Optimization techniques in statistics
Rustagi, Jagdish S
1994-01-01
Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza
Statistical properties of consideration sets
Carson, RT; Louviere, JJ
2015-01-01
© 2015 Elsevier Ltd. The concept of a consideration set has become a central concept in the study of consumer behavior. This paper shows that the common practice of estimating models using only the set of alternatives deemed to be in the set considered by a consumer will usually result in estimated parameters that are biased due to a sample selection effect. This effect is generic to many consideration set models and can be large in practice. To overcome this problem, models of an antecedent ...
Statistics in clinical research: Important considerations.
Barkan, Howard
2015-01-01
Statistical analysis is one of the foundations of evidence-based clinical practice, a key in conducting new clinical research and in evaluating and applying prior research. In this paper, we review the choice of statistical procedures, analyses of the associations among variables and techniques used when the clinical processes being examined are still in process. We discuss methods for building predictive models in clinical situations, and ways to assess the stability of these models and other quantitative conclusions. Techniques for comparing independent events are distinguished from those used with events in a causal chain or otherwise linked. Attention then turns to study design, to the determination of the sample size needed to make a given comparison, and to statistically negative studies.
Statistical Techniques for Project Control
Badiru, Adedeji B
2012-01-01
A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati
The maximum entropy technique. System's statistical description
Belashev, B Z
2002-01-01
The maximum entropy technique (MENT) is applied for searching the distribution functions of physical values. MENT takes into consideration the demand of maximum entropy, the characteristics of the system and the connection conditions, naturally. It is allowed to apply MENT for statistical description of closed and open systems. The examples in which MENT had been used for the description of the equilibrium and nonequilibrium states and the states far from the thermodynamical equilibrium are considered
21 CFR 820.250 - Statistical techniques.
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Statistical Considerations in Environmental Microbial Forensics.
McBride, Graham; Gilpin, Brent
2016-08-01
In environmental microbial forensics, as in other pursuits, statistical calculations are sometimes inappropriately applied, giving rise to the appearance of support for a particular conclusion or failing to support an innately obvious conclusion. This is a reflection of issues related to dealing with sample sizes, the methodologies involved, and the difficulty of communicating uncertainties. In this brief review, we attempt to illustrate ways to minimize such problems. In doing so, we consider one of the most common applications of environmental microbial forensics-the use of genotyping in food and water and disease investigations. We explore three important questions. (i) Do hypothesis tests' P values serve as adequate metrics of evidence? (ii) How can we quantify the value of the evidence? (iii) Can we turn a value-of-evidence metric into attribution probabilities? Our general conclusions are as follows. (i) P values have the unfortunate property of regularly detecting trivial effects when sample sizes are large. (ii) Likelihood ratios, rather than any kind of probability, are the better strength-of-evidence metric, addressing the question "what do these data say?" (iii) Attribution probabilities, addressing the question "what should I believe?," can be calculated using Bayesian methods, relying in part on likelihood ratios but also invoking prior beliefs which therefore can be quite subjective. In legal settings a Bayesian analysis may be required, but the choice and sensitivity of prior assumptions should be made clear.
Tools & techniques--statistics: propensity score techniques.
da Costa, Bruno R; Gahl, Brigitta; Jüni, Peter
2014-10-01
Propensity score (PS) techniques are useful if the number of potential confounding pretreatment variables is large and the number of analysed outcome events is rather small so that conventional multivariable adjustment is hardly feasible. Only pretreatment characteristics should be chosen to derive PS, and only when they are probably associated with outcome. A careful visual inspection of PS will help to identify areas of no or minimal overlap, which suggests residual confounding, and trimming of the data according to the distribution of PS will help to minimise residual confounding. Standardised differences in pretreatment characteristics provide a useful check of the success of the PS technique employed. As with conventional multivariable adjustment, PS techniques cannot account for confounding variables that are not or are only imperfectly measured, and no PS technique is a substitute for an adequately designed randomised trial.
Statistical and Computational Techniques in Manufacturing
2012-01-01
In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...
Considerations for choosing HOT lane delineation techniques
Energy Technology Data Exchange (ETDEWEB)
Hlavacek, I.; Vitek, M. [Texas Univ., Austin, TX (United States); Carrizales, J.; Machemehl, R. [Texas Dept. of Transportation, Austin, TX (Canada)
2007-07-01
Transportation agencies are looking for ways to move more vehicles with fewer tax dollars. One innovative technique is managed lanes. These are special purpose lanes on highways with controlled access to allow manipulation of service parameters. An example of a managed facility, high-occupancy toll (HOT) lanes are gaining support as a tool for encouraging higher vehicle occupancy while dealing with many of the shortcomings of high-occupancy vehicle (HOV) lanes. Because managed lanes, including HOT lanes, are controlled access facilities and must be separated from general purpose lanes, one of the principle issues is the type of delineation technique that should be used. Though the specifics of delineating a managed lane are numerous, and allow for a wide variety of possibilities, most delineation techniques used can be categorized into three broad types: concrete barriers, buffer separation, and plastic posts. This paper presented the advantages, disadvantages, required and desirable conditions for implementation of each type. The project involved assembling an expert panel to gather collective knowledge of factors and to share experiences and make recommendations regarding delineation device implementation. Panel recommendations and secondary source information were presented as conceptual guidance regarding managed lane delineation measures. Buffer type delineators were found to be least costly in terms of both initial and maintenance costs, and concrete barriers provide the best means of controlling access so they are the best means of guaranteeing toll collection from all users. 11 refs.
Statistical considerations for confirmatory clinical trials for similar biotherapeutic products.
Njue, Catherine
2011-09-01
For the purpose of comparing the efficacy and safety of a Similar Biotherapeutic Product (SBP) to a Reference Biotherapeutic Product (RBP), the "Guidelines on Evaluation of Similar Biotherapeutic Products (SBPs)" issued by the World Health Organisation (WHO), states that equivalence or non-inferiority studies may be acceptable. While in principle, equivalence trials are preferred, non-inferiority trials may be considered if appropriately justified, such as for a medicinal product with a wide safety margin. However, the statistical issues involved in the design, conduct, analysis and interpretation of equivalence and non-inferiority trials are complex and subtle, and require that all aspects of these trials be given careful consideration. These issues are important in order to ensure that equivalence and non-inferiority trials provide valid data that are necessary to draw reliable conclusions regarding the clinical similarity of an SBP to an RBP. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Quest for HI Turbulence Statistics New Techniques
Lazarian, A; Esquivel, A; Esquivel, Alejandro
2001-01-01
HI data cubes are sources of unique information on interstellar turbulence. Doppler shifts due to supersonic motions contain information on turbulent velocity field which is otherwise difficult to obtain. However, the problem of separation of velocity and density fluctuations within HI data cubes is far from being trivial. Analytical description of the emissivity statistics of channel maps (velocity slices) in Lazarian & Pogosyan (2000) showed that the relative contribution of the density and velocity fluctuations depends on the thickness of the velocity slice. In particular, power-law assymptotics of the emissivity fluctuations change when the dispersion of the velocity at the scale under study becomes of the order of the velocity slice thickness (integrated width of the channel map). These results are the foundations of the Velocity-Channel Analysis (VCA) technique which allows to determine velocity and density statistics using 21-cm data cubes. The VCA has been successfully tested using data cubes obta...
A survey of statistical downscaling techniques
Energy Technology Data Exchange (ETDEWEB)
Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik
1997-12-31
The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)
Developing Communication Skills: General Considerations and Specific Techniques.
Joiner, Elizabeth Garner; Westphal, Patricia Barney, Ed.
This practical book is designed for the classroom teacher of a second or foreign language at any level. The articles are grouped into two distinct but interdependent sections on general considerations and specific techniques. The contents of the first section are as follows: "Moi Tarzan, Vous Jane?: A Study of Communicative Competence" by P.B.…
Statistical distributions and the entropy considerations in genetics
Lukierska-Walasek, Krystyna
2014-01-01
Zipf's law implies the statistical distributions of hyperbolic type, which can describe the properties of stability and entropy loss in linguistics. We present the information theory from which follows that if the system is described by distributions of hyperbolic type it leads to the possibility of entropy loss. We try to find the correspondence between the histograms of gene lengths and the distributions of hyperbolic type for some bacteria, as {\\em Borelia burgdorferi}, {\\em Escherichia coli} and {\\em Saccharomyces cerevisiae}.
Statistical consideration and challenges in bridging study of personalized medicine.
Li, Meijuan
2015-01-01
Applications of personalized medicine are becoming increasingly prominent. A well-characterized market-ready companion diagnostic assay (CDx) is often desired for patient enrollment in device-drug pivotal clinical trial(s) so that Food and Drug Administration can ensure that appropriate clinical and analytical validation studies are planned and carried out for CDx. However, such a requirement may be difficult or impractical to accomplish. A clinical trial assay (CTA) instead of CDx may be used for patient enrollment in the clinical trial. A concordance study (or bridging study) will be required to assess the agreement between CDx and CTA in order to bridge the clinical data (e.g. overall survival) from CTA to CDx and to evaluate the drug efficacy in CDx intended use population. In this article, we will discuss statistical challenges in study design and data analysis for bridging study. Particularly, we aimed to provide statistical methods on how to estimate the drug efficacy in CDx intended use population using results from bridging study and CTA-drug pivotal clinical trial.
DISTRIBUTED MONITORING SYSTEM RELIABILITY ESTIMATION WITH CONSIDERATION OF STATISTICAL UNCERTAINTY
Institute of Scientific and Technical Information of China (English)
Yi Pengxing; Yang Shuzi; Du Runsheng; Wu Bo; Liu Shiyuan
2005-01-01
Taking into account the whole system structure and the component reliability estimation uncertainty, a system reliability estimation method based on probability and statistical theory for distributed monitoring systems is presented. The variance and confidence intervals of the system reliability estimation are obtained by expressing system reliability as a linear sum of products of higher order moments of component reliability estimates when the number of component or system survivals obeys binomial distribution. The eigenfunction of binomial distribution is used to determine the moments of component reliability estimates, and a symbolic matrix which can facilitate the search of explicit system reliability estimates is proposed. Furthermore, a case of application is used to illustrate the procedure, and with the help of this example, various issues such as the applicability of this estimation model, and measures to improve system reliability of monitoring systems are discussed.
Statistical Considerations of Data Processing in Giovanni Online Tool
Suhung, Shen; Leptoukh, G.; Acker, J.; Berrick, S.
2005-01-01
The GES DISC Interactive Online Visualization and Analysis Infrastructure (Giovanni) is a web-based interface for the rapid visualization and analysis of gridded data from a number of remote sensing instruments. The GES DISC currently employs several Giovanni instances to analyze various products, such as Ocean-Giovanni for ocean products from SeaWiFS and MODIS-Aqua; TOMS & OM1 Giovanni for atmospheric chemical trace gases from TOMS and OMI, and MOVAS for aerosols from MODIS, etc. (http://giovanni.gsfc.nasa.gov) Foremost among the Giovanni statistical functions is data averaging. Two aspects of this function are addressed here. The first deals with the accuracy of averaging gridded mapped products vs. averaging from the ungridded Level 2 data. Some mapped products contain mean values only; others contain additional statistics, such as number of pixels (NP) for each grid, standard deviation, etc. Since NP varies spatially and temporally, averaging with or without weighting by NP will be different. In this paper, we address differences of various weighting algorithms for some datasets utilized in Giovanni. The second aspect is related to different averaging methods affecting data quality and interpretation for data with non-normal distribution. The present study demonstrates results of different spatial averaging methods using gridded SeaWiFS Level 3 mapped monthly chlorophyll a data. Spatial averages were calculated using three different methods: arithmetic mean (AVG), geometric mean (GEO), and maximum likelihood estimator (MLE). Biogeochemical data, such as chlorophyll a, are usually considered to have a log-normal distribution. The study determined that differences between methods tend to increase with increasing size of a selected coastal area, with no significant differences in most open oceans. The GEO method consistently produces values lower than AVG and MLE. The AVG method produces values larger than MLE in some cases, but smaller in other cases. Further
Statistical normalization techniques for magnetic resonance imaging
Directory of Open Access Journals (Sweden)
Russell T. Shinohara
2014-01-01
Full Text Available While computed tomography and other imaging techniques are measured in absolute units with physical meaning, magnetic resonance images are expressed in arbitrary units that are difficult to interpret and differ between study visits and subjects. Much work in the image processing literature on intensity normalization has focused on histogram matching and other histogram mapping techniques, with little emphasis on normalizing images to have biologically interpretable units. Furthermore, there are no formalized principles or goals for the crucial comparability of image intensities within and across subjects. To address this, we propose a set of criteria necessary for the normalization of images. We further propose simple and robust biologically motivated normalization techniques for multisequence brain imaging that have the same interpretation across acquisitions and satisfy the proposed criteria. We compare the performance of different normalization methods in thousands of images of patients with Alzheimer's disease, hundreds of patients with multiple sclerosis, and hundreds of healthy subjects obtained in several different studies at dozens of imaging centers.
Predicting radiotherapy outcomes using statistical learning techniques
Energy Technology Data Exchange (ETDEWEB)
El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O [Washington University, Saint Louis, MO (United States); Lindsay, Patricia E; Hope, Andrew J [Department of Radiation Oncology, Princess Margaret Hospital, Toronto, ON (Canada)
2009-09-21
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among
Predicting radiotherapy outcomes using statistical learning techniques*
El Naqa, Issam; Bradley, Jeffrey D; Lindsay, Patricia E; Hope, Andrew J; Deasy, Joseph O
2013-01-01
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for ‘generalizabilty’ validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model
Predicting radiotherapy outcomes using statistical learning techniques
El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.
2009-09-01
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model
Modern Multivariate Statistical Techniques Regression, Classification, and Manifold Learning
Izenman, Alan Julian
2006-01-01
Describes the advances in computation and data storage that led to the introduction of many statistical tools for high-dimensional data analysis. Focusing on multivariate analysis, this book discusses nonlinear methods as well as linear methods. It presents an integrated mixture of classical and modern multivariate statistical techniques.
2016-06-13
Assimilation of Multi- Sensor Synoptic and Mesoscale Datasets: An Approach Based on Statistic, Dynamic, Physical and Synoptic Considerations Xiaolei...Assimilation of Multi- Sensor Synoptic and Mesoscale Datasets: An Approach Based on Statistic, Dynamic, Physical and Synoptic Considerations 5a
The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques
Menil, Violeta C.
2005-01-01
In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…
Montecarlo Techniques as a tool for teaching statistics
Bueno, FM Alexander
2014-01-01
Probability Theory and Statistics are two of the most useful mathematical fields, and also two of the most difficult to learn. In other science fields, as Physics, experimentation is an useful tool to develop students intuition, but the application of this tool to Statistics is much more difficult. In this paper we show how Monte Carlo techniques can be used to perform numerical experiments, by the use of pseudorandom numbers, and how these experiments can help to the understanding of Statistics and Physics. Monte Carlo techniques are broadly used in scientific research, but they are learnt usually in very specific curses of higher education. By the use of computer simulation these techniques can also be taught at elementary school and they can help to understand and visualise concepts as variance, mean or a probability distribution function. Finally, the use of new technologies, as Javascript and HTML is discussed.
Statistical principles and techniques in scientific and social research
Krzanowski, Wojtek J
2007-01-01
This text provides a clear discussion of the basic statistical concepts and methods frequently encountered in statistical research. Assuming only a basic level of Mathematics, and with numerous examples and illustrations, this text is a valuable resource for students and researchers in the Sciences and Social Sciences. - ;This graduate-level text provides a survey of the logic and reasoning underpinning statistical analysis, as well as giving a broad-brush overview of the various statistical techniques that play a major role in scientific and social investigations. Arranged in rough historical order, the text starts with the ideas of probability that underpin statistical methods and progresses through the developments of the nineteenth and twentieth centuries to modern concerns and solutions. Assuming only a basic level of Mathematics and with numerous examples and illustrations, this text presents a valuable resource not only to the experienced researcher but also to the student, by complementing courses in ...
Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR
2014-07-12
Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 landmine Detection, Signal...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report
Validation of Models : Statistical Techniques and Data Availability
Kleijnen, J.P.C.
1999-01-01
This paper shows which statistical techniques can be used to validate simulation models, depending on which real-life data are available. Concerning this availability three situations are distinguished (i) no data, (ii) only output data, and (iii) both input and output data. In case (i) - no real
Statistical Techniques for Efficient Indexing and Retrieval of Document Images
Bhardwaj, Anurag
2010-01-01
We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…
Statistically tuned Gaussian background subtraction technique for UAV videos
Indian Academy of Sciences (India)
R Athi Lingam; K Senthil Kumar
2014-08-01
Background subtraction is one of the efficient techniques to segment the targets from non-informative background of a video. The traditional background subtraction technique suits for videos with static background whereas the video obtained from unmanned aerial vehicle has dynamic background. Here, we propose an algorithm with tuning factor and Gaussian update for surveillance videos that suits effectively for aerial videos. The tuning factor is optimized by extracting the statistical features of the input frames.With the optimized tuning factor and Gaussian update an adaptive Gaussian-based background subtraction technique is proposed. The algorithm involves modelling, update and subtraction phases. This running Gaussian average based background subtraction technique uses updation at both model generation phase and subtraction phase. The resultant video extracts the moving objects from the dynamic background. Sample videos of various properties such as cluttered background, small objects, moving background and multiple objects are considered for evaluation. The technique is statistically compared with frame differencing technique, temporal median method and mixture of Gaussian model and performance evaluation is done to check the effectiveness of the proposed technique after optimization for both static and dynamic videos.
Green, John; Wheeler, James R
2013-11-15
Solvents are often used to aid test item preparation in aquatic ecotoxicity experiments. This paper discusses the practical, statistical and regulatory considerations. The selection of the appropriate control (if a solvent is used) for statistical analysis is investigated using a database of 141 responses (endpoints) from 71 experiments. The advantages and disadvantages of basing the statistical analysis of treatment effects to the water control alone, solvent control alone, combined controls, or a conditional strategy of combining controls, when not statistically significantly different, are tested. The latter two approaches are shown to have distinct advantages. It is recommended that this approach continue to be the standard used for regulatory and research aquatic ecotoxicology studies. However, wherever technically feasible a solvent should not be employed or at least the concentration minimized. Copyright © 2013 Elsevier B.V. All rights reserved.
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Energy Technology Data Exchange (ETDEWEB)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis
Categorical and nonparametric data analysis choosing the best statistical technique
Nussbaum, E Michael
2014-01-01
Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
A Survey on Statistical Based Single Channel Speech Enhancement Techniques
Directory of Open Access Journals (Sweden)
Sunnydayal. V
2014-11-01
Full Text Available Speech enhancement is a long standing problem with various applications like hearing aids, automatic recognition and coding of speech signals. Single channel speech enhancement technique is used for enhancement of the speech degraded by additive background noises. The background noise can have an adverse impact on our ability to converse without hindrance or smoothly in very noisy environments, such as busy streets, in a car or cockpit of an airplane. Such type of noises can affect quality and intelligibility of speech. This is a survey paper and its object is to provide an overview of speech enhancement algorithms so that enhance the noisy speech signal which is corrupted by additive noise. The algorithms are mainly based on statistical based approaches. Different estimators are compared. Challenges and Opportunities of speech enhancement are also discussed. This paper helps in choosing the best statistical based technique for speech enhancement
Comparison of three Statistical Classification Techniques for Maser Identification
Manning, Ellen M; Ellingsen, Simon P; Breen, Shari L; Chen, Xi; Humphries, Melissa
2016-01-01
We applied three statistical classification techniques - linear discriminant analysis (LDA), logistic regression and random forests - to three astronomical datasets associated with searches for interstellar masers. We compared the performance of these methods in identifying whether specific mid-infrared or millimetre continuum sources are likely to have associated interstellar masers. We also discuss the ease, or otherwise, with which the results of each classification technique can be interpreted. Non-parametric methods have the potential to make accurate predictions when there are complex relationships between critical parameters. We found that for the small datasets the parametric methods logistic regression and LDA performed best, for the largest dataset the non-parametric method of random forests performed with comparable accuracy to parametric techniques, rather than any significant improvement. This suggests that at least for the specific examples investigated here accuracy of the predictions obtained ...
Combining heuristic and statistical techniques in landslide hazard assessments
Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni
2014-05-01
As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.
Statistical optimisation techniques in fatigue signal editing problem
Energy Technology Data Exchange (ETDEWEB)
Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)
2015-02-03
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure.
Seasonal drought predictability in Portugal using statistical-dynamical techniques
Ribeiro, A. F. S.; Pires, C. A. L.
2016-08-01
Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.
Comparison of Three Statistical Classification Techniques for Maser Identification
Manning, Ellen M.; Holland, Barbara R.; Ellingsen, Simon P.; Breen, Shari L.; Chen, Xi; Humphries, Melissa
2016-04-01
We applied three statistical classification techniques-linear discriminant analysis (LDA), logistic regression, and random forests-to three astronomical datasets associated with searches for interstellar masers. We compared the performance of these methods in identifying whether specific mid-infrared or millimetre continuum sources are likely to have associated interstellar masers. We also discuss the interpretability of the results of each classification technique. Non-parametric methods have the potential to make accurate predictions when there are complex relationships between critical parameters. We found that for the small datasets the parametric methods logistic regression and LDA performed best, for the largest dataset the non-parametric method of random forests performed with comparable accuracy to parametric techniques, rather than any significant improvement. This suggests that at least for the specific examples investigated here accuracy of the predictions obtained is not being limited by the use of parametric models. We also found that for LDA, transformation of the data to match a normal distribution led to a significant improvement in accuracy. The different classification techniques had significant overlap in their predictions; further astronomical observations will enable the accuracy of these predictions to be tested.
Statistical technique for analysing functional connectivity of multiple spike trains.
Masud, Mohammad Shahed; Borisyuk, Roman
2011-03-15
A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains.
Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques
Kuan, Gary M
2008-01-01
The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.
The use of statistical techniques in par-level management.
Klee, W B
1994-02-01
The total quality management movement has allowed the reintroduction of statistics in the materials management workplace. Statistical methods can be applied to the par level management process with significant results.
Ahmad, Saira Ijaz; Malik, Samina; Irum, Jamila; Zahid, Rabia
2011-01-01
The main objective of the study was to identify the instructional methods and techniques used by the secondary school teachers to transfer the instructions to the students and to explore the basic considerations of the teachers about the selection of these instructional methods and techniques. Participants of the study included were 442 teachers…
Tóthfalusi, Lászlo; Endrényi, László; Chow, Shein-Chung
2014-05-01
When the patent of a brand-name, marketed drug expires, new, generic products are usually offered. Small-molecule generic and originator drug products are expected to be chemically identical. Their pharmaceutical similarity can be typically assessed by simple regulatory criteria such as the expectation that the 90% confidence interval for the ratio of geometric means of some pharmacokinetic parameters be between 0.80 and 1.25. When such criteria are satisfied, the drug products are generally considered to exhibit therapeutic equivalence. They are then usually interchanged freely within individual patients. Biological drugs are complex proteins, for instance, because of their large size, intricate structure, sensitivity to environmental conditions, difficult manufacturing procedures, and the possibility of immunogenicity. Generic and brand-name biologic products can be expected to show only similarity but not identity in their various features and clinical effects. Consequently, the determination of biosimilarity is also a complicated process which involves assessment of the totality of the evidence for the close similarity of the two products. Moreover, even when biosimilarity has been established, it may not be assumed that the two biosimilar products can be automatically substituted by pharmacists. This generally requires additional, careful considerations. Without declaring interchangeability, a new product could be prescribed, i.e. it is prescribable. However, two products can be automatically substituted only if they are interchangeable. Interchangeability is a statistical term and it means that products can be used in any order in the same patient without considering the treatment history. The concepts of interchangeability and prescribability have been widely discussed in the past but only in relation to small molecule generics. In this paper we apply these concepts to biosimilars and we discuss: definitions of prescribability and interchangeability and
Firstenberg, H.
1971-01-01
The statistics are considered of the Monte Carlo method relative to the interpretation of the NUGAM2 and NUGAM3 computer code results. A numerical experiment using the NUGAM2 code is presented and the results are statistically interpreted.
TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION
Directory of Open Access Journals (Sweden)
А. А. Vershinina
2014-01-01
Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.
Statistical tools for the calibration of traffic conflicts techniques.
Oppe, S.
1982-01-01
To compare the results of various conflict techniques from different countries, an international experiment took place in Rouen in 1979. The experiment showed that, in general, with each technique the same conclusions were reached with regard to the problems of safety at two intersections in Rouen.
Experimental Data Mining Techniques(Using Multiple Statistical Methods
Directory of Open Access Journals (Sweden)
Mustafa Zaidi
2012-05-01
Full Text Available This paper discusses the possible solutions of non-linear multivariable by experimental Data mining techniques using on orthogonal array. Taguchi method is a very useful technique to reduce the time and cost of the experiment but the ignoring all kind of interaction effects. The results are not much encouraging and motivate to study Laser cutting process of non-linear multivariable is modeled by one and two way analysis of variance also linear and non linear regression analysis. These techniques are used to explore better analysis techniques and improve the laser cutting quality by reducing process variations caused by controllable process parameters. The size of data set causes difficulties in modeling and simulation of the problem such as decision tree is useful technique but it is not able to predict better results. The results of analysis of variance are encouraging. Taguchi and regression normally optimizes input process parameters for single characteristics.
Correlation techniques and measurements of wave-height statistics
Guthart, H.; Taylor, W. C.; Graf, K. A.; Douglas, D. G.
1972-01-01
Statistical measurements of wave height fluctuations have been made in a wind wave tank. The power spectral density function of temporal wave height fluctuations evidenced second-harmonic components and an f to the minus 5th power law decay beyond the second harmonic. The observations of second harmonic effects agreed very well with a theoretical prediction. From the wave statistics, surface drift currents were inferred and compared to experimental measurements with satisfactory agreement. Measurements were made of the two dimensional correlation coefficient at 15 deg increments in angle with respect to the wind vector. An estimate of the two-dimensional spatial power spectral density function was also made.
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Velocity field statistics and tessellation techniques : Unbiased estimators of Omega
Van de Weygaert, R; Bernardeau, F; Muller,; Gottlober, S; Mucket, JP; Wambsganss, J
1998-01-01
We describe two new - stochastic-geometrical - methods to obtain reliable velocity field statistics from N-body simulations and from any general density and velocity fluctuation field sampled at a discrete set of locations. These methods, the Voronoi tessellation method and Delaunay tessellation met
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical
Insights into the softening of chaotic statistical models by quantum considerations
Cafaro, C.; Giffin, A.; Lupo, C.; Mancini, S.
2012-05-01
We analyze the information geometry and the entropic dynamics of a 3D Gaussian statistical model and compare our analysis to that of a 2D Gaussian statistical model obtained from the higher-dimensional model via introduction of an additional information constraint that resembles the quantum mechanical canonical minimum uncertainty relation. We uncover that the chaoticity of the 2D Gaussian statistical model, quantified by means of the Information Geometric Entropy (IGE), is softened with respect to the chaoticity of the 3D Gaussian statistical model.
Adaptive Steganography: A survey of Recent Statistical Aware Steganography Techniques
Directory of Open Access Journals (Sweden)
Manish Mahajan
2012-09-01
Full Text Available Steganography is the science that deals with hiding of secret data in some carrier media which may be image, audio, formatted text or video. The main idea behind this is to conceal the very existence of data. We will be dealing here with image steganography. Many algorithms have been proposed for this purpose in spatial & frequency domain. But in almost all the algorithms it has been noticed that as we embed the secret data in the image the certain characteristics or statistics of the image get disturbed. Based on these disturbed statistics steganalysts can get the reflection about the existence of secret data which they further decode with the help of available steganalytic tools. Steganalysis is a science of attacking the hidden data to get an authorized access. Although steganalysis is not a part of this work but it may be sometimes discussed as a part of literature. Even in steganography we are not purely concerned with spatial or frequency domain rather our main emphasis is on adaptive steganography or model based steganography. Adaptive steganography is not entirely a new branch of steganography rather it is based upon spatial & frequency domain with an additional layer of mathematical model. So here we will be dealing with adaptive steganography which take care about the important characteristics & statistics of the cover image well in advance to the embedding of secret data so that the disturbance of image statistics as mentioned earlier, which attracts the forgery or unauthorized access, can be minimized. In this survey we will analyze the various steganography algorithms which are based upon certain mathematical model or in other words algorithms which come under the category of model based steganography.
Statistical techniques for the characterization of partially observed epidemics.
Energy Technology Data Exchange (ETDEWEB)
Safta, Cosmin; Ray, Jaideep; Crary, David (Applied Research Associates, Inc, Arlington, VA); Cheng, Karen (Applied Research Associates, Inc, Arlington, VA)
2010-11-01
Techniques appear promising to construct and integrate automated detect-and-characterize technique for epidemics - Working off biosurveillance data, and provides information on the particular/ongoing outbreak. Potential use - in crisis management and planning, resource allocation - Parameter estimation capability ideal for providing the input parameters into an agent-based model, Index Cases, Time of Infection, infection rate. Non-communicable diseases are easier than communicable ones - Small anthrax can be characterized well with 7-10 days of data, post-detection; plague takes longer, Large attacks are very easy.
Statistical Theory of the Vector Random Decrement Technique
DEFF Research Database (Denmark)
Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.
1999-01-01
The Vector Random Decrement technique has previously been introduced as an effcient method to transform ambient responses of linear structures into Vector Random Decrement functions which are equivalent to free decays of the current structure. The modal parameters can be extracted from the free d...
Statistical Theory of the Vector Random Decrement Technique
DEFF Research Database (Denmark)
Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.
1999-01-01
The Vector Random Decrement technique has previously been introduced as an effcient method to transform ambient responses of linear structures into Vector Random Decrement functions which are equivalent to free decays of the current structure. The modal parameters can be extracted from the free d...
Web Usage Statistics: Measurement Issues and Analytical Techniques.
Bertot, John Carlo; McClure, Charles R.; Moen, William E.; Rubin, Jeffrey
1997-01-01
One means of Web use evaluation is through analysis of server-generated log files. Various log file analysis techniques and issues are presented that are related to the interpretation of log file data. Study findings indicate a number of problems; recommendations and areas needing further research are outlined. (AEF)
Some Bayesian statistical techniques useful in estimating frequency and density
Johnson, D.H.
1977-01-01
This paper presents some elementary applications of Bayesian statistics to problems faced by wildlife biologists. Bayesian confidence limits for frequency of occurrence are shown to be generally superior to classical confidence limits. Population density can be estimated from frequency data if the species is sparsely distributed relative to the size of the sample plot. For other situations, limits are developed based on the normal distribution and prior knowledge that the density is non-negative, which insures that the lower confidence limit is non-negative. Conditions are described under which Bayesian confidence limits are superior to those calculated with classical methods; examples are also given on how prior knowledge of the density can be used to sharpen inferences drawn from a new sample.
Consideration of techniques to mitigate the unauthorized 3D printing production of keys
Straub, Jeremy; Kerlin, Scott
2016-05-01
The illicit production of 3D printed keys based on remote-sensed imagery is problematic as it allows a would-be intruder to access a secured facility without the attack attempt being as obviously detectable as conventional techniques. This paper considers the problem from multiple perspectives. First, it looks at different attack types and considers the prospective attack from a digital information perspective. Second, based on this, techniques for securing keys are considered. Third, the design of keys is considered from the perspective of making them more difficult to duplicate using visible light sensing and 3D printing. Policy and legal considerations are discussed.
Statistical mechanics of sensing and communications: Insights and techniques
Energy Technology Data Exchange (ETDEWEB)
Murayama, T; Davis, P [NTT Communication Science Laboratories, NIPPON TELEGRAPH AND TELEPHONE CORPORATION, 2-4, Hikaridai, Seika-cho, ' Keihanna Science City' , Kyoto 619-0237 (Japan)], E-mail: murayama@cslab.kecl.ntt.co.jp, E-mail: davis@cslab.kecl.ntt.co.jp
2008-01-15
In this article we review a basic model for analysis of large sensor networks from the point of view of collective estimation under bandwidth constraints. We compare different sensing aggregation levels as alternative 'strategies' for collective estimation: moderate aggregation from a moderate number of sensors for which communication bandwidth is enough that data encoding can be reversible, and large scale aggregation from very many sensors - in which case communication bandwidth constraints require the use of nonreversible encoding. We show the non-trivial trade-off between sensing quality, which can be increased by increasing the number of sensors, and communication quality under bandwidth constraints, which decreases if the number of sensors is too large. From a practical standpoint, we verify that such a trade-off exists in constructively defined communications schemes. We introduce a probabilistic encoding scheme and define rate distortion models that are suitable for analysis of the large network limit. Our description shows that the methods and ideas from statistical physics can play an important role in formulating effective models for such schemes.
Stalked protozoa identification by image analysis and multivariable statistical techniques.
Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C
2008-06-01
Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.
Vaughn, Brandon K.
2009-01-01
This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this…
Design of UWB pulse radio transceiver using statistical correlation technique in frequency domain
Directory of Open Access Journals (Sweden)
M. Anis
2007-06-01
Full Text Available In this paper, we propose a new technique to extract low power UWB pulse radio signals, near to noise level, using statistical correlation technique in frequency domain. The receiver consists of many narrow bandpass filters which extract energy either from transmitted UWB signal, interfering channels or noise. Transmitted UWB data can be eliminated by statistical correlation of multiple bandpass filter outputs. Super-regenerative oscillators, tuned within UWB spectrum, are designed as bandpass filters. Summers and comparators perform statistical correlation.
Statistical and Managerial Techniques for Six Sigma Methodology Theory and Application
Barone, Stefano
2012-01-01
Statistical and Managerial Techniques for Six Sigma Methodology examines the methodology through illustrating the most widespread tool and techniques involved in Six Sigma application. Both managerial and statistical aspects of Six Sigma will be analyzed, allowing the reader to apply these tools in the field. This book offers an insight on variation and risk management, and focuses on the structure and organizational aspects of the Six Sigma projects. It covers six sigma methodology, basic managerial techniques, basic statistical techniques, methods for variation and risk management and advanc
Directory of Open Access Journals (Sweden)
Tomislav Car
1989-12-01
Full Text Available Statistical approach is one of methods offering us some quantitative indications of structure security and basic techno-economics criterion of their applications. The basis of that approach is determination of the probability of error occurrence in different interactive relationship between natural forces and structure properties. Results of that analyse are the basis for quantitative definition of the techno-economics optimum for structure and system application (the paper is published in Croatian.
Zhou, Cong; Simpson, Kathryn L; Lancashire, Lee J; Walker, Michael J; Dawson, Martin J; Unwin, Richard D; Rembielak, Agata; Price, Patricia; West, Catharine; Dive, Caroline; Whetton, Anthony D
2012-04-01
A mass spectrometry-based plasma biomarker discovery workflow was developed to facilitate biomarker discovery. Plasma from either healthy volunteers or patients with pancreatic cancer was 8-plex iTRAQ labeled, fractionated by 2-dimensional reversed phase chromatography and subjected to MALDI ToF/ToF mass spectrometry. Data were processed using a q-value based statistical approach to maximize protein quantification and identification. Technical (between duplicate samples) and biological variance (between and within individuals) were calculated and power analysis was thereby enabled. An a priori power analysis was carried out using samples from healthy volunteers to define sample sizes required for robust biomarker identification. The result was subsequently validated with a post hoc power analysis using a real clinical setting involving pancreatic cancer patients. This demonstrated that six samples per group (e.g., pre- vs post-treatment) may provide sufficient statistical power for most proteins with changes>2 fold. A reference standard allowed direct comparison of protein expression changes between multiple experiments. Analysis of patient plasma prior to treatment identified 29 proteins with significant changes within individual patient. Changes in Peroxiredoxin II levels were confirmed by Western blot. This q-value based statistical approach in combination with reference standard samples can be applied with confidence in the design and execution of clinical studies for predictive, prognostic, and/or pharmacodynamic biomarker discovery. The power analysis provides information required prior to study initiation.
Taniguchi, Masahiko; Henry, Sarah; Cogdell, Richard J; Lindsey, Jonathan S
2014-07-01
Depending on growth conditions, some species of purple photosynthetic bacteria contain peripheral light-harvesting (LH2) complexes that are heterogeneous owing to the presence of different protomers (containing different αβ-apoproteins). Recent spectroscopic studies of Rhodopseudomonas palustris grown under low-light conditions suggest the presence of a C 3-symmetric LH2 nonamer comprised of two distinct protomers. The software program Cyclaplex, which enables generation and data-mining of virtual libraries of molecular rings formed upon combinatorial reactions, has been used to delineate the possible number and type of distinct nonamers as a function of numbers of distinct protomers. The yield of the C 3-symmetric nonamer from two protomers (A and B in varying ratios) has been studied under the following conditions: (1) statistical, (2) enriched (preclusion of the B-B sequence), and (3) seeded (pre-formation of an A-B-A block). The yield of C 3-symmetric nonamer is at most 0.98 % under statistical conditions versus 5.6 % under enriched conditions, and can be dominant under conditions of pre-seeding with an A-B-A block. In summary, the formation of any one specific nonamer even from only two protomers is unlikely on statistical grounds but must stem from enhanced free energy of formation or a directed assembly process by as-yet unknown factors.
Scientific Opinion on Statistical considerations for the safety evaluation of GMOs
DEFF Research Database (Denmark)
Sørensen, Ilona Kryspin
in the experimental design of field trials, such as the inclusion of commercial varieties, in order to ensure sufficient statistical power and reliable estimation of natural variability. A graphical representation is proposed to allow the comparison of the GMO, its conventional counterpart and the commercial...... such estimates are unavailable may they be estimated from databases or literature. Estimated natural variability should be used to specify equivalence limits to test the difference between the GMO and the commercial varieties. Adjustments to these equivalence limits allow a simple graphical representation so...... in this opinion may be used, in certain cases, for the evaluation of GMOs other than plants....
The principle of inverse effectiveness in multisensory integration: some statistical considerations.
Holmes, Nicholas P
2009-05-01
The principle of inverse effectiveness (PoIE) in multisensory integration states that, as the responsiveness to individual sensory stimuli decreases, the strength of multisensory integration increases. I discuss three potential problems in the analysis of multisensory data with regard to the PoIE. First, due to 'regression towards the mean,' the PoIE may often be observed in datasets that are analysed post-hoc (i.e., when sorting the data by the unisensory responses). The solution is to design discrete levels of stimulus intensity a priori. Second, due to neurophysiological or methodological constraints on responsiveness, the PoIE may be, in part, a consequence of 'floor' and 'ceiling' effects. The solution is to avoid analysing or interpreting data that are too close to the limits of responsiveness, enabling both enhancement and suppression to be reliably observed. Third, the choice of units of measurement may affect whether the PoIE is observed in a given dataset. Both relative (%) and absolute (raw) measurements have advantages, but the interpretation of both is affected by systematic changes in response variability with changes in response mean, an issue that may be addressed by using measures of discriminability or effect-size such as Cohen's d. Most importantly, randomising or permuting a dataset to construct a null distribution of a test parameter may best indicate whether any observed inverse effectiveness specifically characterises multisensory integration. When these considerations are taken into account, the PoIE may disappear or even reverse in a given dataset. I conclude that caution should be exercised when interpreting data that appear to follow the PoIE.
Energy Technology Data Exchange (ETDEWEB)
Tal, Balazs; Bencze, Attila; Zoletnik, Sandor; Veres, Gabor [KFKI-Research Institute for Particle and Nuclear Physics, Association EURATOM, PO Box 49, H-1525 Budapest (Hungary); Por, Gabor [Department of Nuclear Techniques, Budapest University of Technology and Economics, Association EURATOM, Muegyetem rkp. 9., H-1111 Budapest (Hungary)
2011-12-15
Time delay estimation methods (TDE) are well-known techniques to investigate poloidal flows in hot magnetized plasmas through the propagation properties of turbulent structures in the medium. One of these methods is based on the estimation of the time lag at which the cross-correlation function (CCF) estimation reaches its maximum value. The uncertainty of the peak location refers to the smallest determinable flow velocity modulation, and therefore the standard deviation of the time delay imposes important limitation to the measurements. In this article, the relative standard deviation of the CCF estimation and the standard deviation of its peak location are calculated analytically using a simple model of turbulent signals. This model assumes independent (non interacting) overlapping events (coherent structures) with randomly distributed spatio-temporal origins moving with background flow. The result of our calculations is the derivation of a general formula for the CCF variance, which is valid not exclusively in the high event density limit, but also for arbitrary event densities. Our formula reproduces the well known expression for high event densities previously published in the literature. In this paper we also present a derivation of the variance of time delay estimation that turns out to be inversely proportional to the applied time window. The derived formulas were tested in real plasma measurements. The calculations are an extension of the earlier work of Bencze and Zoletnik [Phys. Plasmas 12, 052323 (2005)] where the autocorrelation-width technique was developed. Additionally, we show that velocities calculated by a TDE method possess a broadband noise which originates from this variance, its power spectral density cannot be decreased by worsening the time resolution and can be coherent with noises of other velocity measurements where the same turbulent structures are used. This noise should not be confused with the impact of zero mean frequency zonal flow
Energy Technology Data Exchange (ETDEWEB)
Simonson, K.M.
1998-08-01
The rate at which a mine detection system falsely identifies man-made or natural clutter objects as mines is referred to as the system's false alarm rate (FAR). Generally expressed as a rate per unit area or time, the FAR is one of the primary metrics used to gauge system performance. In this report, an overview is given of statistical methods appropriate for the analysis of data relating to FAR. Techniques are presented for determining a suitable size for the clutter collection area, for summarizing the performance of a single sensor, and for comparing different sensors. For readers requiring more thorough coverage of the topics discussed, references to the statistical literature are provided. A companion report addresses statistical issues related to the estimation of mine detection probabilities.
How well do test case prioritization techniques support statistical fault localization
Tse, TH; Jiang, B.; Zhang, Z; Chen, TY
2009-01-01
In continuous integration, a tight integration of test case prioritization techniques and fault-localization techniques may both expose failures faster and locate faults more effectively. Statistical fault-localization techniques use the execution information collected during testing to locate faults. Executing a small fraction of a prioritized test suite reduces the cost of testing, and yet the subsequent fault localization may suffer. This paper presents the first empirical study to examine...
Scheib, Stacey A; Tanner, Edward; Green, Isabel C; Fader, Amanda N
2014-01-01
The objectives of this review were to analyze the literature describing the benefits of minimally invasive gynecologic surgery in obese women, to examine the physiologic considerations associated with obesity, and to describe surgical techniques that will enable surgeons to perform laparoscopy and robotic surgery successfully in obese patients. The Medline database was reviewed for all articles published in the English language between 1993 and 2013 containing the search terms "gynecologic laparoscopy" "laparoscopy," "minimally invasive surgery and obesity," "obesity," and "robotic surgery." The incidence of obesity is increasing in the United States, and in particular morbid obesity in women. Obesity is associated with a wide range of comorbid conditions that may affect perioperative outcomes including hypertension, atherosclerosis, angina, obstructive sleep apnea, and diabetes mellitus. In obese patients, laparoscopy or robotic surgery, compared with laparotomy, is associated with a shorter hospital stay, less postoperative pain, and fewer wound complications. Specific intra-abdominal access and trocar positioning techniques, as well as anesthetic maneuvers, improve the likelihood of success of laparoscopy in women with central adiposity. Performing gynecologic laparoscopy in the morbidly obese is no longer rare. Increases in the heaviest weight categories involve changes in clinical practice patterns. With comprehensive and thoughtful preoperative and surgical planning, minimally invasive gynecologic surgery may be performed safely and is of particular benefit in obese patients. Copyright © 2014 AAGL. Published by Elsevier Inc. All rights reserved.
Bliefernicht, Jan; Laux, Patrick; Siegmund, Jonatan; Kunstmann, Harald
2013-04-01
The development and application of statistical techniques with a special focus on a recalibration of meteorological or hydrological forecasts to eliminate the bias between forecasts and observations has received a great deal of attention in recent years. One reason is that retrospective forecasts are nowadays available which allows for a proper training and validation of this kind of techniques. The objective of this presentation is to propose several statistical techniques with different degree of complexity and to evaluate and compare their performance for a recalibration of seasonal ensemble forecasts of monthly precipitation. The techniques selected in this study range from straightforward normal score and quantile-quantile transformation, local scaling, to more sophisticated and novel statistical techniques such as Copula-based methodology recently proposed by Laux et al. (2011). The seasonal forecasts are derived from the Climate Forecast System Version 2. This version is the current coupled ocean-atmosphere general circulation model of the U.S. National Centers for Environmental Prediction used to provide forecasts up to nine months. The CFS precipitation forecasts are compared to monthly precipitation observations from the Global Precipitation Climatology Centre. The statistical techniques are tested for semi-arid regions in West Africa and the Indian subcontinent focusing on large-scale river basins such as the Ganges and the Volta basin. In both regions seasonal precipitation forecasts are a crucial source of information for the prediction of hydro-meteorological extremes, in particular for droughts. The evaluation is done using retrospective CFS ensemble forecast from 1982 to 2009. The training of the statistical techniques is done in a cross-validation mode. The outcome of this investigation illustrates large systematic differences between forecasts and observations, in particular for the Volta basin in West Africa. The selection of straightforward
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
A survey of image processing techniques and statistics for ballistic specimens in forensic science.
Gerules, George; Bhatia, Sanjiv K; Jackson, Daniel E
2013-06-01
This paper provides a review of recent investigations on the image processing techniques used to match spent bullets and cartridge cases. It is also, to a lesser extent, a review of the statistical methods that are used to judge the uniqueness of fired bullets and spent cartridge cases. We review 2D and 3D imaging techniques as well as many of the algorithms used to match these images. We also provide a discussion of the strengths and weaknesses of these methods for both image matching and statistical uniqueness. The goal of this paper is to be a reference for investigators and scientists working in this field.
Basics, common errors and essentials of statistical tools and techniques in anesthesiology research.
Bajwa, Sukhminder Jit Singh
2015-01-01
The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices.
Basics, common errors and essentials of statistical tools and techniques in anesthesiology research
Bajwa, Sukhminder Jit Singh
2015-01-01
The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices. PMID:26702217
de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.
2017-01-01
ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
A STATISTICAL CORRELATION TECHNIQUE AND A NEURAL-NETWORK FOR THE MOTION CORRESPONDENCE PROBLEM
VANDEEMTER, JH; MASTEBROEK, HAK
1994-01-01
A statistical correlation technique (SCT) and two variants of a neural network are presented to solve the motion correspondence problem. Solutions of the motion correspondence problem aim to maintain the identities of individuated elements as they move. In a preprocessing stage, two snapshots of a m
Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis
DEFF Research Database (Denmark)
Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...
Computer program uses Monte Carlo techniques for statistical system performance analysis
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
Statistical Techniques Used in Published Articles: A Historical Review of Reviews
Skidmore, Susan Troncoso; Thompson, Bruce
2010-01-01
The purpose of the present study is to provide a historical account and metasynthesis of which statistical techniques are most frequently used in the fields of education and psychology. Six articles reviewing the "American Educational Research Journal" from 1969 to 1997 and five articles reviewing the psychological literature from 1948 to 2001…
Nuclear Technology. Course 26: Metrology. Module 27-7, Statistical Techniques in Metrology.
Espy, John; Selleck, Ben
This seventh in a series of eight modules for a course titled Metrology focuses on descriptive and inferential statistical techniques in metrology. The module follows a typical format that includes the following sections: (1) introduction, (2) module prerequisites, (3) objectives, (4) notes to instructor/student, (5) subject matter, (6) materials…
Novick, Melvin R.
This project is concerned with the development and implementation of some new statistical techniques that will facilitate a continuing input of information about the student to the instructional manager so that individualization of instruction can be managed effectively. The source of this informational input is typically a short…
Troendle, James F
2016-09-01
No large, randomized, placebo-controlled trial of iodine supplementation in pregnant women in a region of mild or moderate iodine deficiency has been completed in which a primary outcome measure was an assessment of the neurobehavioral development of the offspring at age ≥2 y. In this article, I discuss considerations for the design of such a trial in a region of mild iodine deficiency, with a focus on statistical methods and approaches. Exposure and design issues include the ethics of using a placebo, the potential for overexposure to iodine, and the possibility of community randomization. The main scientific goal of the trial is important in determining the follow-up period. If the goal is to determine whether iodine supplementation during pregnancy improves neurobehavioral development in the offspring, then follow-up should continue until a reasonably reliable assessment can be conducted, which might be at age ≥2 y. Once the timing of assessment is decided, the impact of potential loss to follow-up should be considered so that appropriate statistical methods can be incorporated into the design. The minimum sample size can be calculated by using a sample size formula that incorporates noncompliance and assumes that a certain proportion of study participants do not have any outcome observed. To have sufficient power to detect a reasonably modest difference in neurobehavioral development scores using an assessment tool with an SD of 15, a large number of participants (>500/group) is required. The minimum adequate number of participants may be even larger (>1300/group) depending on the magnitude of the difference in outcome between the supplementation and placebo groups, the estimated proportion of the iodine-supplementation group that fails to take the supplement, and the estimated proportion of pregnancies that do not produce outcome measurements.
Dane, Aaron; Wetherington, Jeffrey D
2014-01-01
At present, there are situations in antibiotic drug development where the low number of enrollable patients with key problem pathogens makes it impossible to conduct fully powered non-inferiority trials in the traditional way. Recent regulatory changes have begun to address this situation. In parallel, statistical issues regarding the application of alternative techniques, balancing the unmet need with the level of certainty in the approval process, and the use of additional sources of data are critical areas to increase development feasibility. Although such approaches increase uncertainty compared with a traditional development program, this will be necessary to allow new agents to be made available. Identification of these risks and explicit discussion around requirements in these areas should help clarify the situation, and hence, the feasibility of developing drugs to treat the most concerning pathogens before the unmet need becomes even more acute than at present.
STATISTICAL INFERENCES FOR VARYING-COEFFICINT MODELS BASED ON LOCALLY WEIGHTED REGRESSION TECHNIQUE
Institute of Scientific and Technical Information of China (English)
梅长林; 张文修; 梁怡
2001-01-01
Some fundamental issues on statistical inferences relating to varying-coefficient regression models are addressed and studied. An exact testing procedure is proposed for checking the goodness of fit of a varying-coefficient model fired by the locally weighted regression technique versus an ordinary linear regression model. Also, an appropriate statistic for testing variation of model parameters over the locations where the observations are collected is constructed and a formal testing approach which is essential to exploring spatial non-stationarity in geography science is suggested.
Energy Technology Data Exchange (ETDEWEB)
Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)
2012-10-15
Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.
Baek, Seung Yeb; Bae, Dong Ho
2011-02-01
Gas welding is a very important and useful technology in the fabrication of railroad cars and commercial vehicle structures. However, since the fatigue strength of gas-welded joints is considerably lower than that of the base of material due to stress concentration at the weld, the fatigue strength assessment of gas-welded joints is very important for the reliability and durability of railroad cars and establishment of criteria for long-life fatigue design. In this study, after evaluating the fatigue strength using a simulated specimen that satisfies not only the structural characteristics but also the mechanical condition of the actual structure, the fatigue design criteria are determined and applied to the fatigue design of the gas welded body structure. To save time and cost for the fatigue design, we investigated an accelerated life-prediction using a probabilistic statistics technique based on the theory of statistical reliability. The (Δσ a )R-Nf relationship was obtained from actual fatigue test data, including welding residual stress. On the basis of these results, the (Δσa)R-(Nf)ALP relationship that was derived from statistical probability analysis was compared with the actual fatigue test data. Therefore, it is expected that the accelerated life prediction will provide a useful method of determining the criteria for fatigue design and predicting a specific target life.
Alvarez, Odalys Quevedo; Tagle, Margarita Edelia Villanueva; Pascual, Jorge L Gómez; Marín, Ma Teresa Larrea; Clemente, Ana Catalina Nuñez; Medina, Miriam Odette Cora; Palau, Raiza Rey; Alfonso, Mario Simeón Pomares
2014-10-01
Spatial and temporal variations of sediment quality in Matanzas Bay (Cuba) were studied by determining a total of 12 variables (Zn, Cu, Pb, As, Ni, Co, Al, Fe, Mn, V, CO₃²⁻, and total hydrocarbons (THC). Surface sediments were collected, annually, at eight stations during 2005-2008. Multivariate statistical techniques, such as principal component (PCA), cluster (CA), and lineal discriminant (LDA) analyses were applied for identification of the most significant variables influencing the environmental quality of sediments. Heavy metals (Zn, Cu, Pb, V, and As) and THC were the most significant species contributing to sediment quality variations during the sampling period. Concentrations of V and As were determined in sediments of this ecosystem for the first time. The variation of sediment environmental quality with the sampling period and the differentiation of samples in three groups along the bay were obtained. The usefulness of the multivariate statistical techniques employed for the environmental interpretation of a limited dataset was confirmed.
Donges, Jonathan F; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-01-01
Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships between both eigen and network approaches are derived and illustrated using exemplary data sets. These results allow to pinpoint that climate network analysis can complement classical eigen techniques and provides substantial additional information on the higher-order structure of statistical interrelationships in climatological data sets. Hence, climate networks are a valuable su...
Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor
2016-09-01
In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].
Design of U-Geometry Parameters Using Statistical Analysis Techniques in the U-Bending Process
Wiriyakorn Phanitwong; Untika Boochakul; Sutasn Thipprakmas
2017-01-01
The various U-geometry parameters in the U-bending process result in processing difficulties in the control of the spring-back characteristic. In this study, the effects of U-geometry parameters, including channel width, bend angle, material thickness, tool radius, as well as workpiece length, and their design, were investigated using a combination of finite element method (FEM) simulation, and statistical analysis techniques. Based on stress distribution analyses, the FEM simulation results ...
Energy Technology Data Exchange (ETDEWEB)
Park, Jinyong [Univ. of Arizona, Tucson, AZ (United States); Balasingham, P [Univ. of Arizona, Tucson, AZ (United States); McKenna, Sean Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kulatilake, Pinnaduwa H.S.W. [Univ. of Arizona, Tucson, AZ (United States)
2004-09-01
Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater
The statistical analysis techniques to support the NGNP fuel performance experiments
Pham, Binh T.; Einerson, Jeffrey J.
2013-10-01
This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.
Ratner, Bruce
2011-01-01
The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has
Directory of Open Access Journals (Sweden)
VIMALA C.
2015-05-01
Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.
Kassomenos, P.; Vardoulakis, S.; Borge, R.; Lumbreras, J.; Papaloukas, C.; Karakitsios, S.
2010-10-01
In this study, we used and compared three different statistical clustering methods: an hierarchical, a non-hierarchical (K-means) and an artificial neural network technique (self-organizing maps (SOM)). These classification methods were applied to a 4-year dataset of 5 days kinematic back trajectories of air masses arriving in Athens, Greece at 12.00 UTC, in three different heights, above the ground. The atmospheric back trajectories were simulated with the HYSPLIT Vesion 4.7 model of National Oceanic and Atmospheric Administration (NOAA). The meteorological data used for the computation of trajectories were obtained from NOAA reanalysis database. A comparison of the three statistical clustering methods through statistical indices was attempted. It was found that all three statistical methods seem to depend to the arrival height of the trajectories, but the degree of dependence differs substantially. Hierarchical clustering showed the highest level of dependence for fast-moving trajectories to the arrival height, followed by SOM. K-means was found to be the least depended clustering technique on the arrival height. The air quality management applications of these results in relation to PM10 concentrations recorded in Athens, Greece, were also discussed. Differences of PM10 concentrations, during certain clusters, were found statistically different (at 95% confidence level) indicating that these clusters appear to be associated with long-range transportation of particulates. This study can improve the interpretation of modelled atmospheric trajectories, leading to a more reliable analysis of synoptic weather circulation patterns and their impacts on urban air quality.
The k-means clustering technique: General considerations and implementation in Mathematica
Directory of Open Access Journals (Sweden)
Laurence Morissette
2013-02-01
Full Text Available Data clustering techniques are valuable tools for researchers working with large databases of multivariate data. In this tutorial, we present a simple yet powerful one: the k-means clustering technique, through three different algorithms: the Forgy/Lloyd, algorithm, the MacQueen algorithm and the Hartigan and Wong algorithm. We then present an implementation in Mathematica and various examples of the different options available to illustrate the application of the technique.
Hamer, Harold A.; Mayer, John P.; Huston, Wilber B.
1961-01-01
Results of a statistical analysis of horizontal-tail loads on a fighter airplane are presented. The data were obtained from a number of operational training missions with flight at altitudes up to about 50,000 feet and at Mach numbers up to 1.22. The analysis was performed to determine the feasibility of calculating horizontal-tail load from data on the flight conditions and airplane motions. In the analysis the calculated loads are compared with the measured loads for the different types of missions performed. The loads were calculated by two methods: a direct approach and a Monte Carlo technique. The procedures used and some of the problems associated with the data analysis are discussed. frequencies of occurrence of tail loads of given magnitudes are derived from statistical information on the flight quantities. In the direct method, a time history of tail load is calculated from time-history measurements of the flight quantities. The Monte Carlo method could be useful for extending loads information for design of prospective airplanes . For the Monte Carlo method, the The results indicate that the accuracy of loads, regardless of the method used for calculation, is largely dependent on the knowledge of the pertinent airplane aerodynamic characteristics and center-of-gravity location. In addition, reliable Monte Carlo results require an adequate sample of statistical data and a knowledge of the more important statistical dependencies between the various flight conditions and airplane motions.
Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.
2012-04-01
Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.
Ajorlo, Majid; Abdullah, Ramdzani B; Yusoff, Mohd Kamil; Halim, Ridzwan Abd; Hanif, Ahmad Husni Mohd; Willms, Walter D; Ebrahimian, Mahboubeh
2013-10-01
This study investigates the applicability of multivariate statistical techniques including cluster analysis (CA), discriminant analysis (DA), and factor analysis (FA) for the assessment of seasonal variations in the surface water quality of tropical pastures. The study was carried out in the TPU catchment, Kuala Lumpur, Malaysia. The dataset consisted of 1-year monitoring of 14 parameters at six sampling sites. The CA yielded two groups of similarity between the sampling sites, i.e., less polluted (LP) and moderately polluted (MP) at temporal scale. Fecal coliform (FC), NO3, DO, and pH were significantly related to the stream grouping in the dry season, whereas NH3, BOD, Escherichia coli, and FC were significantly related to the stream grouping in the rainy season. The best predictors for distinguishing clusters in temporal scale were FC, NH3, and E. coli, respectively. FC, E. coli, and BOD with strong positive loadings were introduced as the first varifactors in the dry season which indicates the biological source of variability. EC with a strong positive loading and DO with a strong negative loading were introduced as the first varifactors in the rainy season, which represents the physiochemical source of variability. Multivariate statistical techniques were effective analytical techniques for classification and processing of large datasets of water quality and the identification of major sources of water pollution in tropical pastures.
López, S.; Dhanoa, M.S.; Dijkstra, J.; Bannink, A.; Kebreab, E.; France, J.
2007-01-01
The in vitro gas production technique is used widely in animal nutrition for feed evaluation and to study the kinetics of microbial fermentation processes in the digestive tract. This technique is based on the assumption that gas produced in batch cultures inoculated with mixed microorganisms from r
López, S.; Dhanoa, M.S.; Dijkstra, J.; Bannink, A.; Kebreab, E.; France, J.
2007-01-01
The in vitro gas production technique is used widely in animal nutrition for feed evaluation and to study the kinetics of microbial fermentation processes in the digestive tract. This technique is based on the assumption that gas produced in batch cultures inoculated with mixed microorganisms from
López, S.; Dhanoa, M.S.; Dijkstra, J.; Bannink, A.; Kebreab, E.; France, J.
2007-01-01
The in vitro gas production technique is used widely in animal nutrition for feed evaluation and to study the kinetics of microbial fermentation processes in the digestive tract. This technique is based on the assumption that gas produced in batch cultures inoculated with mixed microorganisms from r
Directory of Open Access Journals (Sweden)
Land Walker H
2011-01-01
Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
Energy Technology Data Exchange (ETDEWEB)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.
Hutton, Brian; Wolfe, Dianna; Moher, David; Shamseer, Larissa
2017-05-01
Research waste has received considerable attention from the biomedical community. One noteworthy contributor is incomplete reporting in research publications. When detailing statistical methods and results, ensuring analytic methods and findings are completely documented improves transparency. For publications describing randomised trials and systematic reviews, guidelines have been developed to facilitate complete reporting. This overview summarises aspects of statistical reporting in trials and systematic reviews of health interventions. A narrative approach to summarise features regarding statistical methods and findings from reporting guidelines for trials and reviews was taken. We aim to enhance familiarity of statistical details that should be reported in biomedical research among statisticians and their collaborators. We summarise statistical reporting considerations for trials and systematic reviews from guidance documents including the Consolidated Standards of Reporting Trials (CONSORT) Statement for reporting of trials, the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) Statement for trial protocols, the Statistical Analyses and Methods in the Published Literature (SAMPL) Guidelines for statistical reporting principles, the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement for systematic reviews and PRISMA for Protocols (PRISMA-P). Considerations regarding sharing of study data and statistical code are also addressed. Reporting guidelines provide researchers with minimum criteria for reporting. If followed, they can enhance research transparency and contribute improve quality of biomedical publications. Authors should employ these tools for planning and reporting of their research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Controlling intrinsic alignments in weak lensing statistics: The nulling and boosting techniques
Joachimi, B
2010-01-01
The intrinsic alignment of galaxies constitutes the major astrophysical source of systematic errors in surveys of weak gravitational lensing by the large-scale structure. We discuss the principles, summarise the implementation, and highlight the performance of two model-independent methods that control intrinsic alignment signals in weak lensing data: the nulling technique which eliminates intrinsic alignments to ensure unbiased constraints on cosmology, and the boosting technique which extracts intrinsic alignments and hence allows one to further study this contribution. Making only use of the characteristic dependence on redshift of the signals, both approaches are robust, but reduce the statistical power due to the similar redshift scaling of intrinsic alignment and lensing signals.
Performance of Statistical Temporal Downscaling Techniques of Wind Speed Data Over Aegean Sea
Gokhan Guler, Hasan; Baykal, Cuneyt; Ozyurt, Gulizar; Kisacik, Dogan
2016-04-01
Wind speed data is a key input for many meteorological and engineering applications. Many institutions provide wind speed data with temporal resolutions ranging from one hour to twenty four hours. Higher temporal resolution is generally required for some applications such as reliable wave hindcasting studies. One solution to generate wind data at high sampling frequencies is to use statistical downscaling techniques to interpolate values of the finer sampling intervals from the available data. In this study, the major aim is to assess temporal downscaling performance of nine statistical interpolation techniques by quantifying the inherent uncertainty due to selection of different techniques. For this purpose, hourly 10-m wind speed data taken from 227 data points over Aegean Sea between 1979 and 2010 having a spatial resolution of approximately 0.3 degrees are analyzed from the National Centers for Environmental Prediction (NCEP) The Climate Forecast System Reanalysis database. Additionally, hourly 10-m wind speed data of two in-situ measurement stations between June, 2014 and June, 2015 are considered to understand effect of dataset properties on the uncertainty generated by interpolation technique. In this study, nine statistical interpolation techniques are selected as w0 (left constant) interpolation, w6 (right constant) interpolation, averaging step function interpolation, linear interpolation, 1D Fast Fourier Transform interpolation, 2nd and 3rd degree Lagrange polynomial interpolation, cubic spline interpolation, piecewise cubic Hermite interpolating polynomials. Original data is down sampled to 6 hours (i.e. wind speeds at 0th, 6th, 12th and 18th hours of each day are selected), then 6 hourly data is temporally downscaled to hourly data (i.e. the wind speeds at each hour between the intervals are computed) using nine interpolation technique, and finally original data is compared with the temporally downscaled data. A penalty point system based on
Institute of Scientific and Technical Information of China (English)
郑力燕; 于宏兵; 王启山
2016-01-01
Multivariate statistical techniques, such as cluster analysis (CA), discriminant analysis (DA), principal component analysis (PCA) and factor analysis (FA), were applied to evaluate and interpret the surface water quality data sets of the Second Songhua River (SSHR) basin in China, obtained during two years (2012−2013) of monitoring of 10 physicochemical parameters at 15 different sites. The results showed that most of physicochemical parameters varied significantly among the sampling sites. Three significant groups, highly polluted (HP), moderately polluted (MP) and less polluted (LP), of sampling sites were obtained through Hierarchical agglomerative CA on the basis of similarity of water quality characteristics. DA identified pH, F, DO, NH3-N, COD and VPhs were the most important parameters contributing to spatial variations of surface water quality. However, DA did not give a considerable data reduction (40%reduction). PCA/FA resulted in three, three and four latent factors explaining 70%, 62%and 71%of the total variance in water quality data sets of HP, MP and LP regions, respectively. FA revealed that the SSHR water chemistry was strongly affected by anthropogenic activities (point sources: industrial effluents and wastewater treatment plants; non-point sources:domestic sewage, livestock operations and agricultural activities) and natural processes (seasonal effect, and natural inputs). PCA/FA in the whole basin showed the best results for data reduction because it used only two parameters (about 80%reduction) as the most important parameters to explain 72%of the data variation. Thus, this work illustrated the utility of multivariate statistical techniques for analysis and interpretation of datasets and, in water quality assessment, identification of pollution sources/factors and understanding spatial variations in water quality for effective stream water quality management.
Now you see it, now you don't: statistical and methodological considerations in fMRI.
Loring, D W.; Meador, K J.; Allison, J D.; Pillai, J J.; Lavin, T; Lee, G P.; Balan, A; Dave, V
2002-12-01
We illustrate the effects of statistical threshold, spatial clustering, voxel size, and two approaches to multiple comparison correction on fMRI results. We first analyzed fMRI images obtained from a single subject during a noun-verb matching task. Data were analyzed with Statistical Parametric Mapping (SPM) using two different voxel sizes, and results were displayed at three different levels of statistical significance. At each statistical threshold, results were first uncorrected for multiple comparisons and spatial extent and then presented using a spatial extent cluster of 20 voxels. We then statistically controlled the Type I error rate associated with multiple comparisons by using the false discovery rate and by the random field adjustment for false-positive rate used by SPM. We also examined group results from language and graphesthesia paradigms at three levels of statistical significance. In all circumstances, apparent random activations decreased as more conservative statistical approaches were employed, but activation in areas considered to be functionally significant was also reduced. These issues are important in the choice of analytic approach and interpretation of fMRI results, with clear implications for the surgical management of individual patients when fMRI results are used to delineate specific areas of eloquent cortex.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M
2016-01-01
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
MacLean, Adam L.
2015-12-16
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
McLoughlin, M. Padraig M. M.
2008-01-01
The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…
MBB strategy consideration: From microsystem technique to the space transportation system SAENGER 2
Vogels, Hanns Arnt
Microsystems technique, as an example of technology developments, and the future space transportation system SAENGER 2 are treated. Microelectronics, micromechanics, and microoptics are presented. The characteristics of the materials used in microsystems technology are discussed. Economic and nonpolluting solar energy systems for future space systems are discussed. The status and the future of hypersonic transportation systems are discussed.
Natrella, Mary Gibbons
2005-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Coppieters, Michel W; Butler, David S
2008-06-01
Despite the high prevalence of carpal tunnel syndrome and cubital tunnel syndrome, the quality of clinical practice guidelines is poor and non-invasive treatment modalities are often poorly documented. The aim of this cadaveric biomechanical study was to measure longitudinal excursion and strain in the median and ulnar nerve at the wrist and proximal to the elbow during different types of nerve gliding exercises. The results confirmed the clinical assumption that 'sliding techniques' result in a substantially larger excursion of the nerve than 'tensioning techniques' (e.g., median nerve at the wrist: 12.6 versus 6.1mm, ulnar nerve at the elbow: 8.3 versus 3.8mm), and that this larger excursion is associated with a much smaller change in strain (e.g., median nerve at the wrist: 0.8% (sliding) versus 6.8% (tensioning)). The findings demonstrate that different types of nerve gliding exercises have largely different mechanical effects on the peripheral nervous system. Hence different types of techniques should not be regarded as part of a homogenous group of exercises as they may influence neuropathological processes differently. The findings of this study and a discussion of possible beneficial effects of nerve gliding exercises on neuropathological processes may assist the clinician in selecting more appropriate nerve gliding exercises in the conservative and post-operative management of common neuropathies.
A critical review of wetland greenhouse gas measurement techniques and scaling considerations
Allen, S. T.; Krauss, K. W.; Stagg, C. L.; Neubauer, S. C.
2016-12-01
The role of wetlands in terrestrial greenhouse gas fluxes is disproportionately large compared to the relatively small terrestrial area they encompass. There is an established and growing interest in accurately measuring these fluxes, and extrapolating inferences to larger spatial scales. However, a lack of uniformity in measurement approaches impedes progress because it is a challenge to synthesize data, parameterize models, and develop generalizable concepts from disparate data. Furthermore, pairing different methods can result in double-accounting and other aggregation errors. Our objective is to review gas flux measurement techniques and synthesize concepts, factors, and constraints associated with measuring and scaling greenhouse gas fluxes. This work will contribute to a conceptual framework designed to aid in the collection and use of gas flux data obtained by different methods. This review focuses specifically on wetlands which have both distinct transport processes and a unique biogeochemical environment, causing gas fluxes that are not prominent in other terrestrial or aquatic systems. We review techniques and implications of measuring at different steps along the soil-plant-atmosphere continuum; an emphasis of this work is identifying pathways and transit times for different fluxes in different wetland hydrogeomorphic settings. Measurement location along the path from source to atmosphere connotes the spatial and temporal scales at which a technique is applied, the spatiotemporal representation, and the factors that constrain extrapolation.
Marzahn, Philip; Rieke-Zapp, Dirk; Ludwig, Ralf
2010-05-01
Micro scale soil surface roughness is a crucial parameter in many environmental applications. Recent soil erosion studies have shown the impact of micro topography on soil erosion rates as well as overland flow generation due to soil crusting effects. Besides the above mentioned, it is widely recognized that the backscattered signal in SAR remote sensing is strongly influenced by soil surface roughness and by regular higher order tillage patterns. However, there is an ambiguity in the appropriate measurement technique and scale for roughness studies and SAR backscatter model parametrization. While different roughness indices depend on their measurement length, no satisfying roughness parametrization and measurement technique has been found yet, introducing large uncertainty in the interpretation of the radar backscatter. In the presented study, we computed high resolution digital elevation models (DEM) using a consumer grade digital camera in the frame of photogrammetric imaging techniques to represent soil micro topography from different soil surfaces (ploughed, harrowed, seedbed and crusted) . The retrieved DEMs showed sufficient accuracy, with an RMSE of a 1.64 mm compared to high accurate reference points,. For roughness characterization, we calculated different roughness indices (RMS height (s), autocorrelation length (l), tortuosity index (TB)). In an extensive statistical investigation we show the behaviour of the roughness indices for different acquisition sizes. Compared to results from profile measurements taken from literature and profiles generated out of the dataset, results indicate,that by using a three dimensional measuring device, the calculated roughness indices are more robust against outliers and even saturate faster with increasing acquisition size. Dependent on the roughness condition, the calculated values for the RMS-height saturate for ploughed fields at 2.3 m, for harrowed fields at 2.0 m and for crusted fields at 1.2 m. Results also
Directory of Open Access Journals (Sweden)
Lyons MD
2015-07-01
Full Text Available Matthew D Lyons, Culley C Carson III, Robert M Coward Department of Urology, University of North Carolina, Chapel Hill, NC, USA Abstract: Placement of an inflatable penile prosthesis (IPP is the mainstay of surgical treatment for patients with Peyronie's disease (PD and concomitant medication-refractory erectile dysfunction. Special considerations and adjunctive surgical techniques during the IPP procedure are often required for patients with PD to improve residual penile curvature, as well as postoperative penile length. The surgical outcomes and various adjunctive techniques are not significantly different from one another, and selection of the appropriate technique must be tailored to patient-specific factors including the extent of the deformity, the degree of penile shortening, and preoperative patient expectations. The aims of this review were to assess the current literature on published outcomes and surgical techniques involving IPP placement in the treatment of PD. Patient satisfaction and preferences are reported, along with the description and patient selection for surgical techniques that include manual penile modeling, management of refractory curvature with concurrent plication, and correction of severe residual curvature and penile shortening with tunica release and plaque incision and grafting. A thorough description of the available techniques and their associated outcomes may help guide surgeons to the most appropriate choice for their patients. Keywords: Peyronie's disease, outcomes, inflatable penile prosthesis, patient expectation, patient satisfaction
Manzanas, R., Sr.; Brands, S.; San Martin, D., Sr.; Gutiérrez, J. M., Sr.
2014-12-01
This work shows that local-scale climate projections obtained by means of statistical downscaling are sensitive to the choice of reanalysis used for calibration. To this aim, a Generalized Linear Model (GLM) approach is applied to downscale daily precipitation in the Philippines. First, the GLMs are trained and tested -under a cross-validation scheme- separately for two distinct reanalyses (ERA-Interim and JRA-25) for the period 1981-2000. When the observed and downscaled time-series are compared, the attained performance is found to be sensitive to the reanalysis considered if climate change signal bearing variables (temperature and/or specific humidity) are included in the predictor field. Moreover, performance differences are shown to be in correspondence with the disagreement found between the raw predictors from the two reanalyses. Second, the regression coefficients calibrated either with ERA-Interim or JRA-25 are subsequently applied to the output of a Global Climate Model (MPI-ECHAM5) in order to assess the sensitivity of local-scale climate change projections (up to 2100) to reanalysis choice. In this case, the differences detected in present climate conditions are considerably amplified, leading to "delta-change" estimates differing by up to a 35% (on average for the entire country) depending on the reanalysis used for calibration. Therefore, reanalysis choice is shown to importantly contribute to the uncertainty of local-scale climate change projections, and, consequently, should be treated with equal care as other, well-known, sources of uncertainty -e.g., the choice of the GCM and/or downscaling method.- Implications of the results for the entire tropics, as well as for the Model Output Statistics downscaling approach are also briefly discussed.
Brooner, W. G.; Nichols, D. A.
1972-01-01
Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.
Goetz, J. N.; Brenning, A.; Petschko, H.; Leopold, P.
2015-08-01
Statistical and now machine learning prediction methods have been gaining popularity in the field of landslide susceptibility modeling. Particularly, these data driven approaches show promise when tackling the challenge of mapping landslide prone areas for large regions, which may not have sufficient geotechnical data to conduct physically-based methods. Currently, there is no best method for empirical susceptibility modeling. Therefore, this study presents a comparison of traditional statistical and novel machine learning models applied for regional scale landslide susceptibility modeling. These methods were evaluated by spatial k-fold cross-validation estimation of the predictive performance, assessment of variable importance for gaining insights into model behavior and by the appearance of the prediction (i.e. susceptibility) map. The modeling techniques applied were logistic regression (GLM), generalized additive models (GAM), weights of evidence (WOE), the support vector machine (SVM), random forest classification (RF), and bootstrap aggregated classification trees (bundling) with penalized discriminant analysis (BPLDA). These modeling methods were tested for three areas in the province of Lower Austria, Austria. The areas are characterized by different geological and morphological settings. Random forest and bundling classification techniques had the overall best predictive performances. However, the performances of all modeling techniques were for the majority not significantly different from each other; depending on the areas of interest, the overall median estimated area under the receiver operating characteristic curve (AUROC) differences ranged from 2.9 to 8.9 percentage points. The overall median estimated true positive rate (TPR) measured at a 10% false positive rate (FPR) differences ranged from 11 to 15pp. The relative importance of each predictor was generally different between the modeling methods. However, slope angle, surface roughness and plan
Towards a Statistical Methodology to Evaluate Program Speedups and their Optimisation Techniques
Touati, Sid
2009-01-01
The community of program optimisation and analysis, code performance evaluation, parallelisation and optimising compilation has published since many decades hundreds of research and engineering articles in major conferences and journals. These articles study efficient algorithms, strategies and techniques to accelerate programs execution times, or optimise other performance metrics (MIPS, code size, energy/power, MFLOPS, etc.). Many speedups are published, but nobody is able to reproduce them exactly. The non-reproducibility of our research results is a dark point of the art, and we cannot be qualified as {\\it computer scientists} if we do not provide rigorous experimental methodology. This article provides a first effort towards a correct statistical protocol for analysing and measuring speedups. As we will see, some common mistakes are done by the community inside published articles, explaining part of the non-reproducibility of the results. Our current article is not sufficient by its own to deliver a comp...
Adams, Jennifer R.; Quartiroli, Alessandro
2010-01-01
The authors note that although the "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; American Psychiatric Association, 2000) provides a useful tool for assessment and treatment planning, there has been debate over the lack of attention to issues of diversity. The elements of this debate are presented, along with…
New Statistical Techniques in the Measurement of the inclusive Top Pair Production Cross Section
Franc, Jiří; Štěpánek, Michal; Kůs, Václav
2014-01-01
We present several different types of multivariate statistical techniques used in the measurement of the inclusive top pair production cross section in $p \\bar{p}$-collisions at $\\sqrt{s} = 1.96 \\text{TeV}$ employing the full RunII data ($9.7\\textrm{fb}^{-1}$) collected with the D0 detector at the Fermilab Tevatron Collider. We consider the final state of the top quark pair decays containing one electron or muon and at least two jets. We proceed various statistical homogeneity tests such as Anderson - Darling, Kolmogorov - Smirnov, and $\\varphi$-divergences tests to determine, which variables have good data-MC agreement, as well as a good separation power. We adjusted all tests for using weighted empirical distribution functions. Further we separate $t\\bar{t}$ signal from the background by the application of Generalized Linear Models, Gaussian Mixture Models), Neural Networks with Switching Units and confront them with familiar methods from ROOT TMVA package such as Boosted Decision Trees, and Multi-layer Per...
Design of U-Geometry Parameters Using Statistical Analysis Techniques in the U-Bending Process
Directory of Open Access Journals (Sweden)
Wiriyakorn Phanitwong
2017-06-01
Full Text Available The various U-geometry parameters in the U-bending process result in processing difficulties in the control of the spring-back characteristic. In this study, the effects of U-geometry parameters, including channel width, bend angle, material thickness, tool radius, as well as workpiece length, and their design, were investigated using a combination of finite element method (FEM simulation, and statistical analysis techniques. Based on stress distribution analyses, the FEM simulation results clearly identified the different bending mechanisms and effects of U-geometry parameters on the spring-back characteristic in the U-bending process, with and without pressure pads. The statistical analyses elucidated that the bend angle and channel width have a major influence in cases with and without pressure pads, respectively. The experiments were carried out to validate the FEM simulation results. Additionally, the FEM simulation results were in agreement with the experimental results, in terms of the bending forces and bending angles.
Carbon nanotubes for supercapacitors: Consideration of cost and chemical vapor deposition techniques
Institute of Scientific and Technical Information of China (English)
Chao Zheng; Weizhong Qian; Chaojie Cui; Guanghui Xu; Mengqiang Zhao; Guili Tian; Fei Wei
2012-01-01
In this topic,we first discussed the requirement and performance of supercapacitors using carbon nanotubes (CNTs) as the electrode,including specific surface area,purity and cost.Then we reviewed the preparation technique of single walled CNTs (SWNTs) in relatively large scale by chemical vapor deposition method.Its catalysis on the decomposition of methane and other carbon source,the reactor type and the process control strategies were discussed.Special focus was concentrated on how to increase the yield,selectivity,and purity of SWNTs and how to inhibit the formation of impurities,including amorphous carbon,multiwalled CNTs and the carbon encapsulated metal particles,since these impurities seriously influenced the performance of SWNTs in supercapacitors.Wish it be helpful to further decrease its product cost and for the commercial use in supercapacitors.
Statistically-constrained shallow text marking: techniques, evaluation paradigm and results
Murphy, Brian; Vogel, Carl
2007-02-01
We present three natural language marking strategies based on fast and reliable shallow parsing techniques, and on widely available lexical resources: lexical substitution, adjective conjunction swaps, and relativiser switching. We test these techniques on a random sample of the British National Corpus. Individual candidate marks are checked for goodness of structural and semantic fit, using both lexical resources, and the web as a corpus. A representative sample of marks is given to 25 human judges to evaluate for acceptability and preservation of meaning. This establishes a correlation between corpus based felicity measures and perceived quality, and makes qualified predictions. Grammatical acceptability correlates with our automatic measure strongly (Pearson's r = 0.795, p = 0.001), allowing us to account for about two thirds of variability in human judgements. A moderate but statistically insignificant (Pearson's r = 0.422, p = 0.356) correlation is found with judgements of meaning preservation, indicating that the contextual window of five content words used for our automatic measure may need to be extended.
Koltick, David; Wang, Haoyu; Liu, Shih-Chieh; Heim, Jordan; Nistor, Jonathan
2016-03-01
Typical nuclear decay constants are measured at the accuracy level of 10-2. There are numerous reasons: tests of unconventional theories, dating of materials, and long term inventory evolution which require decay constants accuracy at a level of 10-4 to 10-5. The statistical and systematic errors associated with precision measurements of decays using the counting technique are presented. Precision requires high count rates, which introduces time dependent dead time and pile-up corrections. An approach to overcome these issues is presented by continuous recording of the detector current. Other systematic corrections include, the time dependent dead time due to background radiation, control of target motion and radiation flight path variation due to environmental conditions, and the time dependent effects caused by scattered events are presented. The incorporation of blind experimental techniques can help make measurement independent of past results. A spectrometer design and data analysis is reviewed that can accomplish these goals. The author would like to thank TechSource, Inc. and Advanced Physics Technologies, LLC. for their support in this work.
Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell
2017-01-01
In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.
Institute of Scientific and Technical Information of China (English)
Pan Jiayi; Chin-Pang Jack Cheng; Gloria T. Lau; Kincho H. Law
2008-01-01
The objective of this paper is to introduce three semi-automated approaches for ontology mapping using relatedness analysis techniques. In the architecture, engineering, and construction (AEC) industry, there exist a number of ontological standards to describe the semantics of building models. Although the standards share similar scopes of interest, the task of comparing and mapping concepts among standards is challenging due to their differences in terminologies and perspectives. Ontology mapping is therefore necessary to achieve information interoperability, which allows two or more information sources to exchange data and to re-use the data for further purposes. The attribute-based approach, corpus-based approach, and name-based approach presented in this paper adopt the statistical relatedness analysis techniques to discover related concepts from heterogeneous ontologies. A pilot study is conducted on IFC and CIS/2 ontologies to evaluate the approaches. Preliminary results show that the attribute-based approach outperforms the other two approaches in terms of precision and F-measure.
Energy Technology Data Exchange (ETDEWEB)
Young, Carolyn; Taylor, Andrew M. [UCL, Institute of Child Health, Cardiorespiratory Unit, London (United Kingdom); Great Ormond Street Hospital for Children, Cardiorespiratory Unit, London (United Kingdom); Owens, Catherine M. [UCL, Institute of Child Health, Cardiorespiratory Unit, London (United Kingdom)
2011-03-15
The significant challenges involved in imaging the heart in small children (<15 kg) have been addressed by, and partially resolved with improvement in temporal and spatial resolution secondary to the advent of new multi-detector CT technology. This has enabled both retrospective and prospective ECG-gated imaging in children even at high heart rates (over 100 bpm) without the need for beta blockers. Recent studies have highlighted that the radiation burden associated with cardiac CT can be reduced using prospective ECG-gating. Our experience shows that the resultant dose reduction can be optimised to a level equivalent to that of a non-gated study. This article reviews the different aspects of ECG-gating and the preferred technique for cardiac imaging in the young child (<15 kg). We summarize our evidenced based recommendations for readers, referencing recent articles and using our in house data, protocols and dose measurements discussing the various methods available for dose calculations and their inherent bias. (orig.)
Energy Technology Data Exchange (ETDEWEB)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two
Muhammad, Wazir; Lee, Sang Hoon
2013-01-01
Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFFs (i.e. A(Z, v(i)²)) over squared momentum transfer (v(i)² = v(1)²,......, v(59)²). In the current study, the role/issues of RFFs/MFFs and LIT in the MC sampling for the coherent scattering were analyzed. The results showed that the relative probability density curves sampled on the basis of MFFs are unable to reveal any extra scientific information as both the RFFs and MFFs produced the same MC sampled curves. Furthermore, no relationship was established between the multiple small peaks and irregular step shapes (i.e. statistical noise) in the PDFs and either RFFs or MFFs. In fact, the noise in the PDFs appeared due to the use of LIT. The density of the noise depends upon the interval length between two consecutive points in the input data table of A(Z, v(i)²) and has no scientific background. The probability density function curves became smoother as the interval lengths were decreased. In conclusion, these statistical noises can be efficiently removed by introducing more data points in the A(Z, v(i)²) data tables.
A statistical technique for defining rainfall forecast probabilities in southern Africa
Husak, G. J.; Magadzire, T.
2010-12-01
Probabilistic forecasts are currently produced by many climate centers and by just as many processes. These models use a number of inputs to generate the probability of rainfall falling in the lower/middle/upper tercile (frequently termed “below-normal”/”normal”/”above-normal”) of the historical rainfall distribution. Generation of forecasts for a season may be the result of a dynamic climate model, a statistical model, a consensus of a panel of experts, or a combination of some of the afore-mentioned techniques, among others. This last method is one most commonly accepted in Southern Africa, resulting from the Southern Africa Regional Climate Outlook Forum (SARCOF). While it has been noted that there is a reasonable chance of polygons having a dominant tercile of 60% probability or more, this has seldom been the case. Indeed, over the last four years, the SARCOF process has not produced any polygons with such a likelihood. In fact, the terciles in all of the SARCOFs since 2007 have been some combination of 40%, 35% and 25%. Discussions with SARCOF scientists suggests that the SARCOF process is currently using the probabilistic format to define relatively qualitative, rank-ordered outcomes in the format “most-likely”, “second-most likely” and “least likely” terciles. While this rank-ordered classification has its merits, it limits the sort of downstream quantitative statistical analysis that could potentially be of assistance to various decision makers. In this study we build a simple statistical model to analyze the probabilistic outcomes for the coming rainfall season, and analyze their resulting probabilities. The prediction model takes a similar approach to that already used in the SARCOF process: namely, using available historic rainfall data and SST information to create a linear regression between rainfall and SSTs, define a likely rainfall outcome, and analyze the cross-validation errors over the most recent 30 years. The cross
Saugel, Bernd; Grothe, Oliver; Wagner, Julia Y
2015-08-01
When comparing 2 technologies for measuring hemodynamic parameters with regard to their ability to track changes, 2 graphical tools are omnipresent in the literature: the 4-quadrant plot and the polar plot recently proposed by Critchley et al. The polar plot is thought to be the more advanced statistical tool, but care should be taken when it comes to its interpretation. The polar plot excludes possibly important measurements from the data. The polar plot transforms the data nonlinearily, which may prevent it from being seen clearly. In this article, we compare the 4-quadrant and the polar plot in detail and thoroughly describe advantages and limitations of each. We also discuss pitfalls concerning the methods to prepare the researcher for the sound use of both methods. Finally, we briefly revisit the Bland-Altman plot for the use in this context.
Jain, S; Srinath, Ms; Narendra, C; Reddy, Sn; Sindhu, A
2010-10-01
The objective of this study was to evaluate the effect of formulation variables on the release properties, floating lag time, and hardness, when developing floating tablets of Ranitidine hydrochloride, by the statistical optimization technique. The formulations were prepared based on 3(2) factorial design, with polymer ratio (HPMC 100 KM: Xanthan gum) and the amount of aerosil, as two independent formulation variables. The four dependent (response) variables considered were: percentage of drug release at the first hour, T(50%) (time taken to release 50% of the drug), floating lag time, and hardness of the tablet. The release profile data was subjected to a curve fitting analysis, to describe the release mechanism of the drug from the floating tablet. An increase in drug release was observed with an increase in the polymer ratio, and as the amount of aerosil increased, the hardness of the tablet also increased, without causing any change in the floating lag time. The desirability function was used to optimize the response variables, each having a different target, and the observed responses were in accordance with the experimental values. The results demonstrate the feasibility of the model in the development of floating tablets containing Ranitidine hydrochloride.
Classification of human colonic tissues using FTIR spectra and advanced statistical techniques
Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.
2010-04-01
One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.
Ofoghi, Bahadorreza; Zeleznikow, John; Dwyer, Dan; Macmahon, Clare
2013-01-01
This article describes the utilisation of an unsupervised machine learning technique and statistical approaches (e.g., the Kolmogorov-Smirnov test) that assist cycling experts in the crucial decision-making processes for athlete selection, training, and strategic planning in the track cycling Omnium. The Omnium is a multi-event competition that will be included in the summer Olympic Games for the first time in 2012. Presently, selectors and cycling coaches make decisions based on experience and intuition. They rarely have access to objective data. We analysed both the old five-event (first raced internationally in 2007) and new six-event (first raced internationally in 2011) Omniums and found that the addition of the elimination race component to the Omnium has, contrary to expectations, not favoured track endurance riders. We analysed the Omnium data and also determined the inter-relationships between different individual events as well as between those events and the final standings of riders. In further analysis, we found that there is no maximum ranking (poorest performance) in each individual event that riders can afford whilst still winning a medal. We also found the required times for riders to finish the timed components that are necessary for medal winning. The results of this study consider the scoring system of the Omnium and inform decision-making toward successful participation in future major Omnium competitions.
Directory of Open Access Journals (Sweden)
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
Statistical Mechanics Ideas and Techniques Applied to Selected Problems in Ecology
Directory of Open Access Journals (Sweden)
Hugo Fort
2013-11-01
Full Text Available Ecosystem dynamics provides an interesting arena for the application of a plethora concepts and techniques from statistical mechanics. Here I review three examples corresponding each one to an important problem in ecology. First, I start with an analytical derivation of clumpy patterns for species relative abundances (SRA empirically observed in several ecological communities involving a high number n of species, a phenomenon which have puzzled ecologists for decades. An interesting point is that this derivation uses results obtained from a statistical mechanics model for ferromagnets. Second, going beyond the mean field approximation, I study the spatial version of a popular ecological model involving just one species representing vegetation. The goal is to address the phenomena of catastrophic shifts—gradual cumulative variations in some control parameter that suddenly lead to an abrupt change in the system—illustrating it by means of the process of desertification of arid lands. The focus is on the aggregation processes and the effects of diffusion that combined lead to the formation of non trivial spatial vegetation patterns. It is shown that different quantities—like the variance, the two-point correlation function and the patchiness—may serve as early warnings for the desertification of arid lands. Remarkably, in the onset of a desertification transition the distribution of vegetation patches exhibits scale invariance typical of many physical systems in the vicinity a phase transition. I comment on similarities of and differences between these catastrophic shifts and paradigmatic thermodynamic phase transitions like the liquid-vapor change of state for a fluid. Third, I analyze the case of many species interacting in space. I choose tropical forests, which are mega-diverse ecosystems that exhibit remarkable dynamics. Therefore these ecosystems represent a research paradigm both for studies of complex systems dynamics as well as to
Marinović Ruždjak, Andrea; Ruždjak, Domagoj
2015-04-01
For the evaluation of seasonal and spatial variations and the interpretation of a large and complex water quality dataset obtained during a 7-year monitoring program of the Sava River in Croatia, different multivariate statistical techniques were applied in this study. Basic statistical properties and correlations of 18 water quality parameters (variables) measured at 18 sampling sites (a total of 56,952 values) were examined. Correlations between air temperature and some water quality parameters were found in agreement with the previous studies of relationship between climatic and hydrological parameters. Principal component analysis (PCA) was used to explore the most important factors determining the spatiotemporal dynamics of the Sava River. PCA has determined a reduced number of seven principal components that explain over 75 % of the data set variance. The results revealed that parameters related to temperature and organic pollutants (CODMn and TSS) were the most important parameters contributing to water quality variation. PCA analysis of seasonal subsets confirmed this result and showed that the importance of parameters is changing from season to season. PCA of the four seasonal data subsets yielded six PCs with eigenvalues greater than one explaining 73.6 % (spring), 71.4 % (summer), 70.3 % (autumn), and 71.3 % (winter) of the total variance. To check the influence of the outliers in the data set whose distribution strongly deviates from the normal one, in addition to standard principal component analysis algorithm, two robust estimates of covariance matrix were calculated and subjected to PCA. PCA in both cases yielded seven principal components explaining 75 % of the total variance, and the results do not differ significantly from the results obtained by the standard PCA algorithm. With the implementation of robust PCA algorithm, it is demonstrated that the usage of standard algorithm is justified for data sets with small numbers of missing data
Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.
2017-02-01
Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal
Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.
2017-06-01
Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal
Abbas Alkarkhi, F M; Ismail, Norli; Easa, Azhar Mat
2008-02-11
Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.
GIS-based statistical mapping technique for block-and-ash pyroclastic flow and surge hazards
Widiwijayanti, C.; Voight, B.; Hidayat, D.; Schilling, S.
2008-12-01
Assessments of pyroclastic flow (PF) hazards are commonly based on mapping of PF and surge deposits and estimations of inundation limits, and/or computer models of varying degrees of sophistication. In volcanic crises a PF hazard map may be sorely needed, but limited time, exposures, or safety aspects may preclude fieldwork, and insufficient time or baseline data may be available for reliable dynamic simulations. We have developed a statistically constrained simulation model for block-and-ash PFs to estimate potential areas of inundation by adapting methodology from Iverson et al. (1998) for lahars. The predictive equations for block-and-ash PFs are calibrated with data from many volcanoes and given by A = (0.05-0.1)V2/3, B = (35-40)V2/3 , where A is cross-sectional area of inundation, B is planimetric area and V is deposit volume. The proportionality coefficients were obtained from regression analyses and comparison of simulations to mapped deposits. The method embeds the predictive equations in a GIS program coupled with DEM topography, using the LAHARZ program of Schilling (1998). Although the method is objective and reproducible, any PF hazard zone so computed should be considered as an approximate guide only, due to uncertainties on coefficients applicable to individual PFs, DEM details, and release volumes. Gradational nested hazard maps produced by these simulations reflect in a sense these uncertainties. The model does not explicitly consider dynamic behavior, which can be important. Surge impacts must be extended beyond PF hazard zones and we have explored several approaches to do this. The method has been used to supply PF hazard maps in two crises: Merapi 2006; and Montserrat 2006- 2007. We have also compared our hazard maps to actual recent PF deposits and to maps generated by several other model techniques.
Directory of Open Access Journals (Sweden)
M.A. Delavar
2016-02-01
Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and
Vandebosch, An; Mogg, Robin; Goeyvaerts, Nele; Truyers, Carla; Greenwood, Brian; Watson-Jones, Debby; Herrera-Taracena, Guillermo; Parys, Wim; Vangeneugden, Tony
2016-02-01
Starting in December 2013, West Africa was overwhelmed with the deadliest outbreak of Ebola virus known to date, resulting in more than 27,500 cases and 11,000 deaths. In response to the epidemic, development of a heterologous prime-boost vaccine regimen was accelerated and involved preparation of a phase 3 effectiveness study. While individually randomized controlled trials are widely acknowledged as the gold standard for demonstrating the efficacy of a candidate vaccine, there was considerable debate on the ethical appropriateness of these designs in the context of an epidemic. A suitable phase 3 trial must convincingly ensure unbiased evaluation with sufficient statistical power. In addition, efficient evaluation of a vaccine candidate is required so that an effective vaccine can be immediately disseminated. This manuscript aims to present the statistical and modeling considerations, design rationale and challenges encountered due to the emergent, epidemic setting that led to the selection of a cluster-randomized phase 3 study design under field conditions.
Kwon, Heejin; Cho, Jinhan; Oh, Jongyeong; Kim, Dongwon; Cho, Junghyun; Kim, Sanghyun; Lee, Sangyun; Lee, Jihyun
2015-10-01
To investigate whether reduced radiation dose abdominal CT images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) compromise the depiction of clinically competent features when compared with the currently used routine radiation dose CT images reconstructed with ASIR. 27 consecutive patients (mean body mass index: 23.55 kg m(-2) underwent CT of the abdomen at two time points. At the first time point, abdominal CT was scanned at 21.45 noise index levels of automatic current modulation at 120 kV. Images were reconstructed with 40% ASIR, the routine protocol of Dong-A University Hospital. At the second time point, follow-up scans were performed at 30 noise index levels. Images were reconstructed with filtered back projection (FBP), 40% ASIR, 30% ASIR-V, 50% ASIR-V and 70% ASIR-V for the reduced radiation dose. Both quantitative and qualitative analyses of image quality were conducted. The CT dose index was also recorded. At the follow-up study, the mean dose reduction relative to the currently used common radiation dose was 35.37% (range: 19-49%). The overall subjective image quality and diagnostic acceptability of the 50% ASIR-V scores at the reduced radiation dose were nearly identical to those recorded when using the initial routine-dose CT with 40% ASIR. Subjective ratings of the qualitative analysis revealed that of all reduced radiation dose CT series reconstructed, 30% ASIR-V and 50% ASIR-V were associated with higher image quality with lower noise and artefacts as well as good sharpness when compared with 40% ASIR and FBP. However, the sharpness score at 70% ASIR-V was considered to be worse than that at 40% ASIR. Objective image noise for 50% ASIR-V was 34.24% and 46.34% which was lower than 40% ASIR and FBP. Abdominal CT images reconstructed with ASIR-V facilitate radiation dose reductions of to 35% when compared with the ASIR. This study represents the first clinical research experiment to use ASIR-V, the newest version of
Neiroukh, Osama
2011-01-01
A new approach for enhancing the process-variation tolerance of digital circuits is described. We extend recent advances in statistical timing analysis into an optimization framework. Our objective is to reduce the performance variance of a technology-mapped circuit where delays across elements are represented by random variables which capture the manufacturing variations. We introduce the notion of statistical critical paths, which account for both means and variances of performance variation. An optimization engine is used to size gates with a goal of reducing the timing variance along the statistical critical paths. We apply a pair of nested statistical analysis methods deploying a slower more accurate approach for tracking statistical critical paths and a fast engine for evaluation of gate size assignments. We derive a new approximation for the max operation on random variables which is deployed for the faster inner engine. Circuit optimization is carried out using a gain-based algorithm that terminates w...
Novel application of a statistical technique, Random Forests, in a bacterial source tracking study.
Smith, Amanda; Sterba-Boatwright, Blair; Mott, Joanna
2010-07-01
In this study, data from bacterial source tracking (BST) analysis using antibiotic resistance profiles were examined using two statistical techniques, Random Forests (RF) and discriminant analysis (DA) to determine sources of fecal contamination of a Texas water body. Cow Trap and Cedar Lakes are potential oyster harvesting waters located in Brazoria County, Texas, that have been listed as impaired for bacteria on the 2004 Texas 303(d) list. Unknown source Escherichia coli were isolated from water samples collected in the study area during two sampling events. Isolates were confirmed as E. coli using carbon source utilization profiles and then analyzed via ARA, following the Kirby-Bauer disk diffusion method. Zone diameters from ARA profiles were analyzed with both DA and RF. Using a two-way classification (human vs nonhuman), both DA and RF categorized over 90% of the 299 unknown source isolates as a nonhuman source. The average rates of correct classification (ARCCs) for the library of 1172 isolates using DA and RF were 74.6% and 82.3%, respectively. ARCCs from RF ranged from 7.7 to 12.0% higher than those from DA. Rates of correct classification (RCCs) for individual sources classified with RF ranged from 23.2 to 0.2% higher than those of DA, with a mean difference of 9.0%. Additional evidence for the outperformance of DA by RF was found in the comparison of training and test set ARCCs and examination of specific disputed isolates; RF produced higher ARCCs (ranging from 8 to 13% higher) than DA for all 1000 trials (excluding the two-way classification, in which RF outperformed DA 999 out of 1000 times). This is of practical significance for analysis of bacterial source tracking data. Overall, based on both DA and RF results, migratory birds were found to be the source of the largest portion of the unknown E. coli isolates. This study is the first known published application of Random Forests in the field of BST. Copyright 2010 Elsevier Ltd. All rights reserved.
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization
Statistical and Modeling Techniques for Studying the Sudden Infant Death Syndrome
Lindsey, Helen L.
1976-01-01
The intention of this research is to contribute additional data, hopefully bearing on the solution to some of the problems and indirectly, the cause(s) of Sudden Infant Death Syndrome, and to present ideas for consideration for future SIDS research. (Author/RK)
Jeong, Min Sook; Yu, Kyeong-Nam; Chung, Hyun Hoon; Park, Soo Jin; Lee, Ah Young; Song, Mi Ryoung; Cho, Myung-Haing; Kim, Jun Sung
2016-05-01
Qualitative and quantitative analyses of reactive oxygen species (ROS) generated on the surfaces of nanomaterials are important for understanding their toxicity and toxic mechanisms, which are in turn beneficial for manufacturing more biocompatible nanomaterials in many industrial fields. Electron spin resonance (ESR) is a useful tool for detecting ROS formation. However, using this technique without first considering the physicochemical properties of nanomaterials and proper conditions of the spin trapping agent (such as incubation time) may lead to misinterpretation of the resulting data. In this report, we suggest methodological considerations for ESR as pertains to magnetism, sample preparation and proper incubation time with spin trapping agents. Based on our results, each spin trapping agent should be given the proper incubation time. For nanomaterials having magnetic properties, it is useful to remove these nanomaterials via centrifugation after reacting with spin trapping agents. Sonication for the purpose of sample dispersion and sample light exposure should be controlled during ESR in order to enhance the obtained ROS signal. This report will allow researchers to better design ESR spin trapping applications involving nanomaterials.
An analytic technique for statistically modeling random atomic clock errors in estimation
Fell, P. J.
1981-01-01
Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.
The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading
Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.
2016-01-01
Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…
Statistical Techniques for Analyzing Process or "Similarity" Data in TID Hardness Assurance
Ladbury, R.
2010-01-01
We investigate techniques for estimating the contributions to TID hardness variability for families of linear bipolar technologies, determining how part-to-part and lot-to-lot variability change for different part types in the process.
Valev, D.; Werner, R.; Atanassov, A.; Kostadinov, I.; Giovanelli, G.; Ravegnani, F.; Petritoli, A.; Bortoli, D.; Palazzi, E.; Markova, T.
2008-12-01
By means of the GASCOD spectrometer at Stara Zagora station (42.8 N, 26.1 E), data series were obtained of monthly NO2am (at sunrise) and NO2pm (at sunset) slant column densities (SCD), covering the interval from September 1999 to the end of 2006. After removing the seasonal cycle, relationships between the NO2am and F10.7 solar flux and NO2pm and F10.7 were sought. The monthly deseasonalized NO2am show a positive statistical significant correlation with F10.7 having r = 0.41 at level p = 0.01. No statistical significant correlations were found between monthly NO2pm and F10.7 unless the QBO phase was taken into consideration. The original data series of the quasi-biennial oscillations (QBO) at the level of 30 h Pa have been used to study the relation between the NO2 and the F10.7 taking into account the QBO phase. The data have been separated into two different groups ? positive (westerly) and negative (easterly) QBO phase. As a result, during the negative QBO phase, the monthly NO2pm show a negative correlation with F10.7 (r = - 0.37 and p = 0.01). During the positive QBO phase, the monthly NO2pm show no correlation with F10.7 (r = 0.02). The separation of NO2am into two groups according to the sign of the QBO phase practically does not change the results - the correlation coefficient remains between 0.40 and 0.43 at level p = 0.01. The statistical significance of the found relationships was determined by means of Student's t-test.
Directory of Open Access Journals (Sweden)
Emma Lightfoot
Full Text Available Oxygen isotope analysis of archaeological skeletal remains is an increasingly popular tool to study past human migrations. It is based on the assumption that human body chemistry preserves the δ18O of precipitation in such a way as to be a useful technique for identifying migrants and, potentially, their homelands. In this study, the first such global survey, we draw on published human tooth enamel and bone bioapatite data to explore the validity of using oxygen isotope analyses to identify migrants in the archaeological record. We use human δ18O results to show that there are large variations in human oxygen isotope values within a population sample. This may relate to physiological factors influencing the preservation of the primary isotope signal, or due to human activities (such as brewing, boiling, stewing, differential access to water sources and so on causing variation in ingested water and food isotope values. We compare the number of outliers identified using various statistical methods. We determine that the most appropriate method for identifying migrants is dependent on the data but is likely to be the IQR or median absolute deviation from the median under most archaeological circumstances. Finally, through a spatial assessment of the dataset, we show that the degree of overlap in human isotope values from different locations across Europe is such that identifying individuals' homelands on the basis of oxygen isotope analysis alone is not possible for the regions analysed to date. Oxygen isotope analysis is a valid method for identifying first-generation migrants from an archaeological site when used appropriately, however it is difficult to identify migrants using statistical methods for a sample size of less than c. 25 individuals. In the absence of local previous analyses, each sample should be treated as an individual dataset and statistical techniques can be used to identify migrants, but in most cases pinpointing a specific
Energy Technology Data Exchange (ETDEWEB)
Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)
2015-04-15
Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.
Ariew, André
2007-03-01
Charles Darwin, James Clerk Maxwell, and Francis Galton were all aware, by various means, of Aldolphe Quetelet's pioneering work in statistics. Darwin, Maxwell, and Galton all had reason to be interested in Quetelet's work: they were all working on some instance of how large-scale regularities emerge from individual events that vary from one another; all were rejecting the divine interventionistic theories of their contemporaries; and Quetelet's techniques provided them with a way forward. Maxwell and Galton all explicitly endorse Quetelet's techniques in their work; Darwin does not incorporate any of the statistical ideas of Quetelet, although natural selection post-twentieth century synthesis has. Why not Darwin? My answer is that by the time Darwin encountered Malthus's law of excess reproduction he had all he needed to answer about large scale regularities in extinctions, speciation, and adaptation. He didn't need Quetelet.
2012-01-01
Glyphosate quantification methods are complex and expensive, and its control in natural water bodies is getting more important year after year. In order to find a new system that facilitates the detection of glyphosate, we present a comparison between two models to predict glyphosate concentration in aqueous dissolutions. One of them is done by an artificial neural network (ANN) embedded in a microcontroller and the other one is done by statistic methods (Partial Least Squares) in a computer...
Julien, Denyse
1998-01-01
The study reported on in this thesis was an empirical investigation into the implementation and use of, Statistical Process Control (SPC) techniques and tools in a product development environment. The data used originated from four different business units in the European flavour division of a large International company, belonging to the Flavour and Fragrance industry. The study highlights many of the problems related to the use of real data, and working with individuals throughout an organi...
Development of Evaluation Technique of GMAW Welding Quality Based on Statistical Analysis
Institute of Scientific and Technical Information of China (English)
FENG Shengqiang; TERASAKI Hidenri; KOMIZO Yuichi; HU Shengsun; CHEN Donggao; MA Zhihua
2014-01-01
Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the meanm and standard deviations of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero tom+3s are regarded as “good”. Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.
Statistical techniques using NURE airborne geophysical data and NURE geochemical data
Campbell, Katherine
Some standard techniques in multivariate analysis are used to describe the relationships among remotely sensed observations (Landsat and airborne geophysical data) and between these variables and hydrogeochemical and stream sediment analyses. Gray-level pictures of such factors make the analytic results more accessible and easier to interpret.
Using Candy Samples to Learn about Sampling Techniques and Statistical Data Evaluation
Canaes, Larissa S.; Brancalion, Marcel L.; Rossi, Adriana V.; Rath, Susanne
2008-01-01
A classroom exercise for undergraduate and beginning graduate students that takes about one class period is proposed and discussed. It is an easy, interesting exercise that demonstrates important aspects of sampling techniques (sample amount, particle size, and the representativeness of the sample in relation to the bulk material). The exercise…
Tobalem, Frank; Dugert, Eric; Verdun, Francis R; Dunet, Vincent; Ott, Julien G; Rudiger, Hannes A; Cherix, Stephane; Meuli, Reto; Becce, Fabio
2014-12-01
The purpose of this article is to assess the effect of the adaptive statistical iterative reconstruction (ASIR) technique on image quality in hip MDCT arthrography and to evaluate its potential for reducing radiation dose. Thirty-seven patients examined with hip MDCT arthrography were prospectively randomized into three different protocols: one with a regular dose (volume CT dose index [CTDIvol], 38.4 mGy) and two with a reduced dose (CTDIvol, 24.6 or 15.4 mGy). Images were reconstructed using filtered back projection (FBP) and four increasing percentages of ASIR (30%, 50%, 70%, and 90%). Image noise and contrast-to-noise ratio (CNR) were measured. Two musculoskeletal radiologists independently evaluated several anatomic structures and image quality parameters using a 4-point scale. They also jointly assessed acetabular labrum tears and articular cartilage lesions. With decreasing radiation dose level, image noise statistically significantly increased (p=0.0009) and CNR statistically significantly decreased (p=0.001). We also found a statistically significant reduction in noise (p=0.0001) and increase in CNR (p≤0.003) with increasing percentage of ASIR; in addition, we noted statistically significant increases in image quality scores for the labrum and cartilage, subchondral bone, overall diagnostic quality (up to 50% ASIR), and subjective noise (p≤0.04), and statistically significant reductions for the trabecular bone and muscles (p≤0.03). Regardless of the radiation dose level, there were no statistically significant differences in the detection and characterization of labral tears (n=24; p=1) and cartilage lesions (n=40; p≥0.89) depending on the ASIR percentage. The use of up to 50% ASIR in hip MDCT arthrography helps to reduce radiation dose by approximately 35-60%, while maintaining diagnostic image quality comparable to that of a regular-dose protocol using FBP.
Siderius, Daniel W; Mahynski, Nathan A; Shen, Vincent K
2017-05-01
Measurement of the pore-size distribution (PSD) via gas adsorption and the so-called "kernel method" is a widely used characterization technique for rigid adsorbents. Yet, standard techniques and analytical equipment are not appropriate to characterize the emerging class of flexible adsorbents that deform in response to the stress imparted by an adsorbate gas, as the PSD is a characteristic of the material that varies with the gas pressure and any other external stresses. Here, we derive the PSD for a flexible adsorbent using statistical mechanics in the osmotic ensemble to draw analogy to the kernel method for rigid materials. The resultant PSD is a function of the ensemble constraints including all imposed stresses and, most importantly, the deformation free energy of the adsorbent material. Consequently, a pressure-dependent PSD is a descriptor of the deformation characteristics of an adsorbent and may be the basis of future material characterization techniques. We discuss how, given a technique for resolving pressure-dependent PSDs, the present statistical mechanical theory could enable a new generation of analytical tools that measure and characterize certain intrinsic material properties of flexible adsorbents via otherwise simple adsorption experiments.
Downscaling Statistical Model Techniques for Climate Change Analysis Applied to the Amazon Region
Directory of Open Access Journals (Sweden)
David Mendes
2014-01-01
Full Text Available The Amazon is an area covered predominantly by dense tropical rainforest with relatively small inclusions of several other types of vegetation. In the last decades, scientific research has suggested a strong link between the health of the Amazon and the integrity of the global climate: tropical forests and woodlands (e.g., savannas exchange vast amounts of water and energy with the atmosphere and are thought to be important in controlling local and regional climates. Consider the importance of the Amazon biome to the global climate changes impacts and the role of the protected area in the conservation of biodiversity and state-of-art of downscaling model techniques based on ANN Calibrate and run a downscaling model technique based on the Artificial Neural Network (ANN that is applied to the Amazon region in order to obtain regional and local climate predicted data (e.g., precipitation. Considering the importance of the Amazon biome to the global climate changes impacts and the state-of-art of downscaling techniques for climate models, the shower of this work is presented as follows: the use of ANNs good similarity with the observation in the cities of Belém and Manaus, with correlations of approximately 88.9% and 91.3%, respectively, and spatial distribution, especially in the correction process, representing a good fit.
Benson, Nsikak U; Asuquo, Francis E; Williams, Akan B; Essien, Joseph P; Ekong, Cyril I; Akpabio, Otobong; Olajire, Abaas A
2016-01-01
Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.
A Novel Statistical Technique for Determining the Properties of Exrasolar Planets
Starr Henderson, Cassandra; Skemer, Andrew; Morley, Caroline; Fortney, Jonathan J.
2017-01-01
By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systematic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which allows some amount of the data to have underestimated errorbars. This method compares the photometry on the substellar companion GJ 758b to custom atmospheric models to determine the exoplanet's atmospheric properties. It also demonstrates that some of the data is inconsistent with the models, and produces a probability distribution of atmospheric properties including temperature, gravity, cloud thickness, and chemical abundance for GJ 758b which automatically weights the photometry by the probability that it is correct at each wavelength.
Directory of Open Access Journals (Sweden)
Nsikak U Benson
Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.
Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.
2016-01-01
Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934
Enlow, Elizabeth M; Kennedy, Jennifer L; Nieuwland, Alexander A; Hendrix, James E; Morgan, Stephen L
2005-08-01
Nylons are an important class of synthetic polymers, from an industrial, as well as forensic, perspective. A spectroscopic method, such as Fourier transform infrared (FT-IR) spectroscopy, is necessary to determine the nylon subclasses (e. g., nylon 6 or nylon 6,6). Library searching using absolute difference and absolute derivative difference algorithms gives inconsistent results for identifying nylon subclasses. The objective of this study was to evaluate the usefulness of peak ratio analysis and multivariate statistics for the identification of nylon subclasses using attenuated total reflection (ATR) spectral data. Many nylon subclasses could not be distinguished by the peak ratio of the N-H vibrational stretch to the sp(3) C-H(2) vibrational stretch intensities. Linear discriminant analysis, however, provided a graphical visualization of differences between nylon subclasses and was able to correctly classify a set of 270 spectra from eight different subclasses with 98.5% cross-validated accuracy.
Improving techniques for statistical and physical modelling of wind resources in complex terrain
Energy Technology Data Exchange (ETDEWEB)
Croba, D.; Tryfonopoulos, D. [CINAP S.A., Athens (Greece); Bunn, J. [Rutherford Appleton Lab., Chilton (United Kingdom); Casanova, M. [ECOTECNIA, Barcelona (Spain); Martin, F. [CIEMAT-IER, Madrid (Spain); Morgana, B. [CONPHOEBUS s.c.r.l., Catania (Italy); Rodrigues, A. [University of Porto, DEMEGI, Porto (Portugal); Schmid, J. [University of Karlsruhe, IEH, Karlsruhe (Germany); Voutsinas, S. [National Technical Univ. of Athens, Fluids Section, Athens (Greece)
1996-12-31
The objective of this work was to improve the accuracy of estimates of the expected wind energy production with emphasis given to complex terrain sites, where the commonly used methods usually fail to give reliable predictions. An improved wind park siting methodology was devised and validated for both gentle and complex terrain sites. This methodology can be divided into the statistical methodology called Matrix method and the physical methodology, which constitutes the AIOLOS-T wind modelling code. An inventory of potential sites for wind park installation in Southern Europe and Germany was produced, aiming at a wind energy penetration of up to 5% of the annual electricity demand of each country. The improved methodology was applied to several of the identified sites in order to estimate the expected wind energy production. Then the possible sources of error in the assessment of the expected wind energy production were examined. (Author)
Directory of Open Access Journals (Sweden)
Petrus H.A.J.M van Gelder
2013-05-01
Full Text Available The dam-break induced loads and their effects on buildings are of vital importance for assessing the vulnerability of buildings in flood-prone areas. A comprehensive methodology, for risk assessment of buildings subject to flooding, is nevertheless still missing. This research aims to take a step forward by following previous research. To this aim, (1 five statistical procedures including: simple correlation analysis, multiple linear regression model, stepwise multiple linear regression model, principal component analysis and cluster analysis are used to study relationship between mean normalized force on structure and other related variables; (2 a new and efficient variable that can take into account both the shape of the structure and flow conditions is proposed; (3 a new and practical formula for predicting the mean normalized force is suggested for different types of obstacles, which is missing in the previous research.
Directory of Open Access Journals (Sweden)
Mohiuddin Ahmed
2015-07-01
Full Text Available There is significant interest in the data mining and network management communities to efficiently analyse huge amounts of network traffic, given the amount of network traffic generated even in small networks. Summarization is a primary data mining task for generating a concise yet informative summary of the given data and it is a research challenge to create summary from network traffic data. Existing clustering based summarization techniques lack the ability to create a suitable summary for further data mining tasks such as anomaly detection and require the summary size as an external input. Additionally, for complex and high dimensional network traffic datasets, there is often no single clustering solution that explains the structure of the given data. In this paper, we investigate the use of multiview clustering to create a meaningful summary using original data instances from network traffic data in an efficient manner. We develop a mathematically sound approach to select the summary size using a sampling technique. We compare our proposed approach with regular clustering based summarization incorporating the summary size calculation method and random approach. We validate our proposed approach using the benchmark network traffic dataset and state-of-theart summary evaluation metrics.
Lee, Young-Ran; Lee, Sang-Jun; Kim, Jung-Chul; Ogawa, Hideoki
2006-11-01
Pubic atrichosis or hypotrichosis is a common condition in Korean women of Mongolian origin. This results in many patients receiving hair restoration surgery, which is currently thought to be the only definitive therapy. To pursue more natural and realistic-appearing results, to define patient characteristics, and to estimate the survival rate of transplanted pubic hair through restoration surgery, we examined our cases of pubic hair restoration surgery with single-hair grafts. We selected 507 patients with pubic atrichosis or hypotrichosis who visited for pubic hair transplantation between March 1, 2001, and February 28, 2005. We reviewed the medical charts of the 507 patients and performed statistical analysis. We also carried out a detailed evaluation of our surgical technique to 100 patients. In addition, 20 patients, who agreed to participate in the study for survival rate, had received transplantation of 40 hairs in a 1.5 x 1.5-cm area after the angular points were tattooed. The number of hairs grown after 1 year of transplantation was counted in each case. Among the 507 subjects, 169(33.3%) were in their 40s. The mean (+/-SD) patient age was 41.3+/-10.8 years. Of these, 115 patients (22.7%) had pubic atrichosis, and 392 patients (77.3%) had pubic hypotrichosis. In addition, 81.7% of atrichosis patients had a family history of atrichosis or hypotrichosis. Pubic atrichosis accompanied axillary atrichosis or hypotrichosis in 60.0 and 38.2% of the cases, respectively. The most common reason for the hair restorative procedure was the subject's sense of inferiority to the same sex (73.8%). The mean number of transplanted hairs was 929.3+/-76.6. The most common design pattern that we used was modified horizontal type (87.0%). The mean survival rate of single-hair grafts on the pubis was 73.6+/-7.6%. This study suggested that pubic hair transplantation surgery is a suitable cosmetic procedure to address the inferiority complex of patients with pubic atrichosis or
An Efficient Statistical Computation Technique for Health Care Big Data using R
Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.
2017-08-01
Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to –day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.
Directory of Open Access Journals (Sweden)
Qing Gu
2016-03-01
Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.
Selective Ring Oscillator PUF with Statistics Correction Technique and its Evaluation
Yoshikawa, Masaya; Asai, Toshiya; Shiozaki, Mitsuru; Fujino, Takeshi
The physical unclonable function (PUF) is a method to derive ID information peculiar to a device by detecting random physical features that cannot be controlled during the device’s manufacture. Because information such as the ID information is difficult to replicate, PUF is used as a technique to prevent forgery. PUF has two major application fields: an authentication scheme, and protection of intellectual property of the semiconductor industry. Several circuit systems to compose PUF have been reported. This study proposes a new PUF based on ring oscillator PUF. By incorporating a mechanism to correct the oscillation frequency dispersion that results from the layout, the proposed PUF can generate ID accurately. We verified the proposed PUF’s validity by conducting experiments using an FPGA that incorporates the proposed PUF.
Teleni, Vicki; Baldauf, Richard B., Jr.
A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…
Study of the Severity of Accidents in Tehran Using Statistical Modeling and Data Mining Techniques
Directory of Open Access Journals (Sweden)
Hesamaldin Razi
2013-01-01
Full Text Available AbstractBackgrounds and Aims: The Tehran province was subject to the second highest incidence of fatalities due to traffic accidents in 1390. Most studies in this field examine rural traffic accidents, but this study is based on the use of logit models and artificial neural networks to evaluate the factors that affect the severity of accidents within the city of Tehran.Materials and Methods: Among the various types of crashes, head-on collisions are specified as the most serious type, which is investigated in this study with the use of Tehran’s accident data. In the modeling process, the severity of the accident is the dependent variable and defined as a binary covariate, which are non-injury accidents and injury accidents. The independent variables are parameters such as the characteristics of the driver, time of the accident, traffic and environmental characteristics. In addition to the prediction accuracy comparison of the two models, the elasticity of the logit model is compared with a sensitivity analysis of the neural network.Results: The results show that the proposed model provides a good estimate of an accident's severity. The explanatory variables that have been determined to be significant in the final models are the driver’s gender, age and education, along with negligence of the traffic rules, inappropriate acceleration, deviation to the left, type of vehicle, pavement conditions, time of the crash and street width.Conclusion: An artificial neural network model can be useful as a statistical model in the analysis of factors that affect the severity of accidents. According to the results, human errors and illiteracy of drivers increase the severity of crashes, and therefore, educating drivers is the main strategy that will reduce accident severity in Iran. Special attention should be given to a driver’s age group, with particular care taken when they are very young.
Sedlack, Jeffrey D
2010-01-01
Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services.
Statistical techniques for diagnosing CIN using fluorescence spectroscopy: SVD and CART.
Atkinson, E N; Mitchell, M F; Ramanujam, N; Richards-Kortum, R
1995-01-01
A quantitative measure of intraepithelial neoplasia which can be made in vivo without tissue removal would be clinically significant in chemoprevention studies. Our group is working to develop such a technique based on fluorescence spectroscopy. Using empirically based algorithms, we have demonstrated that fluorescence is discriminating normal cervix from low- and high-grade cervical dysplasias with similar performance to colposcopy in expert hands. These measurements can be made in vivo, in near real time, and results can be obtained without biopsy. This paper describes a new method using automated analysis of fluorescence emission spectra to classify cervical tissue into multiple diagnostic categories. First, data is reduced using the singular value decomposition (SVD), yielding a set of orthogonal basis vectors. Each patient's emission spectrum is then fit by linear least squares regression to the basis vectors, producing a set of coefficients for each patient. Based on these coefficient values, the classification and regression tree (CART) method predicts the patient's classification. These results suggest that laser-induced fluorescence can be used to automatically recognize and differentially diagnose cervical intraepithelial neoplasia (CIN) at colposcopy. This method of analysis is general in nature, and can analyze fluorescence spectra of suspected intraepithelial neoplasms from other organ sites. As a more complete understanding of the biochemical and morphologic basis of tissue spectroscopy is developed, it may also be possible to use fluorescence spectroscopy of the cervix as a surrogate endpoint biomarker in Phase I and II chemoprevention trials.
Zeng, W; Liu, B
1999-01-01
Digital watermarking has been proposed as the means for copyright protection of multimedia data. Many of existing watermarking schemes focused on the robust means to mark an image invisibly without really addressing the ends of these schemes. This paper first discusses some scenarios in which many current watermarking schemes fail to resolve the rightful ownership of an image. The key problems are then identified, and some crucial requirements for a valid invisible watermark detection are discussed. In particular, we show that, for the particular application of resolving rightful ownership using invisible watermarks, it might be crucial to require that the original image not be directly involved in the watermark detection process. A general framework for validly detecting the invisible watermarks is then proposed. Some requirements on the claimed signature/watermarks to be used for detection are discussed to prevent the existence of any counterfeit scheme. The optimal detection strategy within the framework is derived. We show the effectiveness of this technique based on some visual-model-based watermark encoding schemes.
Directory of Open Access Journals (Sweden)
Gledsneli Maria Lima Lins
2010-12-01
Full Text Available Water has a decisive influence on populations’ life quality – specifically in areas like urban supply, drainage, and effluents treatment – due to its sound impact over public health. Water rational use constitutes the greatest challenge faced by water demand management, mainly with regard to urban household water consumption. This makes it important to develop researches to assist water managers and public policy-makers in planning and formulating water demand measures which may allow urban water rational use to be met. This work utilized the multivariate techniques Factor Analysis and Multiple Linear Regression Analysis – in order to determine the participation level of socioeconomic and climatic variables in monthly urban household consumption changes – applying them to two districts of Campina Grande city (State of Paraíba, Brazil. The districts were chosen based on socioeconomic criterion (income level so as to evaluate their water consumer’s behavior. A 9-year monthly data series (from year 2000 up to 2008 was utilized, comprising family income, water tariff, and quantity of household connections (economies – as socioeconomic variables – and average temperature and precipitation, as climatic variables. For both the selected districts of Campina Grande city, the obtained results point out the variables “water tariff” and “family income” as indicators of these district’s household consumption.
Garrett, John; Li, Yinsheng; Li, Ke; Chen, Guang-Hong
2017-03-01
Digital breast tomosynthesis (DBT) is a three dimensional (3D) breast imaging modality in which projections are acquired over a limited angular span around the compressed breast and reconstructed into image slices parallel to the detector. DBT has been shown to help alleviate the breast tissue overlapping issues of two dimensional (2D) mammography. Since the overlapping tissues may simulate cancer masses or obscure true cancers, this improvement is critically important for improved breast cancer screening and diagnosis. In this work, a model-based image reconstruction method is presented to show that spatial resolution in DBT volumes can be maintained while dose is reduced using the presented method when compared to that of a state-of-the-art commercial reconstruction technique. Spatial resolution was measured in phantom images and subjectively in a clinical dataset. Noise characteristics were explored in a cadaver study. In both the quantitative and subjective results the image sharpness was maintained and overall image quality was maintained at reduced doses when the model-based iterative reconstruction was used to reconstruct the volumes.
Kostenko, Yuri T.; Shkvarko, Yuri V.
1994-06-01
The aim of this presentation is to address a new theoretic approach to the problem of the development of remote sensing imaging (RSI) nonlinear techniques that exploit the idea of fusion the experiment design and statistical regularization theory-based methods for inverse problems solution optimal/suboptimal in the mixed Bayesian-regularization setting. The basic purpose of such the information fusion-based methodology is twofold, namely, to design the appropriate system- oriented finite-dimensional model of the RSI experiment in the terms of projection schemes for wavefield inversion problems, and to derive the two-stage estimation techniques that provide the optimal/suboptimal restoration of the power distribution in the environment from the limited number of the wavefield measurements. We also discuss issues concerning the available control of some additional degrees of freedom while such an RSI experiment is conducted.
Tahmasebi, Amir M; Sharifi, Reza; Agarwal, Harsh K; Turkbey, Baris; Bernardo, Marcelino; Choyke, Peter; Pinto, Peter; Wood, Bradford; Kruecker, Jochen
2012-01-01
In prostate brachytherapy procedures, combining high-resolution endorectal coil (ERC)-MRI with Computed Tomography (CT) images has shown to improve the diagnostic specificity for malignant tumors. Despite such advantage, there exists a major complication in fusion of the two imaging modalities due to the deformation of the prostate shape in ERC-MRI. Conventionally, nonlinear deformable registration techniques have been utilized to account for such deformation. In this work, we present a model-based technique for accounting for the deformation of the prostate gland in ERC-MR imaging, in which a unique deformation vector is estimated for every point within the prostate gland. Modes of deformation for every point in the prostate are statistically identified using a set of MR-based training set (with and without ERC-MRI). Deformation of the prostate from a deformed (ERC-MRI) to a non-deformed state in a different modality (CT) is then realized by first calculating partial deformation information for a limited number of points (such as surface points or anatomical landmarks) and then utilizing the calculated deformation from a subset of the points to determine the coefficient values for the modes of deformations provided by the statistical deformation model. Using a leave-one-out cross-validation, our results demonstrated a mean estimation error of 1mm for a MR-to-MR registration.
Directory of Open Access Journals (Sweden)
Farooq Ahmad
2011-12-01
Full Text Available Multivariate statistical techniques such as factor analysis (FA, cluster analysis (CA and discriminant analysis (DA, were applied for the evaluation of spatial variations and the interpretation of a large complex water quality data set of three cities (Lahore, Gujranwala and Sialkot in Punjab, Pakistan. 16 parameters of water samples collected from nine different sampling stations of each city were determined. Factor analysis indicates five factors, which explained 74% of the total variance in water quality data set. Five factors are salinization, alkalinity, temperature, domestic waste and chloride, which explained 31.1%, 14.3%, 10.6%, 10.0% and 8.0% of the total variance respectively. Hierarchical cluster analysis grouped nine sampling stations of each city into three clusters, i.e., relatively less polluted (LP, and moderately polluted (MP and highly polluted (HP sites, based on the similarity of water quality characteristics. Discriminant analysis (DA identified ten significant parameters (Calcium (Ca, Ammonia, Sulphate, Sodium (Na, electrical conductivity (EC, chloride, temperature (Temp, total hardness(TH, Turbidity, which discriminate the groundwater quality of three cities, with close to 100.0% correct assignment for spatial variations. This study illustrates the benefit of multivariate statistical techniques for interpreting complex data sets in the analysis of spatial variations in water quality, and to plan for future studies.
Khan, Firdos; Pilz, Jürgen
2016-04-01
South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological
Statistical signal processing techniques for coherent transversal beam dynamics in synchrotrons
Energy Technology Data Exchange (ETDEWEB)
Alhumaidi, Mouhammad
2015-03-04
identifying and analyzing the betatron oscillation sourced from the kick based on its mixing and temporal patterns. The accelerator magnets can generate unwanted spurious linear and non-linear fields due to fabrication errors or aging. These error fields in the magnets can excite undesired resonances leading together with the space charge tune spread to long term beam losses and reducing dynamic aperture. Therefore, the knowledge of the linear and non-linear magnets errors in circular accelerator optics is very crucial for controlling and compensating resonances and their consequent beam losses and beam quality deterioration. This is indispensable, especially for high beam intensity machines. Fortunately, the relationship between the beam offset oscillation signals recorded at the BPMs is a manifestation of the accelerator optics, and can therefore be exploited in the determination of the optics linear and non-linear components. Thus, beam transversal oscillations can be excited deliberately for purposes of diagnostics operation of particle accelerators. In this thesis, we propose a novel method for detecting and estimating the optics lattice non-linear components located in-between the locations of two BPMs by analyzing the beam offset oscillation signals of a BPMs-triple containing these two BPMs. Depending on the non-linear components in-between the locations of the BPMs-triple, the relationship between the beam offsets follows a multivariate polynomial accordingly. After calculating the covariance matrix of the polynomial terms, the Generalized Total Least Squares method is used to find the model parameters, and thus the non-linear components. A bootstrap technique is used to detect the existing polynomial model orders by means of multiple hypothesis testing, and determine confidence intervals for the model parameters.
Bonetti, Jennifer; Quarino, Lawrence
2014-05-01
This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.
Paul, Michael; Arora, Karunesh; Sumita, Eiichiro
This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.
Muhammad, Said; Tahir Shah, M; Khan, Sardar
2010-10-01
The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were samples. This level of contamination should have low chronic risk and medium cancer risk when compared with US EPA guidelines. Furthermore, the inter-dependence of physio-chemical parameters and pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis.
Hillyer, Margot M; Finch, Lauren E; Cerel, Alisha S; Dattelbaum, Jonathan D; Leopold, Michael C
2014-08-01
A wide spectrum and large number of children's toys and toy jewelry items were purchased from both bargain and retail vendors and analyzed for arsenic, cadmium, and lead metal content using multiple analytical techniques, including flame and furnace atomic absorption spectroscopy as well as X-ray fluorescence spectroscopy. Particularly dangerous for young children, metal concentrations in toys/toy jewelry were assessed for compliance with current Consumer Safety Product Commission (CPSC) regulations (F963-11). A conservative metric involving multiple analytical techniques was used to categorize compliance: one technique confirmation of metal in excess of CPSC limits indicated a "suspect" item while confirmation on two different techniques warranted a non-compliant designation. Sample matrix-based standard addition provided additional confirmation of non-compliant and suspect products. Results suggest that origin of purchase, rather than cost, is a significant factor in the risk assessment of these materials with 57% of toys/toy jewelry items from bargain stores non-compliant or suspect compared to only 15% from retail outlets and 13% if only low cost items from the retail stores are compared. While jewelry was found to be the most problematic product (73% of non-compliant/suspect samples), lead (45%) and arsenic (76%) were the most dominant toxins found in non-compliant/suspect samples. Using the greater Richmond area as a model, the discrepancy between bargain and retail children's products, along with growing numbers of bargain stores in low-income and urban areas, exemplifies an emerging socioeconomic public health issue.
Goodpaster, John V; Sturdevant, Amanda B; Andrews, Kristen L; Brun-Conti, Leanora
2007-05-01
Comparisons of polyvinyl chloride electrical tape typically rely upon evaluating class characteristics such as physical dimensions, surface texture, and chemical composition. Given the various techniques that are available for this purpose, a comprehensive study has been undertaken to establish an optimal analytical scheme for electrical tape comparisons. Of equal importance is the development of a quantitative means for sample discrimination. In this study, 67 rolls of black electrical tape representing 34 different nominal brands were analyzed via scanning electron microscopy and energy dispersive spectroscopy. Differences in surface roughness, calendering marks, and filler particle size were readily apparent, including between some rolls of the same nominal brand. The relative amounts of magnesium, aluminum, silicon, sulfur, lead, chlorine, antimony, calcium, titanium, and zinc varied greatly between brands and, in some cases, could be linked to the year of manufacture. For the first time, quantitative differentiation of electrical tapes was achieved through multivariate statistical techniques, with 36 classes identified within the sample population. A single-blind study was also completed where questioned tape samples were correctly associated with known exemplars. Finally, two case studies are presented where tape recovered from an improvised explosive device is compared with tape recovered from a suspect.
Lu, Tao; Chen, Mingli; Du, Yaping; Qiu, Zongxu
2017-02-01
Lightning location network (LLN) with DF/TOA (direction-finder/time-of-arrival) combined technique has been widely used in the world. However, the accuracy of the lightning data from such LLNs has still been restricted by "site error", especially for those detected only by two DF/TOA sensors. In this paper we practice a statistical approach for evaluation and correction of "site error" for DF/TOA type LLN based on its lightning data. By comparing lightning locations recorded by at least 4 sensors between DF and TOA techniques, the spatial characteristics of "site error" for each sensor in the network can be obtained. The obtained "site error" then can be used to improve the accuracy of lightning locations especially those recorded by only 2 sensors. With this approach, the "site error" patterns for 23 sensors in Yunnan LLN are obtained. The features of these site error patterns are in good consistency with those in literature. Significant differences in lightning locations before and after "site error" corrections indicate that the proposed approach works effectively.
Statistical modeling for degradation data
Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru
2017-01-01
This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.
Directory of Open Access Journals (Sweden)
Stephanie Cohen
2017-09-01
Full Text Available As the oceans become less alkaline due to rising CO2 levels, deleterious consequences are expected for calcifying corals. Predicting how coral calcification will be affected by on-going ocean acidification (OA requires an accurate assessment of CaCO3 deposition and an understanding of the relative importance that decreasing calcification and/or increasing dissolution play for the overall calcification budget of individual corals. Here, we assessed the compatibility of the 45Ca-uptake and total alkalinity (TA anomaly techniques as measures of gross and net calcification (GC, NC, respectively, to determine coral calcification at pHT 8.1 and 7.5. Considering the differing buffering capacity of seawater at both pH values, we were also interested in how strongly coral calcification alters the seawater carbonate chemistry under prolonged incubation in sealed chambers, potentially interfering with physiological functioning. Our data indicate that NC estimates by TA are erroneously ∼5% and ∼21% higher than GC estimates from 45Ca for ambient and reduced pH, respectively. Considering also previous data, we show that the consistent discrepancy between both techniques across studies is not constant, but largely depends on the absolute value of CaCO3 deposition. Deriving rates of coral dissolution from the difference between NC and GC was not possible and we advocate a more direct approach for the future by simultaneously measuring skeletal calcium influx and efflux. Substantial changes in carbonate system parameters for incubation times beyond two hours in our experiment demonstrate the necessity to test and optimize experimental incubation setups when measuring coral calcification in closed systems, especially under OA conditions.
Cohen, Stephanie; Krueger, Thomas; Fine, Maoz
2017-01-01
As the oceans become less alkaline due to rising CO2 levels, deleterious consequences are expected for calcifying corals. Predicting how coral calcification will be affected by on-going ocean acidification (OA) requires an accurate assessment of CaCO3 deposition and an understanding of the relative importance that decreasing calcification and/or increasing dissolution play for the overall calcification budget of individual corals. Here, we assessed the compatibility of the (45)Ca-uptake and total alkalinity (TA) anomaly techniques as measures of gross and net calcification (GC, NC), respectively, to determine coral calcification at pHT 8.1 and 7.5. Considering the differing buffering capacity of seawater at both pH values, we were also interested in how strongly coral calcification alters the seawater carbonate chemistry under prolonged incubation in sealed chambers, potentially interfering with physiological functioning. Our data indicate that NC estimates by TA are erroneously ∼5% and ∼21% higher than GC estimates from (45)Ca for ambient and reduced pH, respectively. Considering also previous data, we show that the consistent discrepancy between both techniques across studies is not constant, but largely depends on the absolute value of CaCO3 deposition. Deriving rates of coral dissolution from the difference between NC and GC was not possible and we advocate a more direct approach for the future by simultaneously measuring skeletal calcium influx and efflux. Substantial changes in carbonate system parameters for incubation times beyond two hours in our experiment demonstrate the necessity to test and optimize experimental incubation setups when measuring coral calcification in closed systems, especially under OA conditions.
Energy Technology Data Exchange (ETDEWEB)
Kwon, K.S.; Kim, I.H.; Cho, W.J.; Song, W.K.; Synn, J.H.; Choi, S.O.; Yoon, C.H.; Hong, K.P.; Park, C. [Korea Institute of Geology Mining and Materials, Taejon (Korea)
1998-12-01
The ground stability assessment technique of the subsidence prone area and its restoring plan need to be developed to obtain the ground stability around the mines at rest or closed since 1980's. Up to the present, the assessment of the subsidence risk has been conducted only after the statements of residents or the observation of symptom on the subsidence. Generally, the assessment process at first stage is carried on through the analysis of surface and mining map, the geological survey and the interviews to the residents. Drilling survey, rock property test, geotechnical rock and ground survey, and numerical analyses belong to the second stage. After the completion of the procedure the stability of buildings and the strength of subsidence are determined. The acquisition of the accurate in-situ data, the estimation of mechanical property of rock mass, and the analysis of basic mechanism may affect in the great extent on the assessment of the subsidence risk. In this study, the development of the subsidence risk assessment method was incorporated with the GIS technique which will be used to make the risk information map on the subsidence. The numerical analysis in 2D and 3D using PFC and FLAC has been conducted to estimate the ground stability of Moo-Geuk Mine area. The displacement behavior of the ground and the development of the failed zone due to the cavity were studied from the numerical modelling. The result of the ground stability assessment for the area in question shows that the risk to the subsidence is relatively small. It is, however, necessary to fill the cavity with some suitable materials when considering the new construction of buildings or roads in plan. Finally, the measures to prevent the subsidence and some case studies were presented, in particular the case study on the measurement of the ground movement in a mine were described in detail. (author). 27 refs., 27 tabs., 62 figs.
Directory of Open Access Journals (Sweden)
Z. Pasandidehfard
2014-03-01
Full Text Available Nonpoint source (NPS pollution is a major surface water contaminant commonly caused by agricultural runoff. The purpose of this study was to assess seasonal variation in water quality parameters in Gorganrood watershed (Golestan Province, Iran. It also tried to clarify the effects of agricultural practices and NPS pollution on them. Water quality parameters including potassium, sodium, pH, water flow rate, total dissolved solids (TDS, electrical conductivity (EC, hardness, sulfate, bicarbonate, chlorine, magnesium, and calcium ions during 1966-2010 were evaluated using multivariate statistical techniques. Multivariate analysis of variance (MANOVA was implemented to determine the significance of differences between mean seasonal values. Discriminant analysis (DA was also carried out to identify correlations between seasons and the water quality parameters. Parameters of water quality index were measured through principal component analysis (PCA and factor analysis (FA. Based on the results of statistical tests, climate (freezing, weathering and rainfall and human activities such as agriculture had crucial effects on water quality. The most important parameters in differentiation between seasons in descending order were potassium, pH, carbonic acid, calcium, and magnesium. According to load factor analysis, chlorine, calcium, and potassium were the most important parameters in spring and summer, indicating the application of fertilizers (especially potassium chloride fertilizer and existence of NPS pollution during these seasons. In the next stage, the months during which crops had excessive water requirements were detected using CROPWAT software. Almost all water requirements of the area’s major crops, i.e. cotton, rice, soya, wheat, and oat, happen in the late spring until mid/late summer. According to our findings, agricultural practices had a great impact on water pollution. Results of analysis with CROPWAT software also confirmed this
Directory of Open Access Journals (Sweden)
M. Geyer
2013-06-01
Full Text Available The aim of this study is to derive a realistic estimation of the Surface Mass Balance (SMB of the Greenland ice sheet (GrIS through statistical downscaling of Global Coupled Model (GCM outputs. To this end, climate simulations performed with the CNRM-CM5.1 Atmosphere-Ocean GCM within the CMIP5 (Coupled Model Intercomparison Project phase 5 framework are used for the period 1850–2300. From the year 2006, two different emission scenarios are considered (RCP4.5 and RCP8.5. Simulations of SMB performed with the detailed snowpack model Crocus driven by CNRM-CM5.1 surface atmospheric forcings serve as a reference. On the basis of these simulations, statistical relationships between total precipitation, snow-ratio, snowmelt, sublimation and near-surface air temperature are established. This leads to the formulation of SMB variation as a function of temperature variation. Based on this function, a downscaling technique is proposed in order to refine 150 km horizontal resolution SMB output from CNRM-CM5.1 to a 15 km resolution grid. This leads to a much better estimation of SMB along the GrIS margins, where steep topography gradients are not correctly represented at low-resolution. For the recent past (1989–2008, the integrated SMB over the GrIS is respectively 309 and 243 Gt yr–1 for raw and downscaled CNRM-CM5.1. In comparison, the Crocus snowpack model forced with ERA-Interim yields a value of 245 Gt yr–1. The major part of the remaining discrepancy between Crocus and downscaled CNRM-CM5.1 SMB is due to the different snow albedo representation. The difference between the raw and the downscaled SMB tends to increase with near-surface air temperature via an increase in snowmelt.
Directory of Open Access Journals (Sweden)
M. V. Ninu Krishnan
2015-01-01
Full Text Available In the present study, the Information Value (InfoVal and the Multiple Logistic Regression (MLR methods based on bivariate and multivariate statistical analysis have been applied for shallow landslide initiation susceptibility assessment in a selected subwatershed in the Western Ghats, Kerala, India, to determine the suitability of geographical information systems (GIS assisted statistical landslide susceptibility assessment methods in the data constrained regions. The different landslide conditioning terrain variables considered in the analysis are geomorphology, land use/land cover, soil thickness, slope, aspect, relative relief, plan curvature, profile curvature, drainage density, the distance from drainages, lineament density and distance from lineaments. Landslide Susceptibility Index (LSI maps were produced by integrating the weighted themes and divided into five landslide susceptibility zones (LSZ by correlating the LSI with general terrain conditions. The predictive performances of the models were evaluated through success and prediction rate curves. The area under success rate curves (AUC for InfoVal and MLR generated susceptibility maps shows 84.11% and 68.65%, respectively. The prediction rate curves show good to moderate correlation between the distribution of the validation group of landslides and LSZ maps with AUC values of 0.648 and 0.826 respectively for MLR and InfoVal produced LSZ maps. Considering the best fit and suitability of the models in the study area by quantitative prediction accuracy, LSZ map produced by the InfoVal technique shows higher accuracy, i.e. 82.60%, than the MLR model and is more realistic while compared in the field and is considered as the best suited model for the assessment of landslide susceptibility in areas similar to the study area. The LSZ map produced for the area can be utilised for regional planning and assessment process, by incorporating the generalised rainfall conditions in the area. DOI
Directory of Open Access Journals (Sweden)
Draenert Florian G
2010-06-01
Full Text Available Abstract Background Mandibular reconstruction by means of fibula transplants is the standard therapy for severe bone loss after subtotal mandibulectomy. Venous failure still represents the most common complication in free flap surgery. We present the injection of heparine into the arterial pedicle as modification of the revising both anastomoses in these cases and illustrate the application with a clinical case example. Methods Methods consist of immediate revision surgery with clot removal, heparin perfusion by direct injection in the arterial vessel of the pedicle, subsequent high dose low-molecular weight heparin therapy, and leeches. After 6 hours postoperatively, images of early flap recovery show first sings of recovery by fading livid skin color. Results The application of this technique in a patient with venous thrombosis resulted in the complete recovery of the flap 60 hours postoperatively. Other cases achieved similar success without additional lysis Therapy or revision of the arterial anastomosis. Conclusion Rescue of fibular flaps is possible even in patients with massive thrombosis if surgical revision is done quickly.
Directory of Open Access Journals (Sweden)
Ferdinand eKöckerling
2015-10-01
Full Text Available IntroducationIn a recent Cochrane review the authors concluded that there is an urgent need for well-powered, well-conducted randomized controlled trials comparing various modes of treatment of fistula-in-ano. Ten randomized controlled trials were available for analyses: There were no significant differences in recurrence rates or incontinuence rates in any of the studied comparisons. The following article reviews all studies available for treatment of fistula-in-ano with a fistula plug.Material and MethodsPubMed, Medline, Embase and the Cochrane medical database were searched up to December 2014. 47 articles were relevant for this review.ResultsHealing rates of 50 – 60 % can be expected for treatment of complex anal fistula with a fistula plug, with a plug-extrusion rate of 10 – 20 %. Such results can be achieved not only with plugs made of porcine intestinal submucosa, but also those made of other biological mesh materials, such as acellular dermal matrix. Important technical steps in the performance of a complex anal fistula plug repair need to be followed.SummaryTreatment of a complex fistula-in-ano with a fistula plug is an option with a success rate of 50 – 60 % with low complication rate. Further improvements in technique and better studies
Gaus, Wilhelm
2014-09-02
The US National Toxicology Program (NTP) is assessed by a statistician. In the NTP-program groups of rodents are fed for a certain period of time with different doses of the substance that is being investigated. Then the animals are sacrificed and all organs are examined pathologically. Such an investigation facilitates many statistical tests. Technical Report TR 578 on Ginkgo biloba is used as an example. More than 4800 statistical tests are possible with the investigations performed. Due to a thought experiment we expect >240 false significant tests. In actuality, 209 significant pathological findings were reported. The readers of Toxicology Letters should carefully distinguish between confirmative and explorative statistics. A confirmative interpretation of a significant test rejects the null-hypothesis and delivers "statistical proof". It is only allowed if (i) a precise hypothesis was established independently from the data used for the test and (ii) the computed p-values are adjusted for multiple testing if more than one test was performed. Otherwise an explorative interpretation generates a hypothesis. We conclude that NTP-reports - including TR 578 on Ginkgo biloba - deliver explorative statistics, i.e. they generate hypotheses, but do not prove them.
Directory of Open Access Journals (Sweden)
Abdelbaset Buhmeida
2011-05-01
Full Text Available The role of DNA content as a prognostic factor in colorectal cancer (CRC is highly controversial. Some of these controversies are due to purely technical reasons, e.g. variable practices in interpreting the DNA histograms, which is problematic particularly in advanced cases. In this report, we give a detailed account on various options how these histograms could be optimally interpreted, with the idea of establishing the potential value of DNA image cytometry in prognosis and in selection of proper treatment. Material consists of nuclei isolated from 50 ƒĘm paraffin sections from 160 patients with stage II, III or IV CRC diagnosed, treated and followed-up in our clinic. The nuclei were stained with the Feulgen stain. Nuclear DNA was measured using computer-assisted image cytometry. We applied 4 different approaches to analyse the DNA histograms: 1 appearance of the histogram (ABCDE approach, 2 range of DNA values, 3 peak evaluation, and 4 events present at high DNA values. Intra-observer reproducibility of these four histogram interpretation was 89%, 95%, 96%, and 100%, respectively. We depicted selected histograms to illustrate the four analytical approaches in cases with different stages of CRC, with variable disease outcome. In our analysis, the range of DNA values was the best prognosticator, i.e., the tumours with the widest histograms had the most ominous prognosis. These data implicate that DNA cytometry based on isolated nuclei is valuable in predicting the prognosis of CRC. Different interpretation techniques differed in their reproducibility, but the method showing the best prognostic value also had high reproducibility in our analysis.
Directory of Open Access Journals (Sweden)
José L. Valencia
2015-11-01
Full Text Available Rainfall, one of the most important climate variables, is commonly studied due to its great heterogeneity, which occasionally causes negative economic, social, and environmental consequences. Modeling the spatial distributions of rainfall patterns over watersheds has become a major challenge for water resources management. Multifractal analysis can be used to reproduce the scale invariance and intermittency of rainfall processes. To identify which factors are the most influential on the variability of multifractal parameters and, consequently, on the spatial distribution of rainfall patterns for different time scales in this study, universal multifractal (UM analysis—C1, α, and γs UM parameters—was combined with non-parametric statistical techniques that allow spatial-temporal comparisons of distributions by gradients. The proposed combined approach was applied to a daily rainfall dataset of 132 time-series from 1931 to 2009, homogeneously spatially-distributed across a 25 km × 25 km grid covering the Ebro River Basin. A homogeneous increase in C1 over the watershed and a decrease in α mainly in the western regions, were detected, suggesting an increase in the frequency of dry periods at different scales and an increase in the occurrence of rainfall process variability over the last decades.
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Directory of Open Access Journals (Sweden)
Ruchi Tiwari
2009-12-01
Full Text Available The present study investigated a novel extended release system of promethazine hydrochloride (PHC with acrylic polymers Eudragit RL100 and Eudragit S100 in different weight ratios (1:1 and 1: 5, and in combination (0.5+1.5, using freeze-drying and spray-drying techniques. Solid dispersions were characterized by Fourier-transformed infrared spectroscopy (FT-IR, differential scanning calorimetry (DSC, Powder X-ray diffractometry (PXRD, Nuclear magnetic resonance (NMR, Scanning electron microscopy (SEM, as well as solubility and in vitro dissolution studies in 0.1 N HCl (pH 1.2, double-distilled water and phosphate buffer (pH 7.4. Adsorption tests from drug solution to solid polymers were also performed. A selected solid dispersion system was developed into capsule dosage form and evaluated for in vitro dissolution studies. The progressive disappearance of drug peaks in thermotropic profiles of spray-dried dispersions were related to increasing amount of polymers, while SEM studies suggested homogenous dispersion of drug in polymer. Eudragit RL100 had a greater adsorptive capacity than Eudragit S100, and thus its combination in (0.5+1.5 for S100 and RL 100 exhibited a higher dissolution rate with 97.14% drug release for twelve hours. Among different formulations, capsules prepared by combination of acrylic polymers using spray-drying (1:0.5 + 1.5 displayed extended release of drug for twelve hours with 96.87% release followed by zero order kinetics (r²= 0.9986.O presente trabalho compreendeu estudo de um novo sistema de liberação prolongada de cloridrato de prometazina (PHC com polímeros acrílicos Eudragit RL100 e Eudragit S100 em diferentes proporções em massa (1:1 e 1:5 e em combinação (0,5+1,5, utilizando técnicas de liofilização e de secagem por aspersão As dispersões sólidas foram caracterizadas por espectrofotometria no infravermelho por transformada de Fourier (FT-IR, calorimetria diferencial de varredura (DSC, difratometria
Kissling, Grace E; Haseman, Joseph K; Zeiger, Errol
2015-09-02
A recent article by Gaus (2014) demonstrates a serious misunderstanding of the NTP's statistical analysis and interpretation of rodent carcinogenicity data as reported in Technical Report 578 (Ginkgo biloba) (NTP, 2013), as well as a failure to acknowledge the abundant literature on false positive rates in rodent carcinogenicity studies. The NTP reported Ginkgo biloba extract to be carcinogenic in mice and rats. Gaus claims that, in this study, 4800 statistical comparisons were possible, and that 209 of them were statistically significant (pGinkgo biloba extract cannot be definitively established. However, his assumptions and calculations are flawed since he incorrectly assumes that the NTP uses no correction for multiple comparisons, and that significance tests for discrete data operate at exactly the nominal level. He also misrepresents the NTP's decision making process, overstates the number of statistical comparisons made, and ignores the fact that the mouse liver tumor effects were so striking (e.g., p<0.0000000000001) that it is virtually impossible that they could be false positive outcomes. Gaus' conclusion that such obvious responses merely "generate a hypothesis" rather than demonstrate a real carcinogenic effect has no scientific credibility. Moreover, his claims regarding the high frequency of false positive outcomes in carcinogenicity studies are misleading because of his methodological misconceptions and errors.
Pompe, L.; Clausen, B. L.; Morton, D. M.
2015-12-01
Multi-component statistical techniques and GIS visualization are emerging trends in understanding large data sets. Our research applies these techniques to a large igneous geochemical data set from southern California to better understand magmatic and plate tectonic processes. A set of 480 granitic samples collected by Baird from this area were analyzed for 39 geochemical elements. Of these samples, 287 are from the Peninsular Ranges Batholith (PRB) and 164 from part of the Transverse Ranges (TR). Principal component analysis (PCA) summarized the 39 variables into 3 principal components (PC) by matrix multiplication and for the PRB are interpreted as follows: PC1 with about 30% of the variation included mainly compatible elements and SiO2 and indicates extent of differentation; PC2 with about 20% of the variation included HFS elements and may indicate crustal contamination as usually identified by Sri; PC3 with about 20% of the variation included mainly HRE elements and may indicate magma source depth as often diplayed using REE spider diagrams and possibly Sr/Y. Several elements did not fit well in any of the three components: Cr, Ni, U, and Na2O.For the PRB, the PC1 correlation with SiO2 was r=-0.85, the PC2 correlation with Sri was r=0.80, and the PC3 correlation with Gd/Yb was r=-0.76 and with Sr/Y was r=-0.66 . Extending this method to the TR, correlations were r=-0.85, -0.21, -0.06, and -0.64, respectively. A similar extent of correlation for both areas was visually evident using GIS interpolation.PC1 seems to do well at indicating differentiation index for both the PRB and TR and correlates very well with SiO2, Al2O3, MgO, FeO*, CaO, K2O, Sc, V, and Co, but poorly with Na2O and Cr. If the crustal component is represented by Sri, PC2 correlates well and less expesively with this indicator in the PRB, but not in the TR. Source depth has been related to the slope on REE spidergrams, and PC3 based on only the HREE and using the Sr/Y ratios gives a reasonable
Akifuddin, Syed; Khatoon, Farheen
2015-12-01
Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.
Energy Technology Data Exchange (ETDEWEB)
Dios, R.A.
1984-01-01
This dissertation focuses upon the field of probabilistic risk assessment and its development. It investigates the development of probabilistic risk assessment in nuclear engineering. To provide background for its development, the related areas of population dynamics (demography), epidemiology and actuarial science are studied by presenting information upon how risk has been viewed in these areas over the years. A second major problem involves presenting an overview of the mathematical models related to risk analysis to mathematics educators and making recommendations for presenting this theory in classes of probability and statistics for mathematics and engineering majors at the undergraduate and graduate levels.
The purpose of this memorandum is to inform recipients of concerns regarding Army Corps of Engineers statistical techniques, provide a list of installations and FWS where SiteStat/GridStats (SS/GS) have been used, and to provide direction on communicating with the public on the use of these 'tools' by USACE.
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
Statistical considerations on missing data in clinical trials%临床试验缺失数据的统计学考量
Institute of Scientific and Technical Information of China (English)
王骏; 韩景静; 黄钦
2016-01-01
Missing data is a common and important issue in clinical tri-als.It could make the results difficult to be explained , even influence the inference and conclusion of the trial.Currently , not enough emphasis has been given to the problem in China.In the real world of statistical opera-tion in dealing with clinical trial missing data , sponsors often applied va-rious methods without any rational , which introduced many difficulties for evaluating and confirming the safety and efficacy of new drugs.In this paper , three real cases were specifically brought out for illustration and analysis , meanwhile , the prevention of missing data , the choice of esti-mator , statistical analysis methods and sensitivity analysis were also discussed.The aim is to attract the sponsors to focus attention on prevention and treatment of missing data in clinical trials.%缺失数据是临床试验中常见的重要问题，可能引起试验结果难于解释，甚至影响整个试验的推断和结论，但目前在国内临床研究中尚未引起足够的重视。实际应用中，申请人对待缺失数据的处理存在盲目应用统计学方法的现象，给新药安全有效性的评价和确证带来诸多困难。本文针对三个实际案例进行深入地阐述和分析，探讨对缺失数据的预防、估计量的选择、缺失数据的统计分析方法以及敏感性分析等，以期有助于申请人在临床试验中加强对缺失数据预防和处理方法的重视。
Tackett, Jennifer L; Balsis, Steve; Oltmanns, Thomas F; Krueger, Robert F
2009-01-01
Proposed changes in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-V) include replacing current personality disorder (PD) categories on Axis II with a taxonomy of dimensional maladaptive personality traits. Most of the work on dimensional models of personality pathology, and on personality disorders per se, has been conducted on young and middle-aged adult populations. Numerous questions remain regarding the applicability and limitations of applying various PD models to early and later life. In the present paper, we provide an overview of such dimensional models and review current proposals for conceptualizing PDs in DSM-V. Next, we extensively review existing evidence on the development, measurement, and manifestation of personality pathology in early and later life focusing on those issues deemed most relevant for informing DSM-V. Finally, we present overall conclusions regarding the need to incorporate developmental issues in conceptualizing PDs in DSM-V and highlight the advantages of a dimensional model in unifying PD perspectives across the life span.
Sibille, Louis; Chambert, Benjamin; Alonso, Sandrine; Barrau, Corinne; D'Estanque, Emmanuel; Al Tabaa, Yassine; Collombier, Laurent; Demattei, Christophe; Kotzki, Pierre-Olivier; Boudousq, Vincent
2016-07-01
The purpose of this study was to compare a routine bone SPECT/CT protocol using CT reconstructed with filtered backprojection (FBP) with an optimized protocol using low-dose CT images reconstructed with adaptive statistical iterative reconstruction (ASiR). In this prospective study, enrolled patients underwent bone SPECT/CT, with 1 SPECT acquisition followed by 2 randomized CT acquisitions: FBP CT (FBP; noise index, 25) and ASiR CT (70% ASiR; noise index, 40). The image quality of both attenuation-corrected SPECT and CT images was visually (5-point Likert scale, 2 interpreters) and quantitatively (contrast ratio [CR] and signal-to-noise ratio [SNR]) estimated. The CT dose index volume, dose-length product, and effective dose were compared. Seventy-five patients were enrolled in the study. Quantitative attenuation-corrected SPECT evaluation showed no inferiority for contrast ratio and SNR issued from FBP CT or ASiR CT (respectively, 13.41 ± 7.83 vs. 13.45 ± 7.99 and 2.33 ± 0.83 vs. 2.32 ± 0.84). Qualitative image analysis showed no difference between attenuation-corrected SPECT images issued from FBP CT or ASiR CT for both interpreters (respectively, 3.5 ± 0.6 vs. 3.5 ± 0.6 and 3.6 ± 0.5 vs. 3.6 ± 0.5). Quantitative CT evaluation showed no inferiority for SNR between FBP and ASiR CT images (respectively, 0.93 ± 0.16 and 1.07 ± 0.17). Qualitative image analysis showed no quality difference between FBP and ASiR CT images for both interpreters (respectively, 3.8 ± 0.5 vs. 3.6 ± 0.5 and 4.0 ± 0.1 vs. 4.0 ± 0.2). Mean CT dose index volume, dose-length product, and effective dose for ASiR CT (3.0 ± 2.0 mGy, 148 ± 85 mGy⋅cm, and 2.2 ± 1.3 mSv) were significantly lower than for FBP CT (8.5 ± 3.7 mGy, 365 ± 160 mGy⋅cm, and 5.5 ± 2.4 mSv). The use of 70% ASiR blending in bone SPECT/CT can reduce the CT radiation dose by 60%, with no sacrifice in attenuation-corrected SPECT and CT image quality, compared with the conventional protocol using FBP CT
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
Karthik, M. N.; Davis, Moshe
2004-01-01
Searching techniques for Case Based Reasoning systems involve extensive methods of elimination. In this paper, we look at a new method of arriving at the right solution by performing a series of transformations upon the data. These involve N-gram based comparison and deduction of the input data with the case data, using Morphemes and Phonemes as the deciding parameters. A similar technique for eliminating possible errors using a noise removal function is performed. The error tracking and elim...
Energy Technology Data Exchange (ETDEWEB)
Aguado Garcia, D.; Ferrer Riquelme, A. J.; Seco Torrecillas, A.; Ferrer Polo, J.
2006-07-01
Due to the increasingly stringent effluents quality requirements imposed by the regulations, monitoring wastewater treatment plants (WWTP) becomes extremely important in order to achieve efficient process operations. Nowadays, at modern WWTP large number of online process variables are collected and these variable are usually highly correlated. Therefore, appropriate techniques are required to extract the information from the huge amount of collected data. In this work, the application of multivariate statistical projection techniques is presented as an effective strategy for monitoring a sequencing batch reactor (SBR) operated for enhanced biological phosphorus removal. (Author)
Bouderbala, Abdelkader; Remini, Boualem; Saaed Hamoudi, Abdelamir; Pulido-Bosch, Antonio
2016-06-01
The study focuses on the characterization of the groundwater salinity on the Nador coastal aquifer (Algeria). The groundwater quality has undergone serious deterioration due to overexploitation. Groundwater samplings were carried out in high and low waters in 2013, in order to study the evolution of groundwater hydrochemistry from the recharge to the coastal area. Different kinds of statistical analysis were made in order to identify the main hydrogeochemical processes occurring in the aquifer and to discriminate between different groups of groundwater. These statistical methods provide a better understanding of the aquifer hydrochem-istry, and put in evidence a hydrochemical classification of wells, showing that the area with higher salinity is located close to the coast, in the first two kilometers, where the salinity gradually increases as one approaches the seaside and suggests the groundwater salinization by sea-water intrusion.
Energy Technology Data Exchange (ETDEWEB)
Kawano, Toshihiko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-11-10
This theoretical treatment of low-energy compound nucleus reactions begins with the Bohr hypothesis, with corrections, and various statistical theories. The author investigates the statistical properties of the scattering matrix containing a Gaussian Orthogonal Ensemble (GOE) Hamiltonian in the propagator. The following conclusions are reached: For all parameter values studied, the numerical average of MC-generated cross sections coincides with the result of the Verbaarschot, Weidenmueller, Zirnbauer triple-integral formula. Energy average and ensemble average agree reasonably well when the width I is one or two orders of magnitude larger than the average resonance spacing d. In the strong-absorption limit, the channel degree-of-freedom ν _{a} is 2. The direct reaction increases the inelastic cross sections while the elastic cross section is reduced.
Directory of Open Access Journals (Sweden)
Bouderbala Abdelkader
2016-06-01
Full Text Available The study focuses on the characterization of the groundwater salinity on the Nador coastal aquifer (Algeria. The groundwater quality has undergone serious deterioration due to overexploitation. Groundwater samplings were carried out in high and low waters in 2013, in order to study the evolution of groundwater hydrochemistry from the recharge to the coastal area. Different kinds of statistical analysis were made in order to identify the main hydrogeochemical processes occurring in the aquifer and to discriminate between different groups of groundwater. These statistical methods provide a better understanding of the aquifer hydrochemistry, and put in evidence a hydrochemical classification of wells, showing that the area with higher salinity is located close to the coast, in the first two kilometers, where the salinity gradually increases as one approaches the seaside and suggests the groundwater salinization by seawater intrusion.
Energy Technology Data Exchange (ETDEWEB)
Helton, J.C.; Kleijnen, J.P.C.
1999-03-24
Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.
2011-11-20
symmetric and positive semi- definite (Amari and Nagaoka 2000). If, in addition, G nð Þ is positive definite, then a Riemannian metric (see Spivak ...statistical manifold S. This results to a system of second order differential equations (Eq. 10). By substituting the values of the Christoffel Cijk ( Spivak ...Lauret O, Maheu C, Milagro M, Picot N (2009) In: Radar altimetry tutorial. Benveniste J, Picot N (eds) http://www.altimetry.info Spivak M (1965
Asfahani, Jamal
2014-02-01
Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.
Wittenberg, P; Sever, K; Knoth, S; Sahin, N; Bondarenko, J
2013-01-01
Due to substantial progress made in road safety in the last ten years, the European Union (EU) renewed the ambitious agreement of halving the number of persons killed on the roads within the next decade. In this paper we develop a method that aims at finding an optimal target for each nation, in terms of being as achievable as possible, and with the cumulative EU target being reached. Targets as an important component in road safety policy are given as reduction rate or as absolute number of road traffic deaths. Determination of these quantitative road safety targets (QRST) is done by a top-down approach, formalized in a multi-stage adjustment procedure. Different QRST are derived under consideration of recent research. The paper presents a method to break the national target further down to regional targets in case of the German Federal States. Generalized linear models are fitted to data in the period 1991-2010. Our model selection procedure chooses various models for the EU and solely log-linear models for the German Federal States. If the proposed targets for the EU Member States are attained, the sum of fatalities should not exceed the total value of 15,465 per year by 2020. Both, the mean level and the range of mortality rates within the EU could be lowered from 28-113 in 2010 to 17-41 per million inhabitants in 2020. This study provides an alternative to the determination of safety targets by political commitments only, taking the history of road fatalities trends and population into consideration.
RF Calibration of On-Chip DfT Chain by DC Stimuli and Statistical Multivariate Regression Technique
Ramzan, Rashad; Dabrowski, Jerzy
2015-01-01
The problem of parameter variability in RF and analog circuits is escalating with CMOS scaling. Consequently every RF chip produced in nano-meter CMOS technologies needs to be tested. On-chip Design for Testability (DfT) features, which are meant to reduce test time and cost also suffer from parameter variability. Therefore, RF calibration of all on-chip test structures is mandatory. In this paper, Artificial Neural Networks (ANN) are employed as a multivariate regression technique to archite...
Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol
2013-12-01
The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.
Chorus wave-normal statistics in the Earth's radiation belts from ray tracing technique
Directory of Open Access Journals (Sweden)
H. Breuillard
2012-08-01
Full Text Available Discrete ELF/VLF (Extremely Low Frequency/Very Low Frequency chorus emissions are one of the most intense electromagnetic plasma waves observed in radiation belts and in the outer terrestrial magnetosphere. These waves play a crucial role in the dynamics of radiation belts, and are responsible for the loss and the acceleration of energetic electrons. The objective of our study is to reconstruct the realistic distribution of chorus wave-normals in radiation belts for all magnetic latitudes. To achieve this aim, the data from the electric and magnetic field measurements onboard Cluster satellite are used to determine the wave-vector distribution of the chorus signal around the equator region. Then the propagation of such a wave packet is modeled using three-dimensional ray tracing technique, which employs K. Rönnmark's WHAMP to solve hot plasma dispersion relation along the wave packet trajectory. The observed chorus wave distributions close to waves source are first fitted to form the initial conditions which then propagate numerically through the inner magnetosphere in the frame of the WKB approximation. Ray tracing technique allows one to reconstruct wave packet properties (electric and magnetic fields, width of the wave packet in k-space, etc. along the propagation path. The calculations show the spatial spreading of the signal energy due to propagation in the inhomogeneous and anisotropic magnetized plasma. Comparison of wave-normal distribution obtained from ray tracing technique with Cluster observations up to 40° latitude demonstrates the reliability of our approach and applied numerical schemes.
Islam, Rafiqul; Zaidan, A A; Zaidan, B B
2010-01-01
A Previously traditional methods were sufficient to protect the information, since it is simplicity in the past does not need complicated methods but with the progress of information technology, it become easy to attack systems, and detection of encryption methods became necessary to find ways parallel with the differing methods used by hackers, so the embedding methods could be under surveillance from system managers in an organization that requires the high level of security. This fact requires researches on new hiding methods and cover objects which hidden information is embedded in. It is the result from the researches to embed information in executable files, but when will use the executable file for cover they have many challenges must be taken into consideration which is any changes made to the file will be firstly detected by untie viruses, secondly the functionality of the file is not still functioning. In this paper, a new information hiding system is presented. The aim of the proposed system is to ...
Classical Statistics and Statistical Learning in Imaging Neuroscience
Directory of Open Access Journals (Sweden)
Danilo Bzdok
2017-10-01
Full Text Available Brain-imaging research has predominantly generated insight by means of classical statistics, including regression-type analyses and null-hypothesis testing using t-test and ANOVA. Throughout recent years, statistical learning methods enjoy increasing popularity especially for applications in rich and complex data, including cross-validated out-of-sample prediction using pattern classification and sparsity-inducing regression. This concept paper discusses the implications of inferential justifications and algorithmic methodologies in common data analysis scenarios in neuroimaging. It is retraced how classical statistics and statistical learning originated from different historical contexts, build on different theoretical foundations, make different assumptions, and evaluate different outcome metrics to permit differently nuanced conclusions. The present considerations should help reduce current confusion between model-driven classical hypothesis testing and data-driven learning algorithms for investigating the brain with imaging techniques.
Energy Technology Data Exchange (ETDEWEB)
Kapur, G.S.; Sastry, M.I.S.; Jaiswal, A.K.; Sarpal, A.S
2004-03-17
The present paper describes various classification techniques like cluster analysis, principal component (PC)/factor analysis to classify different types of base stocks. The API classification of base oils (Group I-III) has been compared to a more detailed NMR derived chemical compositional and molecular structural parameters based classification in order to point out the similarities of the base oils in the same group and the differences between the oils placed in different groups. The detailed compositional parameters have been generated using {sup 1}H and {sup 13}C nuclear magnetic resonance (NMR) spectroscopic methods. Further, oxidation stability, measured in terms of rotating bomb oxidation test (RBOT) life, of non-conventional base stocks and their blends with conventional base stocks, has been quantitatively correlated with their {sup 1}H NMR and elemental (sulphur and nitrogen) data with the help of multiple linear regression (MLR) and artificial neural networks (ANN) techniques. The MLR based model developed using NMR and elemental data showed a high correlation between the 'measured' and 'estimated' RBOT values for both training (R=0.859) and validation (R=0.880) data sets. The ANN based model, developed using fewer number of input variables (only {sup 1}H NMR data) also showed high correlation between the 'measured' and 'estimated' RBOT values for training (R=0.881), validation (R=0.860) and test (R=0.955) data sets.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz
2017-10-01
Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Cláudio Roberto Rosário
2012-07-01
Full Text Available The purpose of this research is to improve the practice on customer satisfaction analysis The article presents an analysis model to analyze the answers of a customer satisfaction evaluation in a systematic way with the aid of multivariate statistical techniques, specifically, exploratory analysis with PCA – Partial Components Analysis with HCA - Hierarchical Cluster Analysis. It was tried to evaluate the applicability of the model to be used by the issue company as a tool to assist itself on identifying the value chain perceived by the customer when applied the questionnaire of customer satisfaction. It was found with the assistance of multivariate statistical analysis that it was observed similar behavior among customers. It also allowed the company to conduct reviews on questions of the questionnaires, using analysis of the degree of correlation between the questions that was not a company’s practice before this research.
Percutaneous vertebroplasty: technical considerations
Institute of Scientific and Technical Information of China (English)
Gao-jun TENG; Shi-cheng HE
2005-01-01
Percutaneous vertebroplasty (PVP) is a relative new interventional technique, which is widely used in treatment of vertebral collapse caused by vertebral neoplasms and osteoporotic compression fractures. The general technical considerations of PVP techniques are discussed based on authors' experience obtained over 400 patients in the past years in this article, including preparation of PMMA, instrument of PVP, guidance and puncture approaches, and technique of the procedure, etc. The conclusion is that PVP is a safe procedure if the physicians handle it properly.
Directory of Open Access Journals (Sweden)
R. Matos Cruz
2011-12-01
considered as an adjunct to cause-related periodontal therapy. The kind of surgery performed, the number of sites included and the moment at which it should performed is decided after evaluating for the initial cause-related therapy results. The ultimate objective of periodontal surgical treatment is the long term preservation of the periodontium. Periodontal surgery can contribute to this end creating accessibility for an adequate scaling and root planning therefore restoring the gingival morphology which facilitates the automatic plaque control of the patient. Developed surgical techniques must be evaluated on the basis of their potential to facilitate the elimination of subgingival deposits, as well as facilitate plaque control and thus improve the long term preservation of the periodontium. In the present article the authors review the basis of surgical periodontal treatment as well as related concepts and considerations, objectives, indications and contraindications, procedures and factors that determine the selection of one or the other surgical technique.
Boisrobert, Loic; Laclaustra, Martin; Bossa, Matias; Frangi, Andres G.; Frangi, Alejandro F.
2005-04-01
Clinical studies report that impaired endothelial function is associated with Cardio-Vascular Diseases (CVD) and their risk factors. One commonly used mean for assessing endothelial function is Flow-Mediated Dilation (FMD). Classically, FMD is quantified using local indexes e.g. maximum peak dilation. Although such parameters have been successfully linked to CVD risk factors and other clinical variables, this description does not consider all the information contained in the complete vasodilation curve. Moreover, the relation between flow impulse and the vessel vasodilation response to this stimulus, although not clearly known, seems to be important and is not taken into account in the majority of studies. In this paper we propose a novel global parameterization for the vasodilation and the flow curves of a FMD test. This parameterization uses Principal Component Analysis (PCA) to describe independently and jointly the variability of flow and FMD curves. These curves are obtained using computerized techniques (based on edge detection and image registration, respectively) to analyze the ultrasound image sequences. The global description obtained through PCA yields a detailed characterization of the morphology of such curves allowing the extraction of intuitive quantitative information of the vasodilation process and its interplay with flow changes. This parameterization is consistent with traditional measurements and, in a database of 177 subjects, seems to correlate more strongly (and with more clinical parameters) than classical measures to CVD risk factors and clinical parameters such as LDL- and HDL-Cholesterol.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
Institute of Scientific and Technical Information of China (English)
Liu Wenyun; Ding Xiaobo; Kong Boyu; Fan Baoyan; Chen Liang
2014-01-01
Background Currently there is a trend towards reducing radiation dose while maintaining image quality during computer tomography (CT) examination.This results from the concerns about radiation exposure from CT and the potential increase in the incidence of radiation induced carcinogenesis.This study aimed to investigate the lowest radiation dose for maintaining good image quality in adult chest scanning using GE CT equipment.Methods Seventy-two adult patients were examined by Gemstone Spectral CT.They were randomly divided into six groups.We set up a different value of noise index (NI) when evaluating each group every other number from 13.0 to 23.0.The original images were acquired with a slice of 5 mm thickness.For each group,several image series were reconstructed using different levels of adaptive statistical iterative reconstruction (ASIR) (30％,50％,and 70％).We got a total of 18 image sequences of different combinations of NI and ASIR percentage.On one hand,quantitative indicators,such as CT value and standard deviation (SD),were assessed at the region of interest.The signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated.The volume CT dose index (CTDI) and dose length product (DLP) were recorded.On the other hand,two radiologists with ＞5 years of experience blindly reviewed the subjective image quality using the standards we had previously set.Results The different combinations of noise index and ASIR were assessed.There was no significant difference in CT values among the 18 image sequences.The SD value was reduced with the noise index's reduction or ASIR's increase.There was a trend towards gradually lower SNR and CNR with an NI increase.The CTDI and DLP were diminishing as the NI increased.The scores from subjective image quality evaluation were reduced in all groups as the ASIR increased.Conclusions Increasing NI can reduce radiation dose.With the premise of maintaining the same image quality,using a suitable percentage of
Wienke, B R; O'Leary, T R
2008-05-01
Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application.
Directory of Open Access Journals (Sweden)
Shi Xiaoyan
2010-10-01
selection techniques with classification methods to develop assays using cell line genomic measurements that performed well in patient data. In both case studies, we constructed parsimonious models that generalized well from cell lines to patients.
Herojeet, Rajkumar; Rishi, Madhuri S.; Lata, Renu; Dolma, Konchok
2017-09-01
multivariate techniques for reliable quality characterization of surface water quality to develop effective pollution reduction strategies and maintain a fine balance between the industrialization and ecological integrity.
Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel
2017-09-01
In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Energy Technology Data Exchange (ETDEWEB)
Somerville, Richard
2013-08-22
The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).
Chandrasekaran, A; Ravisankar, R; Rajalakshmi, A; Eswaran, P; Vijayagopal, P; Venkatraman, B
2015-02-01
Gamma Ray and Fourier Transform Infrared (FTIR) spectroscopic techniques were used to evaluate the natural radioactivity due to natural radionuclides and mineralogical characterization in soils of Yelagiri hills, Tamilnadu, India. Various radiological parameters were calculated to assess the radiation hazards associated with the soil. The distribution pattern of activity due to natural radionuclides is explained by Kriging method of mapping. Using FTIR spectroscopic technique the minerals such as quartz, microcline feldspar, orthoclase feldspar, kaolinite, montmorillonite, illite, and organic carbon were identified and characterized. The extinction coefficient values were calculated to know the relative distribution of major minerals such as quartz, microcline feldspar, orthoclase feldspar and kaolinite. The calculated values indicate that the amount of quartz is higher than orthoclase feldspar, microcline feldspar and much higher than kaolinite. Crystallinity index was calculated to know the crystalline nature of quartz. The result indicates that the presence of disordered crystalline quartz in soils. The relation between minerals and radioactivity was assessed by multivariate statistical analysis (Pearson's correlation and cluster analysis). The statistical analysis confirms that the clay mineral kaolinite and non-clay mineral quartz is the major factor than other major minerals to induce the important radioactivity variables and concentrations of uranium and thorium.
Velasco-Tapia, Fernando
2014-01-01
Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).
Directory of Open Access Journals (Sweden)
Fernando Velasco-Tapia
2014-01-01
Full Text Available Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC volcanic range (Mexican Volcanic Belt. In this locality, the volcanic activity (3.7 to 0.5 Ma was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward’s linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas in the comingled lavas (binary mixtures.
Freund, Rudolf J; Wilson, William J
2010-01-01
Statistical Methods, 3e provides students with a working introduction to statistical methods offering a wide range of applications that emphasize the quantitative skills useful across many academic disciplines. This text takes a classic approach emphasizing concepts and techniques for working out problems and intepreting results. The book includes research projects, real-world case studies, numerous examples and data exercises organized by level of difficulty. This text requires that a student be familiar with algebra. New to this edition: NEW expansion of exercises a
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Statistical methods in translational medicine.
Chow, Shein-Chung; Tse, Siu-Keung; Lin, Min
2008-12-01
This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials), information (e.g. translation of basic discoveries to the clinic) and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints) in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physicianscientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change) are reviewed.
Statistical Methods in Translational Medicine
Directory of Open Access Journals (Sweden)
Shein-Chung Chow
2008-12-01
Full Text Available This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials, information (e.g. translation of basic discoveries to the clinic and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician—scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change are reviewed.
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
Directory of Open Access Journals (Sweden)
Georgescu Daniel Ștefan
2014-09-01
Full Text Available This paper presents the appreciations and contributions regarding the use of psychological techniques to stimulate technical creativity with special reference to consonant association technique and inversion technique. The study is performed in the field of TISR transformers and electric motors with limited movement, starting from the analogy between a transformer and an electric motor with shorted coil. It approached a particular aspect of inversion technique in relation with the transformation of negative effects and results of laws, phenomena and processes into useful applications. The matter reffered to is related to the question: ,,why disadvantages and no advantages ?". At the end of the paper are presented and discussed some experimental models produced and studied by the authors in the Research Laboratory of Machines, Equipment and Drives at the University of Suceava and are exposed conclusions drawn from the experimental study and directions for future research.
Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik
2016-04-01
Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.
Directory of Open Access Journals (Sweden)
I. Arismendi
2014-05-01
Full Text Available Central tendency statistics may not capture relevant or desired characteristics about the variability of continuous phenomena and thus, they may not completely track temporal patterns of change. Here, we present two methodological approaches to identify long-term changes in environmental regimes. First, we use higher statistical moments (skewness and kurtosis to examine potential changes of empirical distributions at decadal scale. Second, we adapt an outlier detection procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability, patterns in variability through time, and spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal a differentiated vulnerability to both the climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi
2015-03-15
Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate
Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert
2016-05-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.
Directory of Open Access Journals (Sweden)
Eduardo de Rezende Francisco
2010-08-01
Full Text Available Given the growing importance of integrating marketing and operations indicators to enhance business performance, and the availability of sophisticated geospatial statistical techniques, this paper draws on these concepts to develop an indicator of propensity to energy commercial losses. Loss management is a strategic topic among energy distribution companies, in particular for AES Eletropaulo. In such context, this work’s objectives are: (i to appropriate spatial auto-regressive models and geographically weighted regression (GWR in measuring the cultural influence of neighborhood in customer behavior in the energy fraud act; (ii to replace slum coverage areas by a regional social vulnerability index; and (iii to associate energy loss with customer satisfaction indicators, in a spatial-temporal approach. Spatial regression techniques are revised, followed by a discussion on social vulnerability and customer satisfaction indicators. Operational data obtained from AES Eletropaulo’s geographical information systems were combined with secondary data in order to generate predictive regression models, having energy loss as the response variable. Results show that the incorporation of market and social oriented data about customers substantially contribute to explicate energy loss – the coefficient of determination in the regression models rose from 17.76% to 63.29% when the simpler model was compared to the more complex one. Suggestions are made for future work and opportunities for the replication of the methodology in comparable contexts are discussed.
Van Schependom, Jeroen; Yu, Weiping; Gielen, Jeroen; Laton, Jorne; De Keyser, Jacques; De Hert, Marc; Nagels, Guy
2015-10-01
Metabolic and cardiovascular diseases in patients with schizophrenia have gained a lot of interest in recent years. Developing an algorithm to detect the metabolic syndrome based on readily available variables would eliminate the need for blood sampling, which is considered expensive and inconvenient in this population. All patients fulfilled DSM-IV diagnosis of schizophrenia or schizoaffective disorder. We used the International Diabetes Federation criteria (European population) to diagnose the metabolic syndrome. We used logistic regression and optimized artificial neural networks and support vector machines to detect the metabolic syndrome in a cohort of schizophrenic patients of the University Psychiatric Center Kortenberg, KU Leuven, Belgium. Testing was done on one-third of the included cohort (202 patients); training was performed using a 10-fold stratified cross-validation scheme. The data were collected between 2000 and 2008. All 3 methods yielded similar results, with satisfying accuracies of about 80%. However, none of the advanced statistical methods could improve on the results obtained using a very simple and naive model including only central obesity and information on blood pressure. Although so-called pattern recognition techniques bear high promise in improving clinical decision making, the results should be presented with caution and preferably in comparison with a less complicated technique. © Copyright 2015 Physicians Postgraduate Press, Inc.
Li, R.; Wang, S.-Y.; Gillies, R. R.
2016-04-01
Large biases associated with climate projections are problematic when it comes to their regional application in the assessment of water resources and ecosystems. Here, we demonstrate a method that can reduce systematic biases in regional climate projections. The global and regional climate models employed to demonstrate the technique are the Community Climate System Model (CCSM) and the Weather Research and Forecasting (WRF) model. The method first utilized a statistical regression technique and a global reanalysis dataset to correct biases in the CCSM-simulated variables (e.g., temperature, geopotential height, specific humidity, and winds) that are subsequently used to drive the WRF model. The WRF simulations were conducted for the western United States and were driven with (a) global reanalysis, (b) original CCSM, and (c) bias-corrected CCSM data. The bias-corrected CCSM data led to a more realistic regional climate simulation of precipitation and associated atmospheric dynamics, as well as snow water equivalent (SWE), in comparison to the original CCSM-driven WRF simulation. Since most climate applications rely on existing global model output as the forcing data (i.e., they cannot re-run or change the global model), which often contain large biases, this method provides an effective and economical tool to reduce biases in regional climate downscaling simulations of water resource variables.
Mathematical and statistical analysis
Houston, A. Glen
1988-01-01
The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.
Experiment in Elementary Statistics
Fernando, P. C. B.
1976-01-01
Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Ertemi, Hani; Khetrapal, Pramit; Pavithran, Nevil M; Mumtaz, Faiz
2017-02-03
Nonmodifiable factors including pre-operative renal function and amount of healthy renal tissue preserved are the most important predictive factors that determine renal function after partial nephrectomy. Ischaemia time is an important modifiable risk factor and cold ischaemia time should be used if longer ischaemia time is anticipated. New techniques may have a role in maximising postoperative kidney function, but more robust studies are required to understand their potential benefits and risks.
Hamchevici, Carmen; Udrea, Ion
2013-11-01
The concept of basin-wide Joint Danube Survey (JDS) was launched by the International Commission for the Protection of the Danube River (ICPDR) as a tool for investigative monitoring under the Water Framework Directive (WFD), with a frequency of 6 years. The first JDS was carried out in 2001 and its success in providing key information for characterisation of the Danube River Basin District as required by WFD lead to the organisation of the second JDS in 2007, which was the world's biggest river research expedition in that year. The present paper presents an approach for improving the survey strategy for the next planned survey JDS3 (2013) by means of several multivariate statistical techniques. In order to design the optimum structure in terms of parameters and sampling sites, principal component analysis (PCA), factor analysis (FA) and cluster analysis were applied on JDS2 data for 13 selected physico-chemical and one biological element measured in 78 sampling sites located on the main course of the Danube. Results from PCA/FA showed that most of the dataset variance (above 75%) was explained by five varifactors loaded with 8 out of 14 variables: physical (transparency and total suspended solids), relevant nutrients (N-nitrates and P-orthophosphates), feedback effects of primary production (pH, alkalinity and dissolved oxygen) and algal biomass. Taking into account the representation of the factor scores given by FA versus sampling sites and the major groups generated by the clustering procedure, the spatial network of the next survey could be carefully tailored, leading to a decreasing of sampling sites by more than 30%. The approach of target oriented sampling strategy based on the selected multivariate statistics can provide a strong reduction in dimensionality of the original data and corresponding costs as well, without any loss of information.
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Porter, W. C.; Heald, C. L.; Cooley, D. S.; Russell, B. T.
2013-12-01
Episodes of air-quality extremes are known to be heavily influenced by meteorological conditions, but traditional statistical analysis techniques focused on means and standard deviations may not capture important relationships at the tails of these two respective distributions. Using quantile regression (QR) and extreme value theory (EVT), methodologies specifically developed to examine the behavior of heavy-tailed phenomena, we analyze extremes in the multi-decadal record of ozone (O3) and fine particulate matter (PM2.5) in the United States. We investigate observations from the Air Quality System (AQS) and Interagency Monitoring of Protected Visual Environments (IMPROVE) networks for connections to meteorological drivers, as provided by the National Center for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) product. Through regional characterization by quantile behavior and EVT modeling of the meteorological covariates most responsible for extreme levels of O3 and PM2.5, we estimate pollutant exceedance frequencies and uncertainties in the United States under current and projected future climates, highlighting those meteorological covariates and interactions whose influence on air-quality extremes differs most significantly from the behavior of the bulk of the distribution. As current policy may be influenced by air-quality projections, we then compare these estimated frequencies to those produced by NCAR's Community Earth System Model (CESM) identifying regions, covariates, and species whose extreme behavior may not be adequately captured by current models.
Blondeau-Patissier, David; Gower, James F. R.; Dekker, Arnold G.; Phinn, Stuart R.; Brando, Vittorio E.
2014-04-01
The need for more effective environmental monitoring of the open and coastal ocean has recently led to notable advances in satellite ocean color technology and algorithm research. Satellite ocean color sensors' data are widely used for the detection, mapping and monitoring of phytoplankton blooms because earth observation provides a synoptic view of the ocean, both spatially and temporally. Algal blooms are indicators of marine ecosystem health; thus, their monitoring is a key component of effective management of coastal and oceanic resources. Since the late 1970s, a wide variety of operational ocean color satellite sensors and algorithms have been developed. The comprehensive review presented in this article captures the details of the progress and discusses the advantages and limitations of the algorithms used with the multi-spectral ocean color sensors CZCS, SeaWiFS, MODIS and MERIS. Present challenges include overcoming the severe limitation of these algorithms in coastal waters and refining detection limits in various oceanic and coastal environments. To understand the spatio-temporal patterns of algal blooms and their triggering factors, it is essential to consider the possible effects of environmental parameters, such as water temperature, turbidity, solar radiation and bathymetry. Hence, this review will also discuss the use of statistical techniques and additional datasets derived from ecosystem models or other satellite sensors to characterize further the factors triggering or limiting the development of algal blooms in coastal and open ocean waters.
... Certification Import Safety International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS Injury ...
Cosmic Statistics of Statistics
Szapudi, I.; Colombi, S.; Bernardeau, F.
1999-01-01
The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...
无痛医疗理念与技术的考量%Consideration for Ideas and Techniques of Painless Medical Care
Institute of Scientific and Technical Information of China (English)
孙焱芫
2015-01-01
在中国现行“无痛医疗”流行的背后，可能存在着对无痛理念内涵的错误解读，无痛医疗的开展也缺乏足够的人员和技术支撑。在麻醉医生为主体参与的“无痛医院”的建设中，需要充分认识到：理念的更新依赖基础科学研究的深入，理念的实施依靠规范化体制建设和人员技术培训。随着医学转型和舒适医疗服务理念的深化，无痛医疗工作范围必将更加扩大，因此，务必要重视无痛理念的构建和技术安全的考量，发挥麻醉学科在现代医疗中的重要作用。%The "painless medical care" is getting more popular in China ,but there are some misunderstanding on the connotation of painless .To launch a painless medical care system in a hospital needs sufficient well‐trained medical staff and technical support .Anesthesiologists should play a key role in the construction of"painless hospital".In order to build‐up a painless hospital ,it needs to keep in mind that update of ideas rely on in‐depth basic scientific research ,while implementation of ideas depend on standardized system construction and staff technical training .With the development of medical transformation and comfortable medical service concept ,the scope of painless medical work will expand more . Therefore ,it must be sure to pay attention to the construction of painless concept and considerations of technical safety , let anesthesia disciplines play an important role in modern medical health service .
Energy Technology Data Exchange (ETDEWEB)
Lombardi, S. [Rome, Univ. `La Sapienza` (Italy). Dept. of Earth Sciences; Serafini, S.; Zarlenga, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dept. of Environment; Ciotoli, G.
1997-12-31
A comparative approach to neotectonic studies is presented, which encompasses the integration of geochemical, morphological and structural analyses. Nine-hundred-nineteen soil gas samples were collected in the Era basin (Tuscany, Central Italy) and their helium contents were measured. Helium distribution has been compared with location and orientation of known brittle deformations (faults and fractures) and morphological features obtained by air-photo interpretation and drainage network analyses. Obtained data were statistically compared by means of rose diagram plots concerning the investigated parameters and locally studied by associating the observed helium anomaly ridges with the known morphological and structural elements. The statistical approach showed a good convergence between the applied methodologies. Data from geo morphological , meso structural, and geochemical surveys are consistent with the NE-SW and NW-SE orientations, i.e. Apennine and anti-Apennine trend of the known structural pattern. Moreover the apparent N-S and E-W trending helium anomalies are thought to be due to the Middle Pleistocene deformation phase along these directions. The relationship between helium distribution and the strain field is strengthen by the good correspondence, at local scale, among geochemical data and results of the structural and geo morphological features (Orciatico-Montecatini val di Cecina and Peccioli areas). However helium soil gas technique showed to be a sensible tool for neotectonic studies in clay basin, as soil gas defines the leakage of deep seated gas along tectonic discontinuities even if they have no surface evidence and where the clay deposit is hundreds of meters thick. [Italiano] In questo lavoro viene presentata una nuova metodologia di infagine volta agli studi di neotettonica nei bacini argillosi e basata sull`integrazione di dati geochimici, morfologici e strutturali. I dati geochimici si riferiscono alle analisi delle concentrazioni di elio in
Pueyo Anchuela, Ó.; Pocoví Juan, A.; Casas-Sainz, A. M.; Ansón-López, D.; Gil-Garbi, H.
2013-05-01
Aerial photographs, historical cartographies, and field inspection are useful tools in urban planning design on mantled karst because they permit a wide time interval to be analyzed. In the case of Zaragoza city, several works have confirmed the interest of these approaches in configuring the urban planning code and therefore represent a promising technique. Nevertheless, some caveats should be taken into account when using this kind of information. A detailed analysis is presented comparing (in a case study from the surroundings of Zaragoza) geomorphological, historical analysis, and field inspection with geophysical data. Field inspection in a noncultivated area permits the constraint of the presence of karst indicators below the geomorphological resolution of aerial photographs and shows results consistent with geophysical surveys. The studied case shows an inner zone coinciding with the sinkhole mapped from aerial photographs that correlates with changes in the position of the substratum and changes in thickness of alluvial sediments. The integrated analysis permits us to define an external subsidence ring around the geomorphological sinkhole whose surface is twice the size of the inner zone. This outer ring is indicated by geometrical changes in GPR profiles, increases of thickness of the conductive shallower unit toward the collapse, and small collapses on marginal cracks. These results support the higher extension of karst hazards linked to sinkholes with respect to their geomorphological expression and the needed detailed analysis to constrain the real sinkhole size or the use of security radii surrounding this surficial evidence when geomorphological data is used for the hazard analyses or the urban planning at karstic zones.
Al Furajii, Hazar; Kennedy, Niall; Cahill, Ronan A.
2017-01-01
PURPOSE: Transanal minimally invasive surgery using single port instrumentation is now well described for the performance of total mesorectal excision with restorative colorectal/anal anastomosis most-often in conjunction with transabdominal multiport assistance. While non-restorative abdomino-endoscopic perineal excision of the anorectum is conceptually similar, it has been less detailed in the literature. METHODS: Consecutive patients undergoing non-restorative ano-proctectomy including a transperineal endoscopic component were analysed. All cases commenced laparoscopically with initial medial to lateral mobilisation of any left colon and upper rectum. The lower anorectal dissection started via an intersphincteric or extrasphincteric incision for benign and malignant pathology, respectively, and following suture closure and circumferential mobilisation of the anorectum, a single port (GelPOINT Path, Applied Medical) was positioned allowing the procedure progress endoscopically in all quadrants up to the cephalad dissection level. Standard laparoscopic instrumentation was used. Specimens were removed perineally. RESULTS: Of the 13 patients (median age 55 years, median BMI 28.75 kg/m2, median follow-up 17 months, 6 males), ten needed completion proctectomy for ulcerative colitis following prior total colectomy (three with concomitant parastomal hernia repair) while three required abdominoperineal resection for locally advanced rectal cancer following neoadjuvant chemoradiotherapy. Median operative time was 190 min, median post-operative discharge day was 7. Eleven specimens were of high quality. Four patients developed perineal wound complications (one chronic sinus, two abscesses needing drainage) within median 17-month follow-up. CONCLUSION: Convergence of transabdominal and transanal technology and technique allows accuracy in combination operative performance. Nuanced appreciation of transperineal operative access should allow specified standardisation and
Energy Technology Data Exchange (ETDEWEB)
Flampouri, S; Li, Z; Hoppe, B [University of Florida Health Proton Therapy Institute, Jacksonville, FL (United States)
2015-06-15
Purpose: To develop a treatment planning method for passively-scattered involved-node proton therapy of mediastinal lymphoma robust to breathing and cardiac motions. Methods: Beam-specific planning treatment volumes (bsPTV) are calculated for each proton field to incorporate pertinent uncertainties. Geometric margins are added laterally to each beam while margins for range uncertainty due to setup errors, breathing, and calibration curve uncertainties are added along each beam. The calculation of breathing motion and deformation effects on proton range includes all 4DCT phases. The anisotropic water equivalent margins are translated to distances on average 4DCT. Treatment plans are designed so each beam adequately covers the corresponding bsPTV. For targets close to the heart, cardiac motion effects on dosemaps are estimated by using a library of anonymous ECG-gated cardiac CTs (cCT). The cCT, originally contrast-enhanced, are partially overridden to allow meaningful proton dose calculations. Targets similar to the treatment targets are drawn on one or more cCT sets matching the anatomy of the patient. Plans based on the average cCT are calculated on individual phases, then deformed to the average and accumulated. When clinically significant dose discrepancies occur between planned and accumulated doses, the patient plan is modified to reduce the cardiac motion effects. Results: We found that bsPTVs as planning targets create dose distributions similar to the conventional proton planning distributions, while they are a valuable tool for visualization of the uncertainties. For large targets with variability in motion and depth, integral dose was reduced because of the anisotropic margins. In most cases, heart motion has a clinically insignificant effect on target coverage. Conclusion: A treatment planning method was developed and used for proton therapy of mediastinal lymphoma. The technique incorporates bsPTVs compensating for all common sources of uncertainties
Directory of Open Access Journals (Sweden)
Pascual Izquierdo-Egea
2015-03-01
Full Text Available Se presenta aqui una tecnica estadistica para medir la conflictividad social a traves del registro mortuorio. Nace al amparo del metodo de valoracion contextual empleado en el analisis de los ajuares funerarios desde 1993. Se trata de una herramienta fundamental para el desarrollo de la arqueologia de los fenomenos sociales, cuyos relevantes resultados empiricos avalan su trascendencia teorica. Tras proceder a su conceptualizacion en funcion de la desigualdad social y la riqueza relativa, se explican las dos clases de conflictividad social definidas: estructural o estatica y coyuntural o dinamica. Finalmente, se incluyen sus conexiones con la ley demografica de Malthus a traves de sus dos parametros: poblacion y recursos. Todo este entramado teorico se ilustra con algunas aplicaciones referidas a las civilizaciones antiguas, abarcando la protohistoria iberica, la Mesoamerica prehispanica o la Roma altoimperial. ENGLISH: A statistical technique to measure social conflict through the mortuary record is presented here. It is born under the contextual valuation method used in the analysis of grave goods since 1993. This is a fundamental tool for the development of the archaeology of social phenomena, whose relevant empirical results support its theoretical significance. After conveying its conceptualization in terms of social inequality and relative wealth, the two classes of social conflict are explained: static or structural and dynamic or conjunctural. Finally, connections with the Malthusian demographic law through its two parameters—population and resources—are included. The synthesis of these theoretical frameworks is illustrated with applications to ancient civilizations, including Iberian protohistory, prehispanic Mesoamerica, and early imperial Rome.
Hong, Haoyuan; Pourghasemi, Hamid Reza; Pourtaghi, Zohre Sadat
2016-04-01
Landslides are an important natural hazard that causes a great amount of damage around the world every year, especially during the rainy season. The Lianhua area is located in the middle of China's southern mountainous area, west of Jiangxi Province, and is known to be an area prone to landslides. The aim of this study was to evaluate and compare landslide susceptibility maps produced using the random forest (RF) data mining technique with those produced by bivariate (evidential belief function and frequency ratio) and multivariate (logistic regression) statistical models for Lianhua County, China. First, a landslide inventory map was prepared using aerial photograph interpretation, satellite images, and extensive field surveys. In total, 163 landslide events were recognized in the study area, with 114 landslides (70%) used for training and 49 landslides (30%) used for validation. Next, the landslide conditioning factors-including the slope angle, altitude, slope aspect, topographic wetness index (TWI), slope-length (LS), plan curvature, profile curvature, distance to rivers, distance to faults, distance to roads, annual precipitation, land use, normalized difference vegetation index (NDVI), and lithology-were derived from the spatial database. Finally, the landslide susceptibility maps of Lianhua County were generated in ArcGIS 10.1 based on the random forest (RF), evidential belief function (EBF), frequency ratio (FR), and logistic regression (LR) approaches and were validated using a receiver operating characteristic (ROC) curve. The ROC plot assessment results showed that for landslide susceptibility maps produced using the EBF, FR, LR, and RF models, the area under the curve (AUC) values were 0.8122, 0.8134, 0.7751, and 0.7172, respectively. Therefore, we can conclude that all four models have an AUC of more than 0.70 and can be used in landslide susceptibility mapping in the study area; meanwhile, the EBF and FR models had the best performance for Lianhua
Manchikanti, Laxmaiah; Falco, Frank J E; Singh, Vijay; Benyamin, Ramsin M; Racz, Gabor B; Helm, Standiford; Caraway, David L; Calodney, Aaron K; Snook, Lee T; Smith, Howard S; Gupta, Sanjeeva; Ward, Stephen P; Grider, Jay S; Hirsch, Joshua A
2013-04-01
disproportionate number of challenges compared to established medical specialties, including the inappropriate utilization of ineffective and unsafe techniques. In 2000, the American Society of Interventional Pain Physicians (ASIPP) created treatment guidelines to help practitioners. There have been 5 subsequent updates. These guidelines address the issues of systematic evaluation and ongoing care of chronic or persistent pain, and provide information about the scientific basis of recommended procedures. These guidelines are expected to increase patient compliance; dispel misconceptions among providers and patients, manage patient expectations reasonably; and form the basis of a therapeutic partnership between the patient, the provider, and payers.
Institute of Scientific and Technical Information of China (English)
袁卫; 刘畅; 张云
2004-01-01
The paper systematically introduces to the compiling of statistical teaching materials of the past one hundred years, and forwards some suggestions to solve the existing problems of teaching materials.
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
Zghibi, Adel; Merzougui, Amira; Zouhri, Lahcen; Tarhouni, Jamila
2014-01-01
the dissolution of gypsum, dolomite and halite, as well as contamination by nitrate caused mainly by extensive irrigation activity. The application of Multivariate Statistics Techniques based on Principal component Analysis and Hierarchical Cluster Analysis has lead to the corroboration of the hypotheses developed from the previous hydrochemical study. Two factors were found that explained major hydrochemical processes in the aquifer. These factors reveal the existence of an intensive intrusion of seawater and mechanisms of nitrate contamination of groundwater.
Kim, Moon H.; Ritz, Christian T.; Arvin, Donald V.
2012-01-01
Potential wetland extents were estimated for a 14-mile reach of the Wabash River near Terre Haute, Indiana. This pilot study was completed by the U.S. Geological Survey in cooperation with the U.S. Department of Agriculture, Natural Resources Conservation Service (NRCS). The study showed that potential wetland extents can be estimated by analyzing streamflow statistics with the available streamgage data, calculating the approximate water-surface elevation along the river, and generating maps by use of flood-inundation mapping techniques. Planning successful restorations for Wetland Reserve Program (WRP) easements requires a determination of areas that show evidence of being in a zone prone to sustained or frequent flooding. Zone determinations of this type are used by WRP planners to define the actively inundated area and make decisions on restoration-practice installation. According to WRP planning guidelines, a site needs to show evidence of being in an "inundation zone" that is prone to sustained or frequent flooding for a period of 7 consecutive days at least once every 2 years on average in order to meet the planning criteria for determining a wetland for a restoration in agricultural land. By calculating the annual highest 7-consecutive-day mean discharge with a 2-year recurrence interval (7MQ2) at a streamgage on the basis of available streamflow data, one can determine the water-surface elevation corresponding to the calculated flow that defines the estimated inundation zone along the river. By using the estimated water-surface elevation ("inundation elevation") along the river, an approximate extent of potential wetland for a restoration in agricultural land can be mapped. As part of the pilot study, a set of maps representing the estimated potential wetland extents was generated in a geographic information system (GIS) application by combining (1) a digital water-surface plane representing the surface of inundation elevation that sloped in the downstream
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Postmus, Douwe; Demissei, Biniyam G; Hillege, Hans L
2016-01-01
In cardiovascular studies, it is common to assess the association between the risk of experiencing an event, such as death or hospitalisation, and one or more exposure variables, such as different treatment regimens when the study under consideration is an intervention study. A frequently applied
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Norén, Patrik
2013-01-01
Algebraic statistics brings together ideas from algebraic geometry, commutative algebra, and combinatorics to address problems in statistics and its applications. Computer algebra provides powerful tools for the study of algorithms and software. However, these tools are rarely prepared to address statistical challenges and therefore new algebraic results need often be developed. This way of interplay between algebra and statistics fertilizes both disciplines. Algebraic statistics is a relativ...
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Richfield, Jon; bookfeller
2016-07-01
In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.
Wallis, W Allen
2014-01-01
Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,
新家, 健精
2013-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Estimation and inferential statistics
Sahu, Pradip Kumar; Das, Ajit Kumar
2015-01-01
This book focuses on the meaning of statistical inference and estimation. Statistical inference is concerned with the problems of estimation of population parameters and testing hypotheses. Primarily aimed at undergraduate and postgraduate students of statistics, the book is also useful to professionals and researchers in statistical, medical, social and other disciplines. It discusses current methodological techniques used in statistics and related interdisciplinary areas. Every concept is supported with relevant research examples to help readers to find the most suitable application. Statistical tools have been presented by using real-life examples, removing the “fear factor” usually associated with this complex subject. The book will help readers to discover diverse perspectives of statistical theory followed by relevant worked-out examples. Keeping in mind the needs of readers, as well as constantly changing scenarios, the material is presented in an easy-to-understand form.
Experimental Mathematics and Computational Statistics
Energy Technology Data Exchange (ETDEWEB)
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Energy Technology Data Exchange (ETDEWEB)
Kawano, Toshihiko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-11-10
This theoretical treatment of low-energy compound nucleus reactions begins with the Bohr hypothesis, with corrections, and various statistical theories. The author investigates the statistical properties of the scattering matrix containing a Gaussian Orthogonal Ensemble (GOE) Hamiltonian in the propagator. The following conclusions are reached: For all parameter values studied, the numerical average of MC-generated cross sections coincides with the result of the Verbaarschot, Weidenmueller, Zirnbauer triple-integral formula. Energy average and ensemble average agree reasonably well when the width I is one or two orders of magnitude larger than the average resonance spacing d. In the strong-absorption limit, the channel degree-of-freedom ν _{a} is 2. The direct reaction increases the inelastic cross sections while the elastic cross section is reduced.
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Lead - nutritional considerations
Lead poisoning - nutritional considerations; Toxic metal - nutritional considerations ... Markowitz M. Lead poisoning. In: Kliegman RM, Behrman RE, Jenson HB, ... Emergency Medicine: Concepts and Clinical Practice . 8th ed. ...
Applied statistical inference with MINITAB
Lesik, Sally
2009-01-01
Through clear, step-by-step mathematical calculations, Applied Statistical Inference with MINITAB enables students to gain a solid understanding of how to apply statistical techniques using a statistical software program. It focuses on the concepts of confidence intervals, hypothesis testing, validating model assumptions, and power analysis.Illustrates the techniques and methods using MINITABAfter introducing some common terminology, the author explains how to create simple graphs using MINITAB and how to calculate descriptive statistics using both traditional hand computations and MINITAB. Sh
Directory of Open Access Journals (Sweden)
A. Moberg
2014-06-01
Full Text Available Practical issues arise when applying a statistical framework for unbiased ranking of alternative forced climate model simulations by comparison with climate observations from instrumental and proxy data (Part 1 in this series. Given a set of model and observational data, several decisions need to be made; e.g. concerning the region that each proxy series represents, the weighting of different regions, and the time resolution to use in the analysis. Objective selection criteria cannot be made here, but we argue to study how sensitive the results are to the choices made. The framework is improved by the relaxation of two assumptions; to allow autocorrelation in the statistical model for simulated climate variability, and to enable direct comparison of alternative simulations to test if any of them fit the observations significantly better. The extended framework is applied to a set of simulations driven with forcings for the pre-industrial period 1000–1849 CE and fifteen tree-ring based temperature proxy series. Simulations run with only one external forcing (land-use, volcanic, small-amplitude solar, or large-amplitude solar, do not significantly capture the variability in the tree-ring data – although the simulation with volcanic forcing does so for some experiment settings. When all forcings are combined (using either the small- or large-amplitude solar forcing including also orbital, greenhouse-gas and non-volcanic aerosol forcing, and additionally used to produce small simulation ensembles starting from slightly different initial ocean conditions, the resulting simulations are highly capable of capturing some observed variability. Nevertheless, for some choices in the experiment design, they are not significantly closer to the observations than when unforced simulations are used, due to highly variable results between regions. It is also not possible to tell whether the small-amplitude or large-amplitude solar forcing causes the multiple
Energy Technology Data Exchange (ETDEWEB)
Velazquez, J. C.; Caleyo, F.; Valorm, A.; Hallen, J. M.
2011-07-01
New deterministic and stochastic predictive models are proposed for external pitting corrosion in underground pipelines. The deterministic model takes into consideration the local chemical and physical properties of the soil as well as the pipeline coating to predict the time dependence of pitting depth and rate in a range of soils. This model, based on results from a field study, was used to conduct Monte Carlo simulations that established the probability distribution of pitting depth and growth rate in the studied soils and their evolution over the life of the pipeline. In the last stage of the study, an empirical Markov chain-based stochastic model was developed for predicting the evolution of pitting corrosion depth and rate distributions from the observed properties of the soil. (Author) 18 refs.
Paediatric pharmacokinetics: key considerations
Batchelor, Hannah Katharine; Marriott, John Francis
2015-01-01
A number of anatomical and physiological factors determine the pharmacokinetic profile of a drug. Differences in physiology in paediatric populations compared with adults can influence the concentration of drug within the plasma or tissue. Healthcare professionals need to be aware of anatomical and physiological changes that affect pharmacokinetic profiles of drugs to understand consequences of dose adjustments in infants and children. Pharmacokinetic clinical trials in children are complicated owing to the limitations on blood sample volumes and perception of pain in children resulting from blood sampling. There are alternative sampling techniques that can minimize the invasive nature of such trials. Population based models can also limit the sampling required from each individual by increasing the overall sample size to generate robust pharmacokinetic data. This review details key considerations in the design and development of paediatric pharmacokinetic clinical trials. PMID:25855821
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Plant, Emma L; Smernik, Ronald J; van Leeuwen, John; Greenwood, Paul; Macdonald, Lynne M
2014-03-01
The paper-making process can produce large amounts of wastewater (WW) with high particulate and dissolved organic loads. Generally, in developed countries, stringent international regulations for environmental protection require pulp and paper mill WW to be treated to reduce the organic load prior to discharge into the receiving environment. This can be achieved by primary and secondary treatments involving both chemical and biological processes. These processes result in complex changes in the nature of the organic material, as some components are mineralised and others are transformed. In this study, changes in the nature of organics through different stages of secondary treatment of pulp and paper mill WW were followed using three advanced characterisation techniques: solid-state (13)C nuclear magnetic resonance (NMR) spectroscopy, pyrolysis-gas chromatography mass spectrometry (py-GCMS) and high-performance size-exclusion chromatography (HPSEC). Each technique provided a different perspective on the changes that occurred. To compare the different chemical perspectives in terms of the degree of similarity/difference between samples, we employed non-metric multidimensional scaling. Results indicate that NMR and HPSEC provided strongly correlated perspectives, with 86 % of the discrimination between the organic samples common to both techniques. Conversely, py-GCMS was found to provide a unique, and thus complementary, perspective.
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Statistics Recommend on Facebook Tweet Share Compartir How common is histoplasmosis? In the United States, an estimated 60% to ...
Forbes, Catherine; Hastings, Nicholas; Peacock, Brian J.
2010-01-01
A new edition of the trusted guide on commonly used statistical distributions Fully updated to reflect the latest developments on the topic, Statistical Distributions, Fourth Edition continues to serve as an authoritative guide on the application of statistical methods to research across various disciplines. The book provides a concise presentation of popular statistical distributions along with the necessary knowledge for their successful use in data modeling and analysis. Following a basic introduction, forty popular distributions are outlined in individual chapters that are complete with re
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
Directory of Open Access Journals (Sweden)
Hélène eColineaux
2015-10-01
Full Text Available INTRODUCTION. The use of genetic predictive markers in medical practice does not necessarily bear the same kind of medical and ethical consequences than that of genes directly involved in monogenic diseases. However, the French bioethics law framed in the same way the production and use of any genetic information. It seems therefore necessary to explore the practical and ethical context of the actual use of predictive markers in order to highlight their specific stakes. In this study, we document the uses of HLA-B*27, which are an interesting example of the multiple features of genetic predictive marker in general medical practice.MATERIAL & METHODS. The aims of this monocentric and qualitative study were to identify concrete and ethical issues of using the HLA-B*27 marker and the interests and limits of the legal framework as perceived by prescribers. In this regard, a thematic and descriptive analysis of five rheumatologists’ semi-structured and face-to-face interviews was performed.RESULTS. According to most of the interviewees, HLA-B*27 is an overframed test because they considered that this test is not really genetic or at least does not have the same nature as classical genetic tests; HLA-B*27 is not concerned by the ethical challenges of genetic test; the major ethics stake of this marker is not linked to its genetic nature but rather to the complexity of the probabilistic information. This study allows also showing that HLA-B*27, validated for a certain usage, may be used in different ways in practice.DISCUSSION. This marker and its clinical uses underline the challenges of translating both statistical concepts and unifying legal framework in clinical practice. This study allows identifying some new aspects and stakes of genetics in medicine and shows the need of additional studies about the use of predictive genetic markers, in order to provide a better basis for decisions and legal framework regarding these practices.
Kumar, Manoj; Ramanathan, A L; Tripathi, Ritu; Farswan, Sandhya; Kumar, Devendra; Bhattacharya, Prosun
2017-01-01
This study is an investigation on spatio-chemical, contamination sources (using multivariate statistics), and health risk assessment arising from the consumption of groundwater contaminated with trace and toxic elements in the Chhaprola Industrial Area, Gautam Buddha Nagar, Uttar Pradesh, India. In this study 33 tubewell water samples were analyzed for 28 elements using ICP-OES. Concentration of some trace and toxic elements such as Al, As, B, Cd, Cr, Mn, Pb and U exceeded their corresponding WHO (2011) guidelines and BIS (2012) standards while the other analyzed elements remain below than those values. Background γ and β radiation levels were observed and found to be within their acceptable limits. Multivariate statistics PCA (explains 82.07 cumulative percent for total 6 of factors) and CA indicated (mixed origin) that natural and anthropogenic activities like industrial effluent and agricultural runoff are responsible for the degrading of groundwater quality in the research area. In this study area, an adult consumes 3.0 L (median value) of water therefore consuming 39, 1.94, 1461, 0.14, 11.1, 292.6, 13.6, 23.5 μg of Al, As, B, Cd, Cr, Mn, Pb and U from drinking water per day respectively. The hazard quotient (HQ) value exceeded the safe limit of 1 which for As, B, Al, Cr, Mn, Cd, Pb and U at few locations while hazard index (HI) > 5 was observed in about 30% of the samples which indicated potential health risk from these tubewells for the local population if the groundwater is consumed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modern engineering statistics, solutions manual
Ryan, Thomas P
2012-01-01
An introductory perspective on statistical applications in the field of engineering Modern Engineering Statistics presents state-of-the-art statistical methodology germane to engineering applications. With a nice blend of methodology and applications, this book provides and carefully explains the concepts necessary for students to fully grasp and appreciate contemporary statistical techniques in the context of engineering. With almost thirty years of teaching experience, many of which were spent teaching engineering statistics courses, the author has successfully developed a
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Lyons, L
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
Directory of Open Access Journals (Sweden)
R. Soundararajan
2015-01-01
Full Text Available Artificial Neural Network (ANN approach was used for predicting and analyzing the mechanical properties of A413 aluminum alloy produced by squeeze casting route. The experiments are carried out with different controlled input variables such as squeeze pressure, die preheating temperature, and melt temperature as per Full Factorial Design (FFD. The accounted absolute process variables produce a casting with pore-free and ideal fine grain dendritic structure resulting in good mechanical properties such as hardness, ultimate tensile strength, and yield strength. As a primary objective, a feed forward back propagation ANN model has been developed with different architectures for ensuring the definiteness of the values. The developed model along with its predicted data was in good agreement with the experimental data, inferring the valuable performance of the optimal model. From the work it was ascertained that, for castings produced by squeeze casting route, the ANN is an alternative method for predicting the mechanical properties and appropriate results can be estimated rather than measured, thereby reducing the testing time and cost. As a secondary objective, quantitative and statistical analysis was performed in order to evaluate the effect of process parameters on the mechanical properties of the castings.
Ross, Sheldon M
2005-01-01
In this revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this goal through a coherent mix of mathematical analysis, intuitive discussions and examples.* Ross's clear writin
Ross, Sheldon M
2010-01-01
In this 3rd edition revised text, master expositor Sheldon Ross has produced a unique work in introductory statistics. The text's main merits are the clarity of presentation, contemporary examples and applications from diverse areas, and an explanation of intuition and ideas behind the statistical methods. Concepts are motivated, illustrated and explained in a way that attempts to increase one's intuition. To quote from the preface, ""It is only when a student develops a feel or intuition for statistics that she or he is really on the path toward making sense of data."" Ross achieves this
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Wannier, Gregory H
2010-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
Eliciting spatial statistics from geological experts using genetic algorithms
Walker, Matthew; Curtis, Andrew
2014-07-01
A new method to obtain the statistics of a geostatistical model is introduced. The method elicits the statistical information from a geological expert directly, by iteratively updating a population of vectors of statistics, based on the expert's subjective opinion of the corresponding geological simulations. Thus, it does not require the expert to have knowledge of the mathematical and statistical details of the model. The process uses a genetic algorithm to generate new vectors. We demonstrate the methodology for a particular geostatistical model used to model rock pore-space, which simulates the spatial distribution of matrix and pores over a 2-D grid, using multipoint statistics specified by conditional probabilities. Experts were asked to use the algorithm to estimate the statistics of a given target pore-space image with known statistics; thus, their numerical rates of convergence could be calculated. Convergence was measured for all experts, showing that the algorithm can be used to find appropriate probabilities given the expert's subjective input. However, considerable and apparently irreducible residual misfit was found between the true statistics and the estimates of statistics obtained by the experts, with the root-mean-square error on the conditional probabilities typically >0.1. This is interpreted as the limit of the experts' abilities to distinguish between realizations of different spatial statistics using the algorithm. More accurate discrimination is therefore likely to require complementary elicitation techniques or sources of information independent of expert opinion.
Sammani, Mohamad Subhi; Clavijo, Sabrina; Portugal, Lindomar; Suárez, Ruth; Seddik, Hassan; Cerdà, Víctor
2017-05-15
A new method for the separation and determination of four flavonoids: hesperidin (HES), diosmin (DIO), hesperitin (HTIN), and diosmetin (DTIN) in pure form and pharmaceutical formulations has been developed by using high performance liquid chromatography (HPLC) with UV-DAD detection. Multivariate statistics (2(k) full factorial and Box Behnken Designs) has been used for the multiresponse optimization of the chromatographic separation, which was completed in 22min, and carried out on a symmetry® C18 column (250×3mm; 5µm) as stationary phase. Separation was conducted by gradient elution mode using a mixture of methanol, acetonitrile and water pH: 2.5 (CH3COOH), as mobile phase. Analytes were separated setting the column at 22°C, with a flow rate of 0.58mLmin(-1) and detected at 285nm. Under the optimized conditions, the flavonoids showed retention times of: 8.62, 11.53, 18.55 and 19.94min for HES, DIO, HTIN and DTIN, respectively. Limits of detection and quantification were <0.0156µgmL(-1) and <0.100µgmL(-1), respectively. Linearity was achieved with good correlation coefficients values (r(2)=0.999; n=5). Intra-day and inter-day precision were found to be less than 3.44% (n=7). Finally, the proposed method was successfully applied to determine the target flavonoids in pharmaceutical preparations with satisfactory recoveries (between 95.2% and 107.9%), demonstrating that should also find application in the quality control, as well as in the pharmacokinetic studies of these drugs.
Energy Technology Data Exchange (ETDEWEB)
Roberts, O. W.; Li, X.; Jeska, L., E-mail: o.wyn.roberts@gmail.com, E-mail: xxl@aber.ac.uk [Department of Physics, Aberystwyth University, Aberystwyth SY23 3BZ (United Kingdom)
2015-03-20
Plasma turbulence at ion kinetic scales in the solar wind is investigated using the multi-point magnetometer data from the Cluster spacecraft. By applying the k-filtering method, we are able to estimate the full three-dimensional power spectral density P(ω{sub sc}, k) at a certain spacecraft frequency ω{sub sc} in wavevector (k) space. By using the wavevector at the maximum power in P(ω{sub sc}, k) at each sampling frequency ω{sub sc} and the Doppler shifted frequency ω{sub pla} in the solar wind frame, the dispersion plot ω{sub pla} = ω{sub pla}(k) is found. Previous studies have been limited to very few intervals and have been hampered by large errors, which motivates a statistical study of 52 intervals of solar wind. We find that the turbulence is predominantly highly oblique to the magnetic field k >> k {sub ∥}, and propagates slowly in the plasma frame with most points having frequencies smaller than the proton gyrofrequency ω{sub pla} < Ω{sub p}. Weak agreement is found that turbulence at the ion kinetic scales consists of kinetic Alfvén waves and coherent structures advected with plasma bulk velocity plus some minor more compressible components. The results suggest that anti-sunward and sunward propagating magnetic fluctuations are of similar nature in both the fast and slow solar wind at ion kinetic scales. The fast wind has significantly more anti-sunward flux than sunward flux and the slow wind appears to be more balanced.
The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.
... Resources Conducting Clinical Trials Statistical Tools and Data Terminology Resources NCI Data Catalog Cryo-EM NCI's Role ... Contacts Other Funding Find NCI funding for small business innovation, technology transfer, and contracts Training Cancer Training ...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Flipping the statistics classroom in nursing education.
Schwartz, Todd A
2014-04-01
Flipped classrooms are so named because they substitute the traditional lecture that commonly encompasses the entire class period with active learning techniques, such as small-group work. The lectures are delivered instead by using an alternative mode--video recordings--that are made available for viewing online outside the class period. Due to this inverted approach, students are engaged with the course material during the class period, rather than participating only passively. This flipped approach is gaining popularity in many areas of education due to its enhancement of student learning and represents an opportunity for utilization by instructors of statistics courses in nursing education. This article presents the author's recent experiences with flipping a statistics course for nursing students in a PhD program, including practical considerations and student outcomes and reaction. This transformative experience deepened the level of student learning in a way that may not have occurred using a traditional format.
Affum, Andrews Obeng; Osae, Shiloh Dede; Nyarko, Benjamin Jabez Botwe; Afful, Samuel; Fianko, Joseph Richmond; Akiti, Tetteh Thomas; Adomako, Dickson; Acquaah, Samuel Osafo; Dorleku, Micheal; Antoh, Emmanuel; Barnes, Felix; Affum, Enoch Acheampong
2015-02-01
In recent times, surface water resource in the Western Region of Ghana has been found to be inadequate in supply and polluted by various anthropogenic activities. As a result of these problems, the demand for groundwater by the human populations in the peri-urban communities for domestic, municipal and irrigation purposes has increased without prior knowledge of its water quality. Water samples were collected from 14 public hand-dug wells during the rainy season in 2013 and investigated for total coliforms, Escherichia coli, mercury (Hg), arsenic (As), cadmium (Cd) and physicochemical parameters. Multivariate statistical analysis of the dataset and a linear stoichiometric plot of major ions were applied to group the water samples and to identify the main factors and sources of contamination. Hierarchal cluster analysis revealed four clusters from the hydrochemical variables (R-mode) and three clusters in the case of water samples (Q-mode) after z score standardization. Principal component analysis after a varimax rotation of the dataset indicated that the four factors extracted explained 93.3 % of the total variance, which highlighted salinity, toxic elements and hardness pollution as the dominant factors affecting groundwater quality. Cation exchange, mineral dissolution and silicate weathering influenced groundwater quality. The ranking order of major ions was Na(+) > Ca(2+) > K(+) > Mg(2+) and Cl(-) > SO4 (2-) > HCO3 (-). Based on piper plot and the hydrogeology of the study area, sodium chloride (86 %), sodium hydrogen carbonate and sodium carbonate (14 %) water types were identified. Although E. coli were absent in the water samples, 36 % of the wells contained total coliforms (Enterobacter species) which exceeded the WHO guidelines limit of zero colony-forming unit (CFU)/100 mL of drinking water. With the exception of Hg, the concentration of As and Cd in 79 and 43 % of the water samples exceeded the WHO guideline limits of 10 and 3
Institute of Scientific and Technical Information of China (English)
唐哲
2011-01-01
通过对19届世界杯相关技术的统计分析,探讨现代足球进攻、防守技术环节优秀的数据及足球比赛胜负相关因素的内在本质,旨在为足球教学、训练提供有参考价值的依据。%The article discusses the excellent data of offensive and defend technique link in modern football,and studies football game the victory or defeat related factor of inside essence of thing,through technique statistics and analysis in the 19th World Football Cup.It will become the database of the history for the World Football Cup,for the football teaching and training to provide teaching materials that have reference value.
Advances in statistical multisource-multitarget information fusion
Mahler, Ronald PS
2014-01-01
This is the sequel to the 2007 Artech House bestselling title, Statistical Multisource-Multitarget Information Fusion. That earlier book was a comprehensive resource for an in-depth understanding of finite-set statistics (FISST), a unified, systematic, and Bayesian approach to information fusion. The cardinalized probability hypothesis density (CPHD) filter, which was first systematically described in the earlier book, has since become a standard multitarget detection and tracking technique, especially in research and development.Since 2007, FISST has inspired a considerable amount of research
Open statistical issues in particle physics
Lyons, Louis
2008-01-01
Many statistical issues arise in the analysis of Particle Physics experiments. We give a brief introduction to Particle Physics, before describing the techniques used by Particle Physicists for dealing with statistical problems, and also some of the open statistical questions.
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.
WAIS-IV Subtest Covariance Structure: Conceptual and Statistical Considerations
Ward, L. Charles; Bergman, Maria A.; Hebert, Katina R.
2012-01-01
D. Wechsler (2008b) reported confirmatory factor analyses (CFAs) with standardization data (ages 16-69 years) for 10 core and 5 supplemental subtests from the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV). Analyses of the 15 subtests supported 4 hypothesized oblique factors (Verbal Comprehension, Working Memory, Perceptual Reasoning,…
WAIS-IV Subtest Covariance Structure: Conceptual and Statistical Considerations
Ward, L. Charles; Bergman, Maria A.; Hebert, Katina R.
2012-01-01
D. Wechsler (2008b) reported confirmatory factor analyses (CFAs) with standardization data (ages 16-69 years) for 10 core and 5 supplemental subtests from the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV). Analyses of the 15 subtests supported 4 hypothesized oblique factors (Verbal Comprehension, Working Memory, Perceptual Reasoning,…
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1997, Statistics Finland, Helsinki 1998, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-September 1998, Energy exports by recipient country in January-September 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-12-31
For the year 1997 and 1998, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually includes also historical time series over a longer period (see e.g. Energiatilastot 1996, Statistics Finland, Helsinki 1997, ISSN 0784-3165). The inside of the Review`s back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO{sub 2}-emissions, Electricity supply, Energy imports by country of origin in January-June 1998, Energy exports by recipient country in January-June 1998, Consumer prices of liquid fuels, Consumer prices of hard coal, Natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, Value added taxes and fiscal charges and fees included in consumer prices of some energy sources, Energy taxes and precautionary stock fees, pollution fees on oil products
Gallavotti, Giovanni
2011-01-01
C. Cercignani: A sketch of the theory of the Boltzmann equation.- O.E. Lanford: Qualitative and statistical theory of dissipative systems.- E.H. Lieb: many particle Coulomb systems.- B. Tirozzi: Report on renormalization group.- A. Wehrl: Basic properties of entropy in quantum mechanics.
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Meneghetti, M; Dahle, H; Limousin, M
2013-01-01
The existence of an arc statistics problem was at the center of a strong debate in the last fifteen years. With the aim to clarify if the optical depth for giant gravitational arcs by galaxy clusters in the so called concordance model is compatible with observations, several studies were carried out which helped to significantly improve our knowledge of strong lensing clusters, unveiling their extremely complex internal structure. In particular, the abundance and the frequency of strong lensing events like gravitational arcs turned out to be a potentially very powerful tool to trace the structure formation. However, given the limited size of observational and theoretical data-sets, the power of arc statistics as a cosmological tool has been only minimally exploited so far. On the other hand, the last years were characterized by significant advancements in the field, and several cluster surveys that are ongoing or planned for the near future seem to have the potential to make arc statistics a competitive cosmo...
When Mathematics and Statistics Collide in Assessment Tasks
Bargagliotti, Anna; Groth, Randall
2016-01-01
Because the disciplines of mathematics and statistics are naturally intertwined, designing assessment questions that disentangle mathematical and statistical reasoning can be challenging. We explore the writing statistics assessment tasks that take into consideration potential mathematical reasoning they may inadvertently activate.
Regulatory considerations for biosimilars
Directory of Open Access Journals (Sweden)
Ranjani Nellore
2010-01-01
Full Text Available Currently there is considerable interest in the legislative debate around generic biological drugs or "biosimilars" in the EU and US due to the large, lucrative market that it offers to the industry. While some countries have issued a few regulatory guidelines as well as product specific requirements, there is no general consensus as to a single, simple mechanism similar to the bioequivalence determination that leads to approval of generic small molecules all over the world. The inherent complex nature of the molecules, along with complicated manufacturing and analytical techniques to characterize them make it difficult to rely on a single human pharmacokinetic study for assurance of safety and efficacy. In general, the concept of comparability has been used for evaluation of the currently approved "similar" biological where a step by step assessment on the quality, preclinical and clinical aspects is made. In India, the focus is primarily on the availability and affordability of life-saving drugs. In this context every product needs to be evaluated on its own merit irrespective of the innovator brand. The formation of the National Biotechnology Regulatory Authority may provide a step in the right direction for regulation of these complex molecules. However, in order to have an efficient machinery for initial approval and ongoing oversight with a country-specific focus, cooperation with international authorities for granting approvals and continuous risk-benefit review is essential. Several steps are still needed for India to be perceived as a country that leads the world in providing quality biological products.
2012-01-01
In 1975 John Tukey proposed a multivariate median which is the 'deepest' point in a given data cloud in R^d. Later, in measuring the depth of an arbitrary point z with respect to the data, David Donoho and Miriam Gasko considered hyperplanes through z and determined its 'depth' by the smallest portion of data that are separated by such a hyperplane. Since then, these ideas has proved extremely fruitful. A rich statistical methodology has developed that is based on data depth and, more general...
Sheffield, Scott
2009-01-01
In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.
Search Databases and Statistics
DEFF Research Database (Denmark)
Refsgaard, Jan C; Munk, Stephanie; Jensen, Lars J
2016-01-01
the vast amounts of raw data. This task is tackled by computational tools implementing algorithms that match the experimental data to databases, providing the user with lists for downstream analysis. Several platforms for such automated interpretation of mass spectrometric data have been developed, each...... having strengths and weaknesses that must be considered for the individual needs. These are reviewed in this chapter. Equally critical for generating highly confident output datasets is the application of sound statistical criteria to limit the inclusion of incorrect peptide identifications from database...... searches. Additionally, careful filtering and use of appropriate statistical tests on the output datasets affects the quality of all downstream analyses and interpretation of the data. Our considerations and general practices on these aspects of phosphoproteomics data processing are presented here....
Statistics of the galaxy distribution
Martinez, Vicent J
2001-01-01
Over the last decade, statisticians have developed new statistical tools in the field of spatial point processes. At the same time, observational efforts have yielded a huge amount of new cosmological data to analyze. Although the main tools in astronomy for comparing theoretical results with observation are statistical, in recent years, cosmologists have not been generally aware of the developments in statistics and vice versa.Statistics of the Galaxy Distribution describes both the available observational data on the distribution of galaxies and the applications of spatial statistics in cosmology. It gives a detailed derivation of the statistical methods used to study the galaxy distribution and the cosmological physics needed to formulate the statistical models. Because the prevalent approach in cosmological statistics has been frequentist, the authors focus on the most widely used of these methods, but they also explore Bayesian techniques that have become popular in large-scale structure studies.Describi...
Paine, Gregory Harold
1982-03-01
The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better
Statistical analysis of management data
Gatignon, Hubert
2013-01-01
This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.
The Malpractice of Statistical Interpretation
Fraas, John W.; Newman, Isadore
1978-01-01
Problems associated with the use of gain scores, analysis of covariance, multicollinearity, part and partial correlation, and the lack of rectilinearity in regression are discussed. Particular attention is paid to the misuse of statistical techniques. (JKS)
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Statistical mechanics of learning
Engel, Andreas
2001-01-01
The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.
Celiac disease - nutritional considerations
... this page: //medlineplus.gov/ency/article/002443.htm Celiac disease - nutritional considerations To use the sharing features on this page, please enable JavaScript. Celiac disease is an immune disorder passed down through families. ...
Energy Technology Data Exchange (ETDEWEB)
Chen, Gang; Lin, Yuehe
2008-07-20
Sensitive and selective detection techniques are of crucial importance for capillary electrophoresis (CE), microfluidic chips, and other microfluidic systems. Electrochemical detectors have attracted considerable interest for microfluidic systems with features that include high sensitivity, inherent miniaturization of both the detection and control instrumentation, low cost and power demands, and high compatibility with microfabrication technology. The commonly used electrochemical detectors can be classified into three general modes: conductimetry, potentiometry, and amperometry.
DOE handbook: Design considerations
Energy Technology Data Exchange (ETDEWEB)
NONE
1999-04-01
The Design Considerations Handbook includes information and suggestions for the design of systems typical to nuclear facilities, information specific to various types of special facilities, and information useful to various design disciplines. The handbook is presented in two parts. Part 1, which addresses design considerations, includes two sections. The first addresses the design of systems typically used in nuclear facilities to control radiation or radioactive materials. Specifically, this part addresses the design of confinement systems and radiation protection and effluent monitoring systems. The second section of Part 1 addresses the design of special facilities (i.e., specific types of nonreactor nuclear facilities). The specific design considerations provided in this section were developed from review of DOE 6430.1A and are supplemented with specific suggestions and considerations from designers with experience designing and operating such facilities. Part 2 of the Design Considerations Handbook describes good practices and design principles that should be considered in specific design disciplines, such as mechanical systems and electrical systems. These good practices are based on specific experiences in the design of nuclear facilities by design engineers with related experience. This part of the Design Considerations Handbook contains five sections, each of which applies to a particular engineering discipline.
Considerations and Algorithms for Compression of Sets
DEFF Research Database (Denmark)
Larsson, Jesper
compression algorithm that allows transparent incorporation of various estimates for probability distribution. Our experimental results allow the conclusion that set compression can benefit from incorporat- ing statistics, using our method or variants of previously known techniques.......We consider compression of unordered sets of distinct elements. After a discus- sion of the general problem, we focus on compressing sets of fixed-length bitstrings in the presence of statistical information. We survey techniques from previous work, suggesting some adjustments, and propose a novel...
CONSIDERATIONS ON CONSUMER PERCEIVED RISK
Directory of Open Access Journals (Sweden)
Laura Catalina Timiras
2014-12-01
Full Text Available In this article we identified a number of factors influencing the consumers’ perceived risk. In the first part we conducted a review of the main issues that define the perceived risk by the consumer at the time of purchase, some of the lines of action of the organizations to diminish this risk perception and a number of influencing factors presented in the literature, with significant impact on the intensity with which risk is perceived by consumers. The second part of the article is based on the statistical information regarding e-commerce market, market in which the perceived risk plays an important role in the purchasing decision. Thus, based on available official statistics provided by Eurostat we have revealed the existence of certain links between electronic commerce and orientation towards risk and income levels, age and consumer educational level. The review is not intended to be exhaustive, the study taking into consideration only those links that can be identified from using official statistical data.
T1 VSAT Fade Compensation Statistical Results
Johnson, Sandra K.; Acosta, Roberto; Ugweje, Oke
2000-01-01
New satellite communication systems are steadily seeking to use higher frequency bands to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band. the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS). launched in September 1993, is the first U.S. communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including on-board baseband processing. multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this paper is to characterize the method used by the ACTS TI Very Small Aperture Terminal (TI VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program was used to validate the compensation technique. A software process was developed and demonstrated to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka band system are offered.
Arslan, Hakan; Ayyildiz Turan, Nazlı
2015-08-01
Monitoring of heavy metal concentrations in groundwater potentially used for drinking and irrigation is very important. This study collected groundwater samples from 78 wells in July 2012 and analyzed them for 17 heavy metals (Pb, Zn, Cr, Mn, Fe, Cu, Cd, Co, Ni, Al, As, Mo, Se, B, Ti, V, Ba). Spatial distributions of these elements were identified using three different interpolation methods [inverse distance weighing (IDW), radial basis function (RBF), and ordinary kriging (OK)]. Root mean squared error (RMSE) and mean absolute error (MAE) for cross validation were used to select the best interpolation methods for each parameter. Multivariate statistical analysis [cluster analysis (CA) and factor analysis (FA)] were used to identify similarities among sampling sites and the contribution of variables to groundwater pollution. Fe and Mn levels exceeded World Health Organization (WHO) recommended limits for drinking water in almost all of the study area, and some locations had Fe and Mn levels that exceeded Food and Agriculture Organization (FAO) guidelines for drip irrigation systems. Al, As, and Cd levels also exceeded WHO guidelines for drinking water. Cluster analysis classified groundwater in the study area into three groups, and factor analysis identified five factors that explained 73.39% of the total variation in groundwater, which are as follows: factor 1: Se, Ti, Cr, Mo; factor 2: Ni, Mn, Co, Ba; factor 3: Pb, Cd; factor 4: B, V, Fe, Cu; and factor 5: AS, Zn. As a result of this study, it could be said that interpolation methods and multivariate statistical techniques gave very useful results for the determination of the source.
Dimensional enrichment of statistical linked open data
DEFF Research Database (Denmark)
Varga, Jovan; Vaisman, Alejandro; Romero, Oscar
2016-01-01
On-Line Analytical Processing (OLAP) is a data analysis technique typically used for local and well-prepared data. However, initiatives like Open Data and Open Government bring new and publicly available data on the web that are to be analyzed in the same way. The use of semantic web technologies...... for this context is especially encouraged by the Linked Data initiative. There is already a considerable amount of statistical linked open data sets published using the RDF Data Cube Vocabulary (QB) which is designed for these purposes. However, QB lacks some essential schema constructs (e.g., dimension levels......) to support OLAP. Thus, the QB4OLAP vocabulary has been proposed to extend QB with the necessary constructs and be fully compliant with OLAP. In this paper, we focus on the enrichment of an existing QB data set with QB4OLAP semantics. We first thoroughly compare the two vocabularies and outline the benefits...
Techniques for Wireless Applications
Gaaloul, Fakhreddine
2012-05-01
Switching techniques have been first proposed as a spacial diversity techniques. These techniques have been shown to reduce considerably the processing load while letting multi-antenna systems achieve a specific target performance. In this thesis, we take a different look at the switching schemes by implementing them for different other wireless applications. More specifically, this thesis consists of three main parts, where the first part considers a multiuser environment and an adaptive scheduling algorithm based on the switching with post-selection scheme for statistically independent but non-identically distributed channel conditions. The performance of this switched based scheduler is investigated and a multitude of performance metrics are presented. In a second part, we propose and analyze the performance of three switched-based algorithms for interference reduction in the downlink of over-loaded femtocells. For instance, performance metrics are derived in closed-form and these metrics are used to compare these three proposed schemes. Finally in a third part, a switch based opportunistic channel access scheme is proposed for a cognitive radio system and its performance is analyzed in terms of two new proposed metrics namely the average cognitive radio access and the waiting time duration.
Institute of Scientific and Technical Information of China (English)
高虹
2012-01-01
The competition of modern volleyball is more intense,techniques and tactics become various."Comprehensive" and "stereoscopic" has become leading of attack tactics.Reasonable,concise,practical "fast-changing" play is the development trend of volleyball technique and tactics.By using methods of video observation,statistics,logical analysis to analyze the winning and losing point of serving,first passing,blocking,smashing and defending in the 11th National Games men's volleyball finals.The results indicate that the segment of the team's active scoring is mainly smash,scoring means single;Serving and spiking offensive is not strong,stability has to be raised;Storm blocking success rate is low.%现代排球运动竞争激烈,技战术变化多样,＂全面型＂、＂立体化＂已成为进攻战术的主导,合理、简练、实效的＂快速多变＂打法是排球运动技战术的发展趋势。采用录像观察法、数理统计法、逻辑分析法等,对第11届全运会男排决赛阶段,前八名各队比赛的发球、一传、拦网、扣球以及防守得失分情况进行统计分析。研究结果表明：各队的主动得分环节以扣球为主,得分手段单一;发球和扣球的攻击性不强,稳定性还有待提高;对强攻的拦网成功率较低。
A statistical manual for chemists
Bauer, Edward
1971-01-01
A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect
Statistical methods for ranking data
Alvo, Mayer
2014-01-01
This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.
Directory of Open Access Journals (Sweden)
Adrion Christine
2012-09-01
Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study
Finkelstein, Michael O
2015-01-01
This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...
Your Chi-Square Test Is Statistically Significant: Now What?
Directory of Open Access Journals (Sweden)
Donald Sharpe
2015-04-01
Full Text Available Applied researchers have employed chi-square tests for more than one hundred years. This paper addresses the question of how one should follow a statistically significant chi-square test result in order to determine the source of that result. Four approaches were evaluated: calculating residuals, comparing cells, ransacking, and partitioning. Data from two recent journal articles were used to illustrate these approaches. A call is made for greater consideration of foundational techniques such as the chi-square tests.
Institute of Scientific and Technical Information of China (English)
巩庆波; 徐兰君
2014-01-01
Based on the statistical analysis on attack and defense techniques of men ’ s basketball in Beijing and London Olympic Games, the overall defense and attack quality has been improved in London Olympics comparing with the Olympics in Beijing , but the behavior of Chinese men ’ s basketball team fell significantly .Using factor and cluster analysis , the technical statistic index can be divided into 4 factors:offense factor , response factor , psychological factor and physical factor .The conclusion that those 4 fac-tors are the main factors determining the scores of a team can be made .In terms of the cluster analysis , the US team can be classi-fied as a group .However , the grouping of other teams varies in those two Olympics .Chinese team with some top European teams like Spanish team will be classifies as the same group because of their apparent similarities .Therefore, the urgent problems are that we should increase our communications with these top European teams , improve our own members ’ strength quality , and raise our technical and tactical capabilities under strong antagonistic conditions .%统计分析北京、伦敦两届奥运会男篮比赛相关攻防技术指标，伦敦奥运会男篮整体攻防质量较北京奥运会有了提高，中国男篮下滑明显。运用因子与聚类分析法检验分析所得数据，把相关技术指标分为：进攻因子、反应因子、心理因子和身体因子四个因子，得出这四个因子是决定球队成绩主要因子的结论。聚类分析结果：美国队独归为一类；其他队伍两届奥运会归类有所变化；中国队与以西班牙为代表的欧洲强队归为一类，相似特征明显。加强与欧洲强队学习交流，提高队员力量素质及在强对抗条件下自身的技战术发挥能力是我们急需解决的问题。
Introductory statistics for engineering experimentation
Nelson, Peter R; Coffin, Marie
2003-01-01
The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...
New Considerations for Spectral Classification of Boolean Switching Functions
Directory of Open Access Journals (Sweden)
J. E. Rice
2011-01-01
Full Text Available This paper presents some new considerations for spectral techniques for classification of Boolean functions. These considerations incorporate discussions of the feasibility of extending this classification technique beyond n=5. A new implementation is presented along with a basic analysis of the complexity of the problem. We also note a correction to results in this area that were reported in previous work.
System design considerations for fast-neutron interrogation systems
Energy Technology Data Exchange (ETDEWEB)
Micklich, B.J.; Curry, B.P.; Fink, C.L.; Smith, D.L.; Yule, T.J.
1993-10-01
Nonintrusive interrogation techniques that employ fast neutrons are of interest because of their sensitivity to light elements such as carbon, nitrogen, and oxygen. The primary requirement of a fast-neutron inspection system is to determine the value of atomic densities, or their ratios, over a volumetric grid superimposed on the object being interrogated. There are a wide variety of fast-neutron techniques that can provide this information. The differences between the various nuclear systems can be considered in light of the trade-offs relative to the performance requirements for each system`s components. Given a set of performance criteria, the operational requirements of the proposed nuclear systems may also differ. For instance, resolution standards will drive scanning times and tomographic requirements, both of which vary for the different approaches. We are modelling a number of the fast-neutron interrogation techniques currently under consideration, to include Fast Neutron Transmission Spectroscopy (FNTS), Pulsed Fast Neutron Analysis (PFNA), and its variant, 14-MeV Associated Particle Imaging (API). The goals of this effort are to determine the component requirements for each technique, identify trade-offs that system performance standards impose upon those component requirements, and assess the relative advantages and disadvantages of the different approaches. In determining the component requirements, we will consider how they are driven by system performance standards, such as image resolution, scanning time, and statistical uncertainty. In considering the trade-offs between system components, we concentrate primarily on those which are common to all approaches, for example: source characteristics versus detector array requirements. We will then use the analysis to propose some figures-of-merit that enable performance comparisons between the various fast-neutron systems under consideration. The status of this ongoing effort is presented.
Statistical physics of vaccination
Wang, Zhen; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaz; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-01-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination - one of the most important preventive measures of modern times - is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated ana...
Energy Technology Data Exchange (ETDEWEB)
Hernandez M, B. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)
1997-07-01
The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)
Common pitfalls in statistical analysis: Logistic regression.
Ranganathan, Priya; Pramesh, C S; Aggarwal, Rakesh
2017-01-01
Logistic regression analysis is a statistical technique to evaluate the relationship between various predictor variables (either categorical or continuous) and an outcome which is binary (dichotomous). In this article, we discuss logistic regression analysis and the limitations of this technique.
Implant treatment planning considerations.
Kao, Richard T
2008-04-01
As dental implants become a more accepted treatment modality, there is a need for all parties involved with implant dentistry to be familiar with various treatment planning issues. Though the success can be highly rewarding, failure to forecast treatment planning issues can result in an increase of surgical needs, surgical cost, and even case failure. In this issue, the focus is on implant treatment planning considerations.
On quantum statistical inference
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...
On quantum statistical inference
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...
Al-Tufail, M; Akram, M; Haq, A
1999-03-01
The method previously used in the Toxicology Laboratories of King Faisal Specialist Hospital and Research Center for determining the zinc concentration in serum by Zeeman atomic absorption spectrometer was improved by modifying the matrix modifier and by changing the heated graphite furnace atomization (HGA) program. After trying several methods we failed to achieve the required precision and the accuracy of methods for serum zinc determination. We changed the matrix modifier to a fifty percent mixture (v/v) of 3.90 grams per liter of ammonium phosphate in Type 1 water with 0.2% nitric acid and 1.0 gram per liter of magnesium nitrate in acidic water (0.2% HNO3) with 0.1% triton X-100 was used as matrix modifier. A twenty-five fold dilution of the sample in matrix modifier was injected on the L'vov's platform of the furnace. In order to reduce the high sensitivity of Zn the furnace program was modified. The method is found very robust. The average reproducibility between inter-runs and intra-run is less than 1.59% with a high degree of accuracy. We used two levels of controls i.e. normal or low level and abnormal or high level. The linearity and the detection limit of the assay were 0.9992 and 0.010 micromol/L respectively. Average recovery of the analyte was 98.65%. The X-Bar and R charts were constructed by using Shewhart's statistical analysis technique to assess the test methodology. It was found that the assay is capable and stable for routine clinical and research analysis. The capability index (C(P)) of the assay, an indicator of the precision, was calculated.
Energy Technology Data Exchange (ETDEWEB)
Georg, Dietmar, E-mail: Dietmar.Georg@akhwien.at [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Hopfgartner, Johannes [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Gòra, Joanna [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Kuess, Peter [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Kragl, Gabriele [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Berger, Daniel [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Hegazy, Neamat [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Goldner, Gregor; Georg, Petra [Department of Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria); Christian Doppler Laboratory for Medical Radiation Research for Radiation Oncology, Medical University of Vienna/Allgemeines Krankenhaus der Stadt Wien, Vienna (Austria)
2014-03-01
Purpose: To assess the dosimetric differences among volumetric modulated arc therapy (VMAT), scanned proton therapy (intensity-modulated proton therapy, IMPT), scanned carbon-ion therapy (intensity-modulated carbon-ion therapy, IMIT), and low-dose-rate (LDR) and high-dose-rate (HDR) brachytherapy (BT) treatment of localized prostate cancer. Methods and Materials: Ten patients were considered for this planning study. For external beam radiation therapy (EBRT), planning target volume was created by adding a margin of 5 mm (lateral/anterior–posterior) and 8 mm (superior–inferior) to the clinical target volume. Bladder wall (BW), rectal wall (RW), femoral heads, urethra, and pelvic tissue were considered as organs at risk. For VMAT and IMPT, 78 Gy(relative biological effectiveness, RBE)/2 Gy were prescribed. The IMIT was based on 66 Gy(RBE)/20 fractions. The clinical target volume planning aims for HDR-BT ({sup 192}Ir) and LDR-BT ({sup 125}I) were D{sub 90%} ≥34 Gy in 8.5 Gy per fraction and D{sub 90%} ≥145 Gy. Both physical and RBE-weighted dose distributions for protons and carbon-ions were converted to dose distributions based on 2-Gy(IsoE) fractions. From these dose distributions various dose and dose–volume parameters were extracted. Results: Rectal wall exposure 30-70 Gy(IsoE) was reduced for IMIT, LDR-BT, and HDR-BT when compared with VMAT and IMPT. The high-dose region of the BW dose–volume histogram above 50 Gy(IsoE) of IMPT resembled the VMAT shape, whereas all other techniques showed a significantly lower high-dose region. For all 3 EBRT techniques similar urethra D{sub mean} around 74 Gy(IsoE) were obtained. The LDR-BT results were approximately 30 Gy(IsoE) higher, HDR-BT 10 Gy(IsoE) lower. Normal tissue and femoral head sparing was best with BT. Conclusion: Despite the different EBRT prescription and fractionation schemes, the high-dose regions of BW and RW expressed in Gy(IsoE) were on the same order of magnitude. Brachytherapy techniques
Energy Technology Data Exchange (ETDEWEB)
Ostrowsky, A.; Daures, J
2008-07-01
Calorimetry is the most direct dosimetric technique to reach absorbed dose. A calorimeter gives direct access to the energy imparted to matter by ionizing radiation per mass unit by measuring the heat quantity Q produced under irradiation in its sensitive element which is thermally insulated. Graphite was chosen as construction material because all the energy imparted to graphite by ionizing radiation is converted into heat. Thermistors are used for temperature measurements as well as for the electrical heating of the different bodies of the calorimeter. The construction of a calorimeter is the result of a compromise between dosimetric requirements and mechanical constraints. The difficulties encountered are examined and the solutions chosen are detailed. All technical data are gathered in this document. The aim is to provide a practical operative instruction and guidance document, which can help interested laboratories in designing such an instrument. The electrical and thermal tests have shown a good behaviour of the GR9 calorimeter.
Single-case research design in pediatric psychology: considerations regarding data analysis.
Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E
2014-03-01
Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.
Cultural Considerations in Translation
Institute of Scientific and Technical Information of China (English)
陈嫔荣
2009-01-01
Language is the expression of human communication through which knowledge, belief, and behavior can be experi-enced, explained, and shared. It influences the way the speakers perceive the world. But as it has been long taken for granted, translation deals only with language. Cultural perspective, however, has never been brought into discussion. This paper first analyses the definitions of translation and culture, and then discusses why we should take culture into consideration and in the end, two translating strategies:domestication and foreignization are introduced.
Cultural Considerations in Translation
Institute of Scientific and Technical Information of China (English)
陈嫔荣
2009-01-01
Language is the expression of human communication through which knowledge, belief, and behavior can be experi- enced, explained, and shared. It influences the way the speakers perceive the world. But as it has been long taken for granted, translation deals only with language. Cultural perspective, however, has never been brought into discussion. This paper first analyses the definitions of translation and culture, and then discusses why we should take culture into consideration and in the end, two translating strategies: domestication and foreignization are introduced.
Mc Leod, Roger D.; Mc Leod, David M.
2002-10-01
Archimedes articulated an applied physics experience of many children who observe the upward movement of floating objects when they get into their "tubs." This same principle can effectively allow massive Egyptian construction blocks and obelisks to be elevated and erected. Platform bases at Giza were leveled by means of water channels that were cut into the rock. There is a canal behind the pyramids. The bathtub technique can elevate or transport the water-borne block (or obelisk) to sites involved, including the Sphinx temple. Water outflow from the barge locks (tubs) can erode Sphinx surrounds, without invoking 7000+ year-ago rainy weather. Our previously detailed account of how constellations, Canis Major, Phoenix, Leo can be detected at sites like America's Stonehenge, while they are below the local horizon, also indicates ancient Egyptians may have done likewise. Orion, or Leo the Sphinx could have been detected while they were in the "underground," around BCE 2500, in alignments otherwise requiring a date of BCE 1050.
Characterizing Financial and Statistical Literacy
DEFF Research Database (Denmark)
Di Girolamo, Amalia; Harrison, Glenn W.; Lau, Morten
We characterize the literacy of an individual in a domain by their elicited subjective belief distribution over the possible responses to a question posed in that domain. We consider literacy across several financial, economic and statistical domains. We find considerable demographic heterogeneity...... approach to characterize financial capability, the consequences of non-literacy, social literacy, and the information content of hypothetical survey measures of literacy....
Elements of Statistical Mechanics
Sachs, Ivo; Sen, Siddhartha; Sexton, James
2006-05-01
This textbook provides a concise introduction to the key concepts and tools of modern statistical mechanics. It also covers advanced topics such as non-relativistic quantum field theory and numerical methods. After introducing classical analytical techniques, such as cluster expansion and Landau theory, the authors present important numerical methods with applications to magnetic systems, Lennard-Jones fluids and biophysics. Quantum statistical mechanics is discussed in detail and applied to Bose-Einstein condensation and topics in astrophysics and cosmology. In order to describe emergent phenomena in interacting quantum systems, canonical non-relativistic quantum field theory is introduced and then reformulated in terms of Feynman integrals. Combining the authors' many years' experience of teaching courses in this area, this textbook is ideal for advanced undergraduate and graduate students in physics, chemistry and mathematics. Analytical and numerical techniques in one text, including sample codes and solved problems on the web at www.cambridge.org/0521841984 Covers a wide range of applications including magnetic systems, turbulence astrophysics, and biology Contains a concise introduction to Markov processes and molecular dynamics
Reliability Considerations for the Operation of Large Accelerator User Facilities
Willeke, F J
2016-01-01
The lecture provides an overview of considerations relevant for achieving highly reliable operation of accelerator based user facilities. The article starts with an overview of statistical reliability formalism which is followed by high reliability design considerations with examples. The article closes with operational aspects of high reliability such as preventive maintenance and spares inventory.
Ethical Dimensions of Diagnosing: Considerations for Clinical Mental Health Counselors
Kress, Victoria E.; Hoffman, Rachel M.; Eriksen, Karen
2010-01-01
There are numerous ethical considerations inherent within the process of assigning a "Diagnostic and Statistical Manual of Mental Disorders" (4th ed., text rev.; "DSM-IV-TR"; American Psychiatric Association, 2000) diagnosis. In this article, general ethics considerations such as informed consent and confidentiality, accuracy of diagnosis, and…
A primer of multivariate statistics
Harris, Richard J
2014-01-01
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why
... PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the American Society of Plastic Surgeons. Statistics by Year Print 2016 Plastic Surgery Statistics 2015 ...
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...
Institute of Scientific and Technical Information of China (English)
朱亚乔; 刘元波
2013-01-01
Rain Drop Size Distribution (DSD) is one of the key parameters to micro-physical process and macro-dynamical structure of precipitation.It provides useful information for understanding the mechanisms of precipitation formation and development.Conventional measurement techniques include momentum method,flour method,filtering paper,raindrop camera and immersion method.In general,the techniques have large measurement error,heavy workload,and low efficiency.Innovation of disdrometer is a remarkable progress in DSD observation.To date,the major techniques are classified into impacting,optical and acoustic disdrometers,which are automated and more convenient and accurate.The impacting disdrometer transforms the momentum of raindrops into electric impulse,which are easy to operate and quality-assured but with large errors for extremely large or small raindrops.The optical disdrometer measures rainfall diameter and its velocity in the same time,but cannot distinguish the particles passing through sampling area simultaneously.The acoustic disdrometer determines DSD from the raindrop impacts on water body with a high temporal resolution but easily affected by wind.In addition,the Doppler can provide DSD with polarimetric techniques for large area while it is affected by updrafts,downdrafts and horizontal winds.DSD has meteorological features,which can be described with the Marshall-Palmer (M-P),the Gamma,the lognormal or the normalized models.The M-P model is suitable for steady rainfall,usually used for weak and moderate rainfall.The gamma model is proposed for DSD at high rain rate.The lognormal model is widely applied for cloud droplet analysis,but not appropriate for DSD with a broad spectrum.The normalized model is free of assumptions about the shape of the DSD.For practical application,statistical comparison is necessary for selection of a most suitable model.Meteorologically,convective rain has a relatively narrow and smooth DSD spectrum usually described by the M
Einstein's cosmological considerations
Janzen, Daryl
2014-01-01
The objective of this paper is not simply to present an historical overview of Einstein's cosmological considerations, but to discuss the central role they played in shaping the paradigm of relativistic cosmology. This, we'll show, was a result of both his actions and, perhaps more importantly, his inactions. Accordingly, discussion won't simply be restricted to Einstein's considerations, as we'll analyse relevant contributions to the relativistic expansion paradigm during the approximately twenty years following Slipher's first redshift measurements in 1912. Our aim is to shed some light on why we think some of the things we do, with the idea that a better understanding of the reasoning that fundamentally influenced the common idea of our expanding universe might help to resolve some of the significant problems that modern cosmology now faces; and we eventually use this knowledge to probe the foundations of the standard model. Much of the information we present, including many of the historical details, we e...
Expansions and Asymptotics for Statistics
Small, Christopher G
2010-01-01
Providing a broad toolkit of analytical methods, this book shows how asymptotics, when coupled with numerical methods, becomes a powerful way to acquire a deeper understanding of the techniques used in probability and statistics. It describes core ideas in statistical asymptotics; covers Laplace approximation, the saddle-point method, and summation of series; and, includes vignettes of various people from statistics and mathematics whose ideas have been instrumental in the development of the subject. The author also supplements some topics with relevant Maplea commands and provides a list of c
Consideration of Dynamical Balances
Errico, Ronald M.
2015-01-01
The quasi-balance of extra-tropical tropospheric dynamics is a fundamental aspect of nature. If an atmospheric analysis does not reflect such balance sufficiently well, the subsequent forecast will exhibit unrealistic behavior associated with spurious fast-propagating gravity waves. Even if these eventually damp, they can create poor background fields for a subsequent analysis or interact with moist physics to create spurious precipitation. The nature of this problem will be described along with the reasons for atmospheric balance and techniques for mitigating imbalances. Attention will be focused on fundamental issues rather than on recipes for various techniques.
[Ledderhose disease (case considerations)].
Bottinelli, N F
1982-12-01
After a preliminary note, the Author gives the casuistic survey of 30 patients, surgically treated during 12 years about. Moreover, a per cent analysis is done about the different possibilities of incidence of Ledderhose's disease in the cases considered. As a conclusion, the Author remembers the operative technique which gave the best results. Wide partial aponeurectomy.
Microbiology--Safety Considerations.
Hoffmann, Sheryl K.
This paper discusses the risk assessment associated with microbiology instruction based on grade level, general control measures, appropriate activities for middle school and high school students, the preparation and sterilization of equipment, and safe handling techniques. Appended are instructions and figures on making wire loops and the…
For the last 30 years static chamber methodologies have been most commonly used to measure N2O fluxes from agricultural soils. The main advantages of this technique are that it is relatively inexpensive, versatile in the field, and the technology is very easy to adopt. Consequently, the majority of ...
Permutation statistical methods an integrated approach
Berry, Kenneth J; Johnston, Janis E
2016-01-01
This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...
Statistics in biomedical research
Directory of Open Access Journals (Sweden)
González-Manteiga, Wenceslao
2007-06-01
Full Text Available The discipline of biostatistics is nowadays a fundamental scientific component of biomedical, public health and health services research. Traditional and emerging areas of application include clinical trials research, observational studies, physiology, imaging, and genomics. The present article reviews the current situation of biostatistics, considering the statistical methods traditionally used in biomedical research, as well as the ongoing development of new methods in response to the new problems arising in medicine. Clearly, the successful application of statistics in biomedical research requires appropriate training of biostatisticians. This training should aim to give due consideration to emerging new areas of statistics, while at the same time retaining full coverage of the fundamentals of statistical theory and methodology. In addition, it is important that students of biostatistics receive formal training in relevant biomedical disciplines, such as epidemiology, clinical trials, molecular biology, genetics, and neuroscience.La Bioestadística es hoy en día una componente científica fundamental de la investigación en Biomedicina, salud pública y servicios de salud. Las áreas tradicionales y emergentes de aplicación incluyen ensayos clínicos, estudios observacionales, fisología, imágenes, y genómica. Este artículo repasa la situación actual de la Bioestadística, considerando los métodos estadísticos usados tradicionalmente en investigación biomédica, así como los recientes desarrollos de nuevos métodos, para dar respuesta a los nuevos problemas que surgen en Medicina. Obviamente, la aplicación fructífera de la estadística en investigación biomédica exige una formación adecuada de los bioestadísticos, formación que debería tener en cuenta las áreas emergentes en estadística, cubriendo al mismo tiempo los fundamentos de la teoría estadística y su metodología. Es importante, además, que los estudiantes de
Implementing Statistical Multiplexing in DVB-H
Directory of Open Access Journals (Sweden)
Mehdi Rezaei
2009-01-01
Full Text Available A novel technique for implementing statistical multiplexing (StatMux of broadcast services over Digital Video Broadcasting for Handhelds (DVB-H channels is proposed. DVB-H uses a time-sliced transmission scheme to reduce the power consumption used for radio reception part in DVB-H receivers. Due to the time-sliced transmission scheme, the implementation of known StatMux methods for DVB-H application presents some challenges which are addressed in this paper. The proposed StatMux technique is implemented in conjunction with the time-slicing transmission scheme. The combination is similar to a time division multiplexing (TDM scheme. The proposed StatMux method considerably decreases the end-to-end delay of DVB-H services while it maximizes the usage of available bandwidth. Moreover, the proposed method can effectively decrease the channel switching delay of DVB-H services. Simulation results show a high performance for the proposed StatMux method.
Predict! Teaching Statistics Using Informational Statistical Inference
Makar, Katie
2013-01-01
Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…
Intermediate statistics a modern approach
Stevens, James P
2007-01-01
Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.
Applying statistics in behavioural research
Ellis, J.L.
2016-01-01
Applying Statistics in Behavioural Research is written for undergraduate students in the behavioural sciences, such as Psychology, Pedagogy, Sociology and Ethology. The topics range from basic techniques, like correlation and t-tests, to moderately advanced analyses, like multiple regression and MAN
Statistical physics of vaccination
Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei
2016-12-01
Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.
Modern statistics for the social and behavioral sciences a practical introduction
Wilcox, Rand
2011-01-01
Relative advantages/disadvantages of various techniques are presented so that the reader can be helped to understand the choices they make on using the techniques. … A considerable number of illustrations are included and the book focuses on using R for its computer software application. … A useful text for … postgraduate students in the social science disciplines.-Susan Starkings, International Statistical Review, 2012This is an interesting and valuable book … By gathering a mass of results on that topic into a single volume with references, alternative procedures, and supporting software, th
Scoliosis and anaesthetic considerations
Directory of Open Access Journals (Sweden)
Anand H Kulkarni
2007-01-01
Full Text Available Scoliosis may be of varied etiology and tends to cause a restrictive ventilatory defect, along with ventilation-perfusion mismatch and hypoxemia. There is also cardiovascular involvement in the form of raised right heart pressures, mitral valve prolapse or congenital heart disease. Thus a careful pre-anaesthetic evaluation and optimization should be done. Intraoperatively temperature and fluid balance, positioning, spinal cord integrity testing and blood conservation techniques are to be kept in mind. Postoperatively, intensive respiratory therapy and pain management are prime concerns.
Ainsbury, Elizabeth A; Barquinero, J Francesc
2009-01-01
Consideration of statistical methodology is essential for the application of cytogenetic and other biodosimetry techniques to triage for mass casualty situations. This is because the requirement for speed and accuracy in biodosimetric triage necessarily introduces greater uncertainties than would be acceptable in day-to-day biodosimetry. Additionally, in a large scale accident type situation, it is expected that a large number of laboratories from around the world will assist and it is likely that each laboratory will use one or more different dosimetry techniques. Thus issues arise regarding combination of results and the associated errors. In this article we discuss the statistical and computational aspects of radiation biodosimetry for triage in a large scale accident-type situation. The current status of statistical analysis techniques is reviewed and suggestions are made for improvements to these methods which will allow first responders to estimate doses quickly and reliably for suspected exposed persons.
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
... Facts and Statistics Printable Version Blood Facts and Statistics Facts about blood needs Facts about the blood ... to Top Learn About Blood Blood Facts and Statistics Blood Components Whole Blood and Red Blood Cells ...
Algebraic statistics computational commutative algebra in statistics
Pistone, Giovanni; Wynn, Henry P
2000-01-01
Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.
Acid gas injection : reservoir engineering considerations
Energy Technology Data Exchange (ETDEWEB)
Pooladi-Darvish, M. [Fekete Associates Inc., Calgary, AB (Canada); Calgary Univ., AB (Canada)
2009-07-01
This study discussed reservoir engineering considerations related to acid gas injection, including the effects of pressure. A map of acid gas injection sites in Alberta was presented. The WASP Nisku acid gas project is a carbon dioxide (CO{sub 2}) sequestration project located in a dolomitized aquifer close to coal-fired power plants. Analytical solutions developed at the site include a multi-well injectivity procedure for infinite reservoirs. Analytical considerations at the site included low water compressibility, strong interference, and a lack of flow boundaries. Chromatographic separation techniques were used to address the compositional effects of the reservoir in relation to the injection wells. Techniques developed at the CO{sub 2} sequestration sites are being used to develop procedures for acid gas storage in depleted gas pools and beneath the ocean floor. tabs., figs.
Consideration of smoothing techniques for hyperspectral remote sensing
Vaiphasa, C.
2006-01-01
Spectral smoothing filters are popularly used in a large number of modern hyperspectral remote sensing studies for removing noise from the data. However, most of these studies subjectively apply ad hoc measures to select filter types and their parameters. We argue that this subjectively minded appro
Short clinical crowns (SCC) – treatment considerations and techniques
Rahul, G. R.; Poduval, Soorya T.; Shetty, Karunakar
2012-01-01
When the clinical crowns of teeth are dimensionally inadequate, esthetically and biologically acceptable restoration of these dental units is difficult. Often an acceptable restoration cannot be accomplished without first surgically increasing the length of the existing clinical crowns; therefore, successful management requires an understanding of both the dental and periodontal parameters of treatment. The complications presented by teeth with short clinical crowns demand a comprehensive treatment plan and proper sequencing of therapy to ensure a satisfactory result. Visualization of the desired result is a prerequisite of successful therapy. This review examines the periodontal and restorative factors related to restoring teeth with short clinical crowns. Modes of therapy are usually combined to meet the biologic, restorative, and esthetic requirements imposed by short clinical crowns. In this study various methods for treating short clinical crowns are reviewed, the role that restoration margin location play in the maintenance of periodontal and dental symbiosis and the effects of violation of the supracrestal gingivae by improper full-coverage restorations has also been discussed. Key words:Short clinical crown, surgical crown lengthening, forced eruption, diagnostic wax up, alveoloplasty, gingivectomy. PMID:24558561
Short clinical crowns (SCC) – treatment considerations and techniques
Sharma, Ashu; Rahul, G. R.; Poduval, Soorya T.; Shetty, Karunakar
2012-01-01
When the clinical crowns of teeth are dimensionally inadequate, esthetically and biologically acceptable restoration of these dental units is difficult. Often an acceptable restoration cannot be accomplished without first surgically increasing the length of the existing clinical crowns; therefore, successful management requires an understanding of both the dental and periodontal parameters of treatment. The complications presented by teeth with short clinical crowns demand a comprehensive tre...
[Facial epitheliomas: general considerations, surgical techniques and indications].
Martin, D; Barthélémy, I; Mondie, J M; Grangier, Y; Pélissier, P; Loddé, J P
1998-08-01
Carcinoma of the face is the skin disease most frequently encountered by plastic surgeons in everyday practice. Although basal cell carcinomas and squamous cell carcinomas are generally easy to recognize, their treatment remains subject to various schools of thought, or even individual practices, which are often difficult to define. This article defines a general plan of management of these tumours; their histological duality corresponds to a therapeutic duality. Resection of a basal cell carcinoma requires safety margins of 3 to 4 mm, versus at least 5 mm for a squamous cell carcinoma. In a high-risk subject, with a sclerodermiform carcinoma or undifferentiated squamous cell carcinoma, this safety margin may be as much as 10 mm or more. Frozen section examination is preferable in these situations. Six anatomical regions are studied selectively to define the main rules of reconstruction: nasal region, orbitopalpebral region, labial region, malar region, frontal region and auricular region. Each region will be subdivided into several subterritories, each requiring different strategies. The objectives, methods and indications of each reconstruction are selectively defined. The final strategy proposed is based not only on the author's personal experience, but also on the results of the national survey on carcinomas. As a complement to these therapeutic guidelines, the authors raise the problem of incomplete resection, which requires the definition of a peripheral infiltration index predictive of the recurrence rate. Surgery obviously cannot constitute exclusive treatment carcinomas, hence the value of presenting other methods currently available in the therapeutic armamentarium. Surveillance is essential in every case, determined by the patient's risk of recurrence or even metastatic dissemination.
Consideration of smoothing techniques for hyperspectral remote sensing
Vaiphasa, C.
2006-01-01
Spectral smoothing filters are popularly used in a large number of modern hyperspectral remote sensing studies for removing noise from the data. However, most of these studies subjectively apply ad hoc measures to select filter types and their parameters. We argue that this subjectively minded
Consideration of impedance matching techniques for efficient piezoelectric energy harvesting.
Kim, Hyeoungwoo; Priya, Shashank; Stephanou, Harry; Uchino, Kenji
2007-09-01
This study investigates multiple levels of impedance-matching methods for piezoelectric energy harvesting in order to enhance the conversion of mechanical to electrical energy. First, the transduction rate was improved by using a high piezoelectric voltage constant (g) ceramic material having a magnitude of g33 = 40 x 10(-3) V m/N. Second, a transducer structure, cymbal, was optimized and fabricated to match the mechanical impedance of vibration source to that of the piezoelectric transducer. The cymbal transducer was found to exhibit approximately 40 times higher effective strain coefficient than the piezoelectric ceramics. Third, the electrical impedance matching for the energy harvesting circuit was considered to allow the transfer of generated power to a storage media. It was found that, by using the 10-layer ceramics instead of the single layer, the output current can be increased by 10 times, and the output load can be reduced by 40 times. Furthermore, by using the multilayer ceramics the output power was found to increase by 100%. A direct current (DC)-DC buck converter was fabricated to transfer the accumulated electrical energy in a capacitor to a lower output load. The converter was optimized such that it required less than 5 mW for operation.
Short clinical crowns (SCC) – treatment considerations and techniques
Sharma, Ashu; G. R. Rahul; Poduval, Soorya T.; Shetty, Karunakar
2012-01-01
When the clinical crowns of teeth are dimensionally inadequate, esthetically and biologically acceptable restoration of these dental units is difficult. Often an acceptable restoration cannot be accomplished without first surgically increasing the length of the existing clinical crowns; therefore, successful management requires an understanding of both the dental and periodontal parameters of treatment. The complications presented by teeth with short clinical crowns demand a comprehensive tre...
STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION
Human Factors Considerations in System Design
Mitchell, C. M. (Editor); Vanbalen, P. M. (Editor); Moe, K. L. (Editor)
1983-01-01
Human factors considerations in systems design was examined. Human factors in automated command and control, in the efficiency of the human computer interface and system effectiveness are outlined. The following topics are discussed: human factors aspects of control room design; design of interactive systems; human computer dialogue, interaction tasks and techniques; guidelines on ergonomic aspects of control rooms and highly automated environments; system engineering for control by humans; conceptual models of information processing; information display and interaction in real time environments.
Engineering radioecology: Methodological considerations
Energy Technology Data Exchange (ETDEWEB)
Nechaev, A.F.; Projaev, V.V. [St. Petersburg State Inst. of Technology (Russian Federation); Sobolev, I.A.; Dmitriev, S.A. [United Ecologo-Technological and Research Center on Radioactive Waste Management and Environmental Remediation, Moscow (Russian Federation)
1995-12-31
The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.
Explorations in statistics: statistical facets of reproducibility.
Curran-Everett, Douglas
2016-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.
Purohit, Sudha G; Deshmukh, Shailaja R
2015-01-01
STATISTICS USING R will be useful at different levels, from an undergraduate course in statistics, through graduate courses in biological sciences, engineering, management and so on. The book introduces statistical terminology and defines it for the benefit of a novice. For a practicing statistician, it will serve as a guide to R language for statistical analysis. For a researcher, it is a dual guide, simultaneously explaining appropriate statistical methods for the problems at hand and indicating how these methods can be implemented using the R language. For a software developer, it is a guide in a variety of statistical methods for development of a suite of statistical procedures.
Statistical principles for prospective study protocols:
DEFF Research Database (Denmark)
Christensen, Robin; Langberg, Henning
2012-01-01
In the design of scientific studies it is essential to decide on which scientific questions one aims to answer, just as it is important to decide on the correct statistical methods to use to answer these questions. The correct use of statistical methods is crucial in all aspects of research...... are "statistically significant" or not. In the present paper we outline the considerations and suggestions on how to build a trial protocol, with an emphasis on having a rigorous protocol stage, always leading to a full article manuscript, independent of statistical findings. We conclude that authors, who find...... to quantify relationships in data. Despite an increased focus on statistical content and complexity of biomedical research these topics remain difficult for most researchers. Statistical methods enable researchers to condense large spreadsheets with data into means, proportions, and difference between means...
Statistical Analysis by Statistical Physics Model for the STOCK Markets
Wang, Tiansong; Wang, Jun; Fan, Bingli
A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.
Introduction to statistics using Microsoft Excel
Remenyi, Dan; English, Joseph
2011-01-01
This book explains the statistical concepts and then uses Microsoft Excel functions to illustrate how to get results using the appropriate techniques which will help researchers directly with their research.
Institute of Scientific and Technical Information of China (English)
金应华; 吴耀华
2009-01-01
考虑了乘积多项抽样下的对数线性模型.在这个模型下,文献[Jin Y H, Wu Y H. Minimum φ-divergence estimator and hierarchical testing in log-linear models under product-multinomial sampling. Journal of Statistical Planning and Inference, 2009,139:3 488-3 500]用基于φ-散度和最小φ-散度估计构造的统计量研究了几类假设检验问题,这其中就有嵌套假设.最小φ-散度估计是极大似然估计的推广.在上述文献的基础上,给出了其中一类检验的功效函数的渐近逼近公式;另外,还研究了在一列近邻假设下检验统计量的渐近分布.通过模拟研究发现,与Pearson型统计量和对数极大似然比统计量相比,Cressie-Read型检验统计量有差不多的甚至更好的模拟功效和水平.%Suppose that discrete data are distributed according to a product-multinomial distribution whose probabilities follow a loglinear model.Under the model above,Ref.[Jin Y H,Wu Y H.Minimum φ-divergence estimator and hierarchical testing in log-linear models under product-multinomial sampling.Journal of Statistical Planning and Inference,2009,139:3 488-3 500] have considered hypothesis test problems including hierarchical tests using φ-divergence test statistics that contain the minimum φ-divergence estimator (MφE) which is seen as a generalization of the maximum likelihood estimator.Here an approximation to the power function of one of these tests and asymptotic distributions of these test statistics under a contiguous sequence of hypotheses on the basis of the results in Jin et al was gotten.In the last section,a simulation study was conducted to find our member of the power-divergence statistics is the best,the Cressie-Read test statistic is an attractive alternative to the Pearson-based statistic and the likelihood ratio-based test statistic in terms of simulated sizes and powers.
International Conference on Robust Statistics 2015
Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta
2016-01-01
This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...
Applied statistics for social and management sciences
Miah, Abdul Quader
2016-01-01
This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .
Statistical perspectives on inverse problems
DEFF Research Database (Denmark)
Andersen, Kim Emil
of the interior of an object from electrical boundary measurements. One part of this thesis concerns statistical approaches for solving, possibly non-linear, inverse problems. Thus inverse problems are recasted in a form suitable for statistical inference. In particular, a Bayesian approach for regularisation...... is obtained by assuming that the a priori beliefs about the solution before having observed any data can be described by a prior distribution. The solution to the statistical inverse problem is then given by the posterior distribution obtained by Bayes' formula. Hence the solution of an ill-posed inverse...... problem is given in terms of probability distributions. Posterior inference is obtained by Markov chain Monte Carlo methods and new, powerful simulation techniques based on e.g. coupled Markov chains and simulated tempering is developed to improve the computational efficiency of the overall simulation...
The Statistical Loop Analyzer (SLA)
Lindsey, W. C.
1985-01-01
The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.
Effectiveness and Limitations of Statistical Spam Filters
Banday, M Tariq
2009-01-01
In this paper we discuss the techniques involved in the design of the famous statistical spam filters that include Naive Bayes, Term Frequency-Inverse Document Frequency, K-Nearest Neighbor, Support Vector Machine, and Bayes Additive Regression Tree. We compare these techniques with each other in terms of accuracy, recall, precision, etc. Further, we discuss the effectiveness and limitations of statistical filters in filtering out various types of spam from legitimate e-mails.
On Quantum Statistical Inference, II
Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P.E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...
Practical Statistics for Particle Physicists
Lista, Luca
2016-01-01
These three lectures provide an introduction to the main concepts of statistical data analysis useful for precision measurements and searches for new signals in High Energy Physics. The frequentist and Bayesian approaches to probability theory will introduced and, for both approaches, inference methods will be presented. Hypothesis tests will be discussed, then significance and upper limit evaluation will be presented with an overview of the modern and most advanced techniques adopted for data analysis at the Large Hadron Collider.
Advanced LBB methodology and considerations
Energy Technology Data Exchange (ETDEWEB)
Olson, R.; Rahman, S.; Scott, P. [Battelle, Columbus, OH (United States)] [and others
1997-04-01
LBB applications have existed in many industries and more recently have been applied in the nuclear industry under limited circumstances. Research over the past 10 years has evolved the technology so that more advanced consideration of LBB can now be given. Some of the advanced considerations for nuclear plants subjected to seismic loading evaluations are summarized in this paper.
Ethical Considerations in Technology Transfer.
Froehlich, Thomas J.
1991-01-01
Examines ethical considerations involved in the transfer of appropriate information technology to less developed countries. Approaches to technology are considered; two philosophical frameworks for studying ethical considerations are discussed, i.e., the Kantian approach and the utilitarian perspective by John Stuart Mill; and integration of the…
Ethical Considerations in Technology Transfer.
Froehlich, Thomas J.
1991-01-01
Examines ethical considerations involved in the transfer of appropriate information technology to less developed countries. Approaches to technology are considered; two philosophical frameworks for studying ethical considerations are discussed, i.e., the Kantian approach and the utilitarian perspective by John Stuart Mill; and integration of the…
Revealed preference with limited consideration
Demuynck, T.; Seel, C.
2014-01-01
We derive revealed preference tests for models where individuals use consideration sets to simplify their consumption problem. Our basic test provides necessary and sufficient conditions for consistency of observed choices with the existence of consideration set restrictions. The same conditions can
Pathway Model and Nonextensive Statistical Mechanics
Mathai, A. M.; Haubold, H. J.; Tsallis, C.
2015-12-01
The established technique of eliminating upper or lower parameters in a general hypergeometric series is profitably exploited to create pathways among confluent hypergeometric functions, binomial functions, Bessel functions, and exponential series. One such pathway, from the mathematical statistics point of view, results in distributions which naturally emerge within nonextensive statistical mechanics and Beck-Cohen superstatistics, as pursued in generalizations of Boltzmann-Gibbs statistics.
Statistical and thermal physics with computer applications
Gould, Harvey
2010-01-01
This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the
Marrakesh International Conference on Probability and Statistics
Ouassou, Idir; Rachdi, Mustapha
2015-01-01
This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.
Mad Libs Statistics: A "Happy" Activity
Trumpower, David
2010-01-01
This article describes a fun activity that can be used to help students make links between statistical analyses and their real-world implications. Although an illustrative example is provided using analysis of variance, the activity may be adapted for use with other statistical techniques.
Statistical Analysis of Data for Timber Strengths
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
2003-01-01
. The statistical fits have generally been made using all data and the lower tail of the data. The Maximum Likelihood Method and the Least Square Technique have been used to estimate the statistical parameters in the selected distributions. The results show that the 2-parameter Weibull distribution gives the best...
Statistical mechanics of nucleosomes
Chereji, Razvan V.
Eukaryotic cells contain long DNA molecules (about two meters for a human cell) which are tightly packed inside the micrometric nuclei. Nucleosomes are the basic packaging unit of the DNA which allows this millionfold compactification. A longstanding puzzle is to understand the principles which allow cells to both organize their genomes into chromatin fibers in the crowded space of their nuclei, and also to keep the DNA accessible to many factors and enzymes. With the nucleosomes covering about three quarters of the DNA, their positions are essential because these influence which genes can be regulated by the transcription factors and which cannot. We study physical models which predict the genome-wide organization of the nucleosomes and also the relevant energies which dictate this organization. In the last five years, the study of chromatin knew many important advances. In particular, in the field of nucleosome positioning, new techniques of identifying nucleosomes and the competing DNA-binding factors appeared, as chemical mapping with hydroxyl radicals, ChIP-exo, among others, the resolution of the nucleosome maps increased by using paired-end sequencing, and the price of sequencing an entire genome decreased. We present a rigorous statistical mechanics model which is able to explain the recent experimental results by taking into account nucleosome unwrapping, competition between different DNA-binding proteins, and both the interaction between histones and DNA, and between neighboring histones. We show a series of predictions of our new model, all in agreement with the experimental observations.
Contributions to sampling statistics
Conti, Pier; Ranalli, Maria
2014-01-01
This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1245, 2004. PMID: 15010446 11 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
... Statistics Students' Pages Errata Other Statistical Sites Subjects Inflation & Prices » Consumer Price Index Producer Price Indexes Import/Export Price ... Choose a Subject Employment and Unemployment Employment Unemployment Inflation, Prices, and ... price indexes Consumer spending Industry price indexes Pay ...
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the United States are diagnosed with Merkel cell skin cancer each year. Almost all people diagnosed with the ...
Overweight and Obesity Statistics
... the full list of resources . Overweight and Obesity Statistics Page Content About Overweight and Obesity Prevalence of ... adults age 20 and older [ Top ] Physical Activity Statistics Adults Research Findings Research suggests that staying active ...
... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... fact sheet provides up-to-date data and statistics on youth violence. Data Sources Indicators of School ...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Software for Spatial Statistics
Directory of Open Access Journals (Sweden)
Edzer Pebesma
2015-02-01
Full Text Available We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth, point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.
Software for Spatial Statistics
Edzer Pebesma; Roger Bivand; Paulo Justiniano Ribeiro
2015-01-01
We give an overview of the papers published in this special issue on spatial statistics, of the Journal of Statistical Software. 21 papers address issues covering visualization (micromaps, links to Google Maps or Google Earth), point pattern analysis, geostatistics, analysis of areal aggregated or lattice data, spatio-temporal statistics, Bayesian spatial statistics, and Laplace approximations. We also point to earlier publications in this journal on the same topic.
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Statistical methods for astronomical data analysis
Chattopadhyay, Asis Kumar
2014-01-01
This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...
Dealing with statistics what you need to know
Brown, Reva Berman
2007-01-01
A guide to the essential statistical skills needed for success in assignments, projects or dissertations. It explains why it is impossible to avoid using statistics in analysing data. It also describes the language of statistics to make it easier to understand the various terms used for statistical techniques.
Statistical physics in foreign exchange currency and stock markets
Ausloos, M.
2000-09-01
Problems in economy and finance have attracted the interest of statistical physicists all over the world. Fundamental problems pertain to the existence or not of long-, medium- or/and short-range power-law correlations in various economic systems, to the presence of financial cycles and on economic considerations, including economic policy. A method like the detrended fluctuation analysis is recalled emphasizing its value in sorting out correlation ranges, thereby leading to predictability at short horizon. The ( m, k)-Zipf method is presented for sorting out short-range correlations in the sign and amplitude of the fluctuations. A well-known financial analysis technique, the so-called moving average, is shown to raise questions to physicists about fractional Brownian motion properties. Among spectacular results, the possibility of crash predictions has been demonstrated through the log-periodicity of financial index oscillations.
Statistical properties of cloud lifecycles in cloud-resolving models
Directory of Open Access Journals (Sweden)
R. S. Plant
2008-12-01
Full Text Available A new technique is described for the analysis of cloud-resolving model simulations, which allows one to investigate the statistics of the lifecycles of cumulus clouds. Clouds are tracked from timestep-to-timestep within the model run. This allows for a very simple method of tracking, but one which is both comprehensive and robust. An approach for handling cloud splits and mergers is described which allows clouds with simple and complicated time histories to be compared within a single framework. This is found to be important for the analysis of an idealized simulation of radiative-convective equilibrium, in which the moist, buoyant, updrafts (i.e., the convective cores were tracked. Around half of all such cores were subject to splits and mergers during their lifecycles. For cores without any such events, the average lifetime is 30 min, but events can lengthen the typical lifetime considerably.
Selling statistics[Statistics in scientific progress
Energy Technology Data Exchange (ETDEWEB)
Bridle, S. [Astrophysics Group, University College London (United Kingdom)]. E-mail: sarah@star.ucl.ac.uk
2006-09-15
From Cosmos to Chaos- Peter Coles, 2006, Oxford University Press, 224pp. To confirm or refute a scientific theory you have to make a measurement. Unfortunately, however, measurements are never perfect: the rest is statistics. Indeed, statistics is at the very heart of scientific progress, but it is often poorly taught and badly received; for many, the very word conjures up half-remembered nightmares of 'null hypotheses' and 'student's t-tests'. From Cosmos to Chaos by Peter Coles, a cosmologist at Nottingham University, is an approachable antidote that places statistics in a range of catchy contexts. Using this book you will be able to calculate the probabilities in a game of bridge or in a legal trial based on DNA fingerprinting, impress friends by talking confidently about entropy, and stretch your mind thinking about quantum mechanics. (U.K.)
Statistical motor number estimation assuming a binomial distribution.
Blok, Joleen H; Visser, Gerhard H; de Graaf, Sándor; Zwarts, Machiel J; Stegeman, Dick F
2005-02-01
The statistical method of motor unit number estimation (MUNE) uses the natural stochastic variation in a muscle's compound response to electrical stimulation to obtain an estimate of the number of recruitable motor units. The current method assumes that this variation follows a Poisson distribution. We present an alternative that instead assumes a binomial distribution. Results of computer simulations and of a pilot study on 19 healthy subjects showed that the binomial MUNE values are considerably higher than those of the Poisson method, and in better agreement with the results of other MUNE techniques. In addition, simulation results predict that the performance in patients with severe motor unit loss will be better for the binomial than Poisson method. The adapted method remains closer to physiology, because it can accommodate the increase in activation probability that results from rising stimulus intensity. It does not need recording windows as used with the Poisson method, and is therefore less user-dependent and more objective and quicker in its operation. For these reasons, we believe that the proposed modifications may lead to significant improvements in the statistical MUNE technique.
Reproducible statistical analysis with multiple languages
DEFF Research Database (Denmark)
Lenth, Russell; Højsgaard, Søren
2011-01-01
This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using Open......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
On the statistical properties and tail risk of violent conflicts
Cirillo, Pasquale; Taleb, Nassim Nicholas
2016-06-01
We examine statistical pictures of violent conflicts over the last 2000 years, providing techniques for dealing with the unreliability of historical data. We make use of a novel approach to deal with fat-tailed random variables with a remote but nonetheless finite upper bound, by defining a corresponding unbounded dual distribution (given that potential war casualties are bounded by the world population). This approach can also be applied to other fields of science where power laws play a role in modeling, like geology, hydrology, statistical physics and finance. We apply methods from extreme value theory on the dual distribution and derive its tail properties. The dual method allows us to calculate the real tail mean of war casualties, which proves to be considerably larger than the corresponding sample mean for large thresholds, meaning severe underestimation of the tail risks of conflicts from naive observation. We analyze the robustness of our results to errors in historical reports. We study inter-arrival times between tail events and find that no particular trend can be asserted. All the statistical pictures obtained are at variance with the prevailing claims about "long peace", namely that violence has been declining over time.
Statistics Essentials For Dummies
Rumsey, Deborah
2010-01-01
Statistics Essentials For Dummies not only provides students enrolled in Statistics I with an excellent high-level overview of key concepts, but it also serves as a reference or refresher for students in upper-level statistics courses. Free of review and ramp-up material, Statistics Essentials For Dummies sticks to the point, with content focused on key course topics only. It provides discrete explanations of essential concepts taught in a typical first semester college-level statistics course, from odds and error margins to confidence intervals and conclusions. This guide is also a perfect re