WorldWideScience

Sample records for sophisticated statistical techniques

  1. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  2. Sophisticated Players and Sophisticated Agents

    NARCIS (Netherlands)

    Rustichini, A.

    1998-01-01

    A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

  3. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  4. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    Science.gov (United States)

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  6. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  7. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  8. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  9. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  10. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1997-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  11. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E; Storch, H von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1998-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  12. Review of the Statistical Techniques in Medical Sciences | Okeh ...

    African Journals Online (AJOL)

    ... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.

  13. In Praise of the Sophists.

    Science.gov (United States)

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  14. Testing of statistical techniques used in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.; Edwards, H.; Prust, J.

    1984-01-01

    Analysis of the SYVAC (SYstems Variability Analysis Code) output adopted four techniques to provide a cross comparison of their performance. The techniques used were: examination of scatter plots; correlation/regression; Kruskal-Wallis one-way analysis of variance by ranks; comparison of cumulative distribution functions and risk estimates between sub-ranges of parameter values. The analysis was conducted for the case of a single nuclide chain and was based mainly on simulated dose after 500,000 years. The results from this single SYVAC case showed that site parameters had the greatest influence on dose to man. The techniques of correlation/regression and Kruskal-Wallis were both successful and consistent in their identification of important parameters. Both techniques ranked the eight most important parameters in the same order when analysed for maximum dose. The results from a comparison of cdfs and risks in sub-ranges of the parameter values were not entirely consistent with other techniques. Further sampling of the high dose region is recommended in order to improve the accuracy of this method. (author)

  15. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  16. Predicting radiotherapy outcomes using statistical learning techniques

    International Nuclear Information System (INIS)

    El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O; Lindsay, Patricia E; Hope, Andrew J

    2009-01-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  17. "Statistical Techniques for Particle Physics" (2/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  18. "Statistical Techniques for Particle Physics" (1/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  19. "Statistical Techniques for Particle Physics" (4/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  20. "Statistical Techniques for Particle Physics" (3/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  1. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...

  2. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  3. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  4. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  5. The statistical chopper in the time-of-flight technique

    International Nuclear Information System (INIS)

    Albuquerque Vieira, J. de.

    1975-12-01

    A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt

  6. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    Science.gov (United States)

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  7. Cognitive Load and Strategic Sophistication

    OpenAIRE

    Allred, Sarah; Duffy, Sean; Smith, John

    2013-01-01

    We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

  8. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    Science.gov (United States)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  9. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis

  10. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  11. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  12. The application of statistical techniques to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Annibal, P.S.; Roberts, P.D.

    1990-02-01

    Over the past decade much theoretical research has been carried out on the development of statistical methods for nuclear materials accountancy. In practice plant operation may differ substantially from the idealized models often cited. This paper demonstrates the importance of taking account of plant operation in applying the statistical techniques, to improve the accuracy of the estimates and the knowledge of the errors. The benefits are quantified either by theoretical calculation or by simulation. Two different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an accountancy tank is investigated. Secondly, a means of improving the knowledge of the 'Material Unaccounted For' (the difference between the inventory calculated from input/output data, and the measured inventory), using information about the plant measurement system, is developed and compared with existing general techniques. (author)

  13. Combining heuristic and statistical techniques in landslide hazard assessments

    Science.gov (United States)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  14. Line identification studies using traditional techniques and wavelength coincidence statistics

    International Nuclear Information System (INIS)

    Cowley, C.R.; Adelman, S.J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum

  15. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  16. Statistical optimisation techniques in fatigue signal editing problem

    International Nuclear Information System (INIS)

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-01-01

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection

  17. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  18. Application of multivariate statistical techniques in microbial ecology.

    Science.gov (United States)

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  19. Statistical methods of evaluating and comparing imaging techniques

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested

  20. Sophisticating a naive Liapunov function

    International Nuclear Information System (INIS)

    Smith, D.; Lewins, J.D.

    1985-01-01

    The art of the direct method of Liapunov to determine system stability is to construct a suitable Liapunov or V function where V is to be positive definite (PD), to shrink to a center, which may be conveniently chosen as the origin, and where V is the negative definite (ND). One aid to the art is to solve an approximation to the system equations in order to provide a candidate V function. It can happen, however, that the V function is not strictly ND but vanishes at a finite number of isolated points. Naively, one anticipates that stability has been demonstrated since the trajectory of the system at such points is only momentarily tangential and immediately enters a region of inward directed trajectories. To demonstrate stability rigorously requires the construction of a sophisticated Liapunov function from what can be called the naive original choice. In this paper, the authors demonstrate the method of perturbing the naive function in the context of the well-known second-order oscillator and then apply the method to a more complicated problem based on a prompt jump model for a nuclear fission reactor

  1. Statistical precision of delayed-neutron nondestructive assay techniques

    International Nuclear Information System (INIS)

    Bayne, C.K.; McNeany, S.R.

    1979-02-01

    A theoretical analysis of the statistical precision of delayed-neutron nondestructive assay instruments is presented. Such instruments measure the fissile content of nuclear fuel samples by neutron irradiation and delayed-neutron detection. The precision of these techniques is limited by the statistical nature of the nuclear decay process, but the precision can be optimized by proper selection of system operating parameters. Our method is a three-part analysis. We first present differential--difference equations describing the fundamental physics of the measurements. We then derive and present complete analytical solutions to these equations. Final equations governing the expected number and variance of delayed-neutron counts were computer programmed to calculate the relative statistical precision of specific system operating parameters. Our results show that Poisson statistics do not govern the number of counts accumulated in multiple irradiation-count cycles and that, in general, maximum count precision does not correspond with maximum count as first expected. Covariance between the counts of individual cycles must be considered in determining the optimum number of irradiation-count cycles and the optimum irradiation-to-count time ratio. For the assay system in use at ORNL, covariance effects are small, but for systems with short irradiation-to-count transition times, covariance effects force the optimum number of irradiation-count cycles to be half those giving maximum count. We conclude that the equations governing the expected value and variance of delayed-neutron counts have been derived in closed form. These have been computerized and can be used to select optimum operating parameters for delayed-neutron assay devices

  2. Pension fund sophistication and investment policy

    NARCIS (Netherlands)

    de Dreu, J.|info:eu-repo/dai/nl/364537906; Bikker, J.A.|info:eu-repo/dai/nl/06912261X

    This paper assesses the sophistication of pension funds’ investment policies using data on 748 Dutch pension funds during the 1999–2006 period. We develop three indicators of sophistication: gross rounding of investment choices, investments in alternative sophisticated asset classes and ‘home bias’.

  3. Studies on coal flotation in flotation column using statistical technique

    Energy Technology Data Exchange (ETDEWEB)

    M.S. Jena; S.K. Biswal; K.K. Rao; P.S.R. Reddy [Institute of Minerals & Materials Technology (IMMT), Orissa (India)

    2009-07-01

    Flotation of Indian high ash coking coal fines to obtain clean coal has been reported earlier by many authors. Here an attempt has been made to systematically analyse factors influencing the flotation process using statistical design of experiments technique. Studies carried out in a 100 mm diameter column using factorial design to establish weightage of factors such as feed rate, air rate and collector dosage indicated that all three parameters have equal influence on the flotation process. Subsequently RSM-CCD design was used to obtain best result and it is observed that 94% combustibles can be recovered with 82.5% weight recovery at 21.4% ash from a feed containing 31.3% ash content.

  4. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    Science.gov (United States)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  5. Statistical and particle physics: Common problems and techniques

    International Nuclear Information System (INIS)

    Bowler, K.C.; Mc Kane, A.J.

    1984-01-01

    These proceedings contain statistical mechanical studies in condensed matter physics; interfacial problems in statistical physics; string theory; general monte carlo methods and their application to Lattice gauge theories; topological excitations in field theory; phase transformation kinetics; and studies of chaotic systems

  6. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  7. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications

  8. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  9. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  10. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  11. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  12. Territories typification technique with use of statistical models

    Science.gov (United States)

    Galkin, V. I.; Rastegaev, A. V.; Seredin, V. V.; Andrianov, A. V.

    2018-05-01

    Territories typification is required for solution of many problems. The results of geological zoning received by means of various methods do not always agree. That is why the main goal of the research given is to develop a technique of obtaining a multidimensional standard classified indicator for geological zoning. In the course of the research, the probabilistic approach was used. In order to increase the reliability of geological information classification, the authors suggest using complex multidimensional probabilistic indicator P K as a criterion of the classification. The second criterion chosen is multidimensional standard classified indicator Z. These can serve as characteristics of classification in geological-engineering zoning. Above mentioned indicators P K and Z are in good correlation. Correlation coefficient values for the entire territory regardless of structural solidity equal r = 0.95 so each indicator can be used in geological-engineering zoning. The method suggested has been tested and the schematic map of zoning has been drawn.

  13. Application of Statistical Potential Techniques to Runaway Transport Studies

    International Nuclear Information System (INIS)

    Eguilior, S.; Castejon, F.; Parrondo, J. M.

    2001-01-01

    A method is presented for computing runaway production rate based on techniques of noise-activated escape in a potential is presented in this work. A generalised potential in 2D momentum space is obtained from the deterministic or drift terms of Langevin equations. The diffusive or stochastic terms that arise directly from the stochastic nature of collisions, play the role of the noise that activates barrier crossings. The runaway electron source is given by the escape rate in such a potential which is obtained from an Arrenius-like relation. Runaway electrons are those skip the potential barrier due to the effect of stochastic collisions. In terms of computation time, this method allows one to quickly obtain the source term for a runway electron transport code.(Author) 11 refs

  14. Statistical techniques for the identification of reactor component structural vibrations

    International Nuclear Information System (INIS)

    Kemeny, L.G.

    1975-01-01

    The identification, on-line and in near real-time, of the vibration frequencies, modes and amplitudes of selected key reactor structural components and the visual monitoring of these phenomena by nuclear power plant operating staff will serve to further the safety and control philosophy of nuclear systems and lead to design optimisation. The School of Nuclear Engineering has developed a data acquisition system for vibration detection and identification. The system is interfaced with the HIFAR research reactor of the Australian Atomic Energy Commission. The reactor serves to simulate noise and vibrational phenomena which might be pertinent in power reactor situations. The data acquisition system consists of a small computer interfaced with a digital correlator and a Fourier transform unit. An incremental tape recorder is utilised as a backing store and as a means of communication with other computers. A small analogue computer and an analogue statistical analyzer can be used in the pre and post computational analysis of signals which are received from neutron and gamma detectors, thermocouples, accelerometers, hydrophones and strain gauges. Investigations carried out to date include a study of the role of local and global pressure fields due to turbulence in coolant flow and pump impeller induced perturbations on (a) control absorbers, (B) fuel element and (c) coolant external circuit and core tank structure component vibrations. (Auth.)

  15. Statistical mechanics of sensing and communications: Insights and techniques

    International Nuclear Information System (INIS)

    Murayama, T; Davis, P

    2008-01-01

    In this article we review a basic model for analysis of large sensor networks from the point of view of collective estimation under bandwidth constraints. We compare different sensing aggregation levels as alternative 'strategies' for collective estimation: moderate aggregation from a moderate number of sensors for which communication bandwidth is enough that data encoding can be reversible, and large scale aggregation from very many sensors - in which case communication bandwidth constraints require the use of nonreversible encoding. We show the non-trivial trade-off between sensing quality, which can be increased by increasing the number of sensors, and communication quality under bandwidth constraints, which decreases if the number of sensors is too large. From a practical standpoint, we verify that such a trade-off exists in constructively defined communications schemes. We introduce a probabilistic encoding scheme and define rate distortion models that are suitable for analysis of the large network limit. Our description shows that the methods and ideas from statistical physics can play an important role in formulating effective models for such schemes

  16. Statistical classification techniques in high energy physics (SDDT algorithm)

    International Nuclear Information System (INIS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-01-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)

  17. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.; Harrington, Heather A.; Stumpf, Michael P. H.; Byrne, Helen M.

    2015-01-01

    mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since

  18. Does underground storage still require sophisticated studies?

    International Nuclear Information System (INIS)

    Marsily, G. de

    1997-01-01

    Most countries agree to the necessity of burying high or medium-level wastes in geological layers situated at a few hundred meters below the ground level. The advantages and disadvantages of different types of rock such as salt, clay, granite and volcanic material are examined. Sophisticated studies are lead to determine the best geological confinement but questions arise about the time for which safety must be ensured. France has chosen 3 possible sites. These sites are geologically described in the article. The final place will be proposed after a testing phase of about 5 years in an underground facility. (A.C.)

  19. STOCK EXCHANGE LISTING INDUCES SOPHISTICATION OF CAPITAL BUDGETING

    Directory of Open Access Journals (Sweden)

    Wesley Mendes-da-Silva

    2014-08-01

    Full Text Available This article compares capital budgeting techniques employed in listed and unlisted companies in Brazil. We surveyed the Chief Financial Officers (CFOs of 398 listed companies and 300 large unlisted companies, and based on 91 respondents, the results suggest that the CFOs of listed companies tend to use less simplistic methods more often, for example: NPV and CAPM, and that CFOs of unlisted companies are less likely to estimate the cost of equity, despite being large companies. These findings indicate that stock exchange listing may require greater sophistication of the capital budgeting process.

  20. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1978-01-01

    Surveillance techniques which extend the sophistication existing in automated systems monitoring in industrial rotating equipment are described. The monitoring system automatically established limiting criteria during an initial learning period of a few days; and subsequently, while monitoring the test rotor during an extended period of normal operation, experienced a false alarm rate of 0.5%. At the same time, the monitoring system successfully detected all fault types that introduced into the test setup. Tests on real equipment are needed to provide final verification of the monitoring techniques. There are areas that would profit from additional investigation in the laboratory environment. A comparison of the relative value of alternate descriptors under given fault conditions would be worthwhile. This should be pursued in conjunction with extending the set of fault types available, e.g., lecaring problems. Other tests should examine the effects of using fewer (more coarse) intervals to define the lumped operational states. finally, techniques to diagnose the most probable fault should be developed by drawing upon the extensive data automatically logged by the monitoring system

  1. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  2. Statistical evaluation of recorded knowledge in nuclear and other instrumental analytical techniques

    International Nuclear Information System (INIS)

    Braun, T.

    1987-01-01

    The main points addressed in this study are the following: Statistical distribution patterns of published literature on instrumental analytical techniques 1981-1984; structure of scientific literature and heuristics for identifying active specialities and emerging hot spot research areas in instrumental analytical techniques; growth and growth rates of the literature in some of the identified hot research areas; quality and quantity in instrumental analytical research output. (orig.)

  3. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  4. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  5. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  6. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  7. Curve fitting and modeling with splines using statistical variable selection techniques

    Science.gov (United States)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  8. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    International Nuclear Information System (INIS)

    Ren, Qingguo; Dewan, Sheilesh Kumar; Li, Ming; Li, Jianying; Mao, Dingbiao; Wang, Zhenglei; Hua, Yanqing

    2012-01-01

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI vol ) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  9. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  10. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  11. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  12. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  13. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  14. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    International Nuclear Information System (INIS)

    Park, Jinyong; Balasingham, P.; McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W.

    2004-01-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  15. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  16. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  17. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Connection between perturbation theory, projection-operator techniques, and statistical linearization for nonlinear systems

    International Nuclear Information System (INIS)

    Budgor, A.B.; West, B.J.

    1978-01-01

    We employ the equivalence between Zwanzig's projection-operator formalism and perturbation theory to demonstrate that the approximate-solution technique of statistical linearization for nonlinear stochastic differential equations corresponds to the lowest-order β truncation in both the consolidated perturbation expansions and in the ''mass operator'' of a renormalized Green's function equation. Other consolidated equations can be obtained by selectively modifying this mass operator. We particularize the results of this paper to the Duffing anharmonic oscillator equation

  19. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  20. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  1. A comparison of linear and nonlinear statistical techniques in performance attribution.

    Science.gov (United States)

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  2. ISOLATED SPEECH RECOGNITION SYSTEM FOR TAMIL LANGUAGE USING STATISTICAL PATTERN MATCHING AND MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    VIMALA C.

    2015-05-01

    Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.

  3. Statistical uncertainty of extreme wind storms over Europe derived from a probabilistic clustering technique

    Science.gov (United States)

    Walz, Michael; Leckebusch, Gregor C.

    2016-04-01

    Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.

  4. Does Investors' Sophistication Affect Persistence and Pricing of Discretionary Accruals?

    OpenAIRE

    Lanfeng Kao

    2007-01-01

    This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

  5. Application of statistical downscaling technique for the production of wine grapes (Vitis vinifera L.) in Spain

    Science.gov (United States)

    Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.

    2012-04-01

    Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.

  6. Leak detection and localization in a pipeline system by application of statistical analysis techniques

    International Nuclear Information System (INIS)

    Fukuda, Toshio; Mitsuoka, Toyokazu.

    1985-01-01

    The detection of leak in piping system is an important diagnostic technique for facilities to prevent accidents and to take maintenance measures, since the occurrence of leak lowers productivity and causes environmental destruction. As the first step, it is necessary to detect the occurrence of leak without delay, and as the second step, if the place of leak occurrence in piping system can be presumed, accident countermeasures become easy. The detection of leak by pressure is usually used for detecting large leak. But the method depending on pressure is simple and advantageous, therefore the extension of the detecting technique by pressure gradient method to the detection of smaller scale leak using statistical analysis techniques was examined for a pipeline in steady operation in this study. Since the flow in a pipe irregularly varies during pumping, statistical means is required for the detection of small leak by pressure. The index for detecting leak proposed in this paper is the difference of the pressure gradient at the both ends of a pipeline. The experimental results on water and air in nylon tubes are reported. (Kako, I.)

  7. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  8. Flow prediction models using macroclimatic variables and multivariate statistical techniques in the Cauca River Valley

    International Nuclear Information System (INIS)

    Carvajal Escobar Yesid; Munoz, Flor Matilde

    2007-01-01

    The project this centred in the revision of the state of the art of the ocean-atmospheric phenomena that you affect the Colombian hydrology especially The Phenomenon Enos that causes a socioeconomic impact of first order in our country, it has not been sufficiently studied; therefore it is important to approach the thematic one, including the variable macroclimates associated to the Enos in the analyses of water planning. The analyses include revision of statistical techniques of analysis of consistency of hydrological data with the objective of conforming a database of monthly flow of the river reliable and homogeneous Cauca. Statistical methods are used (Analysis of data multivariante) specifically The analysis of principal components to involve them in the development of models of prediction of flows monthly means in the river Cauca involving the Lineal focus as they are the model autoregressive AR, ARX and Armax and the focus non lineal Net Artificial Network.

  9. Statistics and error considerations at the application of SSND T-technique in radon measurement

    International Nuclear Information System (INIS)

    Jonsson, G.

    1993-01-01

    Plastic films are used for the detection of alpha particles from disintegrating radon and radon daughter nuclei. After etching there are tracks (cones) or holes in the film as a result of the exposure. The step from a counted number of tracks/holes per surface unit of the film to a reliable value of the radon and radon daughter level is surrounded by statistical considerations of different nature. Some of them are the number of counted tracks, the length of the time of exposure, the season of the time of exposure, the etching technique and the method of counting the tracks or holes. The number of background tracks of an unexposed film increases the error of the measured radon level. Some of the mentioned effects of statistical nature will be discussed in the report. (Author)

  10. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  11. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in the Terengganu River Basin

    International Nuclear Information System (INIS)

    Aminu Ibrahim; Hafizan Juahir; Mohd Ekhwan Toriman; Mustapha, A.; Azman Azid; Isiyaka, H.A.

    2015-01-01

    Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH 3 -NL, PO 4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)

  12. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  13. The role of sophisticated accounting system in strategy management

    OpenAIRE

    Naranjo Gil, David

    2004-01-01

    Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

  14. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  15. The First Sophists and the Uses of History.

    Science.gov (United States)

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  16. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  17. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.

    2015-12-16

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  18. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  19. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  20. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    Science.gov (United States)

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  1. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average options

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco

    innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance.......We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differer...

  2. A Modified Moore Approach to Teaching Mathematical Statistics: An Inquiry Based Learning Technique to Teaching Mathematical Statistics

    Science.gov (United States)

    McLoughlin, M. Padraig M. M.

    2008-01-01

    The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…

  3. Coronary artery plaques: Cardiac CT with model-based and adaptive-statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Scheffel, Hans; Stolzmann, Paul; Schlett, Christopher L.; Engel, Leif-Christopher; Major, Gyöngi Petra; Károlyi, Mihály; Do, Synho; Maurovich-Horvat, Pál; Hoffmann, Udo

    2012-01-01

    Objectives: To compare image quality of coronary artery plaque visualization at CT angiography with images reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), and model based iterative reconstruction (MBIR) techniques. Methods: The coronary arteries of three ex vivo human hearts were imaged by CT and reconstructed with FBP, ASIR and MBIR. Coronary cross-sectional images were co-registered between the different reconstruction techniques and assessed for qualitative and quantitative image quality parameters. Readers were blinded to the reconstruction algorithm. Results: A total of 375 triplets of coronary cross-sectional images were co-registered. Using MBIR, 26% of the images were rated as having excellent overall image quality, which was significantly better as compared to ASIR and FBP (4% and 13%, respectively, all p < 0.001). Qualitative assessment of image noise demonstrated a noise reduction by using ASIR as compared to FBP (p < 0.01) and further noise reduction by using MBIR (p < 0.001). The contrast-to-noise-ratio (CNR) using MBIR was better as compared to ASIR and FBP (44 ± 19, 29 ± 15, 26 ± 9, respectively; all p < 0.001). Conclusions: Using MBIR improved image quality, reduced image noise and increased CNR as compared to the other available reconstruction techniques. This may further improve the visualization of coronary artery plaque and allow radiation reduction.

  4. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  5. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Directory of Open Access Journals (Sweden)

    Daniel Müllensiefen

    Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  6. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Science.gov (United States)

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  7. [Comments on] Statistical techniques for the development and application of SYVAC. (Document by Stephen Howe Ltd.)

    International Nuclear Information System (INIS)

    Beale, E.M.L.

    1983-05-01

    The Department of the Environment has embarked on a programme to develop computer models to help with assessment of sites suitable for the disposal of nuclear wastes. The first priority is to produce a system, based on the System Variability Analysis Code (SYVAC) obtained from Atomic Energy of Canada Ltd., suitable for assessing radioactive waste disposal in land repositories containing non heat producing wastes from typical UK sources. The requirements of the SYVAC system development were so diverse that each portion of the development was contracted to a different company. Scicon are responsible for software coordination, system integration and user interface. Their present report contains comments on 'Statistical techniques for the development and application of SYVAC'. (U.K.)

  8. Development of statistical and analytical techniques for use in national quality control schemes for steroid hormones

    International Nuclear Information System (INIS)

    Wilson, D.W.; Gaskell, S.J.; Fahmy, D.R.; Joyce, B.G.; Groom, G.V.; Griffiths, K.; Kemp, K.W.; Nix, A.B.J.; Rowlands, R.J.

    1979-01-01

    Adopting the rationale that the improvement of intra-laboratory performance of immunometric assays will enable the assessment of national QC schemes to become more meaningful, the group of participating laboratories has developed statistical and analytical techniques for the improvement of accuracy, precision and monitoring of error for the determination of steroid hormones. These developments are now described and their relevance to NQC schemes discussed. Attention has been focussed on some of the factors necessary for improving standards of quality in immunometric assays and their relevance to laboratories participating in NQC schemes as described. These have included the 'accuracy', precision and robustness of assay procedures as well as improved methods for internal quality control. (Auth.)

  9. Water quality, Multivariate statistical techniques, submarine out fall, spatial variation, temporal variation

    International Nuclear Information System (INIS)

    Garcia, Francisco; Palacio, Carlos; Garcia, Uriel

    2012-01-01

    Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.

  10. Application of multivariate statistical techniques in the water quality assessment of Danube river, Serbia

    Directory of Open Access Journals (Sweden)

    Voza Danijela

    2015-12-01

    Full Text Available The aim of this article is to evaluate the quality of the Danube River in its course through Serbia as well as to demonstrate the possibilities for using three statistical methods: Principal Component Analysis (PCA, Factor Analysis (FA and Cluster Analysis (CA in the surface water quality management. Given that the Danube is an important trans-boundary river, thorough water quality monitoring by sampling at different distances during shorter and longer periods of time is not only ecological, but also a political issue. Monitoring was carried out at monthly intervals from January to December 2011, at 17 sampling sites. The obtained data set was treated by multivariate techniques in order, firstly, to identify the similarities and differences between sampling periods and locations, secondly, to recognize variables that affect the temporal and spatial water quality changes and thirdly, to present the anthropogenic impact on water quality parameters.

  11. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  12. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    International Nuclear Information System (INIS)

    Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-01-01

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites

  13. Jsub(Ic)-testing of A-533 B - statistical evaluation of some different testing techniques

    International Nuclear Information System (INIS)

    Nilsson, F.

    1978-01-01

    The purpose of the present study was to compare statistically some different methods for the evaluation of fracture toughness of the nuclear reactor material A-533 B. Since linear elastic fracture mechanics is not applicable to this material at the interesting temperature (275 0 C), the so-called Jsub(Ic) testing method was employed. Two main difficulties are inherent in this type of testing. The first one is to determine the quantity J as a function of the deflection of the three-point bend specimens used. Three different techniques were used, the first two based on the experimentally observed input of energy to the specimen and the third employing finite element calculations. The second main problem is to determine the point when crack growth begins. For this, two methods were used, a direct electrical method and the indirect R-curve method. A total of forty specimens were tested at two laboratories. No statistically significant different results were obtained from the respective laboratories. The three methods of calculating J yielded somewhat different results, although the discrepancy was small. Also the two methods of determination of the growth initiation point yielded consistent results. The R-curve method, however, exhibited a larger uncertainty as measured by the standard deviation. The resulting Jsub(Ic) value also agreed well with earlier presented results. The relative standard deviation was of the order of 25%, which is quite small for this type of experiment. (author)

  14. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    Science.gov (United States)

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  15. PAUL AND SOPHISTIC RHETORIC: A PERSPECTIVE ON HIS ...

    African Journals Online (AJOL)

    use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

  16. Sophistication and Performance of Italian Agri‐food Exports

    Directory of Open Access Journals (Sweden)

    Anna Carbone

    2012-06-01

    Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

  17. Obfuscation, Learning, and the Evolution of Investor Sophistication

    OpenAIRE

    Bruce Ian Carlin; Gustavo Manso

    2011-01-01

    Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

  18. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    Science.gov (United States)

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to

  19. Radiation dose reduction on multidetector abdominal CT using adaptive statistical iterative reconstruction technique in children

    International Nuclear Information System (INIS)

    Zhang Qifeng; Peng Yun; Duan Xiaomin; Sun Jihang; Yu Tong; Han Zhonglong

    2013-01-01

    Objective: To investigate the feasibility to reduce radiation doses on pediatric multidetector abdominal CT using the adaptive statistical iterative reconstruction technique (ASIR) associated with automated tube current modulation technique (ATCM). Methods: Thirty patients underwent abdominal CT with ATCM and the follow-up scan with ATCM cooperated with 40% ASIR. ATCM was used with age dependent noise index (NI) settings: NI = 9 for 0-5 year old and NI = 11 for > 5 years old for simple ATCM group, NI = 11 for 0-5 year old and NI = 15 for > 5 years old for ATCM cooperated with 40% ASIR group (AISR group). Two radiologists independently evaluated images for diagnostic quality and image noise with subjectively image quality score and image noise score using a 5-point scale. Interobserver agreement was assessed by Kappa test. The volume CT dose indexes (CTDIvol) for the two groups were recorded. Statistical significance for the CTDIvol value was analyzed by pair-sample t test. Results: The average CTDIvol for the ASIR group was (1.38 ± 0.64) mGy, about 60% lower than (3.56 ± 1.23) mGy for the simple ATCM group, and the CTDIvol of two groups had statistically significant differences. (t = 33.483, P < 0.05). The subjective image quality scores for the simple ATCM group were 4.43 ± 0.57 and 4.37 ±0.61, Kappa = 0.878, P < 0.01 (ASIR group: 4.70 ± 0.47 and 4.60 ± 0.50, Kappa = 0.783, P < 0.01), by two observers. The image noise score for the simple ATCM group were 4.03 ±0.56 and 3.83 ±0.53, Kappa = 0.572, P < 0.01 (ASIR group: 4.20 ± 0.48 and 4.10 ± 0.48, Kappa = 0.748, P < 0.01), by two observers. All images had acceptable diagnostic image quality. Conclusion: Lower radiation dose can be achieved by elevating NI with ASIR in pediatric CT abdominal studies, while maintaining diagnostically acceptable images. (authors)

  20. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    Science.gov (United States)

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  1. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  2. Machine learning and statistical techniques : an application to the prediction of insolvency in Spanish non-life insurance companies

    OpenAIRE

    Díaz, Zuleyka; Segovia, María Jesús; Fernández, José

    2005-01-01

    Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...

  3. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    Science.gov (United States)

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  4. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  5. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    Science.gov (United States)

    Fenta Mekonnen, Dagnenet; Disse, Markus

    2018-04-01

    Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs) and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i) to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG) and the Statistical Downscaling Model (SDSM), and (ii) to downscale future climate scenarios of precipitation, maximum temperature (Tmax) and minimum temperature (Tmin) of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM) have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase from 0.4 to 4.3

  6. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    Directory of Open Access Journals (Sweden)

    D. Fenta Mekonnen

    2018-04-01

    Full Text Available Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG and the Statistical Downscaling Model (SDSM, and (ii to downscale future climate scenarios of precipitation, maximum temperature (Tmax and minimum temperature (Tmin of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase

  7. A Review of Statistical Techniques for 2x2 and RxC Categorical Data Tables In SPSS

    Directory of Open Access Journals (Sweden)

    Cengiz BAL

    2009-11-01

    Full Text Available In this study, a review of statistical techniques for RxC categorical data tables is explained in detail. The emphasis is given to the association of techniques and their corresponding data considerations. Some suggestions to how to handle specific categorical data tables in SPSS and common mistakes in the interpretation of the SPSS outputs are shown.

  8. Statistical techniques for modeling extreme price dynamics in the energy market

    International Nuclear Information System (INIS)

    Mbugua, L N; Mwita, P N

    2013-01-01

    Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.

  9. Renormalization group theory outperforms other approaches in statistical comparison between upscaling techniques for porous media

    Science.gov (United States)

    Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.

    2017-09-01

    Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.

  10. Financial Literacy and Financial Sophistication in the Older Population

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  11. The conceptualization and measurement of cognitive health sophistication.

    Science.gov (United States)

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  12. Financial Literacy and Financial Sophistication in the Older Population.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  13. Evaluation of significantly modified water bodies in Vojvodina by using multivariate statistical techniques

    Directory of Open Access Journals (Sweden)

    Vujović Svetlana R.

    2013-01-01

    Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an

  14. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    Science.gov (United States)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  15. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  16. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Abbas Alkarkhi, F.M.; Ismail, Norli; Easa, Azhar Mat

    2008-01-01

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers

  17. Correlation analysis of energy indicators for sustainable development using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Carneiro, Alvaro Luiz Guimaraes; Santos, Francisco Carlos Barbosa dos

    2007-01-01

    Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)

  18. Application of multivariate statistical technique for hydrogeochemical assessment of groundwater within the Lower Pra Basin, Ghana

    Science.gov (United States)

    Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.

    2017-06-01

    Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal

  19. Source Identification of Heavy Metals in Soils Surrounding the Zanjan Zinc Town by Multivariate Statistical Techniques

    Directory of Open Access Journals (Sweden)

    M.A. Delavar

    2016-02-01

    Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and

  20. Finding the Fabulous Few: Why Your Program Needs Sophisticated Research.

    Science.gov (United States)

    Pfizenmaier, Emily

    1981-01-01

    Fund raising, it is argued, needs sophisticated prospect research. Professional prospect researchers play an important role in helping to identify prospective donors and also in helping to stimulate interest in gift giving. A sample of an individual work-up on a donor and a bibliography are provided. (MLW)

  1. Procles the Carthaginian: A North African Sophist in Pausanias’ Periegesis

    Directory of Open Access Journals (Sweden)

    Juan Pablo Sánchez Hernández

    2010-11-01

    Full Text Available Procles, cited by Pausanias (in the imperfect tense about a display in Rome and for an opinion about Pyrrhus of Epirus, probably was not a historian of Hellenistic date, but a contemporary sophist whom Pausanias encountered in person in Rome.

  2. SMEs and new ventures need business model sophistication

    DEFF Research Database (Denmark)

    Kesting, Peter; Günzel-Jensen, Franziska

    2015-01-01

    , and Spreadshirt, this article develops a framework that introduces five business model sophistication strategies: (1) uncover additional functions of your product, (2) identify strategic benefits for third parties, (3) take advantage of economies of scope, (4) utilize cross-selling opportunities, and (5) involve...

  3. The adaptive statistical iterative reconstruction-V technique for radiation dose reduction in abdominal CT: comparison with the adaptive statistical iterative reconstruction technique.

    Science.gov (United States)

    Kwon, Heejin; Cho, Jinhan; Oh, Jongyeong; Kim, Dongwon; Cho, Junghyun; Kim, Sanghyun; Lee, Sangyun; Lee, Jihyun

    2015-10-01

    To investigate whether reduced radiation dose abdominal CT images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) compromise the depiction of clinically competent features when compared with the currently used routine radiation dose CT images reconstructed with ASIR. 27 consecutive patients (mean body mass index: 23.55 kg m(-2) underwent CT of the abdomen at two time points. At the first time point, abdominal CT was scanned at 21.45 noise index levels of automatic current modulation at 120 kV. Images were reconstructed with 40% ASIR, the routine protocol of Dong-A University Hospital. At the second time point, follow-up scans were performed at 30 noise index levels. Images were reconstructed with filtered back projection (FBP), 40% ASIR, 30% ASIR-V, 50% ASIR-V and 70% ASIR-V for the reduced radiation dose. Both quantitative and qualitative analyses of image quality were conducted. The CT dose index was also recorded. At the follow-up study, the mean dose reduction relative to the currently used common radiation dose was 35.37% (range: 19-49%). The overall subjective image quality and diagnostic acceptability of the 50% ASIR-V scores at the reduced radiation dose were nearly identical to those recorded when using the initial routine-dose CT with 40% ASIR. Subjective ratings of the qualitative analysis revealed that of all reduced radiation dose CT series reconstructed, 30% ASIR-V and 50% ASIR-V were associated with higher image quality with lower noise and artefacts as well as good sharpness when compared with 40% ASIR and FBP. However, the sharpness score at 70% ASIR-V was considered to be worse than that at 40% ASIR. Objective image noise for 50% ASIR-V was 34.24% and 46.34% which was lower than 40% ASIR and FBP. Abdominal CT images reconstructed with ASIR-V facilitate radiation dose reductions of to 35% when compared with the ASIR. This study represents the first clinical research experiment to use ASIR-V, the newest version of

  4. Statistical zonation technique and its application to the San Andres reservoir in the Poza Rica area, Vera Cruz, Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Campa, M F; Romero, M R

    1969-01-01

    A statistical zonation technique developed by J.D. Testerman is presented referring to its application to the San Andres reservoir in the Poza Rica area in Veracruz, Mex. The method is based on a statistical technique which permits grouping of similar values of certain parameter, i.e., porosity, for individual wells within a field. The resulting groups or zones are used in a correlation analysis to deduce whether there is continuity of porosity in any direction. In the San Andres reservoir, there is a continuity of the porous media on NE-SW direction. This is an important fact for the waterflooding project being carried on.

  5. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  6. Lies, damn lies and statistics

    International Nuclear Information System (INIS)

    Jones, M.D.

    2001-01-01

    Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab

  7. Assessment of roadside surface water quality of Savar, Dhaka, Bangladesh using GIS and multivariate statistical techniques

    Science.gov (United States)

    Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir

    2017-11-01

    In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.

  8. A Comparison of Selected Statistical Techniques to Model Soil Cation Exchange Capacity

    Science.gov (United States)

    Khaledian, Yones; Brevik, Eric C.; Pereira, Paulo; Cerdà, Artemi; Fattah, Mohammed A.; Tazikeh, Hossein

    2017-04-01

    Cation exchange capacity (CEC) measures the soil's ability to hold positively charged ions and is an important indicator of soil quality (Khaledian et al., 2016). However, other soil properties are more commonly determined and reported, such as texture, pH, organic matter and biology. We attempted to predict CEC using different advanced statistical methods including monotone analysis of variance (MONANOVA), artificial neural networks (ANNs), principal components regressions (PCR), and particle swarm optimization (PSO) in order to compare the utility of these approaches and identify the best predictor. We analyzed 170 soil samples from four different nations (USA, Spain, Iran and Iraq) under three land uses (agriculture, pasture, and forest). Seventy percent of the samples (120 samples) were selected as the calibration set and the remaining 50 samples (30%) were used as the prediction set. The results indicated that the MONANOVA (R2= 0.82 and Root Mean Squared Error (RMSE) =6.32) and ANNs (R2= 0.82 and RMSE=5.53) were the best models to estimate CEC, PSO (R2= 0.80 and RMSE=5.54) and PCR (R2= 0.70 and RMSE=6.48) also worked well and the overall results were very similar to each other. Clay (positively correlated) and sand (negatively correlated) were the most influential variables for predicting CEC for the entire data set, while the most influential variables for the various countries and land uses were different and CEC was affected by different variables in different situations. Although the MANOVA and ANNs provided good predictions of the entire dataset, PSO gives a formula to estimate soil CEC using commonly tested soil properties. Therefore, PSO shows promise as a technique to estimate soil CEC. Establishing effective pedotransfer functions to predict CEC would be productive where there are limitations of time and money, and other commonly analyzed soil properties are available. References Khaledian, Y., Kiani, F., Ebrahimi, S., Brevik, E.C., Aitkenhead

  9. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    Science.gov (United States)

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  10. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  11. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    OpenAIRE

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of...

  12. Development Strategies for Tourism Destinations: Tourism Sophistication vs. Resource Investments

    OpenAIRE

    Rainer Andergassen; Guido Candela

    2010-01-01

    This paper investigates the effectiveness of development strategies for tourism destinations. We argue that resource investments unambiguously increase tourism revenues and that increasing the degree of tourism sophistication, that is increasing the variety of tourism related goods and services, increases tourism activity and decreases the perceived quality of the destination's resource endowment, leading to an ambiguous effect on tourism revenues. We disentangle these two effects and charact...

  13. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  14. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    Energy Technology Data Exchange (ETDEWEB)

    Solaimani, Mohiuddin [Univ. of Texas-Dallas, Richardson, TX (United States); Iftekhar, Mohammed [Univ. of Texas-Dallas, Richardson, TX (United States); Khan, Latifur [Univ. of Texas-Dallas, Richardson, TX (United States); Thuraisingham, Bhavani [Univ. of Texas-Dallas, Richardson, TX (United States); Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. As a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.

  15. Visual Analysis of North Atlantic Hurricane Trends Using Parallel Coordinates and Statistical Techniques

    National Research Council Canada - National Science Library

    Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E

    2008-01-01

    ... for a particular dependent variable. These capabilities are combined into a unique visualization system that is demonstrated via a North Atlantic hurricane climate study using a systematic workflow. This research corroborates the notion that enhanced parallel coordinates coupled with statistical analysis can be used for more effective knowledge discovery and confirmation in complex, real-world data sets.

  16. Spectral deformation techniques applied to the study of quantum statistical irreversible processes

    International Nuclear Information System (INIS)

    Courbage, M.

    1978-01-01

    A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)

  17. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  18. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    Science.gov (United States)

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  19. Visual Analysis of North Atlantic Hurricane Trends Using Parallel Coordinates and Statistical Techniques

    Science.gov (United States)

    2008-07-07

    analyzing multivariate data sets. The system was developed using the Java Development Kit (JDK) version 1.5; and it yields interactive performance on a... script and captures output from the MATLAB’s “regress” and “stepwisefit” utilities that perform simple and stepwise regression, respectively. The MATLAB...Statistical Association, vol. 85, no. 411, pp. 664–675, 1990. [9] H. Hauser, F. Ledermann, and H. Doleisch, “ Angular brushing of extended parallel coordinates

  20. Vibration impact acoustic emission technique for identification and analysis of defects in carbon steel tubes: Part A Statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-04-15

    Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.

  1. Strategic sophistication of individuals and teams. Experimental evidence

    Science.gov (United States)

    Sutter, Matthias; Czermak, Simon; Feri, Francesco

    2013-01-01

    Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

  2. Few remarks on chiral theories with sophisticated topology

    International Nuclear Information System (INIS)

    Golo, V.L.; Perelomov, A.M.

    1978-01-01

    Two classes of the two-dimensional Euclidean chiral field theoreties are singled out: 1) the field phi(x) takes the values in the compact Hermitiam symmetric space 2) the field phi(x) takes the values in an orbit of the adjoint representation of the comcompact Lie group. The theories have sophisticated topological and rich analytical structures. They are considered with the help of topological invariants (topological charges). Explicit formulae for the topological charges are indicated, and the lower bound extimate for the action is given

  3. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    Science.gov (United States)

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  4. Generic Techniques for the Calibration of Robots with Application of the 3-D Fixtures and Statistical Technique on the PUMA 500 and ARID Robots

    Science.gov (United States)

    Tawfik, Hazem

    1991-01-01

    A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.

  5. On some surprising statistical properties of a DNA fingerprinting technique called AFLP

    NARCIS (Netherlands)

    Gort, G.

    2010-01-01

    AFLP is a widely used DNA fingerprinting technique, resulting in band absence - presence profiles, like a bar code. Bands represent DNA fragments, sampled from the genome of an individual plant or other organism. The DNA fragments travel through a lane of an electrophoretic gel or microcapillary

  6. Statistical Analysis of Reactor Pressure Vessel Fluence Calculation Benchmark Data Using Multiple Regression Techniques

    International Nuclear Information System (INIS)

    Carew, John F.; Finch, Stephen J.; Lois, Lambros

    2003-01-01

    The calculated >1-MeV pressure vessel fluence is used to determine the fracture toughness and integrity of the reactor pressure vessel. It is therefore of the utmost importance to ensure that the fluence prediction is accurate and unbiased. In practice, this assurance is provided by comparing the predictions of the calculational methodology with an extensive set of accurate benchmarks. A benchmarking database is used to provide an estimate of the overall average measurement-to-calculation (M/C) bias in the calculations ( ). This average is used as an ad-hoc multiplicative adjustment to the calculations to correct for the observed calculational bias. However, this average only provides a well-defined and valid adjustment of the fluence if the M/C data are homogeneous; i.e., the data are statistically independent and there is no correlation between subsets of M/C data.Typically, the identification of correlations between the errors in the database M/C values is difficult because the correlation is of the same magnitude as the random errors in the M/C data and varies substantially over the database. In this paper, an evaluation of a reactor dosimetry benchmark database is performed to determine the statistical validity of the adjustment to the calculated pressure vessel fluence. Physical mechanisms that could potentially introduce a correlation between the subsets of M/C ratios are identified and included in a multiple regression analysis of the M/C data. Rigorous statistical criteria are used to evaluate the homogeneity of the M/C data and determine the validity of the adjustment.For the database evaluated, the M/C data are found to be strongly correlated with dosimeter response threshold energy and dosimeter location (e.g., cavity versus in-vessel). It is shown that because of the inhomogeneity in the M/C data, for this database, the benchmark data do not provide a valid basis for adjusting the pressure vessel fluence.The statistical criteria and methods employed in

  7. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    Directory of Open Access Journals (Sweden)

    Nsikak U Benson

    Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  8. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1979-01-01

    The level of technology utilized in automated systems that monitor industrial rotating equipment and the potential of alternative surveillance methods are assessed. It is concluded that changes in surveillance methodology would upgrade ongoing programs and yet still be practical for implementation. An improved anomaly recognition methodology is formulated and implemented on a minicomputer system. The effectiveness of the monitoring system was evaluated in laboratory tests on a small rotor assembly, using vibrational signals from both displacement probes and accelerometers. Time and frequency domain descriptors are selected to compose an overall signature that characterizes the monitored equipment. Limits for normal operation of the rotor assembly are established automatically during an initial learning period. Thereafter, anomaly detection is accomplished by applying an approximate statistical test to each signature descriptor. As demonstrated over months of testing, this monitoring system is capable of detecting anomalous conditions while exhibiting a false alarm rate below 0.5%

  9. The sophisticated control of the tram bogie on track

    Directory of Open Access Journals (Sweden)

    Radovan DOLECEK

    2015-09-01

    Full Text Available The paper deals with the problems of routing control algorithms of new conception of tram vehicle bogie. The main goal of these research activities is wear reduction of rail wheels and tracks, wear reduction of traction energy losses and increasing of running comfort. The testing experimental tram vehicle with special bogie construction powered by traction battery is utilized for these purposes. This vehicle has a rotary bogie with independent rotating wheels driven by permanent magnets synchronous motors and a solid axle. The wheel forces in bogie are measured by large amounts of the various sensors placed on the testing experimental tram vehicle. Nowadays the designed control algorithms are implemented to the vehicle superset control system. The traction requirements and track characteristics have an effect to these control algorithms. This control including sophisticated routing brings other improvements which is verified and corrected according to individual traction and driving characteristics, and opens new possibilities.

  10. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    Science.gov (United States)

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants

  11. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  12. An Efficient Statistical Computation Technique for Health Care Big Data using R

    Science.gov (United States)

    Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.

    2017-08-01

    Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to -day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.

  13. Utilizing Healthcare Developments, Demographic Data with Statistical Techniques to Estimate the Diarrhoea Prevalence in India.

    Science.gov (United States)

    Srivastava, Shweta; Vatsalya, Vatsalya; Arora, Ashoo; Arora, Kashmiri L; Karch, Robert

    2012-03-22

    Diarrhoea is one of the leading causes of morbidity and mortality in developing countries in Africa and South Asia such as India. Prevalence of diarrheal diseases in those countries is higher than developed western world and largely has been associated with socio-economic and sanitary conditions. However, present available data has not been sufficiently evaluated to study the role of other factors like healthcare development, population density, sex and regional influence on diarrheal prevalence pattern. Study was performed to understand the relationship of diarrheal prevalence with specific measures namely; healthcare services development, demographics, population density, socio-economic conditions, sex, and regional prevalence patterns in India. Data from Annual national health reports and other epidemiological studies were included and statistically analyzed. Our results demonstrate significant correlation of the disease prevalence pattern with certain measures like healthcare centers, population growth rate, sex and region-specific morbidity. Available information on sanitation like water supply and toilet availability and socioeconomic conditions like poverty and literacy measures could only be associated as trends of significance. This study can be valuable for improvisation of appropriate strategies focused on important measures like healthcare resources, population growth and regional significances to evaluate prevalence patterns and management of the diarrhoea locally and globally.

  14. Quantitative evaluation of ASiR image quality: an adaptive statistical iterative reconstruction technique

    Science.gov (United States)

    Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan

    2012-03-01

    Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.

  15. Proficiency Testing for Determination of Water Content in Toluene of Chemical Reagents by iteration robust statistic technique

    Science.gov (United States)

    Wang, Hao; Wang, Qunwei; He, Ming

    2018-05-01

    In order to investigate and improve the level of detection technology of water content in liquid chemical reagents of domestic laboratories, proficiency testing provider PT0031 (CNAS) has organized proficiency testing program of water content in toluene, 48 laboratories from 18 provinces/cities/municipals took part in the PT. This paper introduces the implementation process of proficiency testing for determination of water content in toluene, including sample preparation, homogeneity and stability test, the results of statistics of iteration robust statistic technique and analysis, summarized and analyzed those of the different test standards which are widely used in the laboratories, put forward the technological suggestions for the improvement of the test quality of water content. Satisfactory results were obtained by 43 laboratories, amounting to 89.6% of the total participating laboratories.

  16. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  17. Statistical Techniques Used in Three Applied Linguistics Journals: "Language Learning,""Applied Linguistics" and "TESOL Quarterly," 1980-1986: Implications for Readers and Researchers.

    Science.gov (United States)

    Teleni, Vicki; Baldauf, Richard B., Jr.

    A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…

  18. DETERMINING INDICATORS OF URBAN HOUSEHOLD WATER CONSUMPTION THROUGH MULTIVARIATE STATISTICAL TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Gledsneli Maria Lima Lins

    2010-12-01

    Full Text Available Water has a decisive influence on populations’ life quality – specifically in areas like urban supply, drainage, and effluents treatment – due to its sound impact over public health. Water rational use constitutes the greatest challenge faced by water demand management, mainly with regard to urban household water consumption. This makes it important to develop researches to assist water managers and public policy-makers in planning and formulating water demand measures which may allow urban water rational use to be met. This work utilized the multivariate techniques Factor Analysis and Multiple Linear Regression Analysis – in order to determine the participation level of socioeconomic and climatic variables in monthly urban household consumption changes – applying them to two districts of Campina Grande city (State of Paraíba, Brazil. The districts were chosen based on socioeconomic criterion (income level so as to evaluate their water consumer’s behavior. A 9-year monthly data series (from year 2000 up to 2008 was utilized, comprising family income, water tariff, and quantity of household connections (economies – as socioeconomic variables – and average temperature and precipitation, as climatic variables. For both the selected districts of Campina Grande city, the obtained results point out the variables “water tariff” and “family income” as indicators of these district’s household consumption.

  19. Denoised ordered subset statistically penalized algebraic reconstruction technique (DOS-SPART) in digital breast tomosynthesis

    Science.gov (United States)

    Garrett, John; Li, Yinsheng; Li, Ke; Chen, Guang-Hong

    2017-03-01

    Digital breast tomosynthesis (DBT) is a three dimensional (3D) breast imaging modality in which projections are acquired over a limited angular span around the compressed breast and reconstructed into image slices parallel to the detector. DBT has been shown to help alleviate the breast tissue overlapping issues of two dimensional (2D) mammography. Since the overlapping tissues may simulate cancer masses or obscure true cancers, this improvement is critically important for improved breast cancer screening and diagnosis. In this work, a model-based image reconstruction method is presented to show that spatial resolution in DBT volumes can be maintained while dose is reduced using the presented method when compared to that of a state-of-the-art commercial reconstruction technique. Spatial resolution was measured in phantom images and subjectively in a clinical dataset. Noise characteristics were explored in a cadaver study. In both the quantitative and subjective results the image sharpness was maintained and overall image quality was maintained at reduced doses when the model-based iterative reconstruction was used to reconstruct the volumes.

  20. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  1. Multivariate statistical techniques for the evaluation of surface water quality of the Himalayan foothills streams, Pakistan

    Science.gov (United States)

    Malik, Riffat Naseem; Hashmi, Muhammad Zaffar

    2017-10-01

    Himalayan foothills streams, Pakistan play an important role in living water supply and irrigation of farmlands; thus, the water quality is closely related to public health. Multivariate techniques were applied to check spatial and seasonal trends, and metals contamination sources of the Himalayan foothills streams, Pakistan. Grab surface water samples were collected from different sites (5-15 cm water depth) in pre-washed polyethylene containers. Fast Sequential Atomic Absorption Spectrophotometer (Varian FSAA-240) was used to measure the metals concentration. Concentrations of Ni, Cu, and Mn were high in pre-monsoon season than the post-monsoon season. Cluster analysis identified impaired, moderately impaired and least impaired clusters based on water parameters. Discriminant function analysis indicated spatial variability in water was due to temperature, electrical conductivity, nitrates, iron and lead whereas seasonal variations were correlated with 16 physicochemical parameters. Factor analysis identified municipal and poultry waste, automobile activities, surface runoff, and soil weathering as major sources of contamination. Levels of Mn, Cr, Fe, Pb, Cd, Zn and alkalinity were above the WHO and USEPA standards for surface water. The results of present study will help to higher authorities for the management of the Himalayan foothills streams.

  2. Nonparametric statistical techniques used in dose estimation for beagles exposed to inhaled plutonium nitrate

    International Nuclear Information System (INIS)

    Stevens, D.L.; Dagle, G.E.

    1986-01-01

    Retention and translocation of inhaled radionuclides are often estimated from the sacrifice of multiple animals at different time points. The data for each time point can be averaged and a smooth curve fitted to the mean values, or a smooth curve may be fitted to the entire data set. However, an analysis based on means may not be the most appropriate if there is substantial variation in the initial amount of the radionuclide inhaled or if the data are subject to outliers. A method has been developed that takes account of these problems. The body burden is viewed as a compartmental system, with the compartments identified with body organs. A median polish is applied to the multiple logistic transform of the compartmental fractions (compartment burden/total burden) at each time point. A smooth function is fitted to the results of the median polish. This technique was applied to data from beagles exposed to an aerosol of 239 Pu(NO 3 ) 4 . Models of retention and translocation for lungs, skeleton, liver, kidneys, and tracheobronchial lymph nodes were developed and used to estimate dose. 4 refs., 3 figs., 4 tabs

  3. Statistical signal processing techniques for coherent transversal beam dynamics in synchrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Alhumaidi, Mouhammad

    2015-03-04

    identifying and analyzing the betatron oscillation sourced from the kick based on its mixing and temporal patterns. The accelerator magnets can generate unwanted spurious linear and non-linear fields due to fabrication errors or aging. These error fields in the magnets can excite undesired resonances leading together with the space charge tune spread to long term beam losses and reducing dynamic aperture. Therefore, the knowledge of the linear and non-linear magnets errors in circular accelerator optics is very crucial for controlling and compensating resonances and their consequent beam losses and beam quality deterioration. This is indispensable, especially for high beam intensity machines. Fortunately, the relationship between the beam offset oscillation signals recorded at the BPMs is a manifestation of the accelerator optics, and can therefore be exploited in the determination of the optics linear and non-linear components. Thus, beam transversal oscillations can be excited deliberately for purposes of diagnostics operation of particle accelerators. In this thesis, we propose a novel method for detecting and estimating the optics lattice non-linear components located in-between the locations of two BPMs by analyzing the beam offset oscillation signals of a BPMs-triple containing these two BPMs. Depending on the non-linear components in-between the locations of the BPMs-triple, the relationship between the beam offsets follows a multivariate polynomial accordingly. After calculating the covariance matrix of the polynomial terms, the Generalized Total Least Squares method is used to find the model parameters, and thus the non-linear components. A bootstrap technique is used to detect the existing polynomial model orders by means of multiple hypothesis testing, and determine confidence intervals for the model parameters.

  4. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    Science.gov (United States)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  5. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Science.gov (United States)

    de Sá, Fábio P; Zina, Juliana; Haddad, Célio F B

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  6. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Directory of Open Access Journals (Sweden)

    Fábio P de Sá

    Full Text Available Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids, we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  7. Multi-Site and Multi-Variables Statistical Downscaling Technique in the Monsoon Dominated Region of Pakistan

    Science.gov (United States)

    Khan, Firdos; Pilz, Jürgen

    2016-04-01

    South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological

  8. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques

    International Nuclear Information System (INIS)

    Hernandez M, B.

    1997-01-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  9. Library of sophisticated functions for analysis of nuclear spectra

    Science.gov (United States)

    Morháč, Miroslav; Matoušek, Vladislav

    2009-10-01

    In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data. Program summaryProgram title: SpecAnalysLib 1.1 Catalogue identifier: AEDZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 42 154 No. of bytes in distributed program, including test data, etc.: 2 379 437 Distribution format: tar.gz Programming language: C++ Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution package Operating system: Windows 32 bit versions RAM: 10 MB Word size: 32 bits Classification: 17.6 Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete

  10. Impact of sophisticated fog spray models on accident analyses

    International Nuclear Information System (INIS)

    Roblyer, S.P.; Owzarski, P.C.

    1978-01-01

    The N-Reactor confinement system release dose to the public in a postulated accident is reduced by washing the confinement atmosphere with fog sprays. This allows a low pressure release of confinement atmosphere containing fission products through filters and out an elevated stack. The current accident analysis required revision of the CORRAL code and other codes such as CONTEMPT to properly model the N Reactor confinement into a system of multiple fog-sprayed compartments. In revising these codes, more sophisticated models for the fog sprays and iodine plateout were incorporated to remove some of the conservatism of steam condensing rate, fission product washout and iodine plateout than used in previous studies. The CORRAL code, which was used to describe the transport and deposition of airborne fission products in LWR containment systems for the Rasmussen Study, was revised to describe fog spray removal of molecular iodine (I 2 ) and particulates in multiple compartments for sprays having individual characteristics of on-off times, flow rates, fall heights, and drop sizes in changing containment atmospheres. During postulated accidents, the code determined the fission product removal rates internally rather than from input decontamination factors. A discussion is given of how the calculated plateout and washout rates vary with time throughout the analysis. The results of the accident analyses indicated that more credit could be given to fission product washout and plateout. An important finding was that the release of fission products to the atmosphere and adsorption of fission products on the filters were significantly lower than previous studies had indicated

  11. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  12. Analysis of tribological behaviour of zirconia reinforced Al-SiC hybrid composites using statistical and artificial neural network technique

    Science.gov (United States)

    Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.

    2018-05-01

    The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.

  13. Post-fire debris flow prediction in Western United States: Advancements based on a nonparametric statistical technique

    Science.gov (United States)

    Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.

    2017-12-01

    Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.

  14. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques

    Science.gov (United States)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John

    2016-02-01

    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  15. Arsenic health risk assessment in drinking water and source apportionment using multivariate statistical techniques in Kohistan region, northern Pakistan.

    Science.gov (United States)

    Muhammad, Said; Tahir Shah, M; Khan, Sardar

    2010-10-01

    The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. The problem of sexual imbalance and techniques of the self in the Diagnostic and Statistical Manual of Mental Disorders.

    Science.gov (United States)

    Flore, Jacinthe

    2016-09-01

    This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination. © The Author(s) 2016.

  17. Dynamic re-weighted total variation technique and statistic Iterative reconstruction method for x-ray CT metal artifact reduction

    Science.gov (United States)

    Peng, Chengtao; Qiu, Bensheng; Zhang, Cheng; Ma, Changyu; Yuan, Gang; Li, Ming

    2017-07-01

    Over the years, the X-ray computed tomography (CT) has been successfully used in clinical diagnosis. However, when the body of the patient to be examined contains metal objects, the image reconstructed would be polluted by severe metal artifacts, which affect the doctor's diagnosis of disease. In this work, we proposed a dynamic re-weighted total variation (DRWTV) technique combined with the statistic iterative reconstruction (SIR) method to reduce the artifacts. The DRWTV method is based on the total variation (TV) and re-weighted total variation (RWTV) techniques, but it provides a sparser representation than TV and protects the tissue details better than RWTV. Besides, the DRWTV can suppress the artifacts and noise, and the SIR convergence speed is also accelerated. The performance of the algorithm is tested on both simulated phantom dataset and clinical dataset, which are the teeth phantom with two metal implants and the skull with three metal implants, respectively. The proposed algorithm (SIR-DRWTV) is compared with two traditional iterative algorithms, which are SIR and SIR constrained by RWTV regulation (SIR-RWTV). The results show that the proposed algorithm has the best performance in reducing metal artifacts and protecting tissue details.

  18. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  19. Application of Statistical Downscaling Techniques to Predict Rainfall and Its Spatial Analysis Over Subansiri River Basin of Assam, India

    Science.gov (United States)

    Barman, S.; Bhattacharjya, R. K.

    2017-12-01

    The River Subansiri is the major north bank tributary of river Brahmaputra. It originates from the range of Himalayas beyond the Great Himalayan range at an altitude of approximately 5340m. Subansiri basin extends from tropical to temperate zones and hence exhibits a great diversity in rainfall characteristics. In the Northern and Central Himalayan tracts, precipitation is scarce on account of high altitudes. On the other hand, Southeast part of the Subansiri basin comprising the sub-Himalayan and the plain tract in Arunachal Pradesh and Assam, lies in the tropics. Due to Northeast as well as Southwest monsoon, precipitation occurs in this region in abundant quantities. Particularly, Southwest monsoon causes very heavy precipitation in the entire Subansiri basin during May to October. In this study, the rainfall over Subansiri basin has been studied at 24 different locations by multiple linear and non-linear regression based statistical downscaling techniques and by Artificial Neural Network based model. APHRODITE's gridded rainfall data of 0.25˚ x 0.25˚ resolutions and climatic parameters of HadCM3 GCM of resolution 2.5˚ x 3.75˚ (latitude by longitude) have been used in this study. It has been found that multiple non-linear regression based statistical downscaling technique outperformed the other techniques. Using this method, the future rainfall pattern over the Subansiri basin has been analyzed up to the year 2099 for four different time periods, viz., 2020-39, 2040-59, 2060-79, and 2080-99 at all the 24 locations. On the basis of historical rainfall, the months have been categorized as wet months, months with moderate rainfall and dry months. The spatial changes in rainfall patterns for all these three types of months have also been analyzed over the basin. Potential decrease of rainfall in the wet months and months with moderate rainfall and increase of rainfall in the dry months are observed for the future rainfall pattern of the Subansiri basin.

  20. Sophisticated Search Capabilities in the ADS Abstract Service

    Science.gov (United States)

    Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Henneken, E.; Kurtz, M. J.; Murray, S. S.

    2003-12-01

    The ADS provides access to over 940,000 references from astronomy and planetary sciences publications and 1.5 million records from physics publications. It is funded by NASA and provides free access to these references, as well as to 2.4 million scanned pages from the astronomical literature. These include most of the major astronomy and several planetary sciences journals, as well as many historical observatory publications. The references now include the abstracts from all volumes of the Journal of Geophysical Research (JGR) since the beginning of 2002. We get these abstracts on a regular basis. The Kluwer journal Solar Physics has been scanned back to volume 1 and is available through the ADS. We have extracted the reference lists from this and many other journals and included them in the reference and citation database of the ADS. We have recently scanning Earth, Moon and Planets, another Kluwer journal, and will scan other Kluwer journals in the future as well. We plan on extracting references from these journals as well in the near future. The ADS has many sophisticated query features. These allow the user to formulate complex queries. Using results lists to get further information about the selected articles provide the means to quickly find important and relevant articles from the database. Three advanced feedback queries are available from the bottom of the ADS results list (in addition to regular feedback queries already available from the abstract page and from the bottom of the results list): 1. Get reference list for selected articles: This query returns all known references for the selected articles (or for all articles in the first list). The resulting list will be ranked according to how often each article is referred to and will show the most referenced articles in the field of study that created the first list. It presumably shows the most important articles in that field. 2. Get citation list for selected articles: This returns all known articles

  1. Water quality assessment and apportionment of pollution sources of Gomti river (India) using multivariate statistical techniques--a case study

    International Nuclear Information System (INIS)

    Singh, Kunwar P.; Malik, Amrita; Sinha, Sarita

    2005-01-01

    Multivariate statistical techniques, such as cluster analysis (CA), factor analysis (FA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the data set on water quality of the Gomti river (India), generated during three years (1999-2001) monitoring at eight different sites for 34 parameters (9792 observations). This study presents usefulness of multivariate statistical techniques for evaluation and interpretation of large complex water quality data sets and apportionment of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Three significant groups, upper catchments (UC), middle catchments (MC) and lower catchments (LC) of sampling sites were obtained through CA on the basis of similarity between them. FA/PCA applied to the data sets pertaining to three catchments regions of the river resulted in seven, seven and six latent factors, respectively responsible for the data structure, explaining 74.3, 73.6 and 81.4% of the total variance of the respective data sets. These included the trace metals group (leaching from soil and industrial waste disposal sites), organic pollution group (municipal and industrial effluents), nutrients group (agricultural runoff), alkalinity, hardness, EC and solids (soil leaching and runoff process). DA showed the best results for data reduction and pattern recognition during both temporal and spatial analysis. It rendered five parameters (temperature, total alkalinity, Cl, Na and K) affording more than 94% right assignations in temporal analysis, while 10 parameters (river discharge, pH, BOD, Cl, F, PO 4 , NH 4 -N, NO 3 -N, TKN and Zn) to afford 97% right assignations in spatial analysis of three different regions in the basin. Thus, DA allowed reduction in dimensionality of the large data set, delineating a few indicator parameters responsible for large variations in water quality. Further

  2. Sophisticated visualization algorithms for analysis of multidimensional experimental nuclear spectra

    International Nuclear Information System (INIS)

    Morhac, M.; Kliman, J.; Matousek, V.; Turzo, I.

    2004-01-01

    This paper describes graphical models of visualization of 2-, 3-, 4-dimensional scalar data used in nuclear data acquisition, processing and visualization system developed at the Institute of Physics, Slovak Academy of Sciences. It focuses on presentation of nuclear spectra (histograms). However it can be successfully applied for visualization of arrays of other data types. In the paper we present conventional as well as new developed surface and volume rendering visualization techniques used (Authors)

  3. Particle tracking in sophisticated CAD models for simulation purposes

    International Nuclear Information System (INIS)

    Sulkimo, J.; Vuoskoski, J.

    1995-01-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT. (orig.)

  4. Particle tracking in sophisticated CAD models for simulation purposes

    Science.gov (United States)

    Sulkimo, J.; Vuoskoski, J.

    1996-02-01

    The transfer of physics detector models from computer aided design systems to physics simulation packages like GEANT suffers from certain limitations. In addition, GEANT is not able to perform particle tracking in CAD models. We describe an application which is able to perform particle tracking in boundary models constructed in CAD systems. The transfer file format used is the new international standard, STEP. The design and implementation of the application was carried out using object-oriented techniques. It will be integrated in the future object-oriented version of GEANT.

  5. Characterization of stress degradation products of benazepril by using sophisticated hyphenated techniques.

    Science.gov (United States)

    Narayanam, Mallikarjun; Sahu, Archana; Singh, Saranjit

    2013-01-04

    Benazepril, an anti-hypertensive drug, was subjected to forced degradation studies. The drug was unstable under hydrolytic conditions, yielding benazeprilat, which is a known major degradation product (DP) and an active metabolite. It also underwent photochemical degradation in acid and neutral pH conditions, resulting in multiple minor DPs. The products were separated on a reversed phase (C18) column in a gradient mode, and subjected to LC-MS and LC-NMR studies. Initially, comprehensive mass fragmentation pathway of the drug was established through support of high resolution mass spectrometric (HR-MS) and multi stage tandem mass spectrometric (MS(n)) data. The DPs were also subjected to LC-MS/TOF studies to obtain their accurate masses. Along with, on-line H/D exchange data were obtained to ascertain the number of exchangeable hydrogens in each molecule. LC-(1)H NMR and LC-2DNMR data were additionally acquired in a fraction loop mode. The whole information was successfully employed for the characterization of all the DPs. A complete degradation pathway of the drug was also established. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  7. More Sophisticated Fits of the Oribts of Haumea's Interacting Moons

    Science.gov (United States)

    Oldroyd, William Jared; Ragozzine, Darin; Porter, Simon

    2018-04-01

    Since the discovery of Haumea's moons, it has been a challenge to model the orbits of its moons, Hi’iaka and Namaka. With many precision HST observations, Ragozzine & Brown 2009 succeeded in calculating a three-point mass model which was essential because Keplerian orbits were not a statistically acceptable fit. New data obtained in 2010 could be fit by adding a J2 and spin pole to Haumea, but new data from 2015 was far from the predicted locations, even after an extensive exploration using Bayesian Markov Chain Monte Carlo methods (using emcee). Here we report on continued investigations as to why our model cannot fit the full 10-year baseline of data. We note that by ignoring Haumea and instead examining the relative motion of the two moons in the Hi’iaka centered frame leads to adequate fits for the data. This suggests there are additional parameters connected to Haumea that will be required in a full model. These parameters are potentially related to photocenter-barycenter shifts which could be significant enough to affect the fitting process; these are unlikely to be caused by the newly discovered ring (Ortiz et al. 2017) or by unknown satellites (Burkhart et al. 2016). Additionally, we have developed a new SPIN+N-bodY integrator called SPINNY that self-consistently calculates the interactions between n-quadrupoles and is designed to test the importance of other possible effects (Haumea C22, satellite torques on the spin-pole, Sun, etc.) on our astrometric fits. By correctly determining the orbit of Haumea’s satellites we develop a better understanding of the physical properties of each of the objects with implications for the formation of Haumea, its moons, and its collisional family.

  8. Lexical Sophistication as a Multidimensional Phenomenon: Relations to Second Language Lexical Proficiency, Development, and Writing Quality

    Science.gov (United States)

    Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher

    2018-01-01

    This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…

  9. The predictors of economic sophistication: media, interpersonal communication and negative economic experiences

    NARCIS (Netherlands)

    Kalogeropoulos, A.; Albæk, E.; de Vreese, C.H.; van Dalen, A.

    2015-01-01

    In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

  10. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    Science.gov (United States)

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  11. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    Science.gov (United States)

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  12. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees.

    Science.gov (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A

    2018-02-01

    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  13. The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.

    Science.gov (United States)

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L

    2017-06-01

    To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.

  14. The use of adaptive statistical iterative reconstruction (ASiR) technique in evaluation of patients with cervical spine trauma: impact on radiation dose reduction and image quality.

    Science.gov (United States)

    Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan

    2016-01-01

    The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.

  15. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  16. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  17. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  18. Are conventional statistical techniques exhaustive for defining metal background concentrations in harbour sediments? A case study: The Coastal Area of Bari (Southeast Italy).

    Science.gov (United States)

    Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio

    2015-11-01

    Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  20. Application of a Multivariate Statistical Technique to Interpreting Data from Multichannel Equipment for the Example of the KLEM Spectrometer

    International Nuclear Information System (INIS)

    Podorozhnyi, D.M.; Postnikov, E.B.; Sveshnikova, L.G.; Turundaevsky, A.N.

    2005-01-01

    A multivariate statistical procedure for solving problems of estimating physical parameters on the basis of data from measurements with multichannel equipment is described. Within the multivariate procedure, an algorithm is constructed for estimating the energy of primary cosmic rays and the exponent in their power-law spectrum. They are investigated by using the KLEM spectrometer (NUCLEON project) as a specific example of measuring equipment. The results of computer experiments simulating the operation of the multivariate procedure for this equipment are given, the proposed approach being compared in these experiments with the one-parameter approach presently used in data processing

  1. Evaluation of a dimension-reduction-based statistical technique for Temperature, Water Vapour and Ozone retrievals from IASI radiances

    Science.gov (United States)

    Amato, Umberto; Antoniadis, Anestis; De Feis, Italia; Masiello, Guido; Matricardi, Marco; Serio, Carmine

    2009-03-01

    Remote sensing of atmosphere is changing rapidly thanks to the development of high spectral resolution infrared space-borne sensors. The aim is to provide more and more accurate information on the lower atmosphere, as requested by the World Meteorological Organization (WMO), to improve reliability and time span of weather forecasts plus Earth's monitoring. In this paper we show the results we have obtained on a set of Infrared Atmospheric Sounding Interferometer (IASI) observations using a new statistical strategy based on dimension reduction. Retrievals have been compared to time-space colocated ECMWF analysis for temperature, water vapor and ozone.

  2. Influence of manufacturing parameters on the strength of PLA parts using Layered Manufacturing technique: A statistical approach

    Science.gov (United States)

    Jaya Christiyan, K. G.; Chandrasekhar, U.; Mathivanan, N. Rajesh; Venkateswarlu, K.

    2018-02-01

    A 3D printing was successfully used to fabricate samples of Polylactic Acid (PLA). Processing parameters such as Lay-up speed, Lay-up thickness, and printing nozzle were varied. All samples were tested for flexural strength using three point load test. A statistical mathematical model was developed to correlate the processing parameters with flexural strength. The result clearly demonstrated that the lay-up thickness and nozzle diameter influenced flexural strength significantly, whereas lay-up speed hardly influenced the flexural strength.

  3. Design and performance characteristics of solar adsorption refrigeration system using parabolic trough collector: Experimental and statistical optimization technique

    International Nuclear Information System (INIS)

    Abu-Hamdeh, Nidal H.; Alnefaie, Khaled A.; Almitani, Khalid H.

    2013-01-01

    Highlights: • The successes of using olive waste/methanol as an adsorbent/adsorbate pair. • The experimental gross cycle coefficient of performance obtained was COP a = 0.75. • Optimization showed expanding adsorbent mass to a certain range increases the COP. • The statistical optimization led to optimum tank volume between 0.2 and 0.3 m 3 . • Increasing the collector area to a certain range increased the COP. - Abstract: The current work demonstrates a developed model of a solar adsorption refrigeration system with specific requirements and specifications. The recent scheme can be employed as a refrigerator and cooler unit suitable for remote areas. The unit runs through a parabolic trough solar collector (PTC) and uses olive waste as adsorbent with methanol as adsorbate. Cooling production, COP (coefficient of performance, and COP a (cycle gross coefficient of performance) were used to assess the system performance. The system’s design optimum parameters in this study were arrived to through statistical and experimental methods. The lowest temperature attained in the refrigerated space was 4 °C and the equivalent ambient temperature was 27 °C. The temperature started to decrease steadily at 20:30 – when the actual cooling started – until it reached 4 °C at 01:30 in the next day when it rose again. The highest COP a obtained was 0.75

  4. Moral foundations and political attitudes: The moderating role of political sophistication.

    Science.gov (United States)

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  5. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    Directory of Open Access Journals (Sweden)

    Marie Devaine

    2017-11-01

    Full Text Available Theory of Mind (ToM, i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded. However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity or social group size (a proxy for social network complexity are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees engage in simple dyadic games against artificial ToM players (via a familiar human caregiver. Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size. Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  6. Model-based iterative reconstruction technique for radiation dose reduction in chest CT: comparison with the adaptive statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 ± 3.00) than low-dose ASIR (49.24 ± 9.11, P < 0.01) and reference-dose ASIR images (24.93 ± 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)

  7. Model-based iterative reconstruction technique for radiation dose reduction in chest CT: comparison with the adaptive statistical iterative reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Bunkyo-ku, Tokyo (Japan)

    2012-08-15

    To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 {+-} 3.00) than low-dose ASIR (49.24 {+-} 9.11, P < 0.01) and reference-dose ASIR images (24.93 {+-} 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)

  8. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  9. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques

    International Nuclear Information System (INIS)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-01-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock–water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. - Highlights: • Hydrochemical investigations were carried out in Dhurma aquifer in Saudi Arabia. • The factors controlling potential groundwater pollution in an arid region were studied. • Chemical and statistical analyses are integrated to assess these factors. • Five main factors were extracted, which explain >77% of the total data variance. • The chemical characteristics of the groundwater were influenced by rock–water interactions

  10. Evaluation of image quality and radiation dose by adaptive statistical iterative reconstruction technique level for chest CT examination.

    Science.gov (United States)

    Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol

    2013-12-01

    The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.

  11. Establishing structure-property correlations and classification of base oils using statistical techniques and artificial neural networks

    International Nuclear Information System (INIS)

    Kapur, G.S.; Sastry, M.I.S.; Jaiswal, A.K.; Sarpal, A.S.

    2004-01-01

    The present paper describes various classification techniques like cluster analysis, principal component (PC)/factor analysis to classify different types of base stocks. The API classification of base oils (Group I-III) has been compared to a more detailed NMR derived chemical compositional and molecular structural parameters based classification in order to point out the similarities of the base oils in the same group and the differences between the oils placed in different groups. The detailed compositional parameters have been generated using 1 H and 13 C nuclear magnetic resonance (NMR) spectroscopic methods. Further, oxidation stability, measured in terms of rotating bomb oxidation test (RBOT) life, of non-conventional base stocks and their blends with conventional base stocks, has been quantitatively correlated with their 1 H NMR and elemental (sulphur and nitrogen) data with the help of multiple linear regression (MLR) and artificial neural networks (ANN) techniques. The MLR based model developed using NMR and elemental data showed a high correlation between the 'measured' and 'estimated' RBOT values for both training (R=0.859) and validation (R=0.880) data sets. The ANN based model, developed using fewer number of input variables (only 1 H NMR data) also showed high correlation between the 'measured' and 'estimated' RBOT values for training (R=0.881), validation (R=0.860) and test (R=0.955) data sets

  12. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight

    Science.gov (United States)

    Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.

    2013-01-01

    Children born very low birth weight (development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as

  13. A statistical forecast model using the time-scale decomposition technique to predict rainfall during flood period over the middle and lower reaches of the Yangtze River Valley

    Science.gov (United States)

    Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao

    2018-04-01

    In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.

  14. Cognitive ability rivals the effect of political sophistication on ideological voting

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig

    2016-01-01

    This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

  15. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Directory of Open Access Journals (Sweden)

    Adrion Christine

    2012-09-01

    provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.

  16. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: an example from a vertigo phase III study with longitudinal count data as primary endpoint.

    Science.gov (United States)

    Adrion, Christine; Mansmann, Ulrich

    2012-09-10

    within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.

  17. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques.

    Science.gov (United States)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-10-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Paediatric lower limb deformity correction using the Ilizarov technique: a statistical analysis of factors affecting the complication rate.

    Science.gov (United States)

    Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M

    2014-01-01

    Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.

  19. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  20. New scanning technique using Adaptive Statistical Iterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT.

    Science.gov (United States)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-06-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  1. Reducing abdominal CT radiation dose with the adaptive statistical iterative reconstruction technique in children: a feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Vorona, Gregory A. [The Children' s Hospital of Pittsburgh of UPMC, Department of Radiology, Pittsburgh, PA (United States); Allegheny General Hospital, Department of Radiology, Pittsburgh, PA (United States); Ceschin, Rafael C.; Clayton, Barbara L.; Sutcavage, Tom; Tadros, Sameh S.; Panigrahy, Ashok [The Children' s Hospital of Pittsburgh of UPMC, Department of Radiology, Pittsburgh, PA (United States)

    2011-09-15

    The use of the adaptive statistical iterative reconstruction (ASIR) algorithm has been shown to reduce radiation doses in adults undergoing abdominal CT studies while preserving image quality. To our knowledge, no studies have been done to validate the use of ASIR in children. To retrospectively evaluate differences in radiation dose and image quality in pediatric CT abdominal studies utilizing 40% ASIR compared with filtered-back projection (FBP). Eleven patients (mean age 8.5 years, range 2-17 years) had separate 40% ASIR and FBP enhanced abdominal CT studies on different days between July 2009 and October 2010. The ASIR studies utilized a 38% mA reduction in addition to our pediatric protocol mAs. Study volume CT dose indexes (CTDI{sub vol}) and dose-length products (DLP) were recorded. A consistent representative image was obtained from each study. The images were independently evaluated by two radiologists in a blinded manner for diagnostic utility, image sharpness and image noise. The average CTDI{sub vol} and DLP for the 40% ASIR studies were 4.25 mGy and 185.04 mGy-cm, compared with 6.75 mGy and 275.79 mGy-cm for the FBP studies, representing 37% and 33% reductions in both, respectively. The radiologists' assessments of subjective image quality did not demonstrate any significant differences between the ASIR and FBP images. In our experience, the use of 40% ASIR with a 38% decrease in mA lowers the radiation dose for children undergoing enhanced abdominal examinations by an average of 33%, while maintaining diagnostically acceptable images. (orig.)

  2. Reducing abdominal CT radiation dose with the adaptive statistical iterative reconstruction technique in children: a feasibility study

    International Nuclear Information System (INIS)

    Vorona, Gregory A.; Ceschin, Rafael C.; Clayton, Barbara L.; Sutcavage, Tom; Tadros, Sameh S.; Panigrahy, Ashok

    2011-01-01

    The use of the adaptive statistical iterative reconstruction (ASIR) algorithm has been shown to reduce radiation doses in adults undergoing abdominal CT studies while preserving image quality. To our knowledge, no studies have been done to validate the use of ASIR in children. To retrospectively evaluate differences in radiation dose and image quality in pediatric CT abdominal studies utilizing 40% ASIR compared with filtered-back projection (FBP). Eleven patients (mean age 8.5 years, range 2-17 years) had separate 40% ASIR and FBP enhanced abdominal CT studies on different days between July 2009 and October 2010. The ASIR studies utilized a 38% mA reduction in addition to our pediatric protocol mAs. Study volume CT dose indexes (CTDI vol ) and dose-length products (DLP) were recorded. A consistent representative image was obtained from each study. The images were independently evaluated by two radiologists in a blinded manner for diagnostic utility, image sharpness and image noise. The average CTDI vol and DLP for the 40% ASIR studies were 4.25 mGy and 185.04 mGy-cm, compared with 6.75 mGy and 275.79 mGy-cm for the FBP studies, representing 37% and 33% reductions in both, respectively. The radiologists' assessments of subjective image quality did not demonstrate any significant differences between the ASIR and FBP images. In our experience, the use of 40% ASIR with a 38% decrease in mA lowers the radiation dose for children undergoing enhanced abdominal examinations by an average of 33%, while maintaining diagnostically acceptable images. (orig.)

  3. Statistical optimization of cell disruption techniques for releasing intracellular X-prolyl dipeptidyl aminopeptidase from Lactococcus lactis spp. lactis.

    Science.gov (United States)

    Üstün-Aytekin, Özlem; Arısoy, Sevda; Aytekin, Ali Özhan; Yıldız, Ece

    2016-03-01

    X-prolyl dipeptidyl aminopeptidase (PepX) is an intracellular enzyme from the Gram-positive bacterium Lactococcus lactis spp. lactis NRRL B-1821, and it has commercial importance. The objective of this study was to compare the effects of several cell disruption methods on the activity of PepX. Statistical optimization methods were performed for two cavitation methods, hydrodynamic (high-pressure homogenization) and acoustic (sonication), to determine the more appropriate disruption method. Two level factorial design (2FI), with the parameters of number of cycles and pressure, and Box-Behnken design (BBD), with the parameters of cycle, sonication time, and power, were used for the optimization of the high-pressure homogenization and sonication methods, respectively. In addition, disruption methods, consisting of lysozyme, bead milling, heat treatment, freeze-thawing, liquid nitrogen, ethylenediaminetetraacetic acid (EDTA), Triton-X, sodium dodecyl sulfate (SDS), chloroform, and antibiotics, were performed and compared with the high-pressure homogenization and sonication methods. The optimized values of high-pressure homogenization were one cycle at 130 MPa providing activity of 114.47 mU ml(-1), while sonication afforded an activity of 145.09 mU ml(-1) at 28 min with 91% power and three cycles. In conclusion, sonication was the more effective disruption method, and its optimal operation parameters were manifested for the release of intracellular enzyme from a L. lactis spp. lactis strain, which is a Gram-positive bacterium. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. MODEL APPLICATION MULTIVARIATE ANALYSIS OF STATISTICAL TECHNIQUES PCA AND HCA ASSESSMENT QUESTIONNAIRE ON CUSTOMER SATISFACTION: CASE STUDY IN A METALLURGICAL COMPANY OF METAL CONTAINERS

    Directory of Open Access Journals (Sweden)

    Cláudio Roberto Rosário

    2012-07-01

    Full Text Available The purpose of this research is to improve the practice on customer satisfaction analysis The article presents an analysis model to analyze the answers of a customer satisfaction evaluation in a systematic way with the aid of multivariate statistical techniques, specifically, exploratory analysis with PCA – Partial Components Analysis with HCA - Hierarchical Cluster Analysis. It was tried to evaluate the applicability of the model to be used by the issue company as a tool to assist itself on identifying the value chain perceived by the customer when applied the questionnaire of customer satisfaction. It was found with the assistance of multivariate statistical analysis that it was observed similar behavior among customers. It also allowed the company to conduct reviews on questions of the questionnaires, using analysis of the degree of correlation between the questions that was not a company’s practice before this research.

  5. Comparison of NDA and DA measurement techniques for excess plutonium powders at the Hanford Site: Statistical design and heterogeneity testing

    International Nuclear Information System (INIS)

    Welsh, T.L.; McRae, L.P.; Delegard, C.H.; Liebetrau, A.M.; Johnson, W.C.; Theis, W.; Lemaire, R.J.; Xiao, J.

    1995-06-01

    Quantitative physical measurements are a n component of the International Atomic Energy Agency (IAEA) nuclear material m ampersand guards verification regime. In December 1994, LA.FA safeguards were initiated on an inventory of excess plutonium powder items at the Plutonium Finishing Plant, Vault 3, on the US Department of Energy's Hanford Site. The material originl from the US nuclear weapons complex. The diversity of the chemical form and the heterogenous physical form of this inventory were anticipated to challenge the precision and accuracy of quantitative destructive analytical techniques. A sampling design was used to estimate the degree of heterogeneity of the plutonium content of a variety of inventory items. Plutonium concentration, the item net weight, and the 240 Pu content were among the variables considered in the design. Samples were obtained from randomly selected location within each item. Each sample was divided into aliquots and analyzed chemically. Operator measurements by calorimetry and IAEA measurements by coincident neutron nondestructive analysis also were performed for the initial physical inventory verification materials and similar items not yet under IAEA safeguards. The heterogeneity testing has confirmed that part of the material is indeed significantly heterogeneous; this means that precautionary measures must be taken to obtain representative samples for destructive analysis. In addition, the sampling variability due to material heterogeneity was found to be comparable with, or greater than, the variability of the operator's calorimetric measurements

  6. A scanning electron microscope study and statistical analysis of adipocyte morphology in lipofilling: comparing the effects of harvesting and purification procedures with 2 different techniques.

    Science.gov (United States)

    Rubino, Corrado; Mazzarello, Vittorio; Faenza, Mario; Montella, Andrea; Santanelli, Fabio; Farace, Francesco

    2015-06-01

    The aim of this study was to evaluate the effects on adipocyte morphology of 2 techniques of fat harvesting and of fat purification in lipofilling, considering that the number of viable healthy adipocytes is important in fat survival in recipient areas of lipofilling. Fat harvesting was performed in 10 female patients from flanks, on one side with a 2-mm Coleman cannula and on the other side with a 3-mm Mercedes cannula. Thirty milliliter of fat tissue from each side was collected and divided into three 10 mL syringes: A, B, and C. The fat inside syringe A was left untreated, the fat in syringe B underwent simple sedimentation, and the fat inside syringe C underwent centrifugation at 3000 rpm for 3 minutes. Each fat graft specimen was processed for examination under low-vacuum scanning electron microscope. Diameter (μ) and number of adipocytes per square millimeter and number of altered adipocytes per square millimeter were evaluated. Untreated specimens harvested with the 2 different techniques were first compared, then sedimented versus centrifuged specimens harvested with the same technique were compared. Statistical analysis was performed using Wilcoxon signed rank test. The number of adipocytes per square millimeter was statistically higher in specimens harvested with the 3-mm Mercedes cannula (P = 0.0310). The number of altered cells was statistically higher in centrifuged specimens than in sedimented ones using both methods of fat harvesting (P = 0.0080) with a 2-mm Coleman cannula and (P = 0.0050) with a 3-mm Mercedes cannula. Alterations in adipocyte morphology consisted in wrinkling of the membrane, opening of pore with leakage of oily material, reduction of cellular diameter, and total collapse of the cellular membrane. Fat harvesting by a 3-mm cannula results in a higher number of adipocytes and centrifugation of the harvested fat results in a higher number of morphologic altered cells than sedimentation.

  7. Impact of the adaptive statistical iterative reconstruction technique on image quality in ultra-low-dose CT

    International Nuclear Information System (INIS)

    Xu, Yan; He, Wen; Chen, Hui; Hu, Zhihai; Li, Juan; Zhang, Tingting

    2013-01-01

    Aim: To evaluate the relationship between different noise indices (NIs) and radiation dose and to compare the effect of different reconstruction algorithm applications for ultra-low-dose chest computed tomography (CT) on image quality improvement and the accuracy of volumetric measurement of ground-glass opacity (GGO) nodules using a phantom study. Materials and methods: A 11 cm thick transverse phantom section with a chest wall, mediastinum, and 14 artificial GGO nodules with known volumes (919.93 ± 64.05 mm 3 ) was constructed. The phantom was scanned on a Discovery CT 750HD scanner with five different NIs (NIs = 20, 30, 40, 50, and 60). All data were reconstructed with a 0.625 mm section thickness using the filtered back-projection (FBP), 50% adaptive statistical iterative reconstruction (ASiR), and Veo model-base iterative reconstruction algorithms. Image noise was measured in six regions of interest (ROIs). Nodule volumes were measured using a commercial volumetric software package. The image quality and the volume measurement errors were analysed. Results: Image noise increased dramatically from 30.7 HU at NI 20 to 122.4 HU at NI 60, with FBP reconstruction. Conversely, Veo reconstruction effectively controlled the noise increase, with an increase from 9.97 HU at NI 20 to only 15.1 HU at NI 60. Image noise at NI 60 with Veo was even lower (50.8%) than that at NI 20 with FBP. The contrast-to-noise ratio (CNR) of Veo at NI 40 was similar to that of FBP at NI 20. All artificial GGO nodules were successfully identified and measured with an average relative volume measurement error with Veo at NI 60 of 4.24%, comparable to a value of 10.41% with FBP at NI 20. At NI 60, the radiation dose was only one-tenth that at NI 20. Conclusion: The Veo reconstruction algorithms very effectively reduced image noise compared with the conventional FBP reconstructions. Using ultra-low-dose CT scanning and Veo reconstruction, GGOs can be detected and quantified with an acceptable

  8. Assessing Epistemic Sophistication by Considering Domain-Specific Absolute and Multiplicistic Beliefs Separately

    Science.gov (United States)

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-01-01

    Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

  9. The Relationship between Logistics Sophistication and Drivers of the Outsourcing of Logistics Activities

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2008-10-01

    Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

  10. Reacting to Neighborhood Cues?: Political Sophistication Moderates the Effect of Exposure to Immigrants

    DEFF Research Database (Denmark)

    Danckert, Bolette; Dinesen, Peter Thisted; Sønderskov, Kim Mannemar

    2017-01-01

    is founded on politically sophisticated individuals having a greater comprehension of news and other mass-mediated sources, which makes them less likely to rely on neighborhood cues as sources of information relevant for political attitudes. Based on a unique panel data set with fine-grained information...

  11. Sophistic Ethics in the Technical Writing Classroom: Teaching "Nomos," Deliberation, and Action.

    Science.gov (United States)

    Scott, J. Blake

    1995-01-01

    Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

  12. Close to the Clothes : Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  13. Close to the Clothes: Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  14. Lexical Complexity Development from Dynamic Systems Theory Perspective: Lexical Density, Diversity, and Sophistication

    Directory of Open Access Journals (Sweden)

    Reza Kalantari

    2017-10-01

    Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

  15. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

  16. Assessment of Groundwater Quality of Udayagiri area, Nellore District, Andhra Pradesh, South India Using Multivariate Statistical Techniques

    Directory of Open Access Journals (Sweden)

    Arveti Nagaraju

    2016-10-01

    Full Text Available Hydrogeochemical studies were carried out in and around Udayagiri area of Andhra Pradesh in order to assess the chemistry of the groundwater and to identify the dominant hydrogeochemical processes and mechanisms responsible for the evolution of the chemical composition of the groundwater. Descriptive statistics, correlation matrices, principal component analysis (PCA, together with cluster analysis (CA were used to gain an understanding of the hydrogeochemical processes in the study area. PCA has identified 4 main processes influencing the groundwater chemistry viz., mineral precipitation and dissolution, seawater intrusion, cation exchange, and carbonate balance. Further, three clusters C1, C2 and C3 were obtained. Samples from C1 contain high level of Cl− and may be due to the intensive evaporation and contamination from landfill leachate. Most of the samples from C2 are located closer to the sea and the high level of Na+ +K+ in these samples may be attributed to seawater intrusion. The geochemistry of water samples in C3 are more likely to originate from rock weathering. This has been supported by Gibbs diagram. The groundwater geochemistry in the study area is mostly of natural origin, but is influenced to some degree by human activity.    Evaluación de la calidad del agua subterránea a través de técnicas estadísticas multivariadas en el área Udayagiri, distrito Nellore, Andhra Pradesh, en el sur de India Resumen Se realizaron estudios hidrogeoquímicos en y alrededor del área Udayagiri de Andhra Pradesh para evaluar la química del agua subterránea e identificar los procesos hidrogeoquímicos dominantes y los mecanismos responsables de la evolución en la composición química del agua subterránea. Se utilizaron estadísticas descriptivas, matrices de correlación, análisis de componentes principales, al igual que análisis de grupos, para obtener y entender los procesos hidrogeoquímicos en el área de estudio. Los an

  17. Quality characterization and pollution source identification of surface water using multivariate statistical techniques, Nalagarh Valley, Himachal Pradesh, India

    Science.gov (United States)

    Herojeet, Rajkumar; Rishi, Madhuri S.; Lata, Renu; Dolma, Konchok

    2017-09-01

    multivariate techniques for reliable quality characterization of surface water quality to develop effective pollution reduction strategies and maintain a fine balance between the industrialization and ecological integrity.

  18. Reactive polymer coatings: A robust platform towards sophisticated surface engineering for biotechnology

    Science.gov (United States)

    Chen, Hsien-Yeh

    Functionalized poly(p-xylylenes) or so-called reactive polymers can be synthesized via chemical vapor deposition (CVD) polymerization. The resulting ultra-thin coatings are pinhole-free and can be conformally deposited to a wide range of substrates and materials. More importantly, the equipped functional groups can served as anchoring sites for tailoring the surface properties, making these reactive coatings a robust platform that can deal with sophisticated challenges faced in biointerfaces. In this work presented herein, surface coatings presenting various functional groups were prepared by CVD process. Such surfaces include aldehyde-functionalized coating to precisely immobilize saccharide molecules onto well-defined areas and alkyne-functionalized coating to click azide-modified molecules via Huisgen 1,3-dipolar cycloaddition reaction. Moreover, CVD copolymerization has been conducted to prepare multifunctional coatings and their specific functions were demonstrated by the immobilization of biotin and NHS-ester molecules. By using a photodefinable coating, polyethylene oxides were immobilized onto a wide range of substrates through photo-immobilization. Spatially controlled protein resistant properties were characterized by selective adsorption of fibrinogen and bovine serum albumin as model systems. Alternatively, surface initiator coatings were used for polymer graftings of polyethylene glycol) methyl ether methacrylate, and the resultant protein- and cell- resistant properties were characterized by adsorption of kinesin motor proteins, fibrinogen, and murine fibroblasts (NIH3T3). Accessibility of reactive coatings within confined microgeometries was systematically studied, and the preparation of homogeneous polymer thin films within the inner surface of microchannels was demonstrated. Moreover, these advanced coatings were applied to develop a dry adhesion process for microfluidic devices. This process provides (i) excellent bonding strength, (ii) extended

  19. Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508

    International Nuclear Information System (INIS)

    Hayek, A; Al Bokhaiti, M; Schwarz, M H; Boercsoek, J

    2012-01-01

    With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

  20. Assessment of statistical agreement of three techniques for the study of cut marks: 3D digital microscope, laser scanning confocal microscopy and micro-photogrammetry.

    Science.gov (United States)

    Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel

    2017-09-01

    In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  1. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  2. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  3. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    Science.gov (United States)

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  4. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt

    Directory of Open Access Journals (Sweden)

    Fernando Velasco-Tapia

    2014-01-01

    Full Text Available Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC volcanic range (Mexican Volcanic Belt. In this locality, the volcanic activity (3.7 to 0.5 Ma was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward’s linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas in the comingled lavas (binary mixtures.

  5. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    Directory of Open Access Journals (Sweden)

    S. Ars

    2017-12-01

    Full Text Available This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping

  6. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    Science.gov (United States)

    Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe

    2017-12-01

    This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances

  7. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  8. Sophisticated Fowl: The Complex Behaviour and Cognitive Skills of Chickens and Red Junglefowl

    Directory of Open Access Journals (Sweden)

    Laura Garnham

    2018-01-01

    Full Text Available The world’s most numerous bird, the domestic chicken, and their wild ancestor, the red junglefowl, have long been used as model species for animal behaviour research. Recently, this research has advanced our understanding of the social behaviour, personality, and cognition of fowl, and demonstrated their sophisticated behaviour and cognitive skills. Here, we overview some of this research, starting with describing research investigating the well-developed senses of fowl, before presenting how socially and cognitively complex they can be. The realisation that domestic chickens, our most abundant production animal, are behaviourally and cognitively sophisticated should encourage an increase in general appraise and fascination towards them. In turn, this should inspire increased use of them as both research and hobby animals, as well as improvements in their unfortunately often poor welfare.

  9. The relation between maturity and sophistication shall be properly dealt with in nuclear power development

    International Nuclear Information System (INIS)

    Li Yongjiang

    2009-01-01

    The paper analyses the advantages and disadvantages of the second generation improved technologies and third generation technologies mainly developed in China in terms of safety and economy. The paper also discusses the maturity of the second generation improved technologies and the sophistication of the third generation technologies respectively. Meanwhile, the paper proposes that the advantage and disadvantage of second generation improved technologies and third generation technologies should be carefully taken into consideration and the relationship between the maturity and sophistication should be properly dealt with in the current stage. A two-step strategy shall be taken as a solution to solve the problem of insufficient capacity of nuclear power, trace and develop the third generation technologies, so as to ensure the sound and fast development of nuclear power. (authors)

  10. Financial Sophistication and the Distribution of the Welfare Cost of Inflation

    OpenAIRE

    Paola Boel; Gabriele Camera

    2009-01-01

    The welfare cost of anticipated inflation is quantified in a calibrated model of the U.S. economy that exhibits tractable equilibrium dispersion in wealth and earnings. Inflation does not generate large losses in societal welfare, yet its impact varies noticeably across segments of society depending also on the financial sophistication of the economy. If money is the only asset, then inflation hurts mostly the wealthier and more productive agents, while those poorer and less productive may ev...

  11. Putin’s Russia: Russian Mentality and Sophisticated Imperialism in Military Policies

    OpenAIRE

    Szénási, Lieutenant-Colonel Endre

    2016-01-01

    According to my experiences, the Western world hopelessly fails to understand Russian mentality, or misinterprets it. During my analysis of the Russian way of thinking I devoted special attention to the examination of military mentality. I have connected the issue of the Russian way of thinking to the contemporary imperial policies of Putin’s Russia.  I have also attempted to prove the level of sophistication of both. I hope that a better understanding of both the Russian mentality and imperi...

  12. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  13. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  14. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  15. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques

    International Nuclear Information System (INIS)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D.; Rodriguez G, N. L.

    2009-01-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1·10 13 n·cm -2 ·s -1 . The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  16. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  17. Temporal and spatial assessment of river surface water quality using multivariate statistical techniques: a study in Can Tho City, a Mekong Delta area, Vietnam.

    Science.gov (United States)

    Phung, Dung; Huang, Cunrui; Rutherford, Shannon; Dwirahmadi, Febi; Chu, Cordia; Wang, Xiaoming; Nguyen, Minh; Nguyen, Nga Huy; Do, Cuong Manh; Nguyen, Trung Hieu; Dinh, Tuan Anh Diep

    2015-05-01

    The present study is an evaluation of temporal/spatial variations of surface water quality using multivariate statistical techniques, comprising cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA). Eleven water quality parameters were monitored at 38 different sites in Can Tho City, a Mekong Delta area of Vietnam from 2008 to 2012. Hierarchical cluster analysis grouped the 38 sampling sites into three clusters, representing mixed urban-rural areas, agricultural areas and industrial zone. FA/PCA resulted in three latent factors for the entire research location, three for cluster 1, four for cluster 2, and four for cluster 3 explaining 60, 60.2, 80.9, and 70% of the total variance in the respective water quality. The varifactors from FA indicated that the parameters responsible for water quality variations are related to erosion from disturbed land or inflow of effluent from sewage plants and industry, discharges from wastewater treatment plants and domestic wastewater, agricultural activities and industrial effluents, and contamination by sewage waste with faecal coliform bacteria through sewer and septic systems. Discriminant analysis (DA) revealed that nephelometric turbidity units (NTU), chemical oxygen demand (COD) and NH₃ are the discriminating parameters in space, affording 67% correct assignation in spatial analysis; pH and NO₂ are the discriminating parameters according to season, assigning approximately 60% of cases correctly. The findings suggest a possible revised sampling strategy that can reduce the number of sampling sites and the indicator parameters responsible for large variations in water quality. This study demonstrates the usefulness of multivariate statistical techniques for evaluation of temporal/spatial variations in water quality assessment and management.

  18. Short-term Local Forecasting by Artificial Intelligence Techniques and Assess Related Social Effects from Heterogeneous Data

    OpenAIRE

    Gong, Bing

    2017-01-01

    This work aims to use the sophisticated artificial intelligence and statistic techniques to forecast pollution and assess its social impact. To achieve the target of the research, this study is divided into several research sub-objectives as follows: First research sub-objective: propose a framework for relocating and reconfiguring the existing pollution monitoring networks by using feature selection, artificial intelligence techniques, and information theory. Second research sub-objective: c...

  19. Remote sensing estimation of the total phosphorus concentration in a large lake using band combinations and regional multivariate statistical modeling techniques.

    Science.gov (United States)

    Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi

    2015-03-15

    Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate

  20. [Combined use of wide-detector and adaptive statistical iterative reconstruction-V technique in abdominal CT with low radiation dose].

    Science.gov (United States)

    Wang, H X; Lü, P J; Yue, S W; Chang, L Y; Li, Y; Zhao, H P; Li, W R; Gao, J B

    2017-12-05

    Objective: To investigate the image quality and radiation dose with wide-detector(80 mm) and adaptive statistical iterative reconstruction-V (ASIR-V) technique at abdominal contrast enhanced CT scan. Methods: In the first phantom experiment part, the percentage of ASIR-V for half dose of combined wide detector with ASIR-V technique as compared with standard-detector (40 mm) technique was determined. The human experiment was performed based on the phantom study, 160 patients underwent contrast-enhanced abdominal CT scan were prospectively collected and divided into the control group ( n =40) with image reconstruction using 40% ASIR (group A) and the study group ( n =120) with random number table. According to pre-ASIR-V percentage, the study group was assigned into three groups[40 cases in each group, group B: 0 pre-ASIR-V scan with image reconstruction of 0-100% post-ASIR-V (interval 10%, subgroups B0-B10); group C: 20% pre-ASIR-V with 20%, 40% and 60% post-ASIR-V (subgroups C1-C3); group D: 40%pre-ASIR-V with 40% and 60% post-ASIR-V (subgroups D1-D2)]. Image noise, CT attenuation values and CNR of the liver, pancreas, aorta and portal vein were compared by using two sample t test and One-way ANOVA. Qualitative visual parameters (overall image quality as graded on a 5-point scale) was compared by Mann-Whitney U test and Kruskal-Wallis H test. Results: The phantom experiment showed that the percentage of pre-ASIR-V for half dose was 40%. With the 40% pre-ASIR-V, radiation dose in the study group was reduced by 35.5% as compared with the control group. Image noise in the subgroups of B2-B10, C2-C3 and D1-D2 were lower ( t =-14.681--3.046, all P 0.05). The subjective image quality scores increased gradually in the range of 0-60% post-ASIR-V and decreased with post-ASIR-V larger than 70%. The overall image quality of subgroup B3-B8, C2-C3 and D1-D2 were higher than that in group A ( Z =-2.229--6.533, all P ASIR technique, wide-detector combined with 40% pre

  1. Do organizations adopt sophisticated capital budgeting practices to deal with uncertainty in the investment decision? : A research note

    NARCIS (Netherlands)

    Verbeeten, Frank H M

    This study examines the impact of uncertainty on the sophistication of capital budgeting practices. While the theoretical applications of sophisticated capital budgeting practices (defined as the use of real option reasoning and/or game theory decision rules) have been well documented, empirical

  2. "SOCRATICS" AS ADDRESSES OF ISOCRATES’ EPIDEICTIC SPEECHES (Against the Sophists, Encomium of Helen, Busiris

    Directory of Open Access Journals (Sweden)

    Anna Usacheva

    2012-06-01

    Full Text Available This article analyses the three epideictic orations of Isocrates which are in themselves a precious testimony of the quality of intellectual life at the close of the fourth century before Christ. To this period belong also the Socratics who are generally seen as an important link between Socrates and Plato. The author of this article proposes a more productive approach to the study of Antisthenes, Euclid of Megara and other so-called Socratics, revealing them not as independent thinkers but rather as adherents of the sophistic school and also as teachers, thereby, including them among those who took part in the educative activity of their time

  3. Low Level RF Including a Sophisticated Phase Control System for CTF3

    CERN Document Server

    Mourier, J; Nonglaton, J M; Syratchev, I V; Tanner, L

    2004-01-01

    CTF3 (CLIC Test Facility 3), currently under construction at CERN, is a test facility designed to demonstrate the key feasibility issues of the CLIC (Compact LInear Collider) two-beam scheme. When completed, this facility will consist of a 150 MeV linac followed by two rings for bunch-interleaving, and a test stand where 30 GHz power will be generated. In this paper, the work that has been carried out on the linac's low power RF system is described. This includes, in particular, a sophisticated phase control system for the RF pulse compressor to produce a flat-top rectangular pulse over 1.4 µs.

  4. Do we need statistics when we have linguistics?

    Directory of Open Access Journals (Sweden)

    Cantos Gómez Pascual

    2002-01-01

    Full Text Available Statistics is known to be a quantitative approach to research. However, most of the research done in the fields of language and linguistics is of a different kind, namely qualitative. Succinctly, qualitative analysis differs from quantitative analysis is that in the former no attempt is made to assign frequencies, percentages and the like, to the linguistic features found or identified in the data. In quantitative research, linguistic features are classified and counted, and even more complex statistical models are constructed in order to explain these observed facts. In qualitative research, however, we use the data only for identifying and describing features of language usage and for providing real occurrences/examples of particular phenomena. In this paper, we shall try to show how quantitative methods and statistical techniques can supplement qualitative analyses of language. We shall attempt to present some mathematical and statistical properties of natural languages, and introduce some of the quantitative methods which are of the most value in working empirically with texts and corpora, illustrating the various issues with numerous examples and moving from the most basic descriptive techniques (frequency counts and percentages to decision-taking techniques (chi-square and z-score and to more sophisticated statistical language models (Type-Token/Lemma-Token/Lemma-Type formulae, cluster analysis and discriminant function analysis.

  5. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  6. A Third-Generation Adaptive Statistical Iterative Reconstruction Technique: Phantom Study of Image Noise, Spatial Resolution, Lesion Detectability, and Dose Reduction Potential.

    Science.gov (United States)

    Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan

    2018-06-01

    The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.

  7. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  8. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  9. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  10. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  11. When not to copy: female fruit flies use sophisticated public information to avoid mated males

    Science.gov (United States)

    Loyau, Adeline; Blanchet, Simon; van Laere, Pauline; Clobert, Jean; Danchin, Etienne

    2012-10-01

    Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

  12. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  13. UNA TECNICA ESTADISTICA PARA MEDIR LA CONFLICTIVIDAD SOCIAL A TRAVES DEL REGISTRO ARQUEOLOGICO (A Statistical Technique to Measure Social Conflict through the Archaeological Record

    Directory of Open Access Journals (Sweden)

    Pascual Izquierdo-Egea

    2015-03-01

    Full Text Available Se presenta aqui una tecnica estadistica para medir la conflictividad social a traves del registro mortuorio. Nace al amparo del metodo de valoracion contextual empleado en el analisis de los ajuares funerarios desde 1993. Se trata de una herramienta fundamental para el desarrollo de la arqueologia de los fenomenos sociales, cuyos relevantes resultados empiricos avalan su trascendencia teorica. Tras proceder a su conceptualizacion en funcion de la desigualdad social y la riqueza relativa, se explican las dos clases de conflictividad social definidas: estructural o estatica y coyuntural o dinamica. Finalmente, se incluyen sus conexiones con la ley demografica de Malthus a traves de sus dos parametros: poblacion y recursos. Todo este entramado teorico se ilustra con algunas aplicaciones referidas a las civilizaciones antiguas, abarcando la protohistoria iberica, la Mesoamerica prehispanica o la Roma altoimperial. ENGLISH: A statistical technique to measure social conflict through the mortuary record is presented here. It is born under the contextual valuation method used in the analysis of grave goods since 1993. This is a fundamental tool for the development of the archaeology of social phenomena, whose relevant empirical results support its theoretical significance. After conveying its conceptualization in terms of social inequality and relative wealth, the two classes of social conflict are explained: static or structural and dynamic or conjunctural. Finally, connections with the Malthusian demographic law through its two parameters—population and resources—are included. The synthesis of these theoretical frameworks is illustrated with applications to ancient civilizations, including Iberian protohistory, prehispanic Mesoamerica, and early imperial Rome.

  14. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  15. Statistics for wildlifers: how much and what kind?

    Science.gov (United States)

    Johnson, D.H.; Shaffer, T.L.; Newton, W.E.

    2001-01-01

    Quantitative methods are playing increasingly important roles in wildlife ecology and, ultimately, management. This change poses a challenge for wildlife practitioners and students who are not well-educated in mathematics and statistics. Here we give our opinions on what wildlife biologists should know about statistics, while recognizing that not everyone is inclined mathematically. For those who are, we recommend that they take mathematics coursework at least through calculus and linear algebra. They should take statistics courses that are focused conceptually , stressing the Why rather than the How of doing statistics. For less mathematically oriented wildlifers, introductory classes in statistical techniques will furnish some useful background in basic methods but may provide little appreciation of when the methods are appropriate. These wildlifers will have to rely much more on advice from statisticians. Far more important than knowing how to analyze data is an understanding of how to obtain and recognize good data. Regardless of the statistical education they receive, all wildlife biologists should appreciate the importance of controls, replication, and randomization in studies they conduct. Understanding these concepts requires little mathematical sophistication, but is critical to advancing the science of wildlife ecology.

  16. The Impact of Services on Economic Complexity: Service Sophistication as Route for Economic Growth.

    Science.gov (United States)

    Stojkoski, Viktor; Utkovski, Zoran; Kocarev, Ljupco

    2016-01-01

    Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country's productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country's productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries.

  17. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  18. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    Science.gov (United States)

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-12-01

    The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  19. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task

    Science.gov (United States)

    Akam, Thomas; Costa, Rui; Dayan, Peter

    2015-01-01

    The recently developed ‘two-step’ behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects’ investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues. PMID:26657806

  20. Simple Plans or Sophisticated Habits? State, Transition and Learning Interactions in the Two-Step Task.

    Directory of Open Access Journals (Sweden)

    Thomas Akam

    2015-12-01

    Full Text Available The recently developed 'two-step' behavioural task promises to differentiate model-based from model-free reinforcement learning, while generating neurophysiologically-friendly decision datasets with parametric variation of decision variables. These desirable features have prompted its widespread adoption. Here, we analyse the interactions between a range of different strategies and the structure of transitions and outcomes in order to examine constraints on what can be learned from behavioural performance. The task involves a trade-off between the need for stochasticity, to allow strategies to be discriminated, and a need for determinism, so that it is worth subjects' investment of effort to exploit the contingencies optimally. We show through simulation that under certain conditions model-free strategies can masquerade as being model-based. We first show that seemingly innocuous modifications to the task structure can induce correlations between action values at the start of the trial and the subsequent trial events in such a way that analysis based on comparing successive trials can lead to erroneous conclusions. We confirm the power of a suggested correction to the analysis that can alleviate this problem. We then consider model-free reinforcement learning strategies that exploit correlations between where rewards are obtained and which actions have high expected value. These generate behaviour that appears model-based under these, and also more sophisticated, analyses. Exploiting the full potential of the two-step task as a tool for behavioural neuroscience requires an understanding of these issues.

  1. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    Science.gov (United States)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  2. Nurturing Opportunity Identification for Business Sophistication in a Cross-disciplinary Study Environment

    Directory of Open Access Journals (Sweden)

    Karine Oganisjana

    2012-12-01

    Full Text Available Opportunity identification is the key element of the entrepreneurial process; therefore the issue of developing this skill in students is a crucial task in contemporary European education which has recognized entrepreneurship as one of the lifelong learning key competences. The earlier opportunity identification becomes a habitual way of thinking and behavior across a broad range of contexts, the more likely that entrepreneurial disposition will steadily reside in students. In order to nurture opportunity identification in students for making them able to organize sophisticated businesses in the future, certain demands ought to be put forward as well to the teacher – the person who is to promote these qualities in their students. The paper reflects some findings of a research conducted within the frameworks of a workplace learning project for the teachers of one of Riga secondary schools (Latvia. The main goal of the project was to teach the teachers to identify hidden inner links between apparently unrelated things, phenomena and events within 10th grade study curriculum and connect them together and create new opportunities. The creation and solution of cross-disciplinary tasks were the means for achieving this goal.

  3. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Science.gov (United States)

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  4. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    2011-03-01

    Full Text Available We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  5. Method for estimating potential wetland extent by utilizing streamflow statistics and flood-inundation mapping techniques: Pilot study for land along the Wabash River near Terre Haute, Indiana

    Science.gov (United States)

    Kim, Moon H.; Ritz, Christian T.; Arvin, Donald V.

    2012-01-01

    Potential wetland extents were estimated for a 14-mile reach of the Wabash River near Terre Haute, Indiana. This pilot study was completed by the U.S. Geological Survey in cooperation with the U.S. Department of Agriculture, Natural Resources Conservation Service (NRCS). The study showed that potential wetland extents can be estimated by analyzing streamflow statistics with the available streamgage data, calculating the approximate water-surface elevation along the river, and generating maps by use of flood-inundation mapping techniques. Planning successful restorations for Wetland Reserve Program (WRP) easements requires a determination of areas that show evidence of being in a zone prone to sustained or frequent flooding. Zone determinations of this type are used by WRP planners to define the actively inundated area and make decisions on restoration-practice installation. According to WRP planning guidelines, a site needs to show evidence of being in an "inundation zone" that is prone to sustained or frequent flooding for a period of 7 consecutive days at least once every 2 years on average in order to meet the planning criteria for determining a wetland for a restoration in agricultural land. By calculating the annual highest 7-consecutive-day mean discharge with a 2-year recurrence interval (7MQ2) at a streamgage on the basis of available streamflow data, one can determine the water-surface elevation corresponding to the calculated flow that defines the estimated inundation zone along the river. By using the estimated water-surface elevation ("inundation elevation") along the river, an approximate extent of potential wetland for a restoration in agricultural land can be mapped. As part of the pilot study, a set of maps representing the estimated potential wetland extents was generated in a geographic information system (GIS) application by combining (1) a digital water-surface plane representing the surface of inundation elevation that sloped in the downstream

  6. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  7. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  8. Sophisticated Online Learning Scheme for Green Resource Allocation in 5G Heterogeneous Cloud Radio Access Networks

    KAUST Repository

    Alqerm, Ismail

    2018-01-23

    5G is the upcoming evolution for the current cellular networks that aims at satisfying the future demand for data services. Heterogeneous cloud radio access networks (H-CRANs) are envisioned as a new trend of 5G that exploits the advantages of heterogeneous and cloud radio access networks to enhance spectral and energy efficiency. Remote radio heads (RRHs) are small cells utilized to provide high data rates for users with high quality of service (QoS) requirements, while high power macro base station (BS) is deployed for coverage maintenance and low QoS users service. Inter-tier interference between macro BSs and RRHs and energy efficiency are critical challenges that accompany resource allocation in H-CRANs. Therefore, we propose an efficient resource allocation scheme using online learning, which mitigates interference and maximizes energy efficiency while maintaining QoS requirements for all users. The resource allocation includes resource blocks (RBs) and power. The proposed scheme is implemented using two approaches: centralized, where the resource allocation is processed at a controller integrated with the baseband processing unit and decentralized, where macro BSs cooperate to achieve optimal resource allocation strategy. To foster the performance of such sophisticated scheme with a model free learning, we consider users\\' priority in RB allocation and compact state representation learning methodology to improve the speed of convergence and account for the curse of dimensionality during the learning process. The proposed scheme including both approaches is implemented using software defined radios testbed. The obtained results and simulation results confirm that the proposed resource allocation solution in H-CRANs increases the energy efficiency significantly and maintains users\\' QoS.

  9. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    Science.gov (United States)

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support

  10. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  11. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    Science.gov (United States)

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  12. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  13. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  14. Statistical comparisons of Savannah River anemometer data applied to quality control of instrument networks

    International Nuclear Information System (INIS)

    Porch, W.M.; Dickerson, M.H.

    1976-08-01

    Continuous monitoring of extensive meteorological instrument arrays is a requirement in the study of important mesoscale atmospheric phenomena. The phenomena include pollution transport prediction from continuous area sources, or one time releases of toxic materials and wind energy prospecting in areas of topographic enhancement of the wind. Quality control techniques that can be applied to these data to determine if the instruments are operating within their prescribed tolerances were investigated. Savannah River Plant data were analyzed with both independent and comparative statistical techniques. The independent techniques calculate the mean, standard deviation, moments about the mean, kurtosis, skewness, probability density distribution, cumulative probability and power spectra. The comparative techniques include covariance, cross-spectral analysis and two dimensional probability density. At present the calculating and plotting routines for these statistical techniques do not reside in a single code so it is difficult to ascribe independent memory size and computation time accurately. However, given the flexibility of a data system which includes simple and fast running statistics at the instrument end of the data network (ASF) and more sophisticated techniques at the computational end (ACF) a proper balance will be attained. These techniques are described in detail and preliminary results are presented

  15. Integrated statistical learning of metabolic ion mobility spectrometry profiles for pulmonary disease identification

    DEFF Research Database (Denmark)

    Hauschild, A.C.; Baumbach, Jan; Baumbach, J.

    2012-01-01

    sophisticated statistical learning techniques for VOC-based feature selection and supervised classification into patient groups. We analyzed breath data from 84 volunteers, each of them either suffering from chronic obstructive pulmonary disease (COPD), or both COPD and bronchial carcinoma (COPD + BC), as well...... as from 35 healthy volunteers, comprising a control group (CG). We standardized and integrated several statistical learning methods to provide a broad overview of their potential for distinguishing the patient groups. We found that there is strong potential for separating MCC/IMS chromatograms of healthy...... patients from healthy controls. We conclude that these statistical learning methods have a generally high accuracy when applied to well-structured, medical MCC/IMS data....

  16. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  17. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  18. Sophisticated Approval Voting, Ignorance Priors, and Plurality Heuristics: A Behavioral Social Choice Analysis in a Thurstonian Framework

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia

    2007-01-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

  19. Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5

    Science.gov (United States)

    Candiotto, Laura

    2018-01-01

    This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…

  20. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  1. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  2. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  3. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  4. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  5. Old tools for sophisticated diagnosis: Electrocardiography for the assessment of myocardial viability

    International Nuclear Information System (INIS)

    Margonato, A.; Chierchia, S.

    1996-01-01

    The identification of residual myocardial viability in patients with a previous myocardial infarction has important clinical implications. Various methods have been developed for the detection of viable myocardium, however most of them are expensive and available only to high-tech centers. In the attempt to obtain reliable information at a low cost, exercise-ECG has been proposed as a useful technique. The results of a series of studies show that ST segment elevation and ventricular arrhythmias elicited by exercise are reliable signs of the presence of reversible myocardial damage

  6. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: an intra-individual comparison.

    Science.gov (United States)

    Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung

    2012-09-01

    To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Twenty-six patients (M:F=13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, PASIR studies (20.81 vs. 16.67, P=0.004), but was not different in the aorta (18.23 vs. 18.72, P=0.726). The subjective image quality demonstrated no difference between the two studies. A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hyun, E-mail: circle1128@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, Myung-Joon, E-mail: mjkim@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Yoon, Choon-Sik, E-mail: yooncs58@yuhs.ac [Department of Radiology, Gangnam Severance Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Lee, Mi-Jung, E-mail: mjl1213@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of)

    2012-09-15

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality.

  8. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    International Nuclear Information System (INIS)

    Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung

    2012-01-01

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality

  9. A Case Study on E - Banking Security – When Security Becomes Too Sophisticated for the User to Access Their Information

    OpenAIRE

    Aaron M. French

    2012-01-01

    While eBanking security continues to increase in sophistication to protect against threats, the usability of the eBanking decreases resulting in poor security behaviors by the users. The current research evaluates se curity risks and measures taken for eBanking solutions. A case study is presented describing how increased complexity decreases vulnerabilities online but increases vulnerabilities from internal threats and eBanking users

  10. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  11. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  12. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  13. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  14. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  15. Engineering FKBP-Based Destabilizing Domains to Build Sophisticated Protein Regulation Systems.

    Directory of Open Access Journals (Sweden)

    Wenlin An

    Full Text Available Targeting protein stability with small molecules has emerged as an effective tool to control protein abundance in a fast, scalable and reversible manner. The technique involves tagging a protein of interest (POI with a destabilizing domain (DD specifically controlled by a small molecule. The successful construction of such fusion proteins may, however, be limited by functional interference of the DD epitope with electrostatic interactions required for full biological function of proteins. Another drawback of this approach is the remaining endogenous protein. Here, we combined the Cre-LoxP system with an advanced DD and generated a protein regulation system in which the loss of an endogenous protein, in our case the tumor suppressor PTEN, can be coupled directly with a conditionally fine-tunable DD-PTEN. This new system will consolidate and extend the use of DD-technology to control protein function precisely in living cells and animal models.

  16. Towards sophisticated learning from EHRs: increasing prediction specificity and accuracy using clinically meaningful risk criteria.

    Science.gov (United States)

    Vasiljeva, Ieva; Arandjelovic, Ognjen

    2016-08-01

    Computer based analysis of Electronic Health Records (EHRs) has the potential to provide major novel insights of benefit both to specific individuals in the context of personalized medicine, as well as on the level of population-wide health care and policy. The present paper introduces a novel algorithm that uses machine learning for the discovery of longitudinal patterns in the diagnoses of diseases. Two key technical novelties are introduced: one in the form of a novel learning paradigm which enables greater learning specificity, and another in the form of a risk driven identification of confounding diagnoses. We present a series of experiments which demonstrate the effectiveness of the proposed techniques, and which reveal novel insights regarding the most promising future research directions.

  17. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  18. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  19. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  20. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  1. Discovery and characterisation of dietary patterns in two Nordic countries. Using non-supervised and supervised multivariate statistical techniques to analyse dietary survey data

    DEFF Research Database (Denmark)

    Edberg, Anna; Freyhult, Eva; Sand, Salomon

    - and inter-national data excerpts. For example, major PCA loadings helped deciphering both shared and disparate features, relating to food groups, across Danish and Swedish preschool consumers. Data interrogation, reliant on the above-mentioned composite techniques, disclosed one outlier dietary prototype...... prototype with the latter property was identified also in the Danish data material, but without low consumption of Vegetables or Fruit & berries. The second MDA-type of data interrogation involved Supervised Learning, also known as Predictive Modelling. These exercises involved the Random Forest (RF...... not elaborated on in-depth, output from several analyses suggests a preference for energy-based consumption data for Cluster Analysis and Predictive Modelling, over those appearing as weight....

  2. Effects of statistical quality, sampling rate and temporal filtering techniques on the extraction of functional parameters from the left ventricular time-activity curves

    Energy Technology Data Exchange (ETDEWEB)

    Guignard, P.A.; Chan, W. (Royal Melbourne Hospital, Parkville (Australia). Dept. of Nuclear Medicine)

    1984-09-01

    Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned.

  3. Effects of statistical quality, sampling rate and temporal filtering techniques on the extraction of functional parameters from the left ventricular time-activity curves

    International Nuclear Information System (INIS)

    Guignard, P.A.; Chan, W.

    1984-01-01

    Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned. (author)

  4. Modeling and Analysis of Mechanical Properties of Aluminium Alloy (A413 Processed through Squeeze Casting Route Using Artificial Neural Network Model and Statistical Technique

    Directory of Open Access Journals (Sweden)

    R. Soundararajan

    2015-01-01

    Full Text Available Artificial Neural Network (ANN approach was used for predicting and analyzing the mechanical properties of A413 aluminum alloy produced by squeeze casting route. The experiments are carried out with different controlled input variables such as squeeze pressure, die preheating temperature, and melt temperature as per Full Factorial Design (FFD. The accounted absolute process variables produce a casting with pore-free and ideal fine grain dendritic structure resulting in good mechanical properties such as hardness, ultimate tensile strength, and yield strength. As a primary objective, a feed forward back propagation ANN model has been developed with different architectures for ensuring the definiteness of the values. The developed model along with its predicted data was in good agreement with the experimental data, inferring the valuable performance of the optimal model. From the work it was ascertained that, for castings produced by squeeze casting route, the ANN is an alternative method for predicting the mechanical properties and appropriate results can be estimated rather than measured, thereby reducing the testing time and cost. As a secondary objective, quantitative and statistical analysis was performed in order to evaluate the effect of process parameters on the mechanical properties of the castings.

  5. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    Science.gov (United States)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-09-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  6. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  7. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    International Nuclear Information System (INIS)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-01-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 x 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver (p < 0.05). Furthermore, the F-value, which was used as a scale for the difference in recognition rates, was highest in the average gray level, relatively high in the skewness and the entropy, and relatively low in the uniformity, the relative smoothness and the average contrast. The recognition rate for a fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  8. Assessment of myocardial metabolism by PET - a sophisticated dream or clinical reality

    International Nuclear Information System (INIS)

    Schelbert, H.R.

    1986-01-01

    This symposium reviewed radionuclide techniques for the noninvasive study of regional myocardial metabolism and spanned a wide range of topics. New radiotracers for probing different metabolic pathways or selected biochemical reaction steps were presented. New information on tracers already in use was forthcoming. Because the imaging device can measure only concentrations of radiolabel in tissue, other studies examined relationships between uptake and turnover of radioactivity in tissue as an externally observed signal, the chemical fate of the label, and the biologic process under study. Other studies formulated these relationships through tracer compartment models, which are fundament to quantifying regional physiologic processes externally. Other investigations applied radiotracer methods to experimental models of cardiac disease to patients. They described findings of regional or global alterations in substrate metabolism. These observations highlighted the potential clinical value of this new methodology. At the same time, several of these observations remain at present without mechanistic explanation; yet they form the foundation on which working hypotheses can be built, which in turn can be tested in vivo. (orig.)

  9. Toward sophisticated basal ganglia neuromodulation: Review on basal ganglia deep brain stimulation.

    Science.gov (United States)

    Da Cunha, Claudio; Boschen, Suelen L; Gómez-A, Alexander; Ross, Erika K; Gibson, William S J; Min, Hoon-Ki; Lee, Kendall H; Blaha, Charles D

    2015-11-01

    This review presents state-of-the-art knowledge about the roles of the basal ganglia (BG) in action-selection, cognition, and motivation, and how this knowledge has been used to improve deep brain stimulation (DBS) treatment of neurological and psychiatric disorders. Such pathological conditions include Parkinson's disease, Huntington's disease, Tourette syndrome, depression, and obsessive-compulsive disorder. The first section presents evidence supporting current hypotheses of how the cortico-BG circuitry works to select motor and emotional actions, and how defects in this circuitry can cause symptoms of the BG diseases. Emphasis is given to the role of striatal dopamine on motor performance, motivated behaviors and learning of procedural memories. Next, the use of cutting-edge electrochemical techniques in animal and human studies of BG functioning under normal and disease conditions is discussed. Finally, functional neuroimaging studies are reviewed; these works have shown the relationship between cortico-BG structures activated during DBS and improvement of disease symptoms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Crop to wild gene flow: Does more sophisticated research provide better risk assessment?

    International Nuclear Information System (INIS)

    Jong, Tom J. de; Rong, Jun

    2013-01-01

    Highlights: ► Genes can sometimes flow from genetically modified crops to wild plants. ► The probability can be predicted from seed production of hybrids and backcrosses. ► Nevertheless predictions about introgression remain uncertain. ► One should be reluctant to ask too much detail in Environmental Risk Assessment. ► Instead possible harm should have a more central place. -- Abstract: Research into introgression, the permanent incorporation of alleles of one species into another, is flourishing and gives new insights into evolution and speciation. The possible transfer of transgenes from crop species to wild relatives is of major concern for regulators. Applicants that want to introduce a genetically modified (GM) crop on the European market need to indicate the likelihood of introgression and its anticipated effects in an Environmental Risk Analysis (ERA). The European Food Safety Association (EFSA) and competent authorities of different countries evaluate the ERA. Predicting which crop alleles will or will not be permanently incorporated into wild populations requires, apart from information on seed production of hybrids, information on how these crop alleles are associated with fitness. Advances in genetics open new avenues to address this question in more detail. We argue, however, that, even with the best techniques, predicting introgression from crop to wild species will always have a considerable margin of uncertainty. One must therefore be prudent to demand more detailed research for the ERA, especially since the possible harm of transgenes in natural populations remains so poorly defined by regulators

  11. Assessment of myocardial metabolism by PET - a sophisticated dream or clinical reality

    Energy Technology Data Exchange (ETDEWEB)

    Schelbert, H R

    1986-08-01

    This symposium reviewed radionuclide techniques for the noninvasive study of regional myocardial metabolism and spanned a wide range of topics. New radiotracers for probing different metabolic pathways or selected biochemical reaction steps were presented. New information on tracers already in use was forthcoming. Because the imaging device can measure only concentrations of radiolabel in tissue, other studies examined relationships between uptake and turnover of radioactivity in tissue as an externally observed signal, the chemical fate of the label, and the biologic process under study. Other studies formulated these relationships through tracer compartment models, which are fundament to quantifying regional physiologic processes externally. Other investigations applied radiotracer methods to experimental models of cardiac disease to patients. They described findings of regional or global alterations in substrate metabolism. These observations highlighted the potential clinical value of this new methodology. At the same time, several of these observations remain at present without mechanistic explanation; yet they form the foundation on which working hypotheses can be built, which in turn can be tested in vivo.

  12. Experimental study on seismic behaviors of two-storied sophisticated model for nuclear reactor building

    Energy Technology Data Exchange (ETDEWEB)

    Higashiura, Akira; Sato, Kazuhide; Muramoto, Michiya; Yanagase, Takahito; Watanabe, Satoshi

    1987-03-01

    In this paper, by pseudo dynamic test using substructuring technique and lateral static loading test, authors wish to introduce the investigation on the seismic behaviors of nuclear reactor building. The results obtained by those test are as follows. 1) The maximum response displacements obtained by pseudo dynamic test are equivalent to those by dynamic analytical procedures using the approximate earthquake ground motion. 2) In the finally stage of pseudo dynamic test, the natural period of the system is increased about three times as long as that in elastic region. 3) Some shear cracks is observed on the web portion of the box and the truncated conical wall at the end of pseudo dynamic test. 4) Maximum shear forces in the test specimen obtained by pseudo dynamic test are about one third of the ultimate shear strength of it obtained by static loading test. 5) At the ultimate strength of the test specimen on static loading test, a lot of shear cracks and crush of concrete are observed on web portion of the box and the truncated conical wall.

  13. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  14. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  15. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  16. Total coliforms, arsenic and cadmium exposure through drinking water in the Western Region of Ghana: application of multivariate statistical technique to groundwater quality.

    Science.gov (United States)

    Affum, Andrews Obeng; Osae, Shiloh Dede; Nyarko, Benjamin Jabez Botwe; Afful, Samuel; Fianko, Joseph Richmond; Akiti, Tetteh Thomas; Adomako, Dickson; Acquaah, Samuel Osafo; Dorleku, Micheal; Antoh, Emmanuel; Barnes, Felix; Affum, Enoch Acheampong

    2015-02-01

    In recent times, surface water resource in the Western Region of Ghana has been found to be inadequate in supply and polluted by various anthropogenic activities. As a result of these problems, the demand for groundwater by the human populations in the peri-urban communities for domestic, municipal and irrigation purposes has increased without prior knowledge of its water quality. Water samples were collected from 14 public hand-dug wells during the rainy season in 2013 and investigated for total coliforms, Escherichia coli, mercury (Hg), arsenic (As), cadmium (Cd) and physicochemical parameters. Multivariate statistical analysis of the dataset and a linear stoichiometric plot of major ions were applied to group the water samples and to identify the main factors and sources of contamination. Hierarchal cluster analysis revealed four clusters from the hydrochemical variables (R-mode) and three clusters in the case of water samples (Q-mode) after z score standardization. Principal component analysis after a varimax rotation of the dataset indicated that the four factors extracted explained 93.3 % of the total variance, which highlighted salinity, toxic elements and hardness pollution as the dominant factors affecting groundwater quality. Cation exchange, mineral dissolution and silicate weathering influenced groundwater quality. The ranking order of major ions was Na(+) > Ca(2+) > K(+) > Mg(2+) and Cl(-) > SO4 (2-) > HCO3 (-). Based on piper plot and the hydrogeology of the study area, sodium chloride (86 %), sodium hydrogen carbonate and sodium carbonate (14 %) water types were identified. Although E. coli were absent in the water samples, 36 % of the wells contained total coliforms (Enterobacter species) which exceeded the WHO guidelines limit of zero colony-forming unit (CFU)/100 mL of drinking water. With the exception of Hg, the concentration of As and Cd in 79 and 43 % of the water samples exceeded the WHO guideline limits of 10 and 3

  17. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  18. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  19. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  20. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  1. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  2. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  3. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  4. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  5. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  6. Statistical learning and selective inference.

    Science.gov (United States)

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  7. Angular-momentum nonclassicality by breaking classical bounds on statistics

    Energy Technology Data Exchange (ETDEWEB)

    Luis, Alfredo [Departamento de Optica, Facultad de Ciencias Fisicas, Universidad Complutense, E-28040 Madrid (Spain); Rivas, Angel [Departamento de Fisica Teorica I, Facultad de Ciencias Fisicas, Universidad Complutense, E-28040 Madrid (Spain)

    2011-10-15

    We derive simple practical procedures revealing the quantum behavior of angular momentum variables by the violation of classical upper bounds on the statistics. Data analysis is minimum and definite conclusions are obtained without evaluation of moments, or any other more sophisticated procedures. These nonclassical tests are very general and independent of other typical quantum signatures of nonclassical behavior such as sub-Poissonian statistics, squeezing, or oscillatory statistics, being insensitive to the nonclassical behavior displayed by other variables.

  8. Counting statistics in radioactivity measurements

    International Nuclear Information System (INIS)

    Martin, J.

    1975-01-01

    The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr

  9. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Science.gov (United States)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  10. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  11. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  12. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  13. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review.

    Science.gov (United States)

    Lamb, Karen E; Thornton, Lukar E; Cerin, Ester; Ball, Kylie

    2015-01-01

    Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Searches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  14. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Karen E. Lamb

    2015-07-01

    Full Text Available BackgroundInequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses.MethodsSearches were conducted for articles published from 2000-2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status.ResultsFifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer. To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation.ConclusionsWith advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results.

  15. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  16. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  17. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  18. Statistical Mechanics of Turbulent Flows

    International Nuclear Information System (INIS)

    Cambon, C

    2004-01-01

    This is a handbook for a computational approach to reacting flows, including background material on statistical mechanics. In this sense, the title is somewhat misleading with respect to other books dedicated to the statistical theory of turbulence (e.g. Monin and Yaglom). In the present book, emphasis is placed on modelling (engineering closures) for computational fluid dynamics. The probabilistic (pdf) approach is applied to the local scalar field, motivated first by the nonlinearity of chemical source terms which appear in the transport equations of reacting species. The probabilistic and stochastic approaches are also used for the velocity field and particle position; nevertheless they are essentially limited to Lagrangian models for a local vector, with only single-point statistics, as for the scalar. Accordingly, conventional techniques, such as single-point closures for RANS (Reynolds-averaged Navier-Stokes) and subgrid-scale models for LES (large-eddy simulations), are described and in some cases reformulated using underlying Langevin models and filtered pdfs. Even if the theoretical approach to turbulence is not discussed in general, the essentials of probabilistic and stochastic-processes methods are described, with a useful reminder concerning statistics at the molecular level. The book comprises 7 chapters. Chapter 1 briefly states the goals and contents, with a very clear synoptic scheme on page 2. Chapter 2 presents definitions and examples of pdfs and related statistical moments. Chapter 3 deals with stochastic processes, pdf transport equations, from Kramer-Moyal to Fokker-Planck (for Markov processes), and moments equations. Stochastic differential equations are introduced and their relationship to pdfs described. This chapter ends with a discussion of stochastic modelling. The equations of fluid mechanics and thermodynamics are addressed in chapter 4. Classical conservation equations (mass, velocity, internal energy) are derived from their

  19. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  20. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  1. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  2. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  3. Learning Strategic Sophistication

    NARCIS (Netherlands)

    Blume, A.; DeJong, D.V.; Maier, M.

    2005-01-01

    We experimentally investigate coordination games in which cognition plays an important role, i.e. where outcomes are affected by the agents level of understanding of the game and the beliefs they form about each others understanding.We ask whether and when repeated exposure permits agents to learn

  4. Statistics techniques applied to electron probe microanalysis

    International Nuclear Information System (INIS)

    Brizuela, H.; Del Giorgio, M.; Budde, C.; Briozzo, C.; Riveros, J.

    1987-01-01

    A description of Montroll-West's general theory for a tridimensional random walk of a particle with internal degrees of freedom is given, connecting this problem with the master equation solution. The possibility of its application to EPMA is discussed. Numerical solutions are given for thick or collimated beams at several energies interacting with samples of different shape and size. Spatial distribution of particles within the sample -for a stationary state- is analized, as well as the electron backscattering coefficient. (Author) [es

  5. Statistically tuned Gaussian background subtraction technique for ...

    Indian Academy of Sciences (India)

    temporal median method and mixture of Gaussian model and performance evaluation ... to process the videos captured by unmanned aerial vehicle (UAV). ..... The output is obtained by simulation using MATLAB 2010 in a standalone PC with ...

  6. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  7. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  8. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  9. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  10. Statistical mechanical analysis of LMFBR fuel cladding tubes

    International Nuclear Information System (INIS)

    Poncelet, J.-P.; Pay, A.

    1977-01-01

    The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. First, a thermal creep damage index is set up through a sufficiently sophisticated clad physical analysis including arbitrary time dependence of power and neutron flux as well as effects of sodium temperature, burnup and steel mechanical behavior. Although this strain limit approach implies a more general but time consuming model., on the counterpart the net output is improved and e.g. clad temperature, stress and strain maxima may be easily assessed. A full spectrum of variables are statistically treated to account for their probability distributions. Creep damage probability may be obtained and can contribute to a quantitative fuel probability estimation

  11. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  12. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  13. Initiating statistical maintenance optimization

    International Nuclear Information System (INIS)

    Doyle, E. Kevin; Tuomi, Vesa; Rowley, Ian

    2007-01-01

    Since the 1980 s maintenance optimization has been centered around various formulations of Reliability Centered Maintenance (RCM). Several such optimization techniques have been implemented at the Bruce Nuclear Station. Further cost refinement of the Station preventive maintenance strategy includes evaluation of statistical optimization techniques. A review of successful pilot efforts in this direction is provided as well as initial work with graphical analysis. The present situation reguarding data sourcing, the principle impediment to use of stochastic methods in previous years, is discussed. The use of Crowe/AMSAA (Army Materials Systems Analysis Activity) plots is demonstrated from the point of view of justifying expenditures in optimization efforts. (author)

  14. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  15. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  16. Yersinia virulence factors - a sophisticated arsenal for combating host defences [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Steve Atkinson

    2016-06-01

    Full Text Available The human pathogens Yersinia pseudotuberculosis and Yersinia enterocolitica cause enterocolitis, while Yersinia pestis is responsible for pneumonic, bubonic, and septicaemic plague. All three share an infection strategy that relies on a virulence factor arsenal to enable them to enter, adhere to, and colonise the host while evading host defences to avoid untimely clearance. Their arsenal includes a number of adhesins that allow the invading pathogens to establish a foothold in the host and to adhere to specific tissues later during infection. When the host innate immune system has been activated, all three pathogens produce a structure analogous to a hypodermic needle. In conjunction with the translocon, which forms a pore in the host membrane, the channel that is formed enables the transfer of six ‘effector’ proteins into the host cell cytoplasm. These proteins mimic host cell proteins but are more efficient than their native counterparts at modifying the host cell cytoskeleton, triggering the host cell suicide response. Such a sophisticated arsenal ensures that yersiniae maintain the upper hand despite the best efforts of the host to counteract the infecting pathogen.

  17. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  18. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  19. Measured and simulated effects of sophisticated drainage techniques on groundwater level and runoff hydrochemistry in areas of boreal acid sulphate soils

    Directory of Open Access Journals (Sweden)

    I. BÄRLUND

    2008-12-01

    Full Text Available To abate the environmental problems caused by the severe acidity and high metal concentrations in rivers draining acid sulphate (AS soils of Western Finland, control drainage (CD and lime filter drainage (LFD, and their combination, were investigated. The effectiveness of these best management practices (BMP’s on drainage water quality was studied on plot scale in two locations. In Ilmajoki, where the sulphidic materials are more than 2 m below the soil surface, CD efficiently reduced the concentrations of sulphate, aluminium, manganese and iron concentrations and to some extent also increased the pH of the drainage waters. LFD, in contrast, effectively reduced the drainage water acidity and raised the pH level. Decrease of the groundwater level owing to strong evapotranspiration in summer could, however, not be properly prevented by CD. In Mustasaari where sulphidic materials were as shallow as 1 m below soil surface, the positive effects of LFD recognised in Ilmajoki were hardly seen. This shows, that the tested BMP’s work properly, and can thus be recommended, for intensively artificially drained AS soils like in Ilmajoki where most of the acidity has already been transported to watercourses. LFD can, however, not be recommended for as yet poorly leached and thus particularly problematic AS soils like in Mustasaari. This is, of course, a drawback of the tested BMP, as it is not effective for the soils which would need it most. The field data were tentatively utilised to test the performance of the HAPSU (Ionic Flow Model for Acid Sulphate Soils simulation model developed to estimate the loads of harmful substances from AS soils.;

  20. Comparative evaluation of scatter correction techniques in 3D positron emission tomography

    CERN Document Server

    Zaidi, H

    2000-01-01

    Much research and development has been concentrated on the scatter compensation required for quantitative 3D PET. Increasingly sophisticated scatter correction procedures are under investigation, particularly those based on accurate scatter models, and iterative reconstruction-based scatter compensation approaches. The main difference among the correction methods is the way in which the scatter component in the selected energy window is estimated. Monte Carlo methods give further insight and might in themselves offer a possible correction procedure. Methods: Five scatter correction methods are compared in this paper where applicable. The dual-energy window (DEW) technique, the convolution-subtraction (CVS) method, two variants of the Monte Carlo-based scatter correction technique (MCBSC1 and MCBSC2) and our newly developed statistical reconstruction-based scatter correction (SRBSC) method. These scatter correction techniques are evaluated using Monte Carlo simulation studies, experimental phantom measurements...

  1. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  2. A Snapshot of Serial Rape: An Investigation of Criminal Sophistication and Use of Force on Victim Injury and Severity of the Assault.

    Science.gov (United States)

    de Heer, Brooke

    2016-02-01

    Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.

  3. Quantitative Portfolio Optimization Techniques Applied to the Brazilian Stock Market

    Directory of Open Access Journals (Sweden)

    André Alves Portela Santos

    2012-09-01

    Full Text Available In this paper we assess the out-of-sample performance of two alternative quantitative portfolio optimization techniques - mean-variance and minimum variance optimization – and compare their performance with respect to a naive 1/N (or equally-weighted portfolio and also to the market portfolio given by the Ibovespa. We focus on short selling-constrained portfolios and consider alternative estimators for the covariance matrices: sample covariance matrix, RiskMetrics, and three covariance estimators proposed by Ledoit and Wolf (2003, Ledoit and Wolf (2004a and Ledoit and Wolf (2004b. Taking into account alternative portfolio re-balancing frequencies, we compute out-of-sample performance statistics which indicate that the quantitative approaches delivered improved results in terms of lower portfolio volatility and better risk-adjusted returns. Moreover, the use of more sophisticated estimators for the covariance matrix generated optimal portfolios with lower turnover over time.

  4. Engaging with the Art & Science of Statistics

    Science.gov (United States)

    Peters, Susan A.

    2010-01-01

    How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…

  5. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    Science.gov (United States)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  6. Using statistical sensitivities for adaptation of a best-estimate thermo-hydraulic simulation model

    International Nuclear Information System (INIS)

    Liu, X.J.; Kerner, A.; Schaefer, A.

    2010-01-01

    On-line adaptation of best-estimate simulations of NPP behaviour to time-dependent measurement data can be used to insure that simulations performed in parallel to plant operation develop synchronously with the real plant behaviour even over extended periods of time. This opens a range of applications including operator support in non-standard-situations, improving diagnostics and validation of measurements in real plants or experimental facilities. A number of adaptation methods have been proposed and successfully applied to control problems. However, these methods are difficult to be applied to best-estimate thermal-hydraulic codes, such as TRACE and ATHLET, with their large nonlinear differential equation systems and sophisticated time integration techniques. This paper presents techniques to use statistical sensitivity measures to overcome those problems by reducing the number of parameters subject to adaptation. It describes how to identify the most significant parameters for adaptation and how this information can be used by combining: -decomposition techniques splitting the system into a small set of component parts with clearly defined interfaces where boundary conditions can be derived from the measurement data, -filtering techniques to insure that the time frame for adaptation is meaningful, -numerical sensitivities to find minimal error conditions. The suitability of combining those techniques is shown by application to an adaptive simulation of the PKL experiment.

  7. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  8. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  9. Statistical methods for ranking data

    CERN Document Server

    Alvo, Mayer

    2014-01-01

    This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.

  10. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  11. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  12. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques; Identificacion de fuentes de metales pesados en la atmosfera de la Ciudad de Mexico, usando la tecnica de analisis por induccion de rayos X con proton y tecnicas estadisticas multifactoriales

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez M, B [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)

    1997-07-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  13. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  14. Introductory statistics for engineering experimentation

    CERN Document Server

    Nelson, Peter R; Coffin, Marie

    2003-01-01

    The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...

  15. Sophisticated Epistemologies of Physics versus High-Stakes Tests: How Do Elite High School Students Respond to Competing Influences about How to Learn Physics?

    Science.gov (United States)

    Yerdelen-Damar, Sevda; Elby, Andrew

    2016-01-01

    This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to…

  16. On quantum statistical inference

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...

  17. Plasma transport studies using transient techniques

    International Nuclear Information System (INIS)

    Simonen, T.C.; Brower, D.L.; Efthimion, P.

    1991-01-01

    Selected topics from the Transient Transport sessions of the Transport Task Force Workshop, held February 19-23, 1990, in Hilton Head, South Carolina are summarized. Presentations on sawtooth propagation, ECH modulation, particle modulation, and H-mode transitions are included. The research results presented indicated a growing theoretical understanding and experimental sophistication in the application of transient techniques to transport studies. (Author)

  18. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  19. Statistical assessment of the learning curves of health technologies.

    Science.gov (United States)

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  20. A primer of multivariate statistics

    CERN Document Server

    Harris, Richard J

    2014-01-01

    Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why

  1. Isotopic safeguards statistics

    International Nuclear Information System (INIS)

    Timmerman, C.L.; Stewart, K.B.

    1978-06-01

    The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results

  2. Do sophisticated epistemic beliefs predict meaningful learning? Findings from a structural equation model of undergraduate biology learning

    Science.gov (United States)

    Lee, Silvia Wen-Yu; Liang, Jyh-Chong; Tsai, Chin-Chung

    2016-10-01

    This study investigated the relationships among college students' epistemic beliefs in biology (EBB), conceptions of learning biology (COLB), and strategies of learning biology (SLB). EBB includes four dimensions, namely 'multiple-source,' 'uncertainty,' 'development,' and 'justification.' COLB is further divided into 'constructivist' and 'reproductive' conceptions, while SLB represents deep strategies and surface learning strategies. Questionnaire responses were gathered from 303 college students. The results of the confirmatory factor analysis and structural equation modelling showed acceptable model fits. Mediation testing further revealed two paths with complete mediation. In sum, students' epistemic beliefs of 'uncertainty' and 'justification' in biology were statistically significant in explaining the constructivist and reproductive COLB, respectively; and 'uncertainty' was statistically significant in explaining the deep SLB as well. The results of mediation testing further revealed that 'uncertainty' predicted surface strategies through the mediation of 'reproductive' conceptions; and the relationship between 'justification' and deep strategies was mediated by 'constructivist' COLB. This study provides evidence for the essential roles some epistemic beliefs play in predicting students' learning.

  3. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  4. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  5. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  6. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  7. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  8. BOOK REVIEW: Statistical Mechanics of Turbulent Flows

    Science.gov (United States)

    Cambon, C.

    2004-10-01

    This is a handbook for a computational approach to reacting flows, including background material on statistical mechanics. In this sense, the title is somewhat misleading with respect to other books dedicated to the statistical theory of turbulence (e.g. Monin and Yaglom). In the present book, emphasis is placed on modelling (engineering closures) for computational fluid dynamics. The probabilistic (pdf) approach is applied to the local scalar field, motivated first by the nonlinearity of chemical source terms which appear in the transport equations of reacting species. The probabilistic and stochastic approaches are also used for the velocity field and particle position; nevertheless they are essentially limited to Lagrangian models for a local vector, with only single-point statistics, as for the scalar. Accordingly, conventional techniques, such as single-point closures for RANS (Reynolds-averaged Navier-Stokes) and subgrid-scale models for LES (large-eddy simulations), are described and in some cases reformulated using underlying Langevin models and filtered pdfs. Even if the theoretical approach to turbulence is not discussed in general, the essentials of probabilistic and stochastic-processes methods are described, with a useful reminder concerning statistics at the molecular level. The book comprises 7 chapters. Chapter 1 briefly states the goals and contents, with a very clear synoptic scheme on page 2. Chapter 2 presents definitions and examples of pdfs and related statistical moments. Chapter 3 deals with stochastic processes, pdf transport equations, from Kramer-Moyal to Fokker-Planck (for Markov processes), and moments equations. Stochastic differential equations are introduced and their relationship to pdfs described. This chapter ends with a discussion of stochastic modelling. The equations of fluid mechanics and thermodynamics are addressed in chapter 4. Classical conservation equations (mass, velocity, internal energy) are derived from their

  9. Investigation of HIV-1 infected and uninfected cells using the optical trapping technique

    CSIR Research Space (South Africa)

    Ombinda-Lemboumba, Saturnin

    2017-02-01

    Full Text Available Optical trapping has emerged as an essential tool for manipulating single biological material and performing sophisticated spectroscopy analysis on individual cell. The optical trapping technique has been used to grab and immobilize cells from a...

  10. Intermediate statistics a modern approach

    CERN Document Server

    Stevens, James P

    2007-01-01

    Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.

  11. A review on diagnostic techniques for brucellosis | Kaltungo | African ...

    African Journals Online (AJOL)

    ... but has not been validated for standard laboratory use. This paper highlights useful samples and, especially the different conventional to more sophisticated molecular techniques for the diagnosis of brucellosis. Keywords: Brucellosis, diagnosis, techniques. African Journal of Biotechnology, Vol. 13(1), pp. 1-10, 1 January, ...

  12. Chemometric and Statistical Analyses of ToF-SIMS Spectra of Increasingly Complex Biological Samples

    Energy Technology Data Exchange (ETDEWEB)

    Berman, E S; Wu, L; Fortson, S L; Nelson, D O; Kulp, K S; Wu, K J

    2007-10-24

    Characterizing and classifying molecular variation within biological samples is critical for determining fundamental mechanisms of biological processes that will lead to new insights including improved disease understanding. Towards these ends, time-of-flight secondary ion mass spectrometry (ToF-SIMS) was used to examine increasingly complex samples of biological relevance, including monosaccharide isomers, pure proteins, complex protein mixtures, and mouse embryo tissues. The complex mass spectral data sets produced were analyzed using five common statistical and chemometric multivariate analysis techniques: principal component analysis (PCA), linear discriminant analysis (LDA), partial least squares discriminant analysis (PLSDA), soft independent modeling of class analogy (SIMCA), and decision tree analysis by recursive partitioning. PCA was found to be a valuable first step in multivariate analysis, providing insight both into the relative groupings of samples and into the molecular basis for those groupings. For the monosaccharides, pure proteins and protein mixture samples, all of LDA, PLSDA, and SIMCA were found to produce excellent classification given a sufficient number of compound variables calculated. For the mouse embryo tissues, however, SIMCA did not produce as accurate a classification. The decision tree analysis was found to be the least successful for all the data sets, providing neither as accurate a classification nor chemical insight for any of the tested samples. Based on these results we conclude that as the complexity of the sample increases, so must the sophistication of the multivariate technique used to classify the samples. PCA is a preferred first step for understanding ToF-SIMS data that can be followed by either LDA or PLSDA for effective classification analysis. This study demonstrates the strength of ToF-SIMS combined with multivariate statistical and chemometric techniques to classify increasingly complex biological samples

  13. Study of intramolecular isotope heterogeneity of organic oxy acids in order to detect sophisticated wines and juice drinks

    Directory of Open Access Journals (Sweden)

    Kuzmina Helen

    2014-01-01

    Full Text Available According to International Code of Oenological Practices it is allowed to use acide L(+tartrique for wine acidification, while use of synthetic dihydroxysuccinic acid is forbidden. Today it is impossible to differentiate natural dihydroxysuccinic acid from synthetic one by standard techniques. Even by using very sensitive method of isotope mass spectrometry certain difficulties emerge because total isotope characteristics of carbon of dihydroxysuccinic acid of different nature have the same values. However, isotope characteristics of carbon of intramolecular structural groups of dihydroxysuccinic acid made of different raw materials differ significantly. This allows specifying the nature of dihydroxysuccinic acid that is used for making of wines and juice drinks. In Russia, scientific and research institute of beer brewing and wine-making industry carried out a work for studying isotope characteristics of intramolecular isotope heterogeneity of dihydroxysuccinic acid from different origins in order to identify wines and juice drinks. Isotope characteristics of organic oxy acids from different origins were studied including them obtained by synthetic way and numeric range of value δ13 C,‰ were specified. The obtained results allow performing identification tests of wines and juice drinks to find out the products that contain not specified additives as that allowed for its use in production process.

  14. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  15. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  16. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  17. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  18. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  19. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  20. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  1. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  2. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  3. A Comparison of Several Statistical Tests of Reciprocity of Self-Disclosure.

    Science.gov (United States)

    Dindia, Kathryn

    1988-01-01

    Reports the results of a study that used several statistical tests of reciprocity of self-disclosure. Finds little evidence for reciprocity of self-disclosure, and concludes that either reciprocity is an illusion, or that different or more sophisticated methods are needed to detect it. (MS)

  4. The CEO performance effect : Statistical issues and a complex fit perspective

    NARCIS (Netherlands)

    Blettner, D.P.; Chaddad, F.R.; Bettis, R.

    2012-01-01

    How CEOs affect strategy and performance is important to strategic management research. We show that sophisticated statistical analysis alone is problematic for establishing the magnitude and causes of CEO impact on performance. We discuss three problem areas that substantially distort the

  5. EU-Korea FTA and Its Impact on V4 Economies. A Comparative Analysis of Trade Sophistication and Intra-Industry Trade

    Directory of Open Access Journals (Sweden)

    Michalski Bartosz

    2018-03-01

    Full Text Available This paper investigates selected short- and mid-term effects in trade in goods between the Visegrad countries (V4: the Czech Republic, Hungary, Poland and the Slovak Republic and the Republic of Korea under the framework of the Free Trade Agreement between the European Union and the Republic of Korea. This Agreement is described in the “Trade for All” (2015: 9 strategy as the most ambitious trade deal ever implemented by the EU. The primary purpose of our analysis is to identify, compare, and evaluate the evolution of the technological sophistication of bilateral exports and imports. Another dimension of the paper concentrates on the developments within intra-industry trade. Moreover, these objectives are approached taking into account the context of the South Korean direct investment inflow to the V4. The evaluation of technological sophistication is based on UNCTAD’s methodology, while the intensity of intra-industry trade is measured by the GL-index and identification of its subcategories (horizontal and vertical trade. The analysis covers the timespan 2001–2015. The novelty of the paper lies in the fact that the study of South Korean-V4 trade relations has not so far been carried out from this perspective. Thus this paper investigates interesting phenomena identified in the trade between the Republic of Korea (ROK and V4 economies. The main findings imply an impact of South Korean direct investments on trade. This is represented by the trade deficit of the V4 with ROK and the structure of bilateral trade in terms of its technological sophistication. South Korean investments might also have had positive consequences for the evolution of IIT, particularly in the machinery sector. The political interpretation indicates that they may strengthen common threats associated with the middle-income trap, particularly the technological gap and the emphasis placed on lower costs of production.

  6. Statistics in Schools

    Science.gov (United States)

    Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History

  7. Transport Statistics - Transport - UNECE

    Science.gov (United States)

    Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6

  8. Analytical system availability techniques

    NARCIS (Netherlands)

    Brouwers, J.J.H.; Verbeek, P.H.J.; Thomson, W.R.

    1987-01-01

    Analytical techniques are presented to assess the probability distributions and related statistical parameters of loss of production from equipment networks subject to random failures and repairs. The techniques are based on a theoretical model for system availability, which was further developed

  9. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  10. Multivariate Statistical Analysis of Orthogonal Mass Spectral Data for the Identification of Chemical Attribution Signatures of 3-Methylfentanyl

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, B. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Valdez, C. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); DeHope, A. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Spackman, P. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sanner, R. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Martinez, H. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Williams, A. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-11-28

    Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduled precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.

  11. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.

  12. Understanding search trees via statistical physics

    Indian Academy of Sciences (India)

    ary search tree model (where stands for the number of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.

  13. Telling the truth with statistics

    CERN Multimedia

    CERN. Geneva; CERN. Geneva. Audiovisual Unit

    2002-01-01

    This course of lectures will cover probability, distributions, fitting, errors and confidence levels, for practising High Energy Physicists who need to use Statistical techniques to express their results. Concentrating on these appropriate specialist techniques means that they can be covered in appropriate depth, while assuming only the knowledge and experience of a typical Particle Physicist. The different definitions of probability will be explained, and it will be appear why this basic subject is so controversial; there are several viewpoints and it is important to understand them all, rather than abusing the adherents of different beliefs. Distributions will be covered: the situations they arise in, their useful properties, and the amazing result of the Central Limit Theorem. Fitting a parametrisation to a set of data is one of the most widespread uses of statistics: these are lots of ways of doing this and these will be presented, with discussion of which is appropriate in different circumstances. This t...

  14. Structural reliability in context of statistical uncertainties and modelling discrepancies

    International Nuclear Information System (INIS)

    Pendola, Maurice

    2000-01-01

    Structural reliability methods have been largely improved during the last years and have showed their ability to deal with uncertainties during the design stage or to optimize the functioning and the maintenance of industrial installations. They are based on a mechanical modeling of the structural behavior according to the considered failure modes and on a probabilistic representation of input parameters of this modeling. In practice, only limited statistical information is available to build the probabilistic representation and different sophistication levels of the mechanical modeling may be introduced. Thus, besides the physical randomness, other uncertainties occur in such analyses. The aim of this work is triple: 1. at first, to propose a methodology able to characterize the statistical uncertainties due to the limited number of data in order to take them into account in the reliability analyses. The obtained reliability index measures the confidence in the structure considering the statistical information available. 2. Then, to show a methodology leading to reliability results evaluated from a particular mechanical modeling but by using a less sophisticated one. The objective is then to decrease the computational efforts required by the reference modeling. 3. Finally, to propose partial safety factors that are evolving as a function of the number of statistical data available and as a function of the sophistication level of the mechanical modeling that is used. The concepts are illustrated in the case of a welded pipe and in the case of a natural draught cooling tower. The results show the interest of the methodologies in an industrial context. [fr

  15. Applied statistics for social and management sciences

    CERN Document Server

    Miah, Abdul Quader

    2016-01-01

    This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .

  16. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  17. On Quantum Statistical Inference, II

    OpenAIRE

    Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P. E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...

  18. Practical Statistics for Particle Physicists

    CERN Document Server

    Lista, Luca

    2017-01-01

    These three lectures provide an introduction to the main concepts of statistical data analysis useful for precision measurements and searches for new signals in High Energy Physics. The frequentist and Bayesian approaches to probability theory will introduced and, for both approaches, inference methods will be presented. Hypothesis tests will be discussed, then significance and upper limit evaluation will be presented with an overview of the modern and most advanced techniques adopted for data analysis at the Large Hadron Collider.

  19. Surveillance of arthropod vector-borne infectious diseases using remote sensing techniques: a review.

    Directory of Open Access Journals (Sweden)

    Satya Kalluri

    2007-10-01

    Full Text Available Epidemiologists are adopting new remote sensing techniques to study a variety of vector-borne diseases. Associations between satellite-derived environmental variables such as temperature, humidity, and land cover type and vector density are used to identify and characterize vector habitats. The convergence of factors such as the availability of multi-temporal satellite data and georeferenced epidemiological data, collaboration between remote sensing scientists and biologists, and the availability of sophisticated, statistical geographic information system and image processing algorithms in a desktop environment creates a fertile research environment. The use of remote sensing techniques to map vector-borne diseases has evolved significantly over the past 25 years. In this paper, we review the status of remote sensing studies of arthropod vector-borne diseases due to mosquitoes, ticks, blackflies, tsetse flies, and sandflies, which are responsible for the majority of vector-borne diseases in the world. Examples of simple image classification techniques that associate land use and land cover types with vector habitats, as well as complex statistical models that link satellite-derived multi-temporal meteorological observations with vector biology and abundance, are discussed here. Future improvements in remote sensing applications in epidemiology are also discussed.

  20. Statistical and thermal physics with computer applications

    CERN Document Server

    Gould, Harvey

    2010-01-01

    This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the

  1. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  2. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  3. Plato's patricide in the sophist

    Directory of Open Access Journals (Sweden)

    Deretić Irina J.

    2012-01-01

    Full Text Available In this paper, the author attempts to elucidate validity of Plato's criticism of Parmenides' simplified monistic ontology, as well as his concept of non-being. In contrast to Parmenides, Plato introduces a more complex ontology of the megista gene and redefines Parmenides' concept of non-being as something absolutely different from being. According to Plato, not all things are in the same sense, i. e. they have the different ontological status. Additionally, he redefines Parmenides' concept of absolute non-being as 'difference' or 'otherness.' .

  4. Sophisticated fuel handling system evolved

    International Nuclear Information System (INIS)

    Ross, D.A.

    1988-01-01

    The control systems at Sellafield fuel handling plant are described. The requirements called for built-in diagnostic features as well as the ability to handle a large sequencing application. Speed was also important; responses better than 50ms were required. The control systems are used to automate operations within each of the three main process caves - two Magnox fuel decanners and an advanced gas-cooled reactor fuel dismantler. The fuel route within the fuel handling plant is illustrated and described. ASPIC (Automated Sequence Package for Industrial Control) which was developed as a controller for the plant processes is described. (U.K.)

  5. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  6. Recreational Boating Statistics 2012

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  7. Recreational Boating Statistics 2013

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  8. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  9. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  10. Recreational Boating Statistics 2011

    Data.gov (United States)

    Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...

  11. Uterine Cancer Statistics

    Science.gov (United States)

    ... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...

  12. Tuberculosis Data and Statistics

    Science.gov (United States)

    ... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...

  13. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...

  14. National Transportation Statistics 2008

    Science.gov (United States)

    2009-01-08

    Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...

  15. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  16. School Violence: Data & Statistics

    Science.gov (United States)

    ... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...

  17. Caregiver Statistics: Demographics

    Science.gov (United States)

    ... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...

  18. Aortic Aneurysm Statistics

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  19. Alcohol Facts and Statistics

    Science.gov (United States)

    ... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...

  20. National Transportation Statistics 2009

    Science.gov (United States)

    2010-01-21

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  1. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  2. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  3. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  4. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  5. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  6. Youth Sports Safety Statistics

    Science.gov (United States)

    ... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...

  7. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  8. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  9. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  10. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  11. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  12. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  13. Adaptive RAC codes employing statistical channel evaluation ...

    African Journals Online (AJOL)

    An adaptive encoding technique using row and column array (RAC) codes employing a different number of parity columns that depends on the channel state is proposed in this paper. The trellises of the proposed adaptive codes and a statistical channel evaluation technique employing these trellises are designed and ...

  14. Statistical Methods for Environmental Pollution Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  15. Experimental techniques; Techniques experimentales

    Energy Technology Data Exchange (ETDEWEB)

    Roussel-Chomaz, P. [GANIL CNRS/IN2P3, CEA/DSM, 14 - Caen (France)

    2007-07-01

    This lecture presents the experimental techniques, developed in the last 10 or 15 years, in order to perform a new class of experiments with exotic nuclei, where the reactions induced by these nuclei allow to get information on their structure. A brief review of the secondary beams production methods will be given, with some examples of facilities in operation or under project. The important developments performed recently on cryogenic targets will be presented. The different detection systems will be reviewed, both the beam detectors before the targets, and the many kind of detectors necessary to detect all outgoing particles after the reaction: magnetic spectrometer for the heavy fragment, detection systems for the target recoil nucleus, {gamma} detectors. Finally, several typical examples of experiments will be detailed, in order to illustrate the use of each detector either alone, or in coincidence with others. (author)

  16. statistical tests for frequency distribution of mean gravity anomalies

    African Journals Online (AJOL)

    ES Obe

    1980-03-01

    Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...

  17. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  18. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  19. Introduction to statistical data analysis for the life sciences

    CERN Document Server

    Ekstrom, Claus Thorn

    2014-01-01

    This text provides a computational toolbox that enables students to analyze real datasets and gain the confidence and skills to undertake more sophisticated analyses. Although accessible with any statistical software, the text encourages a reliance on R. For those new to R, an introduction to the software is available in an appendix. The book also includes end-of-chapter exercises as well as an entire chapter of case exercises that help students apply their knowledge to larger datasets and learn more about approaches specific to the life sciences.

  20. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.