WorldWideScience

Sample records for underutilized statistical technique

  1. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  2. Optimization techniques in statistics

    CERN Document Server

    Rustagi, Jagdish S

    1994-01-01

    Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza

  3. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  4. 21 CFR 820.250 - Statistical techniques.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  5. Genetic improvement of under-utilized and neglected crops in low income food deficit countries through irradiation and related techniques. Proceedings of a final research coordination meeting

    International Nuclear Information System (INIS)

    2004-11-01

    The majority of the world's food is produced from only a few crops, and yet many neglected and under-utilized crops are extremely important for food production in low income food deficit countries (LIFDCs). As the human population grows at an alarming rate in LIFDCs, food availability has declined and is also affected due to environmental factors, lack of improvement of local crop species, erosion of genetic diversity and dependence on a few crop species for food supply. Neglected crops are traditionally grown by farmers in their centres of origin or centres of diversity, where they are still important for the subsistence of local communities, and maintained by socio-cultural preferences and traditional uses. These crops remain inadequately characterised and, until very recently, have been largely ignored by research and conservation. Farmers are losing these crops because they are less competitive with improved major crop species. Radiation-induced mutation techniques have successfully been used that benefited the most genetic improvement of 'major crops' and their know-how have a great potential for enhancing the use of under-utilized and neglected species and speeding up their domestication and crop improvement. The FAO/IAEA efforts on genetic improvement of under-utilized and neglected species play a strategic role in complementing the work that is being carried out worldwide in their promotion. This CRP entitled Genetic Improvement of Under-utilized and Neglected Crops in LIFDCs through Irradiation and Related Techniques was initiated in 1998 with an overall objective to improve food security, enhance nutritional balance, and promote sustainable agriculture in LIFDCs. Specific objectives addressed major constraints to productivity of neglected and under-utilized crops by genetic improvement with radiation-induced mutations and biotechnology in order to enhance economic viability and sustain crop species diversity, and in future to benefit small farmers. This

  6. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  7. Statistically tuned Gaussian background subtraction technique for ...

    Indian Academy of Sciences (India)

    ground, small objects, moving background and multiple objects are considered for evaluation. The technique is statistically compared with frame differencing technique, temporal median method and mixture of Gaussian model and performance evaluation is done to check the effectiveness of the proposed technique after ...

  8. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  9. The maximum entropy technique. System's statistical description

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Sulejmanov, M.K.

    2002-01-01

    The maximum entropy technique (MENT) is applied for searching the distribution functions of physical values. MENT takes into consideration the demand of maximum entropy, the characteristics of the system and the connection conditions, naturally. It is allowed to apply MENT for statistical description of closed and open systems. The examples in which MENT had been used for the description of the equilibrium and nonequilibrium states and the states far from the thermodynamical equilibrium are considered

  10. Underutilization of Influenza Vaccine

    Directory of Open Access Journals (Sweden)

    Marshall K. Cheney

    2013-04-01

    Full Text Available Yearly influenza vaccination continues to be underutilized by those who would most benefit from it. The Health Belief Model was used to explain differences in beliefs about influenza vaccination among at-risk individuals resistant to influenza vaccination. Survey data were collected from 74 members of at-risk groups who were not vaccinated for influenza during the previous flu season. Accepting individuals were more likely to perceive flu as a threat to health and perceive access barriers, and cues to action were the most important influence on whether they plan to get vaccinated. In comparison, resistant individuals did not feel threatened by the flu, access barriers were not a problem, and they did not respond favorably to cues to action. Perceived threat, perceived access barriers, and cues to action were significantly associated with plans to be vaccinated for influenza in the next flu season. Participants who saw influenza as a threat to their health had 5.4 times the odds of planning to be vaccinated than those who did not. Participants reporting barriers to accessing influenza vaccination had 7.5 times the odds of reporting plans to be vaccinated. Those responding positively to cues to action had 12.2 times the odds of planning to be vaccinated in the next flu season than those who did not. Accepting and resistant individuals have significant differences in their beliefs, which require different intervention strategies to increase vaccination rates. These findings provide important information to researchers and practitioners working to increase influenza vaccination rates.

  11. Review of the Statistical Techniques in Medical Sciences | Okeh ...

    African Journals Online (AJOL)

    ... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.

  12. Statistically tuned Gaussian background subtraction technique for ...

    Indian Academy of Sciences (India)

    Keywords. Tuning factor; background segmentation; unmanned aerial vehicle; aerial surveillance; thresholding. Abstract. Background subtraction is one of the efficient techniques to segment the targets from non-informative background of a video. The traditional background subtraction technique suits for videos with static ...

  13. Statistical Techniques in Electrical and Computer Engineering

    Indian Academy of Sciences (India)

    Stochastic models and statistical inference from them have been popular methodologies in a variety of engineering disciplines, notably in electrical and computer engineering. Recent years have seen explosive growth in this area, driven by technological imperatives. These now go well beyond their traditional domain of ...

  14. Testing of statistical techniques used in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.; Edwards, H.; Prust, J.

    1984-01-01

    Analysis of the SYVAC (SYstems Variability Analysis Code) output adopted four techniques to provide a cross comparison of their performance. The techniques used were: examination of scatter plots; correlation/regression; Kruskal-Wallis one-way analysis of variance by ranks; comparison of cumulative distribution functions and risk estimates between sub-ranges of parameter values. The analysis was conducted for the case of a single nuclide chain and was based mainly on simulated dose after 500,000 years. The results from this single SYVAC case showed that site parameters had the greatest influence on dose to man. The techniques of correlation/regression and Kruskal-Wallis were both successful and consistent in their identification of important parameters. Both techniques ranked the eight most important parameters in the same order when analysed for maximum dose. The results from a comparison of cdfs and risks in sub-ranges of the parameter values were not entirely consistent with other techniques. Further sampling of the high dose region is recommended in order to improve the accuracy of this method. (author)

  15. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  16. Predicting radiotherapy outcomes using statistical learning techniques

    Energy Technology Data Exchange (ETDEWEB)

    El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O [Washington University, Saint Louis, MO (United States); Lindsay, Patricia E; Hope, Andrew J [Department of Radiation Oncology, Princess Margaret Hospital, Toronto, ON (Canada)

    2009-09-21

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among

  17. Predicting radiotherapy outcomes using statistical learning techniques

    Science.gov (United States)

    El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.

    2009-09-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  18. Predicting radiotherapy outcomes using statistical learning techniques

    International Nuclear Information System (INIS)

    El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O; Lindsay, Patricia E; Hope, Andrew J

    2009-01-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  19. "Statistical Techniques for Particle Physics" (4/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  20. "Statistical Techniques for Particle Physics" (1/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  1. "Statistical Techniques for Particle Physics" (2/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  2. "Statistical Techniques for Particle Physics" (3/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  3. The statistical chopper in the time-of-flight technique

    International Nuclear Information System (INIS)

    Albuquerque Vieira, J. de.

    1975-12-01

    A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt

  4. Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR

    Science.gov (United States)

    2014-07-12

    Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report...Aggregation Operator For Humanitarian Demining Using Hand- Held GPR , , (01 2008): . doi: D. Ho, P. Gader, J. Wilson, H. Frigui. Subspace Processing

  5. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Ali Haghizadeh

    2017-11-23

    Nov 23, 2017 ... of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts. Keywords. Groundwater; statistical index; Dempster–Shafer theory; water resource management; ...

  6. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster–Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran.

  7. 1980 Summer Study on Statistical Techniques in Army Testing.

    Science.gov (United States)

    1980-07-01

    WASHINGTON, D. C. 20310 f ARMY CIENCE BOARD 1980 SUMMER STUDY ON STATISTICAL TECHNIQUES IN ARMY TESTING JULY 1980 DTICS ELECTE NOV 2 5 1980 B _STRI...statisticians is adequate, and in some cases, excellent. In the areas of education and the dissemination of information, the Study Group found that the

  8. Techniques in teaching statistics : linking research production and research use.

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Moyano, I .; Smith, A. (Decision and Information Sciences); (Univ. of Massachusetts at Boston)

    2012-01-01

    In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.

  9. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...... processes describing the ambient measurements. The Vector Random Decrement functions are linked to the correlation functions of the stochastic processes provided they are stationary and Gaussian distributed. Furthermore, a new approach for quality assessment of the Vector Random Decrement functions is given...... on the basis of the derived results. The work presented in this paper makes the theory of the Vector Random Decrement technique equivalent to the theory of the Random Decrement technique. The theoretical derivations are illustrated by the analysis of the response of a 3DOF system loaded by white noise. ...

  10. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis

  11. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    Science.gov (United States)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  12. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  13. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  14. Statistical and Economic Techniques for Site-specific Nematode Management

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L.

    2014-01-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes. PMID:24643451

  15. The application of statistical techniques to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Annibal, P.S.; Roberts, P.D.

    1990-02-01

    Over the past decade much theoretical research has been carried out on the development of statistical methods for nuclear materials accountancy. In practice plant operation may differ substantially from the idealized models often cited. This paper demonstrates the importance of taking account of plant operation in applying the statistical techniques, to improve the accuracy of the estimates and the knowledge of the errors. The benefits are quantified either by theoretical calculation or by simulation. Two different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an accountancy tank is investigated. Secondly, a means of improving the knowledge of the 'Material Unaccounted For' (the difference between the inventory calculated from input/output data, and the measured inventory), using information about the plant measurement system, is developed and compared with existing general techniques. (author)

  16. Statistical Theory of the Vector Random Decrement Technique

    Science.gov (United States)

    ASMUSSEN, J. C.; BRINCKER, R.; IBRAHIM, S. R.

    1999-09-01

    The Vector Random Decrement technique has previously been introduced as an efficient method to transform ambient responses of linear structures into Vector Random Decrement functions which are equivalent to free decays of the current structure. The modal parameters can be extracted from the free decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic processes describing the ambient measurements. The Vector Random Decrement functions are linked to the correlation functions of the stochastic processes provided they are stationary and Gaussian distributed. Furthermore, a new approach for quality assessment of the Vector Random Decrement functions is given on the basis of the derived results. The work presented in this paper makes the theory of the Vector Random Decrement technique equivalent to the theory of the Random Decrement technique. The theoretical derivations are illustrated by the analysis of the response of a 3DOF system loaded by white noise.

  17. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  18. Application of multivariate statistical techniques in microbial ecology.

    Science.gov (United States)

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  19. Statistical Techniques for Assessing water‐quality effects of BMPs

    Science.gov (United States)

    Walker, John F.

    1994-01-01

    Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)

  20. Statistical techniques for noise removal from visual images

    Science.gov (United States)

    Allred, Lloyd G.; Kelly, Gary E.

    1992-07-01

    The median operator has been demonstrated to be a very effective method for restoring recognizable images from very noisy image data. The power of the median operator stems from its non-algebraic formulation, which prevents erroneous data corrupting the final color computation. A principal drawback is that the median operator replaces all data, erroneous or not, the result being a net loss of information. This paper presents alternative statistical outlier techniques by which erroneous data is readily recognized, but valid data usually remains unchanged. The result is an effective noise removal algorithm with reduced loss of information.

  1. Sustainable Production of Underutilized Vegetables to Enhance ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Research Fund (CIFSRF), a joint program of IDRC and the Canadian International Development Agency (CIDA) - aims to increase the food security and economic empowerment of resource-poor rural women farmers in Nigeria through the cultivation, processing, consumption and marketing of underutilized vegetables.

  2. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    Science.gov (United States)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  3. Techniques for the Statistical Analysis of Observer Data

    National Research Council Canada - National Science Library

    Bennett, John G

    2001-01-01

    .... The two techniques are as follows: (1) fitting logistic curves to the vehicle data, and (2) using the Fisher Exact Test to compare the probability of detection of the two vehicles at each range...

  4. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  5. Velocity field statistics and tessellation techniques : Unbiased estimators of Omega

    NARCIS (Netherlands)

    Van de Weygaert, R; Bernardeau, F; Muller,; Gottlober, S; Mucket, JP; Wambsganss, J

    1998-01-01

    We describe two new - stochastic-geometrical - methods to obtain reliable velocity field statistics from N-body simulations and from any general density and velocity fluctuation field sampled at a discrete set of locations. These methods, the Voronoi tessellation method and Delaunay tessellation

  6. Velocity Field Statistics and Tessellation Techniques : Unbiased Estimators of Omega

    NARCIS (Netherlands)

    Weygaert, R. van de; Bernardeau, F.

    1998-01-01

    Abstract: We describe two new, stochastic-geometrical, methods to obtain reliable velocity field statistics from N-body simulations and from any general density and velocity fluctuation field sampled at a discrete set of locations. These methods, the Voronoi tessellation method and Delaunay

  7. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  8. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  9. Statistical techniques for the characterization of partially observed epidemics.

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin; Ray, Jaideep; Crary, David (Applied Research Associates, Inc, Arlington, VA); Cheng, Karen (Applied Research Associates, Inc, Arlington, VA)

    2010-11-01

    Techniques appear promising to construct and integrate automated detect-and-characterize technique for epidemics - Working off biosurveillance data, and provides information on the particular/ongoing outbreak. Potential use - in crisis management and planning, resource allocation - Parameter estimation capability ideal for providing the input parameters into an agent-based model, Index Cases, Time of Infection, infection rate. Non-communicable diseases are easier than communicable ones - Small anthrax can be characterized well with 7-10 days of data, post-detection; plague takes longer, Large attacks are very easy.

  10. Understanding Summary Statistics and Graphical Techniques to Compare Michael Jordan versus LeBron James

    Science.gov (United States)

    Williams, Immanuel James; Williams, Kelley Kim

    2016-01-01

    Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.

  11. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    ... complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management ...

  12. GIS-based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    Ali Haghizadeh

    2017-11-23

    Nov 23, 2017 ... So, these models are known as computational intel- ligence and machine learning techniques to use for replacing physically based models. In contrast, knowledge-driven methods (KDM) use rich prior knowledge for model building based on knowledge engineering and management technologies (Azkune.

  13. (NHIS) using data mining technique as a statistical model

    African Journals Online (AJOL)

    kofi.mereku

    2014-05-23

    May 23, 2014 ... Scheme (NHIS) claims in the Awutu-Effutu-Senya District using data mining techniques, with a specific focus on .... transform them into a format that is friendly to data mining algorithms, such as .... many groups to access the data, facilitate updating the data, and improve the efficiency of checking the data for ...

  14. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    driven and knowledge-driven models (Corsini ... addition, the usage application of GIS-based SI technique in groundwater potential mapping .... lithology of an given area and affect the drainage density and can be of great big value for to evaluate ...

  15. Visual Analysis of North Atlantic Hurricane Trends Using Parallel Coordinates and Statistical Techniques

    National Research Council Canada - National Science Library

    Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E

    2008-01-01

    .... Innovative visual interaction techniques such as dynamic axis scaling, conjunctive parallel coordinates, statistical indicators, and aerial perspective shading are exploited to enhance the utility...

  16. Application of Statistical Potential Techniques to Runaway Transport Studies

    Energy Technology Data Exchange (ETDEWEB)

    Eguilior, S.; Castejon, F. [Ciemat.Madrid (Spain); Parrondo, J. M. [Universidad Complutense. Madrid (Spain)

    2001-07-01

    A method is presented for computing runaway production rate based on techniques of noise-activated escape in a potential is presented in this work. A generalised potential in 2D momentum space is obtained from the deterministic or drift terms of Langevin equations. The diffusive or stochastic terms that arise directly from the stochastic nature of collisions, play the role of the noise that activates barrier crossings. The runaway electron source is given by the escape rate in such a potential which is obtained from an Arrenius-like relation. Runaway electrons are those skip the potential barrier due to the effect of stochastic collisions. In terms of computation time, this method allows one to quickly obtain the source term for a runway electron transport code.(Author) 11 refs.

  17. Statistical techniques for the identification of reactor component structural vibrations

    International Nuclear Information System (INIS)

    Kemeny, L.G.

    1975-01-01

    The identification, on-line and in near real-time, of the vibration frequencies, modes and amplitudes of selected key reactor structural components and the visual monitoring of these phenomena by nuclear power plant operating staff will serve to further the safety and control philosophy of nuclear systems and lead to design optimisation. The School of Nuclear Engineering has developed a data acquisition system for vibration detection and identification. The system is interfaced with the HIFAR research reactor of the Australian Atomic Energy Commission. The reactor serves to simulate noise and vibrational phenomena which might be pertinent in power reactor situations. The data acquisition system consists of a small computer interfaced with a digital correlator and a Fourier transform unit. An incremental tape recorder is utilised as a backing store and as a means of communication with other computers. A small analogue computer and an analogue statistical analyzer can be used in the pre and post computational analysis of signals which are received from neutron and gamma detectors, thermocouples, accelerometers, hydrophones and strain gauges. Investigations carried out to date include a study of the role of local and global pressure fields due to turbulence in coolant flow and pump impeller induced perturbations on (a) control absorbers, (B) fuel element and (c) coolant external circuit and core tank structure component vibrations. (Auth.)

  18. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  19. Art Therapy: An Underutilized, yet Effective Tool.

    Science.gov (United States)

    Bitonte, Robert A; De Santo, Marisa

    2014-03-04

    Art therapy has been recognized as beneficial and effective since first described by Adrian Hill in 1942. Even before this time, art therapy was utilized for moral reinforcement and psychoanalysis. Art therapy aids patients with, but not limited to, chronic illness, physical challenges, and cancer in both pediatric and adult scenarios. Although effective in patient care, the practice of art therapy is extremely underutilized, especially in suburban areas. While conducting our own study in northeastern Ohio, USA, we found that only one out of the five inpatient institutions in the suburban area of Mahoning County, Ohio, that we contacted provided continuous art therapy to it's patients. In the metropolitan area of Cuyahoga County, Ohio, only eight of the twenty-two inpatient institutions in the area provided art therapy. There could be many reasons as to why art therapy is not frequently used in these areas, and medical institutions in general. The cause of this could be the amount of research done on the practice. Although difficult to conduct formal research on such a broad field, the American Art Therapy Association has succeeded in doing such, with studies showing improvement of the patient groups emotionally and mentally in many case types.

  20. Art therapy: an underutilized, yet effective tool

    Directory of Open Access Journals (Sweden)

    Robert A. Bitonte

    2014-03-01

    Full Text Available Art therapy has been recognized as beneficial and effective since first described by Adrian Hill in 1942. Even before this time, art therapy was utilized for moral reinforcement and psychoanalysis. Art therapy aids patients with, but not limited to, chronic illness, physical challenges, and cancer in both pediatric and adult scenarios. Although effective in patient care, the practice of art therapy is extremely underutilized, especially in suburban areas. While conducting our own study in northeastern Ohio, USA, we found that only one out of the five inpatient institutions in the suburban area of Mahoning County, Ohio, that we contacted provided continuous art therapy to it’s patients. In the metropolitan area of Cuyahoga County, Ohio, only eight of the twenty-two inpatient institutions in the area provided art therapy. There could be many reasons as to why art therapy is not frequently used in these areas, and medical institutions in general. The cause of this could be the amount of research done on the practice. Although difficult to conduct formal research on such a broad field, the American Art Therapy Association has succeeded in doing such, with studies showing improvement of the patient groups emotionally and mentally in many case types.

  1. Comparison of statistical accuracy between the 'direct' and the 'reverse' time-of-flight techniques

    International Nuclear Information System (INIS)

    Kudryashev, V.A.; Hartung, U.

    1992-01-01

    The statistical accuracy between two neutron time-of-flight (TOF) diffraction techniques, the classic 'forward' TOF and the 'reverse' TOF technique, are compared. This problem is discussed in dependence on the diffracted spectrum, the background and some special device parameters. In general the 'reverse' TOF method yields better statistics in the spectrum's range above the medium channel content; by the classic TOF method this is achieved in the lower area. For that reason, the reverse TOF measurement is especially recommendable for structure problems and the forward TOF technique for studying the background (e.g. the inelastic scattered portion). (orig.)

  2. Pattern recognition in remote-sensing imagery using data mining and statistical techniques

    Science.gov (United States)

    Singh, Rajesh Kumar

    The remote sensing image classification domain has been explored and examined by scientists in the past using classical statistical and machine-learning techniques. Statistical techniques like Bayesian classifiers are good when the data is noise-free or normalized, while implicit models, or machine learning algorithms, such as artificial neural networks (ANN) are more of a "black box", relying on iterative training to adjust parameters using transfer functions to improve their predictive ability relative to training data for which the outputs are known. The statistical approach performs better when a priori information about categories is available, but they have limitations in the case of objective classification and when the distribution of data points are not known, as is the case with remote sensing satellite data. Data mining algorithms, which have potential advantages over classical statistical classifiers in analyzing remote sensing imagery data, were examined for use in land use classification of remote sensing data. Spectral classifications of LANDSAT(TM) imagery from 1989 were conducted using data mining and statistical techniques. The site selected for this research was NASA's Kennedy Space Center (KSC) in Florida. The raw satellite data used in classification was obtained using feature-extraction image processing techniques. The classification effort can broadly be divided into two major categories: (a) Supervised classification with subjectively defined prior known classes, and (b) Unsupervised classification with objectively categorized natural groups of similar attributes. Several predictive models and segmentation classification schemes were developed. The techniques used for evaluation of spectral patterns were based on both statistical and data mining algorithms. The statistical technique involved k-nearest neighbor statistical method, while data mining algorithms included: (1) back-propagation artificial neural network technique for two

  3. Talons and beaks are viable but underutilized samples for detecting ...

    African Journals Online (AJOL)

    Talons and beaks are viable but underutilized samples for detecting organophosphorus and carbamate pesticide poisoning in raptors. Ngaio Richards, Irene Zorrilla, Joseph Lalah, Peter Otieno, Isabel Fernandez, Monica Calvino, Joaquin Garcia ...

  4. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  5. Direct Percutaneous Jejunostomy-An Underutilized Interventional Technique?

    International Nuclear Information System (INIS)

    Sparrow, Patrick; David, Elizabeth; Pugash, Robyn

    2008-01-01

    Our aim in this study was to report our single-center experience with direct percutaneous jejunostomy over a 4-year period with regard to technical success rate, immediate and late complications, and patient tolerance of the procedure. Institutional records of 22 consecutive patients who underwent radiological insertion of a percutaneous jejunostomy for a variety of indications were reviewed. The proximal jejunum was punctured under either fluoroscopic or ultrasonic guidance, and following placement of retention sutures, a 10- to 12-Fr catheter inserted. There was a 100% technical success rate in placement involving a total of seven operators. The indications for placement were prior gastric resection, newly diagnosed resectable esophageal or gastric carcinoma, unresectable gastric carcinoma with outlet obstruction, and palliative drainage of bowel obstruction. Mean duration of follow-up was 100 days, and catheter placement 57.7 days. There were six minor early complications, consisting of loss of two retention anchors requiring repuncture, three cases of localized excessive postprocedural pain, and one failed relief of symptoms of small bowel obstruction. Four tubes developed late complications (two blocked, one catheter cracked, and one inadvertently pulled out). Three of the four were successfully replaced through the existing tracts. One patient subsequently developed a minor skin infection, while another developed late pericatheter leakage from ascites. We conclude that direct percutaneous jejunostomy is a valuable treatment modality applicable to a number of clinical scenarios, with a high technical success rate and low serious complication rate

  6. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  7. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  8. Statistical designs and response surface techniques for the optimization of chromatographic systems.

    Science.gov (United States)

    Ferreira, Sergio Luis Costa; Bruns, Roy Edward; da Silva, Erik Galvão Paranhos; Dos Santos, Walter Nei Lopes; Quintella, Cristina Maria; David, Jorge Mauricio; de Andrade, Jailson Bittencourt; Breitkreitz, Marcia Cristina; Jardim, Isabel Cristina Sales Fontes; Neto, Benicio Barros

    2007-07-27

    This paper describes fundamentals and applications of multivariate statistical techniques for the optimization of chromatographic systems. The surface response methodologies: central composite design, Doehlert matrix and Box-Behnken design are discussed and applications of these techniques for optimization of sample preparation steps (extractions) and determination of experimental conditions for chromatographic separations are presented. The use of mixture design for optimization of mobile phases is also related. An optimization example involving a real separation process is exhaustively described. A discussion about model validation is presented. Some applications of other multivariate techniques for optimization of chromatographic methods are also summarized.

  9. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  10. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  11. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  12. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  13. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  14. Assumptions for well-known statistical techniques: Disturbing explanations for why they are seldom checked

    Directory of Open Access Journals (Sweden)

    Rink eHoekstra

    2012-05-01

    Full Text Available A valid interpretation of most statistical techniques requires that the criteria for one or more assumptions are met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another, more disquieting, explanation would be that violations of assumptions are hardly checked for in the first place. In this article a study is presented on whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. They were asked to analyze the data as they would their own data, for which often used and well-known techniques like the t-procedure, ANOVA and regression were required. It was found that they hardly ever checked for violations of assumptions. Interviews afterwards revealed that mainly lack of knowledge and nonchalance, rather than more rational reasons like being aware of the robustness of a technique or unfamiliarity with an alternative, seem to account for this behavior. These data suggest that merely encouraging people to check for violations of assumptions will not lead them to do so, and that the use of statistics is opportunistic.

  15. Statistical Approaches in GIS-Based Techniques for Sustainable Planning: Kayaçukuru Case

    OpenAIRE

    Aygun Erdogan

    2003-01-01

    The purpose of this study is to make both a summary and additional descriptive and inferential statistical analyses for a completed thesis on "Sustainable/Environment Friendly Development Planning of Fethiye-Kayaçukuru Using GIS-Based Techniques" (M.Sc. in the Graduate School of Geodetic and Geographic Information Technologies, Middle East Technical University, Supervisor: Assoc.Prof.Dr. Oğuz Işık, September 2000, 214 pages). The statistical analyses explained in this paper comprise a part of...

  16. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  17. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  19. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  20. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    International Nuclear Information System (INIS)

    Park, Jinyong; Balasingham, P.; McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W.

    2004-01-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  1. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER

  2. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  3. [Statistical study of the wavelet-based lossy medical image compression technique].

    Science.gov (United States)

    Puniene, Jūrate; Navickas, Ramūnas; Punys, Vytenis; Jurkevicius, Renaldas

    2002-01-01

    Medical digital images have informational redundancy. Both the amount of memory for image storage and their transmission time could be reduced if image compression techniques are applied. The techniques are divided into two groups: lossless (compression ratio does not exceed 3 times) and lossy ones. Compression ratio of lossy techniques depends on visibility of distortions. It is a variable parameter and it can exceed 20 times. A compression study was performed to evaluate the compression schemes, which were based on the wavelet transform. The goal was to develop a set of recommendations for an acceptable compression ratio for different medical image modalities: ultrasound cardiac images and X-ray angiographic images. The acceptable image quality after compression was evaluated by physicians. Statistical analysis of the evaluation results was used to form a set of recommendations.

  4. ISOLATED SPEECH RECOGNITION SYSTEM FOR TAMIL LANGUAGE USING STATISTICAL PATTERN MATCHING AND MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    VIMALA C.

    2015-05-01

    Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.

  5. Ignoring Functionality as a Correlate of the Underutilization of ...

    African Journals Online (AJOL)

    Ignoring Functionality as a Correlate of the Underutilization of Computer and Information Technology in Rwandan Higher Education Institutions. ... Data were collected on the institutions' expenditure on components of the TCO and the findings contrasted with documented experiences from CIT-savvy settings, to establish ...

  6. Antioxidant Activities and Food Value of Five Underutilized Green ...

    African Journals Online (AJOL)

    ... iron, calcium and vitamin C. The antioxidant activities and phenolic antioxidant contents of the vegetables were also high. The health claims associated with some of these food values and bioactive compounds are noteworthy, thereby underlining the potential role of these underutilized vegetables as functional foods.

  7. Assessment of nutritional values of three underutilized indigenous ...

    African Journals Online (AJOL)

    The nutritional values of three underutilized indigenous leafy vegetables of Izzi land in Ebonyi State of Nigeria; Zanthoxylum zanthoyloides Herms, Vitex doniana Sweet and Adenia cissamploides Zepernick, were investigated. Their proximate and mineral values (Ca, P, Na, Mg, Zn, K, Fe, Cu, Pb) were determined. Results of ...

  8. Application of statistical downscaling technique for the production of wine grapes (Vitis vinifera L.) in Spain

    Science.gov (United States)

    Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.

    2012-04-01

    Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.

  9. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  10. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression.

    Science.gov (United States)

    Heine, John J; Land, Walker H; Egan, Kathleen M

    2011-01-27

    When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL) techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR) modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  11. Flow prediction models using macroclimatic variables and multivariate statistical techniques in the Cauca River Valley

    International Nuclear Information System (INIS)

    Carvajal Escobar Yesid; Munoz, Flor Matilde

    2007-01-01

    The project this centred in the revision of the state of the art of the ocean-atmospheric phenomena that you affect the Colombian hydrology especially The Phenomenon Enos that causes a socioeconomic impact of first order in our country, it has not been sufficiently studied; therefore it is important to approach the thematic one, including the variable macroclimates associated to the Enos in the analyses of water planning. The analyses include revision of statistical techniques of analysis of consistency of hydrological data with the objective of conforming a database of monthly flow of the river reliable and homogeneous Cauca. Statistical methods are used (Analysis of data multivariante) specifically The analysis of principal components to involve them in the development of models of prediction of flows monthly means in the river Cauca involving the Lineal focus as they are the model autoregressive AR, ARX and Armax and the focus non lineal Net Artificial Network.

  12. TECHNIQUE OF CARRYING OUT THE PRACTICAL TRAINING ON MATHEMATICAL STATISTICS ABOUT USE OF INFORMATION TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    В А Бубнов

    2017-12-01

    Full Text Available In article the maintenance of a technique of training in a computer class on the example of the statistical analysis of the price of dollar in rubles within March, 2017 with use of the Microsoft Excel program is shown. This analysis allows from the traditional data defining dynamics of the price of dollar depending on date of day of this month to reveal days of month in which the price of dollar is grouped rather average price of dollar, and also to reveal so-called rare days in which the dollar price strongly differs from average as towards her reduction, and increase.

  13. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  14. A Superposition Technique for Deriving Photon Scattering Statistics in Plane-Parallel Cloudy Atmospheres

    Science.gov (United States)

    Platnick, S.

    1999-01-01

    Photon transport in a multiple scattering medium is critically dependent on scattering statistics, in particular the average number of scatterings. A superposition technique is derived to accurately determine the average number of scatterings encountered by reflected and transmitted photons within arbitrary layers in plane-parallel, vertically inhomogeneous clouds. As expected, the resulting scattering number profiles are highly dependent on cloud particle absorption and solar/viewing geometry. The technique uses efficient adding and doubling radiative transfer procedures, avoiding traditional time-intensive Monte Carlo methods. Derived superposition formulae are applied to a variety of geometries and cloud models, and selected results are compared with Monte Carlo calculations. Cloud remote sensing techniques that use solar reflectance or transmittance measurements generally assume a homogeneous plane-parallel cloud structure. The scales over which this assumption is relevant, in both the vertical and horizontal, can be obtained from the superposition calculations. Though the emphasis is on photon transport in clouds, the derived technique is applicable to any scattering plane-parallel radiative transfer problem, including arbitrary combinations of cloud, aerosol, and gas layers in the atmosphere.

  15. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.

    2015-12-16

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  16. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Science.gov (United States)

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672

  17. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  18. Network Models: An Underutilized Tool in Wildlife Epidemiology?

    Directory of Open Access Journals (Sweden)

    Meggan E. Craft

    2011-01-01

    Full Text Available Although the approach of contact network epidemiology has been increasing in popularity for studying transmission of infectious diseases in human populations, it has generally been an underutilized approach for investigating disease outbreaks in wildlife populations. In this paper we explore the differences between the type of data that can be collected on human and wildlife populations, provide an update on recent advances that have been made in wildlife epidemiology by using a network approach, and discuss why networks might have been underutilized and why networks could and should be used more in the future. We conclude with ideas for future directions and a call for field biologists and network modelers to engage in more cross-disciplinary collaboration.

  19. [Comments on] Statistical techniques for the development and application of SYVAC. (Document by Stephen Howe Ltd.)

    International Nuclear Information System (INIS)

    Beale, E.M.L.

    1983-05-01

    The Department of the Environment has embarked on a programme to develop computer models to help with assessment of sites suitable for the disposal of nuclear wastes. The first priority is to produce a system, based on the System Variability Analysis Code (SYVAC) obtained from Atomic Energy of Canada Ltd., suitable for assessing radioactive waste disposal in land repositories containing non heat producing wastes from typical UK sources. The requirements of the SYVAC system development were so diverse that each portion of the development was contracted to a different company. Scicon are responsible for software coordination, system integration and user interface. Their present report contains comments on 'Statistical techniques for the development and application of SYVAC'. (U.K.)

  20. Neutron multiplicity equation and its application for (n,2n) multiplication measurements by statistical correlation techniques

    International Nuclear Information System (INIS)

    Kumar, A.; Srinivasan, M.

    1986-01-01

    A new equation, called the neutron multiplicity equation (NME), has been derived starting from basic physics principles. Neutron multiplicity v is defined as the integral number of neutrons leaking from a neutron multiplying system for a source neutron introduced into it. Probability distribution of neutron multiplicities (PDNMs) gives the probability of leakage of neutrons as a function of their multiplicity v. The PDNM is directly measurable through statistical correlation techniques. In a specific application, the NME has been solved for PDNM as a function of v for /sup 9/Be spheres of varying radii and driven by a centrally located 14-MeV deuterium-tritium neutron source. The potential of NME for sensitivity analysis is demonstrated through a particular modification of secondary neutron transfer cross sections of /sup 9/Be. It turns out that PDNM is very sensitive, even as the ''average'' neutron leakage is practically insensitive to it

  1. Source of statistical noises in the Monte Carlo sampling techniques for coherently scattered photons

    Science.gov (United States)

    Muhammad, Wazir; Lee, Sang Hoon

    2013-01-01

    Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFFs (i.e. ) over squared momentum transfer (). In the current study, the role/issues of RFFs/MFFs and LIT in the MC sampling for the coherent scattering were analyzed. The results showed that the relative probability density curves sampled on the basis of MFFs are unable to reveal any extra scientific information as both the RFFs and MFFs produced the same MC sampled curves. Furthermore, no relationship was established between the multiple small peaks and irregular step shapes (i.e. statistical noise) in the PDFs and either RFFs or MFFs. In fact, the noise in the PDFs appeared due to the use of LIT. The density of the noise depends upon the interval length between two consecutive points in the input data table of and has no scientific background. The probability density function curves became smoother as the interval lengths were decreased. In conclusion, these statistical noises can be efficiently removed by introducing more data points in the data tables. PMID:22984278

  2. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    Science.gov (United States)

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to

  3. Statistical downscaling of temperature using three techniques in the Tons River basin in Central India

    Science.gov (United States)

    Duhan, Darshana; Pandey, Ashish

    2015-08-01

    In this study, downscaling models were developed for the projections of monthly maximum and minimum air temperature for three stations, namely, Allahabad, Satna, and Rewa in Tons River basin, which is a sub-basin of the Ganges River in Central India. The three downscaling techniques, namely, multiple linear regression (MLR), artificial neural network (ANN), and least square support vector machine (LS-SVM), were used for the development of models, and best identified model was used for simulations of future predictand (temperature) using third-generation Canadian Coupled Global Climate Model (CGCM3) simulation of A2 emission scenario for the period 2001-2100. The performance of the models was evaluated based on four statistical performance indicators. To reduce the bias in monthly projected temperature series, bias correction technique was employed. The results show that all the models are able to simulate temperature; however, LS-SVM models perform slightly better than ANN and MLR. The best identified LS-SVM models are then employed to project future temperature. The results of future projections show the increasing trends in maximum and minimum temperature for A2 scenario. Further, it is observed that minimum temperature will increase at greater rate than maximum temperature.

  4. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  5. Machine learning and statistical techniques : an application to the prediction of insolvency in Spanish non-life insurance companies

    OpenAIRE

    Díaz, Zuleyka; Segovia, María Jesús; Fernández, José

    2005-01-01

    Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...

  6. Improving the statistics of quantitative ultrasound techniques with deformation compounding: an experimental study.

    Science.gov (United States)

    Herd, Maria-Teresa; Hall, Timothy J; Jiang, Jingfeng; Zagzebski, James A

    2011-12-01

    Many quantitative ultrasound (QUS) techniques are based on estimates of the radio-frequency (RF) echo signal power spectrum. Historically, reliable spectral estimates required spatial averaging over large regions-of-interest (ROIs). Spatial compounding techniques have been used to obtain robust spectral estimates for data acquired over small regions of interest. A new technique referred to as "deformation compounding" is another method for providing robust spectral estimates over smaller regions of interest. Motion tracking software is used to follow an ROI while the tissue is deformed (typically by pressing with the transducer). The deformation spatially reorganizes the scatterers so that the resulting echo signal is decorrelated. The RF echo signal power spectrum for the ROI is then averaged over several frames of RF echo data as the tissue is deformed, thus, undergoing deformation compounding. More specifically, averaging spectral estimates among the uncorrelated RF data acquired following small deformations allows reduction in the variance of the power spectral density estimates and, thereby, improves accuracy of spectrum-based tissue property estimation. The viability of deformation compounding has been studied using phantoms with known attenuation and backscatter coefficients. Data from these phantoms demonstrates that a deformation of about 2% frame-to-frame average strain is sufficient to obtain statistically-independent echo signals (with correlations of less than 0.2). Averaging five such frames, where local scatterer reorganization has taken place due to mechanical deformations, reduces the average percent standard deviation among power spectra by 26% and averaging 10 frames reduces the average percent standard deviation by 49%. Deformation compounding is used in this study to improve measurements of backscatter coefficients. These tests show deformation compounding is a promising method to improve the accuracy of spectrum-based quantitative ultrasound

  7. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  8. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    Directory of Open Access Journals (Sweden)

    D. Fenta Mekonnen

    2018-04-01

    Full Text Available Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG and the Statistical Downscaling Model (SDSM, and (ii to downscale future climate scenarios of precipitation, maximum temperature (Tmax and minimum temperature (Tmin of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase

  9. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    Science.gov (United States)

    Fenta Mekonnen, Dagnenet; Disse, Markus

    2018-04-01

    Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs) and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i) to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG) and the Statistical Downscaling Model (SDSM), and (ii) to downscale future climate scenarios of precipitation, maximum temperature (Tmax) and minimum temperature (Tmin) of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM) have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase from 0.4 to 4.3

  10. Study of a 5 kW PEMFC using experimental design and statistical analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wahdame, B.; Francois, X.; Kauffmann, J.M. [Laboratory of Electrical Engineering and Systems (L2ES), Unite mixte de recherche UTBM and UFC - EA 3898, L2ES-UTBM Batiment F, rue Thierry Mieg, 90010 BELFORT Cedex (France); Candusso, D.; Harel, F.; De Bernardinis, A.; Coquery, G. [The French National Institute for Transport and Safety Research (INRETS), 2 avenue du General Malleret-Joinville, 94 114 ARCUEIL Cedex (France)

    2007-02-15

    Within the framework of the French inter lab SPACT project (Fuel Cell Systems for Transportation Applications), the behavior of a 5 kW PEM fuel cell stack, fed by humidified hydrogen and compressed air, is investigated in a test platform at Belfort, France. A set of polarization curves are recorded, under various conditions of stack temperature, gas pressure, and stoichiometry rates, in order to obtain a kind of cartography, representing the static stack performance. Initially, the tests are defined considering experimental design techniques. In order to study the relative impacts of the physical factors on the fuel cell voltage, some polarization curve results are selected from the static tests available applying experimental design methodology. First, several analyses are used to estimate the impact of the stack temperature, gas pressure, and stoichiometry rate on the fuel cell voltage. Statistical sensitivity analyses (ANOVA) are used to compute, from the available data, the effects and respective contributions of the various physical factors on the stack voltage. The potential for the detection of any interactions between the different parameters is shown. Also, some graphic representations are used to display the results of the statistical analyses made for different current values of the polarization curves. Then, the experimental design method and its associated statistical tools are employed in order to identify the influence of the stack temperature and gas pressure on the fuel cell voltage. Moreover, it is shown how it is possible to reduce the number of experiments needed and how certain optimizations of the fuel cell operating parameters leading to higher performances can be achieved. The work presented aims at showing the suitability of the experimental design method for the characterization, analysis, and improvement of a complex system like a fuel cell generator. The future outlook is proposed in the final part of the paper. The methodologies

  11. Statistical techniques for modeling extreme price dynamics in the energy market

    International Nuclear Information System (INIS)

    Mbugua, L N; Mwita, P N

    2013-01-01

    Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.

  12. Statistical inference on associated fertility life table parameters using jackknife technique: computational aspects.

    Science.gov (United States)

    Maia, A de H; Luiz, A J; Campanhola, C

    2000-04-01

    Knowledge of population growth potential is crucial for studying population dynamics and for establishing management tactics for pest control. Estimation of population growth can be achieved with fertility life tables because they synthesize data on reproduction and mortality of a population. The five main parameters associated with a fertility life table are as follows: (1) the net reproductive rate (Ro), (2) the intrinsic rate of increase (rm), 3) the mean generation time (T), (4) the doubling time (Dt), and (5) the finite rate of increase (lambda). Jackknife and bootstrap techniques are used to calculate the variance of the rm estimate, which can be extended to the other parameters of life tables. Those methods are computer-intensive, their application requires the development of efficient algorithms, and their implementation is based on a programming language that encompasses quickness and reliability. The objectives of this article are to discuss statistical and computational aspects related to estimation of life table parameters and to present a SAS program that uses jackknife to estimate parameters for fertility life tables. The SAS program presented here allows the calculation of confidence intervals for all estimated parameters, as well as provides one-sided and two-sided t-tests to perform pairwise or multiple comparison between groups, with their respective P values.

  13. Modeling and monitoring of a high pressure polymerization process using multivariate statistical techniques

    Science.gov (United States)

    Sharmin, Rumana

    This thesis explores the use of multivariate statistical techniques in developing tools for property modeling and monitoring of a high pressure ethylene polymerization process. In polymer industry, many researchers have shown, mainly in simulation studies, the potential of multivariate statistical methods in identification and control of polymerization process. However, very few, if any, of these strategies have been implemented. This work was done using data collected from a commercial high pressure LDPE/EVA reactor located at AT Plastics, Edmonton. The models or methods developed in the course of this research have been validated with real data and in most cases, implemented in real time. One main objective of this PhD project was to develop and implement a data based inferential sensor to estimate the melt flow index of LDPE and EVA resins using regularly measured process variables. Steady state PLS method was used to develop the soft sensor model. A detailed description of the data preprocessing steps are given that should be followed in the analysis of industrial data. Models developed for two of the most frequently produced polymer grades at AT Plastics have been implemented. The models were tested for many sets of data and showed acceptable performance when applied with an online bias updating scheme. One observation from many validation exercises was that the model prediction becomes poorer with time as operators use new process conditions in the plant to produce the same resin with the same specification. During the implementation of the soft sensors, we suggested a simple bias update scheme as a remedy to this problem. An alternative and more rigorous approach is to recursively update the model with new data, which is also more suitable to handle grade transition. Two existing recursive PLS methods, one based on NIPALS algorithm and the other based on kernel algorithm were reviewed. In addition, we proposed a novel RPLS algorithm which is based on the

  14. Proteolytic activities in fillets of selected underutilized Australian fish species.

    Science.gov (United States)

    Ahmed, Z; Donkor, O; Street, W A; Vasiljevic, T

    2013-09-01

    The hydrolytic activity of major endogenous proteases, responsible for proteolysis of myofibrillar proteins during post-mortem storage, may be an indicator of the textural quality of fish which influences consumer purchasing behaviour and thus market value of the final product. Furthermore, it may also influence the type and bioactive properties of the peptides released during post-mortem proteolysis of myofibrillar proteins. This study compared the activities of cathepsins B, B+L, D, H and calpain-like enzymes in crude muscle extracted from 16 Australian underutilized fish species. Fish species had a significant effect on the activity of these enzymes with barracouta showing the highest cathepsins B, B+L, D and H activities. Activities of cathepsins B and B+L were higher than cathepsin H for all studied species. The more commercially important rock ling and tiger flathead demonstrated higher cathepsin B+L activity, whereas gemfish and eastern school whiting showed higher activity towards cathepsin B. Underutilized fish species showing higher endogenous protease activities may be suitable for fish sauce production, whereas those with lower protease activities for surimi processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Multiple statistical analysis techniques corroborate intratumor heterogeneity in imaging mass spectrometry datasets of myxofibrosarcoma.

    Directory of Open Access Journals (Sweden)

    Emrys A Jones

    Full Text Available MALDI mass spectrometry can generate profiles that contain hundreds of biomolecular ions directly from tissue. Spatially-correlated analysis, MALDI imaging MS, can simultaneously reveal how each of these biomolecular ions varies in clinical tissue samples. The use of statistical data analysis tools to identify regions containing correlated mass spectrometry profiles is referred to as imaging MS-based molecular histology because of its ability to annotate tissues solely on the basis of the imaging MS data. Several reports have indicated that imaging MS-based molecular histology may be able to complement established histological and histochemical techniques by distinguishing between pathologies with overlapping/identical morphologies and revealing biomolecular intratumor heterogeneity. A data analysis pipeline that identifies regions of imaging MS datasets with correlated mass spectrometry profiles could lead to the development of novel methods for improved diagnosis (differentiating subgroups within distinct histological groups and annotating the spatio-chemical makeup of tumors. Here it is demonstrated that highlighting the regions within imaging MS datasets whose mass spectrometry profiles were found to be correlated by five independent multivariate methods provides a consistently accurate summary of the spatio-chemical heterogeneity. The corroboration provided by using multiple multivariate methods, efficiently applied in an automated routine, provides assurance that the identified regions are indeed characterized by distinct mass spectrometry profiles, a crucial requirement for its development as a complementary histological tool. When simultaneously applied to imaging MS datasets from multiple patient samples of intermediate-grade myxofibrosarcoma, a heterogeneous soft tissue sarcoma, nodules with mass spectrometry profiles found to be distinct by five different multivariate methods were detected within morphologically identical regions of

  16. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Abbas Alkarkhi, F.M.; Ismail, Norli; Easa, Azhar Mat

    2008-01-01

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers

  17. Correlation analysis of energy indicators for sustainable development using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Carneiro, Alvaro Luiz Guimaraes [Instituto de Pesquisas Energeticas e Nucleares (IPEN-CNEN/SP), Sao Paulo, SP (Brazil)], E-mail: carneiro@ipen.br; Santos, Francisco Carlos Barbosa dos [Fundacao Instituto de Pesquisas Economicas (FIPE/USP), Sao Paulo, SP (Brazil)], E-mail: fcarlos@fipe.org.br

    2007-07-01

    Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)

  18. Correlation analysis of energy indicators for sustainable development using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Carneiro, Alvaro Luiz Guimaraes; Santos, Francisco Carlos Barbosa dos

    2007-01-01

    Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)

  19. Application of multivariate statistical technique for hydrogeochemical assessment of groundwater within the Lower Pra Basin, Ghana

    Science.gov (United States)

    Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.

    2017-06-01

    Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal

  20. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques.

    Science.gov (United States)

    Abbas Alkarkhi, F M; Ismail, Norli; Easa, Azhar Mat

    2008-02-11

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.

  1. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Abbas Alkarkhi, F.M. [School of Industrial Technology, Environmental Technology Division, Universiti Sains Malaysia, 11800 Penang (Malaysia)], E-mail: abbas@usm.my; Ismail, Norli [School of Industrial Technology, Environmental Technology Division, Universiti Sains Malaysia, 11800 Penang (Malaysia)], E-mail: norlii@usm.my; Easa, Azhar Mat [School of Industrial Technology, Food Technology Division, Universiti Sains Malaysia, 11800 Penang (Malaysia)], E-mail: azhar@usm.my

    2008-02-11

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.

  2. Source Identification of Heavy Metals in Soils Surrounding the Zanjan Zinc Town by Multivariate Statistical Techniques

    Directory of Open Access Journals (Sweden)

    M.A. Delavar

    2016-02-01

    Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and

  3. Assessment of roadside surface water quality of Savar, Dhaka, Bangladesh using GIS and multivariate statistical techniques

    Science.gov (United States)

    Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir

    2017-11-01

    In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.

  4. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    Science.gov (United States)

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  5. Improving the Statistics of Quantitative Ultrasound Techniques with Deformation Compounding: An Experimental Study

    OpenAIRE

    Herd, Maria-Teresa; Hall, Timothy J; Jiang, Jingfeng; Zagzebski, James A

    2011-01-01

    Many quantitative ultrasound (QUS) techniques are based on estimates of the radio frequency (RF) echo signal power spectrum. Historically reliable spectral estimates required spatial averaging over large regions of interest (ROIs). Spatial compounding techniques have been used to obtain robust spectral estimates for data acquired over small regions of interest. A new technique referred to as “deformation compounding” is another method for providing robust spectral estimates over smaller regio...

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  7. Statistical Techniques for Analyzing Process or "Similarity" Data in TID Hardness Assurance

    Science.gov (United States)

    Ladbury, R.

    2010-01-01

    We investigate techniques for estimating the contributions to TID hardness variability for families of linear bipolar technologies, determining how part-to-part and lot-to-lot variability change for different part types in the process.

  8. The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading

    Science.gov (United States)

    Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.

    2016-01-01

    Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…

  9. Visual Analysis of North Atlantic Hurricane Trends Using Parallel Coordinates and Statistical Techniques

    National Research Council Canada - National Science Library

    Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E

    2008-01-01

    ... for a particular dependent variable. These capabilities are combined into a unique visualization system that is demonstrated via a North Atlantic hurricane climate study using a systematic workflow. This research corroborates the notion that enhanced parallel coordinates coupled with statistical analysis can be used for more effective knowledge discovery and confirmation in complex, real-world data sets.

  10. On some surprising statistical properties of a DNA fingerprinting technique called AFLP

    NARCIS (Netherlands)

    Gort, G.

    2010-01-01

    AFLP is a widely used DNA fingerprinting technique, resulting in band absence - presence profiles, like a bar code. Bands represent DNA fragments, sampled from the genome of an individual plant or other organism. The DNA fragments travel through a lane of an electrophoretic gel or microcapillary

  11. Underutilization of A2 ABO incompatible kidney transplantation.

    Science.gov (United States)

    Redfield, Robert R; Parsons, Ronald F; Rodriguez, Eduardo; Mustafa, Moiz; Cassuto, James; Vivek, Kumar; Noorchashm, Hooman; Naji, Ali; Levine, Matthew H; Abt, Peter L

    2012-01-01

    ABO compatibility creates a disadvantage for O and B renal allograft candidates. A2 ABO incompatible transplant may decrease waiting times and generate equivalent graft survival to an ABO compatible transplant. Death-censored graft survival was compared between A recipients and O, B, and AB recipients of an A2 allograft with multivariate Cox regression models utilizing data from the United Network of Organ Sharing (UNOS) between 1997 and 2007. Eighty-five percent of A2 kidneys were transplanted into ABO compatible recipients vs. 15% into ABO incompatible recipients. Rates of A2 incompatible kidney transplants did not increase over the study period (14.8% to 14.6%). Mean wait time for A2→O kidneys was 337 vs. 684 d for O→O and for A2→B kidneys, 542 vs. 734 d for B→B. Adjusted relative risk of graft loss at five-yr was similar between O, B, and AB recipients compared to A recipients of an A2 allograft, corresponding to a five-yr graft survival of 84%, 86.2%, 86.1%, and 86.1%, respectively. A2 incompatible kidney transplantation is underutilized. Graft outcomes are similar among A2 compatible and incompatible recipients. Shorter waiting time and improved access might be achieved if A2 kidneys are considered in all blood groups. © 2011 John Wiley & Sons A/S.

  12. Statistical Analysis of Reactor Pressure Vessel Fluence Calculation Benchmark Data Using Multiple Regression Techniques

    International Nuclear Information System (INIS)

    Carew, John F.; Finch, Stephen J.; Lois, Lambros

    2003-01-01

    The calculated >1-MeV pressure vessel fluence is used to determine the fracture toughness and integrity of the reactor pressure vessel. It is therefore of the utmost importance to ensure that the fluence prediction is accurate and unbiased. In practice, this assurance is provided by comparing the predictions of the calculational methodology with an extensive set of accurate benchmarks. A benchmarking database is used to provide an estimate of the overall average measurement-to-calculation (M/C) bias in the calculations ( ). This average is used as an ad-hoc multiplicative adjustment to the calculations to correct for the observed calculational bias. However, this average only provides a well-defined and valid adjustment of the fluence if the M/C data are homogeneous; i.e., the data are statistically independent and there is no correlation between subsets of M/C data.Typically, the identification of correlations between the errors in the database M/C values is difficult because the correlation is of the same magnitude as the random errors in the M/C data and varies substantially over the database. In this paper, an evaluation of a reactor dosimetry benchmark database is performed to determine the statistical validity of the adjustment to the calculated pressure vessel fluence. Physical mechanisms that could potentially introduce a correlation between the subsets of M/C ratios are identified and included in a multiple regression analysis of the M/C data. Rigorous statistical criteria are used to evaluate the homogeneity of the M/C data and determine the validity of the adjustment.For the database evaluated, the M/C data are found to be strongly correlated with dosimeter response threshold energy and dosimeter location (e.g., cavity versus in-vessel). It is shown that because of the inhomogeneity in the M/C data, for this database, the benchmark data do not provide a valid basis for adjusting the pressure vessel fluence.The statistical criteria and methods employed in

  13. Source of statistical noises in the Monte Carlo sampling techniques for coherently scattered photons

    OpenAIRE

    Muhammad, Wazir; Lee, Sang Hoon

    2012-01-01

    Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFF...

  14. Application of the Statistical ICA Technique in the DANCE Data Analysis

    Science.gov (United States)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  15. An assessment of water quality in the Coruh Basin (Turkey) using multivariate statistical techniques.

    Science.gov (United States)

    Bilgin, Ayla

    2015-11-01

    The purpose of this study was to assess the impact of 24 water parameters, measured semi-annually between 2011 and 2013 in Coruh Basin (Turkey), based on the quality of the water. The study utilised analysis of variance (ANOVA), principal component analysis (PCA) and factor analysis (FA) methods. The water-quality data was obtained from a total of four sites by the 26th Regional Directorate of the State Hydraulic Works (DSI). ANOVA was carried out to identify the differences between the parameters at the different measuring sites. The variables were classified using factor analysis, and at the end of the ANOVA test, it was established that there was a statistically significant difference between the downstream and upstream waste waters released by the Black Sea copper companies and between the Murgul and Borcka Dams, in terms of water quality, while no statistically significant difference was observed between the Murgul and Borcka Dams. It was determined through factor analysis that five factors explained 81.3% of the total variance. It was concluded that domestic, industrial and agricultural activities, in combination with physicochemical properties, were factors affecting the quality of the water in the Coruh Basin.

  16. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1978-01-01

    Surveillance techniques which extend the sophistication existing in automated systems monitoring in industrial rotating equipment are described. The monitoring system automatically established limiting criteria during an initial learning period of a few days; and subsequently, while monitoring the test rotor during an extended period of normal operation, experienced a false alarm rate of 0.5%. At the same time, the monitoring system successfully detected all fault types that introduced into the test setup. Tests on real equipment are needed to provide final verification of the monitoring techniques. There are areas that would profit from additional investigation in the laboratory environment. A comparison of the relative value of alternate descriptors under given fault conditions would be worthwhile. This should be pursued in conjunction with extending the set of fault types available, e.g., lecaring problems. Other tests should examine the effects of using fewer (more coarse) intervals to define the lumped operational states. finally, techniques to diagnose the most probable fault should be developed by drawing upon the extensive data automatically logged by the monitoring system

  17. Relationship between Pore-size Distribution and Flexibility of Adsorbent Materials: Statistical Mechanics and Future Material Characterization Techniques.

    Science.gov (United States)

    Siderius, Daniel W; Mahynski, Nathan A; Shen, Vincent K

    2017-05-01

    Measurement of the pore-size distribution (PSD) via gas adsorption and the so-called "kernel method" is a widely used characterization technique for rigid adsorbents. Yet, standard techniques and analytical equipment are not appropriate to characterize the emerging class of flexible adsorbents that deform in response to the stress imparted by an adsorbate gas, as the PSD is a characteristic of the material that varies with the gas pressure and any other external stresses. Here, we derive the PSD for a flexible adsorbent using statistical mechanics in the osmotic ensemble to draw analogy to the kernel method for rigid materials. The resultant PSD is a function of the ensemble constraints including all imposed stresses and, most importantly, the deformation free energy of the adsorbent material. Consequently, a pressure-dependent PSD is a descriptor of the deformation characteristics of an adsorbent and may be the basis of future material characterization techniques. We discuss how, given a technique for resolving pressure-dependent PSDs, the present statistical mechanical theory could enable a new generation of analytical tools that measure and characterize certain intrinsic material properties of flexible adsorbents via otherwise simple adsorption experiments.

  18. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1979-01-01

    The level of technology utilized in automated systems that monitor industrial rotating equipment and the potential of alternative surveillance methods are assessed. It is concluded that changes in surveillance methodology would upgrade ongoing programs and yet still be practical for implementation. An improved anomaly recognition methodology is formulated and implemented on a minicomputer system. The effectiveness of the monitoring system was evaluated in laboratory tests on a small rotor assembly, using vibrational signals from both displacement probes and accelerometers. Time and frequency domain descriptors are selected to compose an overall signature that characterizes the monitored equipment. Limits for normal operation of the rotor assembly are established automatically during an initial learning period. Thereafter, anomaly detection is accomplished by applying an approximate statistical test to each signature descriptor. As demonstrated over months of testing, this monitoring system is capable of detecting anomalous conditions while exhibiting a false alarm rate below 0.5%

  19. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    Science.gov (United States)

    Benson, Nsikak U; Asuquo, Francis E; Williams, Akan B; Essien, Joseph P; Ekong, Cyril I; Akpabio, Otobong; Olajire, Abaas A

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  20. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    Directory of Open Access Journals (Sweden)

    Nsikak U Benson

    Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  1. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  2. An Efficient Statistical Computation Technique for Health Care Big Data using R

    Science.gov (United States)

    Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.

    2017-08-01

    Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to -day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.

  3. Steepest-descent technique and stellar equilibrium statistical mechanics. I. Newtonian clusters in a box

    International Nuclear Information System (INIS)

    Horwitz, G.; Katz, J.

    1977-01-01

    A microcanonical statistical mechanics formulation has been developed for nonrotating systems of stars of equal mass. The system, placed in a confining volume and with a cutoff of the interparticle gravitational interaction at short distances, can have thermodynamic equilibrium states. Sequences of equilibrium states are presumed to simulate slowly evolving, near-equilibrium configurations of real star clusters. An exact functional expression for the entropy of such systems is derived which has also a relativistic counterpart. The entropy is evaluated in an approximation which is mean field plus fluctuations. Evaluations beyond this approximation can readily be carried out. We obtain the necessary and sufficient conditions for spherically symmetric clusters to be thermodynamically stable about a mean field solution, with respect to arbitrary fluctuations in the microcanonical ensemble. The stability conditions amount to the following quantities having definite signs: (i) a functional form, quadratic in ''mean field'' fluctuations, (ii) the derivative of the gravito-chemical potential with respect to the number of particles, at fixed temperature, being positive definite, and (iii) the heat capacity C/sub ν/, at fixed number of particles, being positive definite. In a sequence of equilibrium configurations in which the ratio of densities between the center and the boundary of the cluster is progressively increased, conditions (i) and (ii) break down simultaneously when this density contrast is equal to 1.58. Condition (i) remains unsatisfied for higher density contrasts. The limit 1.58 on the density contrast is much more stringent than that given by condition (iii) which breaks down only for a value of 32.1. Our results are in sharp contrast to those of Antonov's criterion according to which instabilities appear when the density contrast is higher than 709. Time scales of evolutions of various unstable configurations are not considered in this work

  4. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  5. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  6. Multivariate statistical techniques for the evaluation of surface water quality of the Himalayan foothills streams, Pakistan

    Science.gov (United States)

    Malik, Riffat Naseem; Hashmi, Muhammad Zaffar

    2017-10-01

    Himalayan foothills streams, Pakistan play an important role in living water supply and irrigation of farmlands; thus, the water quality is closely related to public health. Multivariate techniques were applied to check spatial and seasonal trends, and metals contamination sources of the Himalayan foothills streams, Pakistan. Grab surface water samples were collected from different sites (5-15 cm water depth) in pre-washed polyethylene containers. Fast Sequential Atomic Absorption Spectrophotometer (Varian FSAA-240) was used to measure the metals concentration. Concentrations of Ni, Cu, and Mn were high in pre-monsoon season than the post-monsoon season. Cluster analysis identified impaired, moderately impaired and least impaired clusters based on water parameters. Discriminant function analysis indicated spatial variability in water was due to temperature, electrical conductivity, nitrates, iron and lead whereas seasonal variations were correlated with 16 physicochemical parameters. Factor analysis identified municipal and poultry waste, automobile activities, surface runoff, and soil weathering as major sources of contamination. Levels of Mn, Cr, Fe, Pb, Cd, Zn and alkalinity were above the WHO and USEPA standards for surface water. The results of present study will help to higher authorities for the management of the Himalayan foothills streams.

  7. Production of bio-oil from underutilized forest biomass using an auger reactor

    Science.gov (United States)

    H. Ravindran; S. Thangalzhy-Gopakumar; S. Adhikari; O. Fasina; M. Tu; B. Via; E. Carter; S. Taylor

    2015-01-01

    Conversion of underutilized forest biomass to bio-oil could be a niche market for energy production. In this work, bio-oil was produced from underutilized forest biomass at selected temperatures between 425–500°C using an auger reactor. Physical properties of bio-oil, such as pH, density, heating value, ash, and water, were analyzed and compared with an ASTM standard...

  8. Conservation and Use of Genetic Resources of Underutilized Crops in the Americas—A Continental Analysis

    OpenAIRE

    Gea Galluzzi; Isabel López Noriega

    2014-01-01

    Latin America is home to dramatically diverse agroecological regions which harbor a high concentration of underutilized plant species, whose genetic resources hold the potential to address challenges such as sustainable agricultural development, food security and sovereignty, and climate change. This paper examines the status of an expert-informed list of underutilized crops in Latin America and analyses how the most common features of underuse apply to these. The analysis pays special attent...

  9. Statistical Techniques Used in Three Applied Linguistics Journals: "Language Learning,""Applied Linguistics" and "TESOL Quarterly," 1980-1986: Implications for Readers and Researchers.

    Science.gov (United States)

    Teleni, Vicki; Baldauf, Richard B., Jr.

    A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…

  10. Beyond landraces: developing improved germplasm resources for underutilized species - a case for Bambara groundnut.

    Science.gov (United States)

    Aliyu, Siise; Massawe, Festo; Mayes, Sean

    2014-10-01

    The potential for underutilized crops (also known as minor, neglected or orphan crops) to improve food and nutrition security has been gaining prominence within the research community in recent years. This is due to their significance for diversified agricultural systems which is a necessary component of future agriculture to address food and nutritional security concerns posed by changing climate and a growing world population. Developing workable value chain systems for underutilized crop species, coupled with comparative trait studies with major crops, potentially allows us to identify suitable agricultural modalities for such species. Bambara groundnut (Vigna subterranea L. Verdc.), an underutilized leguminous species, is of interest for its reported high levels of drought tolerance in particular, which contributes to environmental resilience in semi-arid environments. Here, we present a synopsis of suitable strategies for the genetic improvement of Bambara groundnut as a guide to other underutilized crop species. Underutilized crops have often been adapted over thousands of years in particular regions by farmers and largely still exist as landraces with little or no genetic knowledge of key phenotypic traits. Breeding in these species is fundamentally different to breeding in major crops, where significant pedigree structures and history allow highly directed improvement. In this regard, deploying new integrated germplasm development approaches for variety development and genetic analysis, such as multi-parent advance generation inter-crosses (MAGIC), within breeding programmes of underutilized species will be important to be able to fully utilize such crops.

  11. A two step linear statistical technique using leaps and bounds procedure for retrieval of geophysical parameters from microwave radiometric data

    Science.gov (United States)

    Kakar, R. K.; Pandey, P. C.

    1983-01-01

    A linear statistical technique using a 'leaps and bounds' procedure (Furnival and Wilson, 1974) is developed for retrieving geophysical parameters from remote measurements. It is used for retrieving sea surface temperatures from the Scaning Multichannel Microwave Radiometer (SMMR) on Seasat. The technique uses an efficient algorithm to select the best fixed-size subset of the 10 SMMR channels for linearly retrieving a given geophysical parameter. The 5-channel subset (6.6V, 6.6H 10H, 18V, 21H), where V and H refer to, respectively, the vertical and horizontal polarizations and the numbers are the channel frequencies in gigahertz, gives the minimum rms error in estimating the sea surface temperature. A comparison with ground truth indicates that the algorithm infers the temperature with an rms accuracy of better than 1.5 K under most environmental conditions. A quality control procedure which is seen as holding promise for further improving the accuracy is proposed.

  12. Combining statistical techniques to predict postsurgical risk of 1-year mortality for patients with colon cancer

    Directory of Open Access Journals (Sweden)

    Arostegui I

    2018-03-01

    Full Text Available Inmaculada Arostegui,1–3 Nerea Gonzalez,2,4 Nerea Fernández-de-Larrea,5,6 Santiago Lázaro-Aramburu,7 Marisa Baré,2,8 Maximino Redondo,2,9 Cristina Sarasqueta,2,10 Susana Garcia-Gutierrez,2,4 José M Quintana2,4 On behalf of the REDISSEC CARESS-CCR Group2 1Department of Applied Mathematics, Statistics and Operations Research, University of the Basque Country UPV/EHU, Leioa, Bizkaia, Spain; 2Health Services Research on Chronic Patients Network (REDISSEC, Galdakao, Bizkaia, Spain; 3Basque Center for Applied Mathematics – BCAM, Bilbao, Bizkaia, Spain; 4Research Unit, Galdakao-Usansolo Hospital, Galdakao, Bizkaia, Spain; 5Environmental and Cancer Epidemiology Unit, National Center of Epidemiology, Instituto de Salud Carlos III, Madrid, Spain; 6Consortium for Biomedical Research in Epidemiology and Public Health (CIBERESP, Madrid, Spain; 7General Surgery Service, Galdakao-Usansolo Hospital, Galdakao, Bizkaia, Spain; 8Clinical Epidemiology and Cancer Screening Unit, Parc Taulí Sabadell-Hospital Universitari, UAB, Sabadell, Barcelona, Spain; 9Research Unit, Costa del Sol Hospital, Marbella, Malaga, Spain; 10Research Unit, Donostia Hospital, Donostia-San Sebastián, Gipuzkoa, Spain Introduction: Colorectal cancer is one of the most frequently diagnosed malignancies and a common cause of cancer-related mortality. The aim of this study was to develop and validate a clinical predictive model for 1-year mortality among patients with colon cancer who survive for at least 30 days after surgery. Methods: Patients diagnosed with colon cancer who had surgery for the first time and who survived 30 days after the surgery were selected prospectively. The outcome was mortality within 1 year. Random forest, genetic algorithms and classification and regression trees were combined in order to identify the variables and partition points that optimally classify patients by risk of mortality. The resulting decision tree was categorized into four risk categories

  13. Statistical signal processing techniques for coherent transversal beam dynamics in synchrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Alhumaidi, Mouhammad

    2015-03-04

    identifying and analyzing the betatron oscillation sourced from the kick based on its mixing and temporal patterns. The accelerator magnets can generate unwanted spurious linear and non-linear fields due to fabrication errors or aging. These error fields in the magnets can excite undesired resonances leading together with the space charge tune spread to long term beam losses and reducing dynamic aperture. Therefore, the knowledge of the linear and non-linear magnets errors in circular accelerator optics is very crucial for controlling and compensating resonances and their consequent beam losses and beam quality deterioration. This is indispensable, especially for high beam intensity machines. Fortunately, the relationship between the beam offset oscillation signals recorded at the BPMs is a manifestation of the accelerator optics, and can therefore be exploited in the determination of the optics linear and non-linear components. Thus, beam transversal oscillations can be excited deliberately for purposes of diagnostics operation of particle accelerators. In this thesis, we propose a novel method for detecting and estimating the optics lattice non-linear components located in-between the locations of two BPMs by analyzing the beam offset oscillation signals of a BPMs-triple containing these two BPMs. Depending on the non-linear components in-between the locations of the BPMs-triple, the relationship between the beam offsets follows a multivariate polynomial accordingly. After calculating the covariance matrix of the polynomial terms, the Generalized Total Least Squares method is used to find the model parameters, and thus the non-linear components. A bootstrap technique is used to detect the existing polynomial model orders by means of multiple hypothesis testing, and determine confidence intervals for the model parameters.

  14. Alternative calibration techniques for counteracting the matrix effects in GC-MS-SPE pesticide residue analysis - a statistical approach.

    Science.gov (United States)

    Rimayi, Cornelius; Odusanya, David; Mtunzi, Fanyana; Tsoka, Shepherd

    2015-01-01

    This paper investigates the efficiency of application of four different multivariate calibration techniques, namely matrix-matched internal standard (MMIS), matrix-matched external standard (MMES), solvent-only internal standard (SOIS) and solvent-only external standard (SOES) on the detection and quantification of 20 organochlorine compounds from high, low and blank matrix water sample matrices by Gas Chromatography-Mass Spectrometry (GC-MS) coupled to solid phase extraction (SPE). Further statistical testing, using Statistical Package for the Social Science (SPSS) by applying MANOVA, T-tests and Levene's F tests indicates that matrix composition has a more significant effect on the efficiency of the analytical method than the calibration method of choice. Matrix effects are widely described as one of the major sources of errors in GC-MS multiresidue analysis. Descriptive and inferential statistics proved that the matrix-matched internal standard calibration was the best approach to use for samples of varying matrix composition as it produced the most precise average mean recovery of 87% across all matrices tested. The use of an internal standard calibration overall produced more precise total recoveries than external standard calibration, with mean values of 77% and 64% respectively. The internal standard calibration technique produced a particularly high overall standard deviation of 38% at 95% confidence level indicating that it is less robust than the external standard calibration method which had an overall standard error of 32% at 95% confidence level. Overall, the matrix-matched external standard calibration proved to be the best calibration approach for analysis of low matrix samples which consisted of the real sample matrix as it had the most precise recovery of 98% compared to other calibration approaches for the low-matrix samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. [Estimation of a nationwide statistics of hernia operation applying data mining technique to the National Health Insurance Database].

    Science.gov (United States)

    Kang, Sunghong; Seon, Seok Kyung; Yang, Yeong-Ja; Lee, Aekyung; Bae, Jong-Myon

    2006-09-01

    The aim of this study is to develop a methodology for estimating a nationwide statistic for hernia operations with using the claim database of the Korea Health Insurance Cooperation (KHIC). According to the insurance claim procedures, the claim database was divided into the electronic data interchange database (EDI_DB) and the sheet database (Paper_DB). Although the EDI_DB has operation and management codes showing the facts and kinds of operations, the Paper_DB doesn't. Using the hernia matched management code in the EDI_DB, the cases of hernia surgery were extracted. For drawing the potential cases from the Paper_DB, which doesn't have the code, the predictive model was developed using the data mining technique called SEMMA. The claim sheets of the cases that showed a predictive probability of an operation over the threshold, as was decided by the ROC curve, were identified in order to get the positive predictive value as an index of usefulness for the predictive model. Of the claim databases in 2004, 14,386 cases had hernia related management codes with using the EDI system. For fitting the models with applying the data mining technique, logistic regression was chosen rather than the neural network method or the decision tree method. From the Paper_DB, 1,019 cases were extracted as potential cases. Direct review of the sheets of the extracted cases showed that the positive predictive value was 95.3%. The results suggested that applying the data mining technique to the claim database in the KHIC for estimating the nationwide surgical statistics would be useful from the aspect of execution and cost-effectiveness.

  16. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques

    International Nuclear Information System (INIS)

    Hernandez M, B.

    1997-01-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  17. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    Science.gov (United States)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  18. Conservation and Use of Genetic Resources of Underutilized Crops in the Americas—A Continental Analysis

    Directory of Open Access Journals (Sweden)

    Gea Galluzzi

    2014-02-01

    Full Text Available Latin America is home to dramatically diverse agroecological regions which harbor a high concentration of underutilized plant species, whose genetic resources hold the potential to address challenges such as sustainable agricultural development, food security and sovereignty, and climate change. This paper examines the status of an expert-informed list of underutilized crops in Latin America and analyses how the most common features of underuse apply to these. The analysis pays special attention to if and how existing international policy and legal frameworks on biodiversity and plant genetic resources effectively support or not the conservation and sustainable use of underutilized crops. Results show that not all minor crops are affected by the same degree of neglect, and that the aspects under which any crop is underutilized vary greatly, calling for specific analyses and interventions. We also show that current international policy and legal instruments have so far provided limited stimulus and funding for the conservation and sustainable use of the genetic resources of these crops. Finally, the paper proposes an analytical framework for identifying and evaluating a crop’s underutilization, in order to define the most appropriate type and levels of intervention (international, national, local for improving its status.

  19. Post-fire debris flow prediction in Western United States: Advancements based on a nonparametric statistical technique

    Science.gov (United States)

    Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.

    2017-12-01

    Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.

  20. The problem of sexual imbalance and techniques of the self in the Diagnostic and Statistical Manual of Mental Disorders.

    Science.gov (United States)

    Flore, Jacinthe

    2016-09-01

    This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination. © The Author(s) 2016.

  1. Application of Statistical Downscaling Techniques to Predict Rainfall and Its Spatial Analysis Over Subansiri River Basin of Assam, India

    Science.gov (United States)

    Barman, S.; Bhattacharjya, R. K.

    2017-12-01

    The River Subansiri is the major north bank tributary of river Brahmaputra. It originates from the range of Himalayas beyond the Great Himalayan range at an altitude of approximately 5340m. Subansiri basin extends from tropical to temperate zones and hence exhibits a great diversity in rainfall characteristics. In the Northern and Central Himalayan tracts, precipitation is scarce on account of high altitudes. On the other hand, Southeast part of the Subansiri basin comprising the sub-Himalayan and the plain tract in Arunachal Pradesh and Assam, lies in the tropics. Due to Northeast as well as Southwest monsoon, precipitation occurs in this region in abundant quantities. Particularly, Southwest monsoon causes very heavy precipitation in the entire Subansiri basin during May to October. In this study, the rainfall over Subansiri basin has been studied at 24 different locations by multiple linear and non-linear regression based statistical downscaling techniques and by Artificial Neural Network based model. APHRODITE's gridded rainfall data of 0.25˚ x 0.25˚ resolutions and climatic parameters of HadCM3 GCM of resolution 2.5˚ x 3.75˚ (latitude by longitude) have been used in this study. It has been found that multiple non-linear regression based statistical downscaling technique outperformed the other techniques. Using this method, the future rainfall pattern over the Subansiri basin has been analyzed up to the year 2099 for four different time periods, viz., 2020-39, 2040-59, 2060-79, and 2080-99 at all the 24 locations. On the basis of historical rainfall, the months have been categorized as wet months, months with moderate rainfall and dry months. The spatial changes in rainfall patterns for all these three types of months have also been analyzed over the basin. Potential decrease of rainfall in the wet months and months with moderate rainfall and increase of rainfall in the dry months are observed for the future rainfall pattern of the Subansiri basin.

  2. Use of statistical and GIS techniques to assess and predict concentrations of heavy metals in soils of Lahore City, Pakistan.

    Science.gov (United States)

    Alam, Nayab; Ahmad, Sajid Rashid; Qadir, Abdul; Ashraf, Muhammad Imran; Lakhan, Calvin; Lakhan, V Chris

    2015-10-01

    Soils from different land use areas in Lahore City, Pakistan, were analyzed for concentrations of heavy metals-cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb). One hundred one samples were randomly collected from six land use areas categorized as park, commercial, agricultural, residential, urban, and industrial. Each sample was analyzed in the laboratory with the tri-acid digestion method. Metal concentrations in each sample were obtained with the use of an atomic absorption spectrophotometer. The statistical techniques of analysis of variance, correlation analysis, and cluster analysis were used to analyze all data. In addition, kriging, a geostatistical procedure supported by ArcGIS, was used to model and predict the spatial concentrations of the four heavy metals-Cd, Cr, Ni, and Pb. The results demonstrated significant correlation among the heavy metals in the urban and industrial areas. The dendogram, and the results associated with the cluster analysis, indicated that the agricultural, commercial, and park areas had high concentrations of Cr, Ni, and Pb. High concentrations of Cd and Ni were also observed in the residential and industrial areas, respectively. The maximum concentrations of both Cd and Pb exceeded world toxic limit values. The kriging method demonstrated increasing spatial diffusion of both Cd and Pb concentrations throughout and beyond the Lahore City area.

  3. Case studies on sugar production from underutilized woody biomass using sulfite chemistry

    Science.gov (United States)

    J.Y. Zhu; M. Subhosh Chandra; Roland Gleisner; William Gilles; Johnway Gao; Gevan Marrs; Dwight Anderson; John Sessions

    2015-01-01

    We examined two case studies to demonstrate the advantages of sulfite chemistry for pretreating underutilized woody biomass to produce sugars through enzymatic saccharification. In the first case study, we evaluated knot rejects from a magnesium-basedsulfite mill for direct enzymatic sugar production.We found that the sulfite mill rejects are an excellent feedstock for...

  4. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  5. Factors responsible for under-utilization of postnatal care services in Maiduguri, north-eastern Nigeria

    Directory of Open Access Journals (Sweden)

    Idris Usman Takai

    2015-01-01

    Full Text Available Background: In Maiduguri, the utilization of available postnatal care services is still very low. This may be influenced by demographic, socioeconomic, cultural, and obstetric factors among others. Objective: The aim of this study is to understand the current status of utilization of maternal postnatal health care services and identify factors responsible for under.utilization of available postnatal care services in Maiduguri. Materials and Methods: A. cross.sectional, questionnaire.based study was conducted involving 350 women in their reproductive age group. (15.49. years, who had delivered previously, residing in Maiduguri and who came to access any of the available maternal health care services at the State Specialist Hospital, Maiduguri over a 3.month period. The Chi.squared statistics and multivariate logistic regression analysis were used. Results: Out of the grand total of 350 questionnaires that were distributed during the study period, 18 questionnaires were excluded from analysis due to incomplete responses, 332 with complete responses were therefore analyzed, giving a response rate of 94.9%. The results showed that only 16.9% of the respondents attended postnatal care services within 42. days after delivery. Most of the mothers. (60.9% were not knowledgeable about postnatal care services. A. very high proportion of participants. (69.4% did not attend antenatal clinics, and over. 70% of the study population had delivered at home. The study has identified some factors that have an important influence on utilization of postnatal care services in Maiduguri. These included awareness of postnatal care services. (odds ratio. [OR] 12.04, 95% confidence interval. [CI]: 10.26, P =. 0.000, higher educational status of the woman. (OR 7.15, 95% CI: 5.19, P =0.000, lower parity. (OR 5.22, 95% CI: 3.21, P = 0.001 and marital status. (married woman.OR 3.44, 95% CI: 2.17, P =0.002. Educational attainment of the husband also significantly affected the

  6. Hormone therapy might be underutilized in women with early menopause.

    Science.gov (United States)

    Lindh-Åstrand, L; Hoffmann, M; Järvstråt, L; Fredriksson, M; Hammar, M; Spetz Holm, A-C

    2015-04-01

    Are Swedish women age 40-44 years with assumed early menopause 'undertreated' by hormone therapy (HT)? Many women with probable early menopause discontinue their HT after a short period of time. Thus, they fail to complete the recommended replacement up to age 51-52 years, the average age of menopause. Spontaneous early menopause occurs in ∼5% of women age 40-45 years. Regardless of the cause, women who experience hormonal menopause due to bilateral oophorectomy before the median age of spontaneous menopause are at increased risk of cardiovascular disease, neurological disease, osteoporosis, psychiatric illness and even death. The study is descriptive, and epidemiological and was based on the use of national registers of dispensed drug prescriptions (HT) linking registers from the National Board of Health and Welfare and Statistics Sweden from 1 July 2005 until 31 December 2011. The study population consisted of 310 404 women, 40-44 years old on 31 December 2005 who were followed from 1 July 2005 until 31 December 2011. Only 0.9% of women 40-44 years old started HT during the study period. A majority of these women used HT menopause was the main reason. Because of the study design-making a retrospective study of registers-we can only speculate on the reasons for most of the women in this group discontinuing HT. Another limitation of this study is that we have a rather short observation time. However, we have up to now only been able to collect and combine the data since July 2005. As the occurrence of spontaneous early menopause in women age 40-45 is reported to be ∼5%, the fact that menopause may have used combined contraceptives as supplementation therapy, but in Sweden HT is the recommended treatment for early menopause so any such women are not following this recommendation. Women who experience early menopause are at increased risk for overall morbidity and mortality, and can expect to benefit from HT until they have reached at least the median age of

  7. Groundwater quality assessment of the shallow aquifers west of the Nile Delta (Egypt) using multivariate statistical and geostatistical techniques

    Science.gov (United States)

    Masoud, Alaa A.

    2014-07-01

    Extensive urban, agricultural and industrial expansions on the western fringe of the Nile Delta of Egypt have exerted much load on the water needs and lead to groundwater quality deterioration. Documenting the spatial variation of the groundwater quality and their controlling factors is vital to ensure sustainable water management and safe use. A comprehensive dataset of 451 shallow groundwater samples were collected in 2011 and 2012. On-site field measurements of the total dissolved solids (TDS), electric conductivity (EC), pH, temperature, as well as lab-based ionic composition of the major and trace components were performed. Groundwater types were derived and the suitability for irrigation use was evaluated. Multivariate statistical techniques of factor analysis and K-means clustering were integrated with the geostatistical semi-variogram modeling for evaluating the spatial hydrochemical variations and the driving factors as well as for hydrochemical pattern recognition. Most hydrochemical parameters showed very wide ranges; TDS (201-24,400 mg/l), pH (6.72-8.65), Na+ (28.30-7774 mg/l), and Cl- (7-12,186 mg/l) suggesting complex hydrochemical processes of multiple sources. TDS violated the limit (1200 mg/l) of the Egyptian standards for drinking water quality in many localities. Extreme concentrations of Fe2+, Mn2+, Zn2+, Cu2+, Ni2+, are mostly related to their natural content in the water-bearing sediments and/or to contamination from industrial leakage. Very high nitrate concentrations exceeding the permissible limit (50 mg/l) were potentially maximized toward hydrologic discharge zones and related to wastewater leakage. Three main water types; NaCl (29%), Na2SO4 (26%), and NaHCO3 (20%), formed 75% of the groundwater dominated in the saline depressions, sloping sides of the coastal ridges of the depressions, and in the cultivated/newly reclaimed lands intensely covered by irrigation canals, respectively. Water suitability for irrigation use clarified that the

  8. UNDER-UTILIZATION OF COMMUNITY HEALTH CENTERS IN PURWOREJO REGENCY, CENTRAL JAVA

    OpenAIRE

    Atik Triratnawati

    2006-01-01

    The basic strategy of the Ministry of Health to achieve Health For All In Indonesia 2010 is through health paradigm, decentralization, professionalism and health service management. Community health centers play an important role to achieve the goal. Unfortunately, underutilization of community health centers is still a problem in Purworejo. The purpose of this study was to know the utilization of community health centers using a sociological health approach. Qualitative research by observati...

  9. UNDER-UTILIZATION OF COMMUNITY HEALTH CENTERS IN PURWOREJO REGENCY, CENTRAL JAVA

    Directory of Open Access Journals (Sweden)

    Atik Triratnawati

    2006-06-01

    Full Text Available The basic strategy of the Ministry of Health to achieve Health For All In Indonesia 2010 is through health paradigm, decentralization, professionalism and health service management. Community health centers play an important role to achieve the goal. Unfortunately, underutilization of community health centers is still a problem in Purworejo. The purpose of this study was to know the utilization of community health centers using a sociological health approach. Qualitative research by observation, in-depth interview and focus group discussion were done among different types of group. The study was done in Purworejo District on February and March 2000. The main problems related to underutilization of community health centers are mostly on administration (less quality services, un-efficient, long hours waiting, strong bureaucratic system (physician has a dominant power, overlapping programs, poor coordination and integration with other divisions and cultural behavior of the community (labeling/stigma, self-care dominant, lack of community participation. To overcome under-utilization of community health centers the administration and bureaucracy should be changed into more efficient, not bureaucratic management. In addition social changes of the community culture is needed. As a consequence through these changes the staff of the health centers will be more efficient and effective.

  10. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  11. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  12. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  13. Influence of manufacturing parameters on the strength of PLA parts using Layered Manufacturing technique: A statistical approach

    Science.gov (United States)

    Jaya Christiyan, K. G.; Chandrasekhar, U.; Mathivanan, N. Rajesh; Venkateswarlu, K.

    2018-02-01

    A 3D printing was successfully used to fabricate samples of Polylactic Acid (PLA). Processing parameters such as Lay-up speed, Lay-up thickness, and printing nozzle were varied. All samples were tested for flexural strength using three point load test. A statistical mathematical model was developed to correlate the processing parameters with flexural strength. The result clearly demonstrated that the lay-up thickness and nozzle diameter influenced flexural strength significantly, whereas lay-up speed hardly influenced the flexural strength.

  14. Elemental analysis of ancient potteries of Vellore Dist, Tamil Nadu, India by ED-XRF technique with statistical approach

    Directory of Open Access Journals (Sweden)

    A. Naseerutheen

    2014-03-01

    Full Text Available In the analysis of archaeological pottery, Energy Dispersive X-ray florescence analysis has been utilized to establish the elemental concentrations up to fourteen chemical elements for each of 14 archaeological pottery samples from Vellore Dist, Tamil Nadu, India. The EDXRF results have been processed using two multivariate statistical cluster and principal component analysis (PCA methods in order to determine the similarities and correlation between the selected samples based on their elemental composition. The methodology successfully separates the samples where two distinct chemical groups were discerned.

  15. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  16. A statistical forecast model using the time-scale decomposition technique to predict rainfall during flood period over the middle and lower reaches of the Yangtze River Valley

    Science.gov (United States)

    Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao

    2018-04-01

    In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.

  17. Airport surveys at travel destinations--underutilized opportunities in travel medicine research?

    Science.gov (United States)

    Bauer, Irmgard L

    2015-01-01

    Research in destination airports, especially in resource-poor areas, allows unique immediate access to travelers at the conclusion of their trip. Response rates are high and the recall gap small. Trip-related health matters can be elicited relatively easily. An insight into travelers' decision-making processes on location would fill large gaps in our knowledge regarding travel health advice provision; yet, this approach is still much underutilized. Using PubMed, ScienceDirect, Google Scholar, and ProQuest, a review of the literature on airport surveys was conducted to determine where they were used, their response rates and purpose, and location-relevant methodological information. The lack of methodological guidelines in the reviewed literature resulted in recommendations for planning and conducting an airport survey at a destination airport. Millions of travelers in airports around the world represent an underutilized sample of potential study participants for topics that cannot be studied adequately in other settings. Benefiting from close cooperation between travel health professionals and airport authorities, researchers can expect not only large-scale convenience samples for surveys, but also opportunities to explore exciting and creative research topics to broaden our understanding of travel medicine and health. © 2014 International Society of Travel Medicine.

  18. Diversity of the Neglected and Underutilized Crop Species of Importance in Benin

    Directory of Open Access Journals (Sweden)

    A. Dansi

    2012-01-01

    Full Text Available Many of the plant species that are cultivated for food across the world are neglected and underutilized. To assess their diversity in Benin and identify the priority species and establish their research needs, a survey was conducted in 50 villages distributed throughout the country. The study revealed 41 neglected and underutilized crop species (NUCS among which 19 were identified as of priority base on 10 criteria among which included their extent and degree of consumption. Reasons for neglect vary with the producers and the agricultural technicians. Market surveys revealed that NUCS are important source of household incomes and substantially contribute to poverty reduction. Review of the literature available revealed that most of the species are rich in nutrients and have some proven medicinal values and the promotion of their use would help in combating malnutrition and improving the health status of the local populations. The knowledge gaps and research needs are immense on most of the species identified as no concrete scientific data is nationally available. In terms of research, almost all has to be done starting from basic ethnobotanical investigation. The results will help the scientists and students willing to conduct research on NUCS in Benin to better orient their research programs.

  19. Transformation of Food Habits through Promotion of Under-Utilized Cereals in High Hills of Nepal

    International Nuclear Information System (INIS)

    Koirala, Pramod; Bajracharya, Keshari; Chalise, Ananda

    2014-01-01

    Full text: Malnutrition is a persistent social setback in Nepal. High hills in Nepal is considered as the headquarter of the malnourished people as it holds almost the twice of stunted children than the national average. Food insecurity is the major causes of malnutrition as there is low agriculture production followed by difficult terrain and poor road connectivity. Nevertheless, there are several types of locally produced cereals that are under-utilized because of the traditional food-habit of eating rice. In order to bring a change in local food habit by the high-hill residents, attempts were made in processing of under-utilized cereals. Six different cereals were processed into super flour, porridge, cookies, flakes and traditional sweets for which locally accessible home level processing technology was used. Sorghum (Sorghum bicolor), Foxtail Millet (Setaria italica), Porso Millet (Panicum miliaceum), Buck Wheat (Fagopyrum esculentum), Amaranth (Amaranth caudatus) and Naked Barley (Hordeum vulgare) were processed into diverse products that suit to the local taste. The processing steps were standardized and laboratory analysis was carried out. It was then distributed to local development partners through trainers' training. Now, local people have started processing and consuming these products. It is anticipated that processed products promotion helps in solving food insecurity to some extent, and contributes in reducing malnutrition for the children below two. (author)

  20. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques.

    Science.gov (United States)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-10-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A Multivariate Statistical Analysis to Guide Classification of 2MASS/DPOSS Galaxies Using Data Mining Techniques

    Science.gov (United States)

    Mazzarella, J.; Jarrett, T.; Odewahn, S.; Cutri, R.; Chester, T.; Schmitz, M.; Monkewitz, S.; Madore, B.

    1999-05-01

    The Spring 1999 Incremental Release of the Two Micron All-Sky Survey (2MASS) Extended Source Catalog (XSC) contains new near-infrared measurements for about eighty thousand extended objects, most of which are previously uncatalogued galaxies. Likewise, the Second Generation Digital Palomar Observatory Sky Survey (DPOSS) provides a rich archive of new visual measurements over the same regions of the sky. Concise graphical and statistical summary data are used to systematically quantify the source densities in various slices of the 2MASS+DPOSS parameter space, including BRIJHK color space, concentration indices, central and average surface brightnesses, and isophotal parameters. Results are also presented for a global principal components analysis of this merged 2MASS+DPOSS dataset for the Spring 1999 XSC sample, with the primary goal of identifying the most important linear combinations of variables to feed into a decision-tree algorithm which will be applied in a follow-up study to attempt supervised classification of previously uncatalogued galaxies. An initial cross-comparison with the current NASA/IPAC Extragalactic Database (NED) shows that approximately 10% of the Spring 1999 XSC sample are previously catalogued objects. Distributions of 2MASS/DPOSS sources with published morphological types and nuclear activity levels (starburst, LINER, Seyfert) available in NED are summarized in the context of forming a training set for a machine learning classifier.

  2. QbD-driven development and evaluation of nanostructured lipid carriers (NLCs) of Olmesartan medoxomil employing multivariate statistical techniques.

    Science.gov (United States)

    Beg, Sarwar; Saini, Sumant; Bandopadhyay, Shantanu; Katare, O P; Singh, Bhupinder

    2018-03-01

    This research work entails quality by design (QbD)-based systematic development of nanostructured lipid carriers (NLCs) of Olmesartan medoxomil (OLM) with improved biopharmaceutical attributes. Quality target product profile (QTPP) was defined and critical quality attributes (CQAs) were earmarked. Solubility of drug was performed in various lipids for screening of them. NLCs were prepared by hot-microemulsion method using solid lipids, liquid lipids and surfactants with maximal solubility. Failure mode and effect analysis (FMEA) was carried out for identifying high risk formulation and process parameters. Further, principal component analysis (PCA) was applied on high risk parameters for evaluating the effect of type and concentration of lipids and surfactants on CQAs. Further, systematic optimization of critical material attributes (CMAs) was carried out using face centered cubic design and optimized formulation was identified in the design space. FMEA and PCA suggested suitability of stearic acid, oleic acid and Tween 80 as the CMAs for NLCs. Response surface optimization helped in identifying the optimized NLC formulation with particle size ∼250 nm, zeta potential 75%, in vitro drug release >80% within 6 h. Release kinetic modeling indicated drug release through Fickian-diffusion mechanism. Overall, these studies indicated successful development of NLCs using multivariate statistical approaches for improved product and process understanding.

  3. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  4. Statistical optimization of cell disruption techniques for releasing intracellular X-prolyl dipeptidyl aminopeptidase from Lactococcus lactis spp. lactis.

    Science.gov (United States)

    Üstün-Aytekin, Özlem; Arısoy, Sevda; Aytekin, Ali Özhan; Yıldız, Ece

    2016-03-01

    X-prolyl dipeptidyl aminopeptidase (PepX) is an intracellular enzyme from the Gram-positive bacterium Lactococcus lactis spp. lactis NRRL B-1821, and it has commercial importance. The objective of this study was to compare the effects of several cell disruption methods on the activity of PepX. Statistical optimization methods were performed for two cavitation methods, hydrodynamic (high-pressure homogenization) and acoustic (sonication), to determine the more appropriate disruption method. Two level factorial design (2FI), with the parameters of number of cycles and pressure, and Box-Behnken design (BBD), with the parameters of cycle, sonication time, and power, were used for the optimization of the high-pressure homogenization and sonication methods, respectively. In addition, disruption methods, consisting of lysozyme, bead milling, heat treatment, freeze-thawing, liquid nitrogen, ethylenediaminetetraacetic acid (EDTA), Triton-X, sodium dodecyl sulfate (SDS), chloroform, and antibiotics, were performed and compared with the high-pressure homogenization and sonication methods. The optimized values of high-pressure homogenization were one cycle at 130 MPa providing activity of 114.47 mU ml(-1), while sonication afforded an activity of 145.09 mU ml(-1) at 28 min with 91% power and three cycles. In conclusion, sonication was the more effective disruption method, and its optimal operation parameters were manifested for the release of intracellular enzyme from a L. lactis spp. lactis strain, which is a Gram-positive bacterium. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Paediatric lower limb deformity correction using the Ilizarov technique: a statistical analysis of factors affecting the complication rate.

    Science.gov (United States)

    Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M

    2014-01-01

    Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.

  6. Combined statistical analysis of vasodilation and flow curves in brachial ultrasonography: technique and its connection to cardiovascular risk factors

    Science.gov (United States)

    Boisrobert, Loic; Laclaustra, Martin; Bossa, Matias; Frangi, Andres G.; Frangi, Alejandro F.

    2005-04-01

    Clinical studies report that impaired endothelial function is associated with Cardio-Vascular Diseases (CVD) and their risk factors. One commonly used mean for assessing endothelial function is Flow-Mediated Dilation (FMD). Classically, FMD is quantified using local indexes e.g. maximum peak dilation. Although such parameters have been successfully linked to CVD risk factors and other clinical variables, this description does not consider all the information contained in the complete vasodilation curve. Moreover, the relation between flow impulse and the vessel vasodilation response to this stimulus, although not clearly known, seems to be important and is not taken into account in the majority of studies. In this paper we propose a novel global parameterization for the vasodilation and the flow curves of a FMD test. This parameterization uses Principal Component Analysis (PCA) to describe independently and jointly the variability of flow and FMD curves. These curves are obtained using computerized techniques (based on edge detection and image registration, respectively) to analyze the ultrasound image sequences. The global description obtained through PCA yields a detailed characterization of the morphology of such curves allowing the extraction of intuitive quantitative information of the vasodilation process and its interplay with flow changes. This parameterization is consistent with traditional measurements and, in a database of 177 subjects, seems to correlate more strongly (and with more clinical parameters) than classical measures to CVD risk factors and clinical parameters such as LDL- and HDL-Cholesterol.

  7. A nonparametric statistical technique for combining global precipitation datasets: development and hydrological evaluation over the Iberian Peninsula

    Science.gov (United States)

    Abul Ehsan Bhuiyan, Md; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Quintana-Seguí, Pere; Barella-Ortiz, Anaïs

    2018-02-01

    This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF), for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000-2010). Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN) available at 5 km 1 h-1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme) and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20-99 and 44-88 %, respectively, when considering the ensemble mean.

  8. Estimation of both optical and nonoptical surface water quality parameters using Landsat 8 OLI imagery and statistical techniques

    Science.gov (United States)

    Sharaf El Din, Essam; Zhang, Yun

    2017-10-01

    Traditional surface water quality assessment is costly, labor intensive, and time consuming; however, remote sensing has the potential to assess surface water quality because of its spatiotemporal consistency. Therefore, estimating concentrations of surface water quality parameters (SWQPs) from satellite imagery is essential. Remote sensing estimation of nonoptical SWQPs, such as chemical oxygen demand (COD), biochemical oxygen demand (BOD), and dissolved oxygen (DO), has not yet been performed because they are less likely to affect signals measured by satellite sensors. However, concentrations of nonoptical variables may be correlated with optical variables, such as turbidity and total suspended sediments, which do affect the reflected radiation. In this context, an indirect relationship between satellite multispectral data and COD, BOD, and DO can be assumed. Therefore, this research attempts to develop an integrated Landsat 8 band ratios and stepwise regression to estimate concentrations of both optical and nonoptical SWQPs. Compared with previous studies, a significant correlation between Landsat 8 surface reflectance and concentrations of SWQPs was achieved and the obtained coefficient of determination (R2)>0.85. These findings demonstrated the possibility of using our technique to develop models to estimate concentrations of SWQPs and to generate spatiotemporal maps of SWQPs from Landsat 8 imagery.

  9. Impact of the adaptive statistical iterative reconstruction technique on image quality in ultra-low-dose CT

    International Nuclear Information System (INIS)

    Xu, Yan; He, Wen; Chen, Hui; Hu, Zhihai; Li, Juan; Zhang, Tingting

    2013-01-01

    Aim: To evaluate the relationship between different noise indices (NIs) and radiation dose and to compare the effect of different reconstruction algorithm applications for ultra-low-dose chest computed tomography (CT) on image quality improvement and the accuracy of volumetric measurement of ground-glass opacity (GGO) nodules using a phantom study. Materials and methods: A 11 cm thick transverse phantom section with a chest wall, mediastinum, and 14 artificial GGO nodules with known volumes (919.93 ± 64.05 mm 3 ) was constructed. The phantom was scanned on a Discovery CT 750HD scanner with five different NIs (NIs = 20, 30, 40, 50, and 60). All data were reconstructed with a 0.625 mm section thickness using the filtered back-projection (FBP), 50% adaptive statistical iterative reconstruction (ASiR), and Veo model-base iterative reconstruction algorithms. Image noise was measured in six regions of interest (ROIs). Nodule volumes were measured using a commercial volumetric software package. The image quality and the volume measurement errors were analysed. Results: Image noise increased dramatically from 30.7 HU at NI 20 to 122.4 HU at NI 60, with FBP reconstruction. Conversely, Veo reconstruction effectively controlled the noise increase, with an increase from 9.97 HU at NI 20 to only 15.1 HU at NI 60. Image noise at NI 60 with Veo was even lower (50.8%) than that at NI 20 with FBP. The contrast-to-noise ratio (CNR) of Veo at NI 40 was similar to that of FBP at NI 20. All artificial GGO nodules were successfully identified and measured with an average relative volume measurement error with Veo at NI 60 of 4.24%, comparable to a value of 10.41% with FBP at NI 20. At NI 60, the radiation dose was only one-tenth that at NI 20. Conclusion: The Veo reconstruction algorithms very effectively reduced image noise compared with the conventional FBP reconstructions. Using ultra-low-dose CT scanning and Veo reconstruction, GGOs can be detected and quantified with an acceptable

  10. A nonparametric statistical technique for combining global precipitation datasets: development and hydrological evaluation over the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    M. A. E. Bhuiyan

    2018-02-01

    Full Text Available This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF, for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000–2010. Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7; an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN available at 5 km 1 h−1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20–99 and 44–88 %, respectively, when considering the ensemble mean.

  11. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  12. Statistical techniques to construct assays for identifying likely responders to a treatment under evaluation from cell line genomic data

    International Nuclear Information System (INIS)

    Huang, Erich P; Fridlyand, Jane; Lewin-Koh, Nicholas; Yue, Peng; Shi, Xiaoyan; Dornan, David; Burington, Bart

    2010-01-01

    Developing the right drugs for the right patients has become a mantra of drug development. In practice, it is very difficult to identify subsets of patients who will respond to a drug under evaluation. Most of the time, no single diagnostic will be available, and more complex decision rules will be required to define a sensitive population, using, for instance, mRNA expression, protein expression or DNA copy number. Moreover, diagnostic development will often begin with in-vitro cell-line data and a high-dimensional exploratory platform, only later to be transferred to a diagnostic assay for use with patient samples. In this manuscript, we present a novel approach to developing robust genomic predictors that are not only capable of generalizing from in-vitro to patient, but are also amenable to clinically validated assays such as qRT-PCR. Using our approach, we constructed a predictor of sensitivity to dacetuzumab, an investigational drug for CD40-expressing malignancies such as lymphoma using genomic measurements of cell lines treated with dacetuzumab. Additionally, we evaluated several state-of-the-art prediction methods by independently pairing the feature selection and classification components of the predictor. In this way, we constructed several predictors that we validated on an independent DLBCL patient dataset. Similar analyses were performed on genomic measurements of breast cancer cell lines and patients to construct a predictor of estrogen receptor (ER) status. The best dacetuzumab sensitivity predictors involved ten or fewer genes and accurately classified lymphoma patients by their survival and known prognostic subtypes. The best ER status classifiers involved one or two genes and led to accurate ER status predictions more than 85% of the time. The novel method we proposed performed as well or better than other methods evaluated. We demonstrated the feasibility of combining feature selection techniques with classification methods to develop assays

  13. Quality characterization and pollution source identification of surface water using multivariate statistical techniques, Nalagarh Valley, Himachal Pradesh, India

    Science.gov (United States)

    Herojeet, Rajkumar; Rishi, Madhuri S.; Lata, Renu; Dolma, Konchok

    2017-09-01

    multivariate techniques for reliable quality characterization of surface water quality to develop effective pollution reduction strategies and maintain a fine balance between the industrialization and ecological integrity.

  14. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  15. Assessment of statistical agreement of three techniques for the study of cut marks: 3D digital microscope, laser scanning confocal microscopy and micro-photogrammetry.

    Science.gov (United States)

    Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel

    2017-09-01

    In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  16. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    Energy Technology Data Exchange (ETDEWEB)

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).

  17. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    Science.gov (United States)

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  18. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt

    Directory of Open Access Journals (Sweden)

    Fernando Velasco-Tapia

    2014-01-01

    Full Text Available Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC volcanic range (Mexican Volcanic Belt. In this locality, the volcanic activity (3.7 to 0.5 Ma was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward’s linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas in the comingled lavas (binary mixtures.

  19. Collaborative strategies are underutilized for mental health promotion: For the motion

    Directory of Open Access Journals (Sweden)

    Mohan Isaac

    2016-01-01

    Full Text Available Interventions for mental health promotion have to be initiated not just by the traditional mental health sector but by numerous other sectors and stakeholders who are involved in dealing with the social determinants of mental health. Collaboration would the most appropriate and effective approach to deal with social determinants of mental health. However, collaborative strategies are grossly underutilized or almost nonutilized at regional, national, and international levels. There are several reasons for this nonutilization. Foremost among them is the continuing struggle of mental health services all over the world, in both resource rich as well as resource poor settings, to effectively fill the treatment gap and provide services of adequate quality for the mentally unwell population. There is a need to expand the evidence base for mental health promotion and identify effective interventions which can be collaboratively implemented.

  20. Phenolic constituents and antioxidant capacity of four underutilized fruits from the Amazon region.

    Science.gov (United States)

    Gordon, Andre; Jungfer, Elvira; da Silva, Bruno Alexandre; Maia, Jose Guilherme S; Marx, Friedhelm

    2011-07-27

    The Amazon region comprises a plethora of fruit-bearing species of which a large number are still agriculturally unimportant. Because fruit consumption has been attributed to an enhanced physical well-being, interest in the knowledge of the chemical composition of underexplored exotic fruits has increased during recent years. This paper provides a comprehensive identification of the polyphenolic constituents of four underutilized fruits from the Amazon region by HPLC/DAD-ESI-MS(n). Araçá ( Psidium guineense ), jambolão ( Syzygium cumini ), muruci ( Byrsonima crassifolia ), and cutite ( Pouteria macrophylla ) turned out to be primarily good sources of hydrolyzable tannins and/or flavonols. Additionally, different flavanonols and proanthocyanidins were identified in some fruits. The antioxidant capacity was determined by using the total oxidant scavenging capacity (TOSC) assay. Cutite showed the highest antioxidant capacity followed by jambolão, araçá, and muruci.

  1. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    Science.gov (United States)

    Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe

    2017-12-01

    This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances

  2. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    Directory of Open Access Journals (Sweden)

    S. Ars

    2017-12-01

    Full Text Available This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping

  3. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  4. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    Science.gov (United States)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  5. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques

    International Nuclear Information System (INIS)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D.; Rodriguez G, N. L.

    2009-01-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1·10 13 n·cm -2 ·s -1 . The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  6. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    Science.gov (United States)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  7. Remote sensing estimation of the total phosphorus concentration in a large lake using band combinations and regional multivariate statistical modeling techniques.

    Science.gov (United States)

    Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi

    2015-03-15

    Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate

  8. Factors influencing the underutilization of mental health services among Asian American women with a history of depression and suicide.

    Science.gov (United States)

    Augsberger, Astraea; Yeung, Albert; Dougher, Meaghan; Hahm, Hyeouk Chris

    2015-12-08

    Despite the substantially high prevalence of depression, suicidal ideation and suicide attempts among Asian American women who are children of immigrants, little is known about the prevalence of mental health utilization and the perceived barriers to accessing care. The data were from the Asian American Women's Sexual Health Initiative Project (AWSHIP), a 5-year mixed methods study at Boston University. The quantitative analysis examined the differential proportion of mental health utilization among 701 survey participants based on their mental health risk profile determined by current moderate to severe depression symptoms and lifetime history of suicidality. Mental health risk groups were created based on participants' current depression symptoms and history of suicide behaviors: Group 1-low-risk; Group 2-medium-risk; Group 3-high-risk. Mental health care utilization outcomes were measured by any mental health care, minimally adequate mental health care, and intensive mental health care. The qualitative analysis explored the perceived barriers to mental health care among 17 participants from the medium and high-risk groups. Among 701 participants, 43% of women (n = 299) reported that they either suffered from current moderate to severe depression symptoms or a lifetime history of suicidal ideation or suicide attempt. Although the high-risk group demonstrated statistically significant higher mental health utilization compared to the low and medium-risk groups, more than 60% of the high-risk group did not access any mental health care, and more than 80% did not receive minimally adequate care. The qualitative analysis identified three underutilization factors: Asian family contributions to mental health stigma, Asian community contributions to mental health stigma, and a mismatch between cultural needs and available services. Despite the high prevalence of depression and suicidal behaviors among young Asian American women in the sample, the proportion of mental

  9. Experimental evolution as an underutilized tool for studying beneficial animal-microbe interactions

    Directory of Open Access Journals (Sweden)

    Kim Loan Hoang

    2016-09-01

    Full Text Available Microorganisms play a significant role in the evolution and functioning of the eukaryotes with which they interact. Much of our understanding of beneficial host-microbe interactions stems from studying already established associations; we often infer the genotypic and environmental conditions that led to the existing host-microbe relationships. However, several outstanding questions remain, including understanding how host and microbial (internal traits, and ecological and evolutionary (external processes, influence the origin of beneficial host-microbe associations. Experimental evolution has helped address a range of evolutionary and ecological questions across different model systems; however, it has been greatly underutilized as a tool to study beneficial host-microbe associations. In this review, we suggest ways in which experimental evolution can further our understanding of the proximate and ultimate mechanisms shaping mutualistic interactions between eukaryotic hosts and microbes. By tracking beneficial interactions under defined conditions or evolving novel associations among hosts and microbes with little prior evolutionary interaction, we can link specific genotypes to phenotypes that can be directly measured. Moreover, this approach will help address existing puzzles in beneficial symbiosis research: how symbioses evolve, how symbioses are maintained, and how both host and microbe influence their partner’s evolutionary trajectories. By bridging theoretical predictions and empirical tests, experimental evolution provides us with another approach to test hypotheses regarding the evolution of beneficial host-microbe associations.

  10. Development and Examination of Sweet Potato Flour Fortified with Indigenous Underutilized Seasonal Vegetables

    Directory of Open Access Journals (Sweden)

    Ernest Teye

    2018-01-01

    Full Text Available Developing nutrient-rich vegetable flour using locally under-utilized food crops in Africa would improve rural house-hold nutrition. This study seeks to develop nutrient-dense vegetable flour from different proportions of Sweet potato (Sp 40–100%, Avocado pear (Avo 10–40%, and Turkey berry (Tor 10–40%, using completely randomized design (CRD with 14 treatment combinations and three replications. The proximate composition, mineral composition, and functional properties were investigated on the composite flour. The results showed significant differences in all the parameters analyzed for the various composite flours. As the amount of Avo and Tor was added to the Sp, the proximate composition was enhanced except for the percentage carbohydrate, which decreased from 83.92 to 54.59 g/100 g. The mineral composition was also improved by the incorporation of Avo and Tor. Favourable functional properties were also obtained. The optimal composite flour was made up of 40% Sp, 35% Avo, and 25% Tor. The functional properties of the composite flours were better than the control (Sweet potato flour. Fortifying Sp flour with Avo and Tor is feasible and could be an easy and affordable means to improve rural nutrition, as it requires simple logistics for the ordinary rural household to produce the composite of the desired choice.

  11. Epidemiology of meningitis with a negative CSF Gram stain: under-utilization of available diagnostic tests.

    Science.gov (United States)

    Nesher, L; Hadi, C M; Salazar, L; Wootton, S H; Garey, K W; Lasco, T; Luce, A M; Hasbun, R

    2016-01-01

    Meningitis with a negative cerebrospinal fluid Gram stain (CSF-GS) poses a diagnostic challenge as more than 50% of patients remain without an aetiology. The introduction of polymerase chain reaction (PCR) and arboviral serologies have increased diagnostic capabilities, yet large scale epidemiological studies evaluating their use in clinical practice are lacking. We conducted a prospective observational study in New Orleans between November 1999 and September 2008 (early era) when PCR was not widely available, and in Houston between November 2008 and June 2013 (modern era), when PCR was commonly used. Patients presenting with meningitis and negative CSF-GS were followed for 4 weeks. All investigations, PCR used, and results were recorded as they became available. In 323 patients enrolled, PCR provided the highest diagnostic yield (24·2%) but was ordered for 128 (39·6%) patients; followed by serology for arboviruses (15%) that was ordered for 100 (31%) of all patients. The yield of blood cultures was (10·3%) and that of CSF cultures was 4%; the yield for all other tests was meningitis and a negative CSF-GS, but both tests are being under-utilized.

  12. Parathyroidectomy is Underutilized in Patients with Tertiary Hyperparathyroidism after Renal Transplantation

    Science.gov (United States)

    Lou, Irene; Schneider, David F; Leverson, Glen; Foley, David; Sippel, Rebecca; Chen, Herbert

    2015-01-01

    Background Parathyroidectomy is the only curative treatment for tertiary hyperparathyroidism (3HPT). With the introduction of calcimimetics (cinacalcet), parathyroidectomy can sometimes be delayed or avoided. The purpose of this study was to determine the current incidence of utilization of parathyroidectomy in patients with post-transplant 3HPT with the advent of cinacalcet. Method We evaluated renal transplant patients between 1/1/2004-6/30/2012 with a minimum of 24 months follow-up who had persistent allograft function. Patients with an increased serum level of parathyroid hormone (PTH) one year after successful renal transplantation with normocalcemia or hypercalcemia were defined as having 3HPT. A multivariate logistic regression model was constructed to determine factors associated with undergoing parathyroidectomy. Results We identified 618 patients with 3HPT, only 41 (6.6%) of whom underwent parathyroidectomy. Patients with higher levels of serum calcium (p<0.001) and PTH (p=0.002) post-transplant were more likely to be referred for parathyroidectomy. Importantly, those who underwent parathyroidectomy had serum calcium and PTH values distributed more closely to the normal range on most recent follow-up. Parathyroidectomy was not associated with rejection (p=0.400) or with worsened allograft function (p=0.163). Conclusion Parathyroidectomy appears to be underutilized in patients with 3HPT at our institution. Parathyroidectomy is associated with high cure rates, improved serum calcium and PTH levels, and is not associated with rejection. PMID:26603850

  13. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  14. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  15. Design of a Seismic Reflection Multi-Attribute Workflow for Delineating Karst Pore Systems Using Neural Networks and Statistical Dimensionality Reduction Techniques

    Science.gov (United States)

    Ebuna, D. R.; Kluesner, J.; Cunningham, K. J.; Edwards, J. H.

    2016-12-01

    An effective method for determining the approximate spatial extent of karst pore systems is critical for hydrological modeling in such environments. When using geophysical techniques, karst features are especially challenging to constrain due to their inherent heterogeneity and complex seismic signatures. We present a method for mapping these systems using three-dimensional seismic reflection data by combining applications of machine learning and modern data science. Supervised neural networks (NN) have been successfully implemented in seismic reflection studies to produce multi-attributes (or meta-attributes) for delineating faults, chimneys, salt domes, and slumps. Using a seismic reflection dataset from southeast Florida, we develop an objective multi-attribute workflow for mapping karst in which potential interpreter bias is minimized by applying linear and non-linear data transformations for dimensionality reduction. This statistical approach yields a reduced set of input seismic attributes to the NN by eliminating irrelevant and overly correlated variables, while still preserving the vast majority of the observed data variance. By initiating the supervised NN from an eigenspace that maximizes the separation between classes, the convergence time and accuracy of the computations are improved since the NN only needs to recognize small perturbations to the provided decision boundaries. We contend that this 3D seismic reflection, data-driven method for defining the spatial bounds of karst pore systems provides great value as a standardized preliminary step for hydrological characterization and modeling in these complex geological environments.

  16. Cosmic Statistics of Statistics

    OpenAIRE

    Szapudi, I.; Colombi, S.; Bernardeau, F.

    1999-01-01

    The errors on statistics measured in finite galaxy catalogs are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi (1996) is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly nonlinear to weakly nonlinear scales. The final analytic formu...

  17. Injury Statistics

    Science.gov (United States)

    ... Certification Import Surveillance International Recall Guidance Civil and Criminal Penalties Federal Court Orders & Decisions Research & Statistics Research & Statistics Technical Reports Injury Statistics NEISS ...

  18. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  19. Modification of RDX and HMX crystals in procedure of solvent/anti-solvent by statistical methods of Taguchi analysis design and MLR technique

    Directory of Open Access Journals (Sweden)

    Hamid Reza Pouretedal

    2018-02-01

    Full Text Available Many of the physical and functional properties of RDX and HMX explosives are related to the crystalline structure of these materials. Crystalline defects affect the quality of the explosives. Therefore, in order to enhance the quality of these materials, it is necessary to form crystals with the lowest defects. In this research, we report the optimization of recrystallization process of RDX and HMX by statistical techniques. The solvent/anti-solvent procedure was used for recrystallization of HMX and RDX particles. The four parameters of i ratio of anti-solvent to solvent, ii ratio of solute to solvent, iii aging time, and iv cooling rate of mixture, were optimized by Taguchi analysis design. Taguchi L16 orthogonal array was used with sixteen rows corresponding to the number of tests in four columns at four levels. The apparent density of recrystallized of RDX and HMX particles was considered as the quality characteristic with the concept of “the larger-the-better”. The obtained graphs showed that the studied parameters were optimized in ratio 1:1 for anti-solvent to solvent, ratio 0.1 g⋅mL-1 for solute to solvent, aging time of 2 h and cooling rate of 1 °C⋅min-1. Also, the correlation between the investigated parameters and apparent density of crystals were studied by multiple linear regressions (MLR method for obtaining a model of prediction of apparent density. The P-values were indicated that in confidence level of 95%, the null hypothesis is rejected and a meaningful addition is observed in the proposed model.

  20. DHI evaluation by combining rock physics simulation and statistical techniques for fluid identification of Cambrian-to-Cretaceous clastic reservoirs in Pakistan

    Science.gov (United States)

    Ahmed, Nisar; Khalid, Perveiz; Shafi, Hafiz Muhammad Bilal; Connolly, Patrick

    2017-10-01

    The use of seismic direct hydrocarbon indicators is very common in exploration and reservoir development to minimise exploration risk and to optimise the location of production wells. DHIs can be enhanced using AVO methods to calculate seismic attributes that approximate relative elastic properties. In this study, we analyse the sensitivity to pore fluid changes of a range of elastic properties by combining rock physics studies and statistical techniques and determine which provide the best basis for DHIs. Gassmann fluid substitution is applied to the well log data and various elastic properties are evaluated by measuring the degree of separation that they achieve between gas sands and wet sands. The method has been applied successfully to well log data from proven reservoirs in three different siliciclastic environments of Cambrian, Jurassic, and Cretaceous ages. We have quantified the sensitivity of various elastic properties such as acoustic and extended elastic (EEI) impedances, elastic moduli ( K sat and K sat- μ), lambda-mu-rho method ( λρ and μρ), P-to-S-wave velocity ratio ( V P/ V S), and Poisson's ratio ( σ) at fully gas/water saturation scenarios. The results are strongly dependent on the local geological settings and our modeling demonstrates that for Cambrian and Cretaceous reservoirs, K sat- μ, EEI, V P/ V S, and σ are more sensitive to pore fluids (gas/water). For the Jurassic reservoir, the sensitivity of all elastic and seismic properties to pore fluid reduces due to high overburden pressure and the resultant low porosity. Fluid indicators are evaluated using two metrics: a fluid indicator coefficient based on a Gaussian model and an overlap coefficient which makes no assumptions about a distribution model. This study will provide a potential way to identify gas sand zones in future exploration.

  1. UNA TECNICA ESTADISTICA PARA MEDIR LA CONFLICTIVIDAD SOCIAL A TRAVES DEL REGISTRO ARQUEOLOGICO (A Statistical Technique to Measure Social Conflict through the Archaeological Record

    Directory of Open Access Journals (Sweden)

    Pascual Izquierdo-Egea

    2015-03-01

    Full Text Available Se presenta aqui una tecnica estadistica para medir la conflictividad social a traves del registro mortuorio. Nace al amparo del metodo de valoracion contextual empleado en el analisis de los ajuares funerarios desde 1993. Se trata de una herramienta fundamental para el desarrollo de la arqueologia de los fenomenos sociales, cuyos relevantes resultados empiricos avalan su trascendencia teorica. Tras proceder a su conceptualizacion en funcion de la desigualdad social y la riqueza relativa, se explican las dos clases de conflictividad social definidas: estructural o estatica y coyuntural o dinamica. Finalmente, se incluyen sus conexiones con la ley demografica de Malthus a traves de sus dos parametros: poblacion y recursos. Todo este entramado teorico se ilustra con algunas aplicaciones referidas a las civilizaciones antiguas, abarcando la protohistoria iberica, la Mesoamerica prehispanica o la Roma altoimperial. ENGLISH: A statistical technique to measure social conflict through the mortuary record is presented here. It is born under the contextual valuation method used in the analysis of grave goods since 1993. This is a fundamental tool for the development of the archaeology of social phenomena, whose relevant empirical results support its theoretical significance. After conveying its conceptualization in terms of social inequality and relative wealth, the two classes of social conflict are explained: static or structural and dynamic or conjunctural. Finally, connections with the Malthusian demographic law through its two parameters—population and resources—are included. The synthesis of these theoretical frameworks is illustrated with applications to ancient civilizations, including Iberian protohistory, prehispanic Mesoamerica, and early imperial Rome.

  2. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  3. Understanding groundwater chemistry using multivariate statistics techniques to the study of contamination in the Korba unconfined aquifer system of Cap-Bon (North-east of Tunisia)

    Science.gov (United States)

    Zghibi, Adel; Merzougui, Amira; Zouhri, Lahcen; Tarhouni, Jamila

    2014-01-01

    the dissolution of gypsum, dolomite and halite, as well as contamination by nitrate caused mainly by extensive irrigation activity. The application of Multivariate Statistics Techniques based on Principal component Analysis and Hierarchical Cluster Analysis has lead to the corroboration of the hypotheses developed from the previous hydrochemical study. Two factors were found that explained major hydrochemical processes in the aquifer. These factors reveal the existence of an intensive intrusion of seawater and mechanisms of nitrate contamination of groundwater.

  4. Underutilization of dental care when it is freely available: a prospective study of the New England Children's Amalgam Trial.

    Science.gov (United States)

    Maserejian, Nancy Nairi; Trachtenberg, Felicia; Link, Carol; Tavares, Mary

    2008-01-01

    This study aims to prospectively examine the trends and reasons for the underutilization of free semiannual preventive dental care provided to children with unmet dental needs who participated in the 5-year New England Children's Amalgam Trial. Children aged 6 to 10 at baseline (1997-99) with > or = 2 posterior carious teeth were recruited from rural Maine (n = 232) and urban Boston (n = 266). Interviewer-administered questionnaires assessed demographic and personal characteristics. Reasons for missed appointments were recorded during follow-up and are descriptively presented. We used an ordinal logistic regression to analyze the utilization of semiannual dental visits. On average, urban children utilized 69 percent of the visits and rural children utilized 82 percent of the visits. For both sites, utilization steadily decreased until the end of the 5-year trial. Significant predictors of underutilization in the multivariate model for urban children were non-White race, household welfare use, deep debt, and distance to dental clinic. Among the relatively less-diverse rural children, caregiver education level and a greater number of decayed tooth surfaces at baseline (i.e., need for care) were significantly associated with underutilization. Among all children, the common reasons for missed visits included guardian scheduling and transportation difficulties; reasons among urban participants also indicated a low priority for dental care. Among these children with unmet dental needs, the provision of free preventive dental care was insufficient to remove the disparities in utilization and did not consistently result in high utilization through follow-up. Differences between educational levels, ethnicities, and rural/urban location suggest that public health programs need to target the social settings in which financial burdens exist.

  5. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  6. Caregiving Statistics

    Science.gov (United States)

    ... Coping with Alzheimer’s COPD Caregiving Take Care! Caregiver Statistics Statistics on Family Caregivers and Family Caregiving Caregiving Population ... Health Care Caregiver Self-Awareness State by State Statistics Caregiving Population The value of the services family ...

  7. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  8. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    Science.gov (United States)

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  9. Emissions dispatch under the underutilization provision of the 1990 U.S. Clean Air Act Amendments: Models and analysis

    International Nuclear Information System (INIS)

    Hobbs, B.F.

    1993-01-01

    The acid rain title of the new Clean Air Act will impact utility planning and operations in many ways. One important provision of the title will constrain the operation of coal-fired generating units that are subject to SO 2 limitations during Phase 1 of the Act (1995--99). Because only SO 2 emissions from those units will require emissions allowances during that time, utilities will be motivated to shift production to non-Phase 1 units whose SO 2 emissions are not limited until the year 2000. To prevent this from happening, the Act mandates that utilities maintain the fuel use rate of Phase 1 units at or above their 1985--87 level. This paper summarizes methods for including such underutilization constraints in probabilistic production costing models and real-time dispatch. As an input to the US Environmental Protection Agency's rulemaking process, a production costing study has been conducted that compares two alternative rules that would define when underutilization of Phase 1 units occurs. It is concluded that significant cost savings could be realized if the more flexible of the two proposed rules were to be adopted

  10. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  11. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  12. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  13. Random-matrix approach to the statistical compound nuclear reaction at low energies using the Monte-Carlo technique [PowerPoint

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, Toshihiko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-11-10

    This theoretical treatment of low-energy compound nucleus reactions begins with the Bohr hypothesis, with corrections, and various statistical theories. The author investigates the statistical properties of the scattering matrix containing a Gaussian Orthogonal Ensemble (GOE) Hamiltonian in the propagator. The following conclusions are reached: For all parameter values studied, the numerical average of MC-generated cross sections coincides with the result of the Verbaarschot, Weidenmueller, Zirnbauer triple-integral formula. Energy average and ensemble average agree reasonably well when the width I is one or two orders of magnitude larger than the average resonance spacing d. In the strong-absorption limit, the channel degree-of-freedom ν a is 2. The direct reaction increases the inelastic cross sections while the elastic cross section is reduced.

  14. Do parents who smoke underutilize health care services for their children? A cross sectional study within the longitudinal PIAMA study

    Directory of Open Access Journals (Sweden)

    Baan Caroline A

    2007-06-01

    Full Text Available Abstract Background A higher prevalence of respiratory symptoms and an associated increase in health care utilization among children with parents who smoke is to be expected. From previous studies however, it appears that parents who smoke may underutilize health services for their children, especially with respect to respiratory care. This study explores the validity and generalizability of the previous assumption. Methods Data were obtained from a Dutch birth-cohort study; the Prevention and Incidence of Asthma and Mite Allergy (PIAMA project. Information regarding parental smoking, the child's respiratory symptoms and health care use and potential confounders were obtained by postal questionnaires. Multivariate logistic models were used to relate parental smoking to the child's respiratory symptoms and health care use. Results The study comprised 3,564, 4-year old children. In the crude analysis, respiratory symptoms were more frequent among children with a parent who smoked, while health care utilization for respiratory symptoms was not significantly different between children with or without a parent who smoked. In the multivariate analyses, maternal smoking had a larger impact on the child's respiratory symptoms and health care use as compared to paternal smoking. Maternal smoking was positively associated with mild respiratory symptoms of the child, adjusted odds ratio [AOR] 1.50 (1.19–1.91, but not with severe respiratory symptoms AOR 1.03 (0.75–1.40. Among children with mild respiratory symptoms, children with a mother who smoked were less likely to be taken to the general practitioner (GP for respiratory symptoms, than children with mothers who did not smoke, AOR 0.58 (0.33–1.01. This finding was less pronounced among children with severe respiratory symptoms AOR 0.86 (0.49–1.52. Neither GP visits for non-respiratory symptoms nor specialized care for respiratory disease were significantly associated with parental smoking

  15. Evaluating selected demographic factors related to consumer preferences for furniture from commercial and from underutilized species.

    Science.gov (United States)

    David Nicholls; Matthew Bumgardner

    2007-01-01

    This technical note describes consumer preferences within selected demographic categories in two major Pacific Northwest markets for six domestic wood species. These woods were considered for construction of four furniture pieces. Chi-square tests were performed to determine species preferences based on gender, age, and income. Age and income were statistically...

  16. Harmonic statistics

    Science.gov (United States)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  17. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  18. Changes in the nature of dissolved organics during pulp and paper mill wastewater treatment: a multivariate statistical study combining data from three analytical techniques.

    Science.gov (United States)

    Plant, Emma L; Smernik, Ronald J; van Leeuwen, John; Greenwood, Paul; Macdonald, Lynne M

    2014-03-01

    The paper-making process can produce large amounts of wastewater (WW) with high particulate and dissolved organic loads. Generally, in developed countries, stringent international regulations for environmental protection require pulp and paper mill WW to be treated to reduce the organic load prior to discharge into the receiving environment. This can be achieved by primary and secondary treatments involving both chemical and biological processes. These processes result in complex changes in the nature of the organic material, as some components are mineralised and others are transformed. In this study, changes in the nature of organics through different stages of secondary treatment of pulp and paper mill WW were followed using three advanced characterisation techniques: solid-state (13)C nuclear magnetic resonance (NMR) spectroscopy, pyrolysis-gas chromatography mass spectrometry (py-GCMS) and high-performance size-exclusion chromatography (HPSEC). Each technique provided a different perspective on the changes that occurred. To compare the different chemical perspectives in terms of the degree of similarity/difference between samples, we employed non-metric multidimensional scaling. Results indicate that NMR and HPSEC provided strongly correlated perspectives, with 86 % of the discrimination between the organic samples common to both techniques. Conversely, py-GCMS was found to provide a unique, and thus complementary, perspective.

  19. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  20. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  1. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  2. An evaluation of single-site statistical downscaling techniques in terms of indices of climate extremes for the Midwest of Iran

    Science.gov (United States)

    Farajzadeh, M.; Oji, R.; Cannon, A. J.; Ghavidel, Y.; Massah Bavani, A.

    2015-04-01

    Seven single-site statistical downscaling methods for daily temperature and precipitation, including four deterministic algorithms [analog model (ANM), quantile mapping with delta method extrapolation (QMD), cumulative distribution function transform (CDFt), and model-based recursive partitioning (MOB)] and three stochastic algorithms [generalized linear model (GLM), Conditional Density Estimation Network Creation and Evaluation (CaDENCE), and Statistical Downscaling Model-Decision Centric (SDSM-DC] are evaluated at nine stations located in the mountainous region of Iran's Midwest. The methods are of widely varying complexity, with input requirements that range from single-point predictors of temperature and precipitation to multivariate synoptic-scale fields. The period 1981-2000 is used for model calibration and 2001-2010 for validation, with performance assessed in terms of 27 Climate Extremes Indices (CLIMDEX). The sensitivity of the methods to large-scale anomalies and their ability to replicate the observed data distribution in the validation period are separately tested for each index by Pearson correlation and Kolmogorov-Smirnov (KS) tests, respectively. Combined tests are used to assess overall model performances. MOB performed best, passing 14.5 % (49.6 %) of the combined (single) tests, respectively, followed by SDSM, CaDENCE, and GLM [14.5 % (46.5 %), 13.2 % (47.1 %), and 12.8 % (43.2 %), respectively], and then by QMD, CDFt, and ANM [7 % (45.7 %), 4.9 % (45.3 %), and 1.6 % (37.9 %), respectively]. Correlation tests were passed less frequently than KS tests. All methods downscaled temperature indices better than precipitation indices. Some indices, notably R20, R25, SDII, CWD, and TNx, were not successfully simulated by any of the methods. Model performance varied widely across the study region.

  3. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hyun, E-mail: circle1128@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, Myung-Joon, E-mail: mjkim@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Yoon, Choon-Sik, E-mail: yooncs58@yuhs.ac [Department of Radiology, Gangnam Severance Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Lee, Mi-Jung, E-mail: mjl1213@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of)

    2012-09-15

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality.

  4. Underutilization of aspirin in people living with human immunodeficiency virus at increased risk for acute myocardial infarction: Systematic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Stella Pak

    2017-01-01

    Full Text Available Context: With the increased availability of potent combination antiretroviral therapies, the life expectancy of people living with human immunodeficiency virus (PLHIV has greatly increased. This rapid improvement in lifespan has served as a catalyst for a paradigm shift in human immunodeficiency virus (HIV care. The focus of HIV care models has transitioned from the sole treatment of acute opportunistic infections to comprehensive management of chronic diseases, such as cardiovascular disease (CVD. Multiple studies have demonstrated that PLHIV are 50% more likely to develop acute myocardial infarction (AMI, compared to the general population. Cardiovascular risk prevention is becoming an essential component of the overarching HIV treatment plan. Aims: This meta-analysis aims to compare the rate of aspirin use for AMI prevention in indicated patients between PLHIV and general population. Methods: PubMed, EMBASE, Web of Science, Cochrane Library, CINAHL, and MEDLINE databases were used to identify observational cohort trials. Studies were assessed by two reviewers for inclusion criteria. Two separate random-effects meta-analyses' models were performed using the DerSimonian and Laird method. Heterogeneity was assessed using the I2 value. Meta-regression with study level variables was used to explore potential sources of heterogeneity. The funnel-plot-based trim-and-fill method was applied to detect and adjust for potential publication bias. Statistical tests were two-sided and P< 0.05 was considered statistically significant. Results: A total of 13 studies were included for analysis. In these trials, 30.4% of PLHIV with increased risk for coronary heart disease (CHD used aspirin for AMI prevention, compared to 36.9% of patients at risk of CHD in the general population. Conclusions: The results of this meta-analysis provide evidence that aspirin is underutilized in both PLHIV and the general population across broad geographical zones. Aspirin use

  5. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  6. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... by accounting for the significance of the materials and the equipment that enters into the production of statistics. Key words: Reversible statistics, diverse materials, constructivism, economics, science, and technology....

  7. Factors associated with underutilization of antenatal care services in Indonesia: results of Indonesia Demographic and Health Survey 2002/2003 and 2007

    Directory of Open Access Journals (Sweden)

    Titaley Christiana R

    2010-08-01

    Full Text Available Abstract Background Antenatal care aims to prevent maternal and perinatal mortality and morbidity. In Indonesia, at least four antenatal visits are recommended during pregnancy. However, this service has been underutilized. This study aimed to examine factors associated with underutilization of antenatal care services in Indonesia. Methods We used data from Indonesia Demographic and Health Survey (IDHS 2002/2003 and 2007. Information of 26,591 singleton live-born infants of the mothers' most recent birth within five years preceding each survey was examined. Twenty-three potential risk factors were identified and categorized into four main groups, external environment, predisposing, enabling, and need factors. Logistic regression models were used to examine the association between all potential risk factors and underutilization of antenatal services. The Population Attributable Risk (PAR was calculated for selected significant factors associated with the outcome. Results Factors strongly associated with underutilization of antenatal care services were infants from rural areas and from outer Java-Bali region, infants from low household wealth index and with low maternal education level, and high birth rank infants with short birth interval of less than two years. Other associated factors identified included mothers reporting distance to health facilities as a major problem, mothers less exposed to mass media, and mothers reporting no obstetric complications during pregnancy. The PAR showed that 55% of the total risks for underutilization of antenatal care services were attributable to the combined low household wealth index and low maternal education level. Conclusions Strategies to increase the accessibility and availability of health care services are important particularly for communities in rural areas. Financial support that enables mothers from poor households to use health services will be beneficial. Health promotion programs targeting

  8. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    Science.gov (United States)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-09-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  9. Modeling and Analysis of Mechanical Properties of Aluminium Alloy (A413 Processed through Squeeze Casting Route Using Artificial Neural Network Model and Statistical Technique

    Directory of Open Access Journals (Sweden)

    R. Soundararajan

    2015-01-01

    Full Text Available Artificial Neural Network (ANN approach was used for predicting and analyzing the mechanical properties of A413 aluminum alloy produced by squeeze casting route. The experiments are carried out with different controlled input variables such as squeeze pressure, die preheating temperature, and melt temperature as per Full Factorial Design (FFD. The accounted absolute process variables produce a casting with pore-free and ideal fine grain dendritic structure resulting in good mechanical properties such as hardness, ultimate tensile strength, and yield strength. As a primary objective, a feed forward back propagation ANN model has been developed with different architectures for ensuring the definiteness of the values. The developed model along with its predicted data was in good agreement with the experimental data, inferring the valuable performance of the optimal model. From the work it was ascertained that, for castings produced by squeeze casting route, the ANN is an alternative method for predicting the mechanical properties and appropriate results can be estimated rather than measured, thereby reducing the testing time and cost. As a secondary objective, quantitative and statistical analysis was performed in order to evaluate the effect of process parameters on the mechanical properties of the castings.

  10. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  11. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  12. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  13. COAGULATION ASSESSMENT: UNDERUTILIZED DIAGNOSTIC TOOLS IN ZOO AND AQUATIC ANIMAL MEDICINE.

    Science.gov (United States)

    Gerlach, Trevor J; Barratclough, Ashley; Conner, Bobbi

    2017-12-01

    Veterinarians specializing in nondomestic species are faced with unique challenges regarding research and diagnostic capabilities given the wild and frequently dangerous nature of their patients. Standard diagnostic techniques used in small or large animal practice are not always possible due to anatomical constraints, size, tractability, or the inherent risk of anesthesia in highly valued, rare species. Diagnostic modalities that utilize simple, relatively noninvasive techniques show promise in evaluating nondomestic species and elucidating the pathophysiology behind poorly characterized disease processes in both wild and captive populations. Coagulation profiles, which may include prothrombin time (PT), partial thromboplastin time (PTT), D-dimer concentration, platelet count, and thromboelastography (TEG) are frequently used in domestic species but often overlooked in exotic medicine due to lack of normal reference values and/or availability. Whenever possible, coagulation profiles should be utilized in the evaluation of various disease processes including neoplasia, sepsis, trauma, inflammation, toxin exposure, and envenomation. There are several reports of coagulopathies in both wild and captive species; however, few studies on coagulation profiles have been published on nondomestic species. Clinicians should consider coagulation testing as part of the diagnostic work-up in nondomestic species. A review of available coagulation diagnostic tests is provided here in addition to summarizing the pertinent coagulation disorders currently established in the literature.

  14. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    International Nuclear Information System (INIS)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-01-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 x 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver (p < 0.05). Furthermore, the F-value, which was used as a scale for the difference in recognition rates, was highest in the average gray level, relatively high in the skewness and the entropy, and relatively low in the uniformity, the relative smoothness and the average contrast. The recognition rate for a fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  15. Laser-Plasma Instabilities by Avoiding the Strong Ion Landau Damping Limit: The Central Role of Statistical, Ultrafast, Nonlinear Optical Laser Techniques (SUNOL)

    Science.gov (United States)

    Afeyan, Bedros; Hüller, Stefan; Montgomery, David; Moody, John; Froula, Dustin; Hammer, James; Jones, Oggie; Amendt, Peter

    2014-10-01

    In mid-Z and high-Z plasmas, it is possible to control crossed bean energy transfer (CBET) and subsequently occurring single or multiple beam instabilities such as Stimulated Raman Scattering (SRS) by novel means. These new techniques are inoperative when the ion acoustic waves are in their strong damping limit, such as occurs in low Z plasmas with comparable electron and ion temperatures. For mid-Z plasmas, such as Z = 10, and near the Mach 1 surface, the strong coupling regime (SCR) can be exploited for LPI mitigation. While at higher Z values, it is thermal filamentation in conjunction with nonlocal heat transport that are useful to exploit. In both these settings, the strategy is to induce laser hot spot intensity dependent, and thus spatially dependent, frequency shifts to the ion acoustic waves in the transient response of wave-wave interactions. The latter is achieved by the on-off nature of spike trains of uneven duration and delay, STUD pulses. The least taxing use of STUD pulses is to modulate the beams at the 10 ps time scale and to choose which crossing beams are overlapping in time and which are not. Work supported by a grant from the DOE NNSA-OFES joint program on HEDP

  16. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  17. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  18. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  19. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  20. Practical Statistics

    OpenAIRE

    Lyons, L.

    2017-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses. Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing...

  1. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  2. Total coliforms, arsenic and cadmium exposure through drinking water in the Western Region of Ghana: application of multivariate statistical technique to groundwater quality.

    Science.gov (United States)

    Affum, Andrews Obeng; Osae, Shiloh Dede; Nyarko, Benjamin Jabez Botwe; Afful, Samuel; Fianko, Joseph Richmond; Akiti, Tetteh Thomas; Adomako, Dickson; Acquaah, Samuel Osafo; Dorleku, Micheal; Antoh, Emmanuel; Barnes, Felix; Affum, Enoch Acheampong

    2015-02-01

    In recent times, surface water resource in the Western Region of Ghana has been found to be inadequate in supply and polluted by various anthropogenic activities. As a result of these problems, the demand for groundwater by the human populations in the peri-urban communities for domestic, municipal and irrigation purposes has increased without prior knowledge of its water quality. Water samples were collected from 14 public hand-dug wells during the rainy season in 2013 and investigated for total coliforms, Escherichia coli, mercury (Hg), arsenic (As), cadmium (Cd) and physicochemical parameters. Multivariate statistical analysis of the dataset and a linear stoichiometric plot of major ions were applied to group the water samples and to identify the main factors and sources of contamination. Hierarchal cluster analysis revealed four clusters from the hydrochemical variables (R-mode) and three clusters in the case of water samples (Q-mode) after z score standardization. Principal component analysis after a varimax rotation of the dataset indicated that the four factors extracted explained 93.3 % of the total variance, which highlighted salinity, toxic elements and hardness pollution as the dominant factors affecting groundwater quality. Cation exchange, mineral dissolution and silicate weathering influenced groundwater quality. The ranking order of major ions was Na(+) > Ca(2+) > K(+) > Mg(2+) and Cl(-) > SO4 (2-) > HCO3 (-). Based on piper plot and the hydrogeology of the study area, sodium chloride (86 %), sodium hydrogen carbonate and sodium carbonate (14 %) water types were identified. Although E. coli were absent in the water samples, 36 % of the wells contained total coliforms (Enterobacter species) which exceeded the WHO guidelines limit of zero colony-forming unit (CFU)/100 mL of drinking water. With the exception of Hg, the concentration of As and Cd in 79 and 43 % of the water samples exceeded the WHO guideline limits of 10 and 3

  3. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  4. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  5. Male circumcision: a globally relevant but under-utilized method for the prevention of HIV and other sexually transmitted infections.

    Science.gov (United States)

    Tobian, Aaron A R; Kacker, Seema; Quinn, Thomas C

    2014-01-01

    Randomized trials have demonstrated that male circumcision (MC) reduces heterosexual acquisition of HIV, herpes simplex virus type 2, human papillomavirus (HPV), and genital ulcer disease among men, and it reduces HPV, genital ulcer disease, bacterial vaginosis, and trichomoniasis among female partners. The pathophysiology behind these effects is multifactorial, relying on anatomic and cellular changes. MC is cost effective and potentially cost saving in both the United States and Africa. The World Health Organization and Joint United Nations Program on HIV/AIDS proposed reaching 80% MC coverage in HIV endemic countries, but current rates fall far behind targets. Barriers to scale-up include supply-side and demand-side challenges. In the United States, neonatal MC rates are decreasing, but the American Academy of Pediatrics now recognizes the medical benefits of MC and supports insurance coverage. Although MC is a globally valuable tool to prevent HIV and other sexually transmitted infections, it is underutilized. Further research is needed to address barriers to MC uptake.

  6. Tapping the Potential of Neglected and Underutilized Food Crops for Sustainable Nutrition Security in the Mountains of Pakistan and Nepal

    Directory of Open Access Journals (Sweden)

    Lipy Adhikari

    2017-02-01

    Full Text Available Neglected and underutilized food crops (NUFCs have high nutritional value, but their role in achieving nutrition security is not adequately understood, and they do not feature in food and nutrition policies and programs of the countries of the Hindu-Kush Himalayan (HKH region. Drawing examples from Pakistan and Nepal, this study investigates the importance of NUFCs in achieving nutrition security in the mountains and identifies key underlying reasons for the decline in their cultivation and use. The study found that the prevalence of malnutrition is significantly higher in the mountains than nationally in both Pakistan and Nepal and identifies the decline in the cultivation and use of micronutrient-rich NUFCs as one of the key reasons for this. The deterioration of local food systems, changing food habits, lack of knowledge about the cultivation, use and nutritional value of NUFCs and lack of attention to NUFCs in programs and policies are the key reasons for the abandoning of NUFCs by mountain communities. There is an urgent need to mainstream these crops into national programs and policies and to integrate them into local food systems. This will not only improve the nutrition security of mountain areas, but also biodiversity and local mountain economies.

  7. Comparison of Herbarium Label Data and Published Medicinal Use: Herbaria as an Underutilized Source of Ethnobotanical Information.

    Science.gov (United States)

    Souza, E N F; Hawkins, J A

    2017-01-01

    The use of herbarium specimens as vouchers to support ethnobotanical surveys is well established. However, herbaria may be underutilized resources for ethnobotanical research that depends on the analysis of large datasets compiled across multiple sites. Here, we compare two medicinal use datasets, one sourced from published papers and the other from online herbaria to determine whether herbarium and published data are comparable and to what extent herbarium specimens add new data and fill gaps in our knowledge of geographical extent of plant use. Using Brazilian legumes as a case study, we compiled 1400 use reports from 105 publications and 15 Brazilian herbaria. Of the 319 species in 107 genera with cited medicinal uses, 165 (51%) were recorded only in the literature and 55 (17%) only on herbarium labels. Mode of application, plant part used, or therapeutic use was less often documented by herbarium specimen labels (17% with information) than publications (70%). However, medicinal use of 21 of the 128 species known from only one report in the literature was substantiated from independently collected herbarium specimens, and 58 new therapeutic applications, 25 new plant parts, and 16 new modes of application were added for species known from the literature. Thus, when literature reports are few or information-poor, herbarium data can both validate and augment these reports. Herbarium data can also provide insights into the history and geographical extent of use that are not captured in publications.

  8. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  9. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  10. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  11. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  12. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  13. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Science.gov (United States)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  14. Statistical Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 10. Statistical Computing - Understanding Randomness and Random Numbers. Sudhakar Kunte. Series Article Volume 4 Issue 10 October 1999 pp 16-21. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  16. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  17. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  18. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  19. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  20. Statistical finite element analysis.

    Science.gov (United States)

    Khalaji, Iman; Rahemifar, Kaamran; Samani, Abbas

    2008-01-01

    A novel technique is introduced for tissue deformation and stress analysis. Compared to the conventional Finite Element method, this technique is orders of magnitude faster and yet still very accurate. The proposed technique uses preprocessed data obtained from FE analyses of a number of similar objects in a Statistical Shape Model framework as described below. This technique takes advantage of the fact that the body organs have limited variability, especially in terms of their geometry. As such, it is well suited for calculating tissue displacements of body organs. The proposed technique can be applied in many biomedical applications such as image guided surgery, or virtual reality environment development where tissue behavior is simulated for training purposes.

  1. Statistically tuned Gaussian background subtraction technique for ...

    Indian Academy of Sciences (India)

    The non-parametric background modelling approach proposed by Martin Hofmann et al (2012) involves modelling of foreground by the history of recently ... background subtraction system with mixture of Gaussians, deviation scaling factor and max– min background model for outdoor environment. Selection of detection ...

  2. Statistics techniques applied to electron probe microanalysis

    International Nuclear Information System (INIS)

    Brizuela, H.; Del Giorgio, M.; Budde, C.; Briozzo, C.; Riveros, J.

    1987-01-01

    A description of Montroll-West's general theory for a tridimensional random walk of a particle with internal degrees of freedom is given, connecting this problem with the master equation solution. The possibility of its application to EPMA is discussed. Numerical solutions are given for thick or collimated beams at several energies interacting with samples of different shape and size. Spatial distribution of particles within the sample -for a stationary state- is analized, as well as the electron backscattering coefficient. (Author) [es

  3. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  4. Juice blends--a way of utilization of under-utilized fruits, vegetables, and spices: a review.

    Science.gov (United States)

    Bhardwaj, Raju Lal; Pandey, Shruti

    2011-07-01

    The post-harvest shelf life of maximum of fruits and vegetables is very limited due to their perishable nature. In India more then 20-25 percent of fruits and vegetables are spoiled before utilization. Despite being the world's second largest producer of fruits and vegetables, in India only 1.5 percent of the total fruits and vegetables produced are processed. Maximum amounts of fruit and vegetable juices turn bitter after extraction due to conversion of chemical compounds. In spite of being under utilized, the utilization of highly nutritive fruits and vegetables is very limited due to high acidity, astringency, bitterness, and some other factors. While improving flavor, palatability, and nutritive and medicinal value of various fruit juices such as aonla, mango, papaya, pineapple, citrus, ber, pear, apple, watermelon, and vegetables including bottle gourd, carrot, beet root, bitter gourd, medicinal plants like aloe vera and spices can also be used for juice blending. All these natural products are valued very highly for their refreshing juice, nutritional value, pleasant flavor, and medicinal properties. Fruits and vegetables are also a rich source of sugars, vitamins, and minerals. However, some fruits and vegetables have an off flavor and bitterness although they are an excellent source of vitamins, enzymes, and minerals. Therefore, blending of two or more fruit and vegetable juices with spices extract for the preparation of nutritive ready-to-serve (RTS), beverages is thought to be a convenient and economic alternative for utilization of these fruits and vegetables. Moreover, one could think of a new product development through blending in the form of a natural health drink, which may also serve as an appetizer. The present review focuses on the blending of fruits, under-utilized fruits, vegetables, medicinal plants, and spices in appropriate proportions for the preparation of natural fruit and vegetable based nutritive beverages.

  5. ACE inhibitor and angiotensin II type 1 receptor antagonist therapies in elderly patients with diabetes mellitus: are they underutilized?

    Science.gov (United States)

    Pappoe, Lamioko Shika; Winkelmayer, Wolfgang C

    2010-02-01

    Diabetes mellitus is highly prevalent in older adults in the industrialized world. These patients are at high risk of complications from diabetes, including diabetic kidney disease. ACE inhibitors and their newer cousins, angiotensin II type 1 receptor antagonists (angiotensin receptor blockers [ARBs]), are powerful medications for the prevention of progression of diabetic renal disease. Unfortunately, among the elderly, these medications have been underutilized. The reasons for this include physician concerns regarding patient age and limited life expectancy and potential complications of ACE inhibitor or ARB use, specifically an increase in creatinine levels and hyperkalaemia. As discussed in this article, there have been several studies that show that the effects of inhibition of the renin-angiotensin system can be beneficial for the treatment of cardiovascular disease and renal disease among elderly patients with diabetes and that the potential risks mentioned above are no greater in this group than in the general population. For these reasons, several professional societies recommend that elderly patients with diabetes and hypertension (systolic blood pressure >or=140 mmHg or diastolic blood pressure >or=90 mmHg) be treated with an ACE inhibitor or ARB (as is recommended for younger diabetics). Use of ACE inhibitors or ARBs is also recommended for those with cardiovascular disease or those who are at risk of cardiovascular disease. Furthermore, in the management of diabetic kidney disease in elderly patients, treatment with ACE inhibitors or ARBs is also recommended to reduce the risk or slow the progression of nephropathy. Renal function and potassium levels should be monitored within the first 12 weeks of initiation of these medications, with each dose increase, and on a yearly basis thereafter. This article summarizes the current guidelines on the use of ACE inhibitors and ARBs in older adults with diabetes, reviews the evidence for their use in the elderly

  6. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  7. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  8. Advanced Techniques for Monitoring, Simulation and Optimization of Machining Processes

    OpenAIRE

    Keshari, Anupam

    2011-01-01

    In today’s manufacturing industry, pressure for productivity, higher quality and cost saving is heavier than ever. Surviving in today’s highly competitive world is not an easy task, contemporary technology updates and heavy investments are needed in state of the art machinery and modern cutting tool systems. If the machining resources are underutilized, feasible techniques are needed to utilize resources efficiently. The new enhancements in the machine tools sector have enabled opportunit...

  9. Statistical mechanics of learning

    CERN Document Server

    Engel, Andreas

    2001-01-01

    The effort to build machines that are able to learn and undertake tasks such as datamining, image processing and pattern recognition has led to the development of artificial neural networks in which learning from examples may be described and understood. The contribution to this subject made over the past decade by researchers applying the techniques of statistical mechanics is the subject of this book. The authors provide a coherent account of various important concepts and techniques that are currently only found scattered in papers, supplement this with background material in mathematics and physics, and include many examples and exercises.

  10. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  11. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical......Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  12. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  13. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  14. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  15. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  16. Optimization of solvent mixtures for extraction from bark of Schinus terebinthifolius by a statistical mixture-design technique and development of a UV-Vis spectrophotometric method for analysis of total polyphenols in the extract

    Directory of Open Access Journals (Sweden)

    Maria Cristina DiCiaula

    2014-01-01

    Full Text Available A statistical mixture-design technique was used to study the effects of different solvents and their mixtures on the yield, total polyphenol content, and antioxidant capacity of the crude extracts from the bark of Schinus terebinthifolius Raddi (Anacardiaceae. The experimental results and their response-surface models showed that ternary mixtures with equal portions of all the three solvents (water, ethanol and acetone were better than the binary mixtures in generating crude extracts with the highest yield (22.04 ± 0.48%, total polyphenol content (29.39 ± 0.39%, and antioxidant capacity (6.38 ± 0.21. An analytical method was developed and validated for the determination of total polyphenols in the extracts. Optimal conditions for the various parameters in this analytical method, namely, the time for the chromophoric reaction to stabilize, wavelength of the absorption maxima to be monitored, the reference standard and the concentration of sodium carbonate were determined to be 5 min, 780 nm, pyrogallol, and 14.06% w v-1, respectively. UV-Vis spectrophotometric monitoring of the reaction under these conditions proved the method to be linear, specific, precise, accurate, reproducible, robust, and easy to perform.

  17. Engaging with the Art & Science of Statistics

    Science.gov (United States)

    Peters, Susan A.

    2010-01-01

    How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…

  18. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Science.gov (United States)

    2012-01-01

    Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent

  19. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Directory of Open Access Journals (Sweden)

    Adrion Christine

    2012-09-01

    Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study

  20. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  1. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  2. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  3. Statistical methods for ranking data

    CERN Document Server

    Alvo, Mayer

    2014-01-01

    This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.

  4. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  5. A Holistic Approach to Enhance the Use of Neglected and Underutilized Species: The Case of Andean Grains in Bolivia and Peru

    Directory of Open Access Journals (Sweden)

    Stefano Padulosi

    2014-03-01

    Full Text Available The IFAD-NUS project, implemented over the course of a decade in two phases, represents the first UN-supported global effort on neglected and underutilized species (NUS. This initiative, deployed and tested a holistic and innovative value chain framework using multi-stakeholder, participatory, inter-disciplinary, pro-poor gender- and nutrition-sensitive approaches. The project has been linking aspects often dealt with separately by R&D, such as genetic diversity, selection, cultivation, harvest, value addition, marketing, and final use, with the goal to contribute to conservation, better incomes, and improved nutrition and strengthened livelihood resilience. The project contributed to the greater conservation of Andean grains and their associated indigenous knowledge, through promoting wider use of their diversity by value chain actors, adoption of best cultivation practices, development of improved varieties, dissemination of high quality seed, and capacity development. Reduced drudgery in harvest and postharvest operations, and increased food safety were achieved through technological innovations. Development of innovative food products and inclusion of Andean grains in school meal programs is projected to have had a positive nutrition outcome for targeted communities. Increased income was recorded for all value chain actors, along with strengthened networking skills and self-reliance in marketing. The holistic approach taken in this study is advocated as an effective strategy to enhance the use of other neglected and underutilized species for conservation and livelihood benefits.

  6. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  7. Computer intensive statistical methods

    Science.gov (United States)

    Yakowitz, S.

    The special session “Computer-Intensive Statistical Methods” was held in morning and afternoon parts at the 1985 AGU Fall Meeting in San Francisco. Calif. Its mission was to provide a forum for hydrologists and statisticians who are active in bringing unconventional, algorithmic-oriented statistical techniques to bear on problems of hydrology. Statistician Emanuel Parzen (Texas A&M University, College Station, Tex.) opened the session by relating recent developments in quantile estimation methods and showing how properties of such methods can be used to advantage to categorize runoff data previously analyzed by I. Rodriguez-Iturbe (Universidad Simon Bolivar, Caracas, Venezuela). Statistician Eugene Schuster (University of Texas, El Paso) discussed recent developments in nonparametric density estimation which enlarge the framework for convenient incorporation of prior and ancillary information. These extensions were motivated by peak annual flow analysis. Mathematician D. Myers (University of Arizona, Tucson) gave a brief overview of “kriging” and outlined some recently developed methodology.

  8. Morphological Analysis for Statistical Machine Translation

    National Research Council Canada - National Science Library

    Lee, Young-Suk

    2004-01-01

    We present a novel morphological analysis technique which induces a morphological and syntactic symmetry between two languages with highly asymmetrical morphological structures to improve statistical...

  9. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques; Identificacion de fuentes de metales pesados en la atmosfera de la Ciudad de Mexico, usando la tecnica de analisis por induccion de rayos X con proton y tecnicas estadisticas multifactoriales

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez M, B. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)

    1997-07-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  10. Introductory statistics for engineering experimentation

    CERN Document Server

    Nelson, Peter R; Coffin, Marie

    2003-01-01

    The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...

  11. On quantum statistical inference

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...

  12. Ultra-low gossypol cottonseed: gene-silencing opens up a vast, but underutilized protein resource for human nutrition

    Science.gov (United States)

    Cotton, grown mainly for its fiber, is a major crop in several developing and developed countries across the globe. In 2012, 48.8 million metric tons (MMT) of cottonseed was produced worldwide as a by-product of the 25.9 MMT of cotton lint production (FAO Production Statistics). This amount of cot...

  13. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  14. A primer of multivariate statistics

    CERN Document Server

    Harris, Richard J

    2014-01-01

    Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why

  15. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  16. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  17. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  18. Statistics for dental researchers: descriptive statistics

    OpenAIRE

    Mohammad Reza Baneshi PhD; Amir Reza Ghassemi DDS; Arash Shahravan DDS, MS

    2012-01-01

    Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data....

  19. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    existence in July. 2006, is mandated, among its functions, to exercise statistical co-ordination between. Ministries,. Departments and other agencies of the. Central government; ... tween the Directorate General of Commercial Intelligence and. Statistics ... in some states do not play a nodal role in the coordination of statistical ...

  20. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    Advanced Institute of. Maths, Stats and Com- puter Science, UoH. Campus, Hyderabad. His research interests include theory and practice of sample surveys .... other agencies of the. Central government; and to exercise statistical audit over the statistical activities to ensure quality and integrity of the statistical products.

  1. Statistics for dental researchers: descriptive statistics

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Baneshi PhD

    2012-09-01

    Full Text Available Descriptive statistics is the process of summarizing gathered raw data from a research and creating useful statistics,which help the better understanding of data. According to the types of variables, which consist of qualitative andquantitative variables, some descriptive statistics have been introduced. Frequency percentage is used in qualitativedata, and mean, median, mode, standard deviation, standard error, variance, and range are some of the statistics whichare used in quantitative data. In health sciences, the majority of continuous variables follow a normal distribution.skewness and kurtosis are two statistics which help to compare a given distribution with the normal distribution.

  2. Intermediate statistics a modern approach

    CERN Document Server

    Stevens, James P

    2007-01-01

    Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.

  3. Statistical validation of stochastic models

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, N.F. [Los Alamos National Lab., NM (United States). Engineering Science and Analysis Div.; Barney, P.; Paez, T.L. [Sandia National Labs., Albuquerque, NM (United States). Experimental Structural Dynamics Dept.; Ferregut, C.; Perez, L. [Univ. of Texas, El Paso, TX (United States). Dept. of Civil Engineering

    1996-12-31

    It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

  4. Algebraic statistics computational commutative algebra in statistics

    CERN Document Server

    Pistone, Giovanni; Wynn, Henry P

    2000-01-01

    Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

  5. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  6. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  7. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2009, a statistical profile of transportation in the 50 states and the District of Col...

  8. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  9. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  10. Childhood Cancer Statistics

    Science.gov (United States)

    ... Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates Over Time Cancer Deaths Per Year 5-Year Survival Rate Infographics Childhood Cancer Statistics – Important Facts Each year, the ...

  11. Underutilized Luffa cylindrica sponge: A local bio-adsorbent for the removal of Pb(II pollutant from water system

    Directory of Open Access Journals (Sweden)

    Adewale Adewuyi

    2017-06-01

    Full Text Available Biosorption of Pb2+ ions from aqueous solution onto Luffa cylindrica sponge as adsorbent (LCSA was investigated in batch adsorption system. LCSA was characterized by X-ray diffraction analysis technique (XRD, Scanning Electron Microscopy (SEM coupled with energy dispersive spectroscopy (EDS, Fourier Transform Infrared spectrometer (FTIR, particle size dispersion, zeta potential, thermogravimetric analysis (TGA, and Brunauer-Emmett-Teller (BET surface area analyzer. The sorption of Pb2+ ions by LCSA was subjected to equilibrium, thermodynamics and kinetic studies and was carried out by considering the effects of pH, initial metal ions concentration, contact time and temperature. BET surface area of LCSA was 6.00 m2/g with mean distribution size of 0.02 µm while the zeta potential was found to increase as the value of pH increased from 4 to 14. The findings revealed the maximum equilibrium adsorption capacity to be 75.853 mg/g. The process is chemisorptive and controlled by intra-particle diffusion. The values of thermodynamic parameters such as ΔG°, ΔH° and ΔS° showed a stable adsorbent-adsorbate (LCSA-Pb configuration which is exothermic. The adsorption capacity of LCSA compared better than some natural bio-sorbent found in literature.

  12. Market-Based Instruments for the Conservation of Underutilized Crops: In-Store Experimental Auction of Native Chili Products in Bolivia

    Directory of Open Access Journals (Sweden)

    Jaqueline Garcia-Yi

    2014-11-01

    Full Text Available Native chilies (Capsicum spp. are currently underutilized in Bolivia, one of this crop’s centers of diversity. Fewer local farmers cultivate native chilies annually due to low market demand. Increasing its private use value can lead to the in-situ conservation of this crop. The objective of the paper is to evaluate the market acceptability of three native chili products: (a chili marmalade; (b chili cooking paste; and (c pickled chilies. Multi-product Becker-DeGroot-Marschak experimental auctions and hedonic tests were conducted with 337 participants in La Paz and Santa Cruz. Data were analyzed using seemingly unrelated regressions. Results suggest that consumers are willing to pay price premiums of about 25–50 percent. Biodiversity conservation and improvements in farmers’ quality of life statements would not have influence on first purchase decisions but rather on repurchase decisions and therefore on consumers’ product loyalty. This in turn could lead to sustainable agro-biodiversity conservation, centered on consumers’ purchase of these products over time.

  13. A dual-purpose silver nanoparticles biosynthesized using aqueous leaf extract of Detarium microcarpum: An under-utilized species.

    Science.gov (United States)

    Labulo, Ayomide H; Adesuji, Elijah T; Dedeke, Oyinade A; Bodede, Olusola S; Oseghale, Charles O; Moodley, Roshila; Nyamori, Vincent O; Dare, Enock O; Adegoke, Olajire A

    2016-11-01

    The need for green synthesis of emerging industrial materials has led to the biosynthesis of nanoparticles from plants to circumvent the adverse by-products of chemical synthesis. In this study, the leaf extract of Detarium mirocarpum Guill & Perr, a small tree belonging to the family Fabaceae (Legume), was used to synthesize silver nanoparticles (DAgNPs). DAgNPs were characterized using spectroscopic techniques (Ultraviolet-Visible spectroscopy and Fourier Transform Infrared spectroscopy) which showed hydroxyl and carbonyl functional groups to be responsible for their synthesis. DAgNPs were observed to be crystalline and spherical. The average size, determined by transmission electron microscopy (TEM) was 17.05nm. The antioxidant activity of DAgNPs ranked from moderate to good. The ability of DAgNPs to sense Hg(2+) and Fe(3+) ions in aqueous medium was also investigated. The quenching of the SPR peak at 430nm was used to monitor the toxic and heavy metal ions with linear ranges of 20-70µgmL(-1) and 10-40µgmL(-1) for Hg(2+) and Fe(3+), respectively. The limit of detection (LOD) and limit of quantification (LOQ) obtained for Hg(2+) was 2.05µgmL(-1) and 6.21µgmL(-1), respectively and for Fe(3+) was 5.01µgmL(-1) and 15.21µgmL(-1), respectively. The intra- and inter-day assessments of accuracy and repeatability gave relative errors less than 1% in all instances. DAgNPs can therefore provide a convenient method of sensing the toxic metals easily. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Generalized quantum statistics

    International Nuclear Information System (INIS)

    Chou, C.

    1992-01-01

    In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics

  15. Statistical Analysis by Statistical Physics Model for the STOCK Markets

    Science.gov (United States)

    Wang, Tiansong; Wang, Jun; Fan, Bingli

    A new stochastic stock price model of stock markets based on the contact process of the statistical physics systems is presented in this paper, where the contact model is a continuous time Markov process, one interpretation of this model is as a model for the spread of an infection. Through this model, the statistical properties of Shanghai Stock Exchange (SSE) and Shenzhen Stock Exchange (SZSE) are studied. In the present paper, the data of SSE Composite Index and the data of SZSE Component Index are analyzed, and the corresponding simulation is made by the computer computation. Further, we investigate the statistical properties, fat-tail phenomena, the power-law distributions, and the long memory of returns for these indices. The techniques of skewness-kurtosis test, Kolmogorov-Smirnov test, and R/S analysis are applied to study the fluctuation characters of the stock price returns.

  16. Telling the truth with statistics

    CERN Multimedia

    CERN. Geneva; CERN. Geneva. Audiovisual Unit

    2002-01-01

    This course of lectures will cover probability, distributions, fitting, errors and confidence levels, for practising High Energy Physicists who need to use Statistical techniques to express their results. Concentrating on these appropriate specialist techniques means that they can be covered in appropriate depth, while assuming only the knowledge and experience of a typical Particle Physicist. The different definitions of probability will be explained, and it will be appear why this basic subject is so controversial; there are several viewpoints and it is important to understand them all, rather than abusing the adherents of different beliefs. Distributions will be covered: the situations they arise in, their useful properties, and the amazing result of the Central Limit Theorem. Fitting a parametrisation to a set of data is one of the most widespread uses of statistics: these are lots of ways of doing this and these will be presented, with discussion of which is appropriate in different circumstances. This t...

  17. Understanding search trees via statistical physics

    Indian Academy of Sciences (India)

    ary search tree model (where stands for the number of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.

  18. Data Mining: Going beyond Traditional Statistics

    Science.gov (United States)

    Zhao, Chun-Mei; Luan, Jing

    2006-01-01

    The authors provide an overview of data mining, giving special attention to the relationship between data mining and statistics to unravel some misunderstandings about the two techniques. (Contains 1 figure.)

  19. Nickel speciation in several serpentine (ultramafic) topsoils via bulk synchrotron-based techniques

    Energy Technology Data Exchange (ETDEWEB)

    Siebecker, Matthew G.; Chaney, Rufus L.; Sparks, Donald L.

    2017-07-01

    Serpentine soils have elevated concentrations of trace metals including nickel, cobalt, and chromium compared to non-serpentine soils. Identifying the nickel bearing minerals allows for prediction of potential mobility of nickel. Synchrotron-based techniques can identify the solid-phase chemical forms of nickel with minimal sample treatment. Element concentrations are known to vary among soil particle sizes in serpentine soils. Sonication is a useful method to physically disperse sand, silt and clay particles in soils. Synchrotron-based techniques and sonication were employed to identify nickel species in discrete particle size fractions in several serpentine (ultramafic) topsoils to better understand solid-phase nickel geochemistry. Nickel commonly resided in primary serpentine parent material such as layered-phyllosilicate and chain-inosilicate minerals and was associated with iron oxides. In the clay fractions, nickel was associated with iron oxides and primary serpentine minerals, such as lizardite. Linear combination fitting (LCF) was used to characterize nickel species. Total metal concentration did not correlate with nickel speciation and is not an indicator of the major nickel species in the soil. Differences in soil texture were related to different nickel speciation for several particle size fractionated samples. A discussion on LCF illustrates the importance of choosing standards based not only on statistical methods such as Target Transformation but also on sample mineralogy and particle size. Results from the F-test (Hamilton test), which is an underutilized tool in the literature for LCF in soils, highlight its usefulness to determine the appropriate number of standards to for LCF. EXAFS shell fitting illustrates that destructive interference commonly found for light and heavy elements in layered double hydroxides and in phyllosilicates also can occur in inosilicate minerals, causing similar structural features and leading to false positive results in

  20. On Quantum Statistical Inference, II

    OpenAIRE

    Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P. E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...

  1. Practical Statistics for Particle Physicists

    CERN Document Server

    Lista, Luca

    2017-01-01

    These three lectures provide an introduction to the main concepts of statistical data analysis useful for precision measurements and searches for new signals in High Energy Physics. The frequentist and Bayesian approaches to probability theory will introduced and, for both approaches, inference methods will be presented. Hypothesis tests will be discussed, then significance and upper limit evaluation will be presented with an overview of the modern and most advanced techniques adopted for data analysis at the Large Hadron Collider.

  2. Applied statistics for social and management sciences

    CERN Document Server

    Miah, Abdul Quader

    2016-01-01

    This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .

  3. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  4. Contributions to sampling statistics

    CERN Document Server

    Conti, Pier; Ranalli, Maria

    2014-01-01

    This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international  forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...

  5. Understanding search trees via statistical physics

    Indian Academy of Sciences (India)

    Other applications of statistical physics (networks, traffic flows, algorithmic problems, econophysics, astrophysical applications, etc.) ... of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.

  6. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  7. Statistical and thermal physics with computer applications

    CERN Document Server

    Gould, Harvey

    2010-01-01

    This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the

  8. National transportation statistics 2010

    Science.gov (United States)

    2010-01-01

    National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...

  9. National transportation statistics 2011

    Science.gov (United States)

    2011-04-01

    Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...

  10. Mental Illness Statistics

    Science.gov (United States)

    ... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...

  11. Blood Facts and Statistics

    Science.gov (United States)

    ... Home > Learn About Blood > Blood Facts and Statistics Blood Facts and Statistics Facts about blood needs Facts ... about American Red Cross Blood Services Facts about blood needs Every two seconds someone in the U.S. ...

  12. CMS Program Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...

  13. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...

  14. Developments in Statistical Education.

    Science.gov (United States)

    Kapadia, Ramesh

    1980-01-01

    The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)

  15. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  16. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  17. Ethics in Statistics

    Science.gov (United States)

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  18. Fisher's Contributions to Statistics

    Indian Academy of Sciences (India)

    T Krishnan received his. Ph.D. from the Indian. Statistical Institute. He joined the faculty of lSI in. 1965 and has been with the Institute ever since. He is at present a professor in the Applied. Statistics, Surveys and. Computing Division of the Institute. Krishnan's research interests are in. Statistical Pattern. Recognition ...

  19. Fermi–Dirac Statistics

    Indian Academy of Sciences (India)

    IAS Admin

    Dirac statistics, identical and in- distinguishable particles, Fermi gas. ... They obey. Fermi–Dirac statistics. In contrast, those with integer spin such as photons, mesons, 7Li atoms are called bosons and they obey. Bose–Einstein statistics. .... hypothesis (which later was extended as the third law of thermody- namics) was ...

  20. Statistical Methods in Translational Medicine

    Directory of Open Access Journals (Sweden)

    Shein-Chung Chow

    2008-12-01

    Full Text Available This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials, information (e.g. translation of basic discoveries to the clinic and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physician—scientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change are reviewed.

  1. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  2. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  3. Statistics for Research

    CERN Document Server

    Dowdy, Shirley; Chilko, Daniel

    2011-01-01

    Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f

  4. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  5. Statistics for economics

    CERN Document Server

    Naghshpour, Shahdad

    2012-01-01

    Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...

  6. STATISTICAL ANALYSIS OF MONETARY POLICY INDICATORS VARIABILITY

    Directory of Open Access Journals (Sweden)

    ANAMARIA POPESCU

    2016-10-01

    Full Text Available This paper attempts to characterize through statistical indicators of statistical data that we have available. The purpose of this paper is to present statistical indicators, primary and secondary, simple and synthetic, which is frequently used for statistical characterization of statistical series. We can thus analyze central tendency, and data variability, form and concentration distributions package data using analytical tools in Microsoft Excel that enables automatic calculation of descriptive statistics using Data Analysis option from the Tools menu. We will also study the links which exist between statistical variables can be studied using two techniques, correlation and regression. From the analysis of monetary policy in the period 2003 - 2014 and information provided by the website of the National Bank of Romania (BNR seems to be a certain tendency towards eccentricity and asymmetry of financial data series.

  7. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  8. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  9. National Statistical Commission and Indian Official Statistics

    Indian Academy of Sciences (India)

    T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India. Resonance – Journal of Science Education. Current Issue : Vol. 23, Issue 2 · Current Issue Volume 23 ...

  10. Baseline Statistics of Linked Statistical Data

    NARCIS (Netherlands)

    Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe

    2014-01-01

    We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National

  11. RNA interference: a promising technique for the improvement of traditional crops.

    Science.gov (United States)

    Katoch, Rajan; Thakur, Neelam

    2013-03-01

    RNA interference (RNAi) is a homology-dependent gene-silencing technology that involves double-stranded RNA directed against a target gene. This technique has emerged as powerful tool in understanding the functions of a number of genes in recent years. For the improvement in the nutritional status of the plants and reduction in the level of antinutrients, the conventional breeding methods were not completely successful in achieving the tissue-specific regulation of some genes. RNAi has shown successful results in a number of plant species for nutritional improvement, change in morphology and alteration in metabolite synthesis. This technology has been applied mostly in genetic engineering of important crop plants, and till date there are no reports of its application for the improvement of traditional/underutilized crops. In this study, we discuss current knowledge of RNAi function and concept and strategies for the improvement of traditional crops. Practical application. Although RNAi has been extensively used for the improvement of popular crops, no attention has been given for the use of this technology for the improvement of underutilized crops. This study describes the importance of use of this technology for the improvement of underutilized crops.

  12. Adaptive RAC codes employing statistical channel evaluation ...

    African Journals Online (AJOL)

    An adaptive encoding technique using row and column array (RAC) codes employing a different number of parity columns that depends on the channel state is proposed in this paper. The trellises of the proposed adaptive codes and a statistical channel evaluation technique employing these trellises are designed and ...

  13. Statistical feature extraction based iris recognition system

    Indian Academy of Sciences (India)

    Atul Bansal

    Abstract. Iris recognition systems have been proposed by numerous researchers using different feature extraction techniques for accurate and reliable biometric authentication. In this paper, a statistical feature extraction technique based on correlation between adjacent pixels has been proposed and implemented. Ham-.

  14. Dealing with statistics what you need to know

    CERN Document Server

    Brown, Reva Berman

    2007-01-01

    A guide to the essential statistical skills needed for success in assignments, projects or dissertations. It explains why it is impossible to avoid using statistics in analysing data. It also describes the language of statistics to make it easier to understand the various terms used for statistical techniques.

  15. Statistical Methods for Environmental Pollution Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    1987-01-01

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.

  16. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics.

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t -test. This "naive" approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t -test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment.

  17. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    Science.gov (United States)

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  18. Improved biomass utilization through the use of nuclear techniques

    International Nuclear Information System (INIS)

    1988-10-01

    Biomass is a major by-product resource of agriculture and food manufacturing, but it is under-utilized as a source of food, fibre, and chemicals. Nuclear techniques provide unique tools for studies of the capabilities of micro-organisms in methane digestor operation and in the transformation of lignocellulosic materials to useful products. Nuclear techniques have also been effectively employed as mutagenic agents in the preparation of more efficient microbial strains for the conversion of biomass. This report reviews the variety and diversity of such applications with focus on the development of microbial processes to utilize agricultural wastes and by-products. The value of nuclear techniques is manifestly demonstrated in the production of efficient microbial mutant strains, in the tracing of metabolic pathways, in the monitoring of lignin degradation and also of fermenter operation. Refs, figs and tabs

  19. Statistical Physics An Introduction

    CERN Document Server

    Yoshioka, Daijiro

    2007-01-01

    This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.

  20. The statistical stability phenomenon

    CERN Document Server

    Gorban, Igor I

    2017-01-01

    This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...

  1. Statistical baseline assessment in cardiotocography.

    Science.gov (United States)

    Agostinelli, Angela; Braccili, Eleonora; Marchegiani, Enrico; Rosati, Riccardo; Sbrollini, Agnese; Burattini, Luca; Morettini, Micaela; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura

    2017-07-01

    Cardiotocography (CTG) is the most common non-invasive diagnostic technique to evaluate fetal well-being. It consists in the recording of fetal heart rate (FHR; bpm) and maternal uterine contractions. Among the main parameters characterizing FHR, baseline (BL) is fundamental to determine fetal hypoxia and distress. In computerized applications, BL is typically computed as mean FHR±ΔFHR, with ΔFHR=8 bpm or ΔFHR=10 bpm, both values being experimentally fixed. In this context, the present work aims: to propose a statistical procedure for ΔFHR assessment; to quantitatively determine ΔFHR value by applying such procedure to clinical data; and to compare the statistically-determined ΔFHR value against the experimentally-determined ΔFHR values. To these aims, the 552 recordings of the "CTU-UHB intrapartum CTG database" from Physionet were submitted to an automatic procedure, which consisted in a FHR preprocessing phase and a statistical BL assessment. During preprocessing, FHR time series were divided into 20-min sliding windows, in which missing data were removed by linear interpolation. Only windows with a correction rate lower than 10% were further processed for BL assessment, according to which ΔFHR was computed as FHR standard deviation. Total number of accepted windows was 1192 (38.5%) over 383 recordings (69.4%) with at least an accepted window. Statistically-determined ΔFHR value was 9.7 bpm. Such value was statistically different from 8 bpm (P<;10 -19 ) but not from 10 bpm (P=0.16). Thus, ΔFHR=10 bpm is preferable over 8 bpm because both experimentally and statistically validated.

  2. Applied statistics for economists

    CERN Document Server

    Lewis, Margaret

    2012-01-01

    This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.

  3. Introduction to Statistics

    Directory of Open Access Journals (Sweden)

    Mirjam Nielen

    2017-01-01

    Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016. 

  4. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  5. Equilibrium statistical mechanics

    CERN Document Server

    Mayer, J E

    1968-01-01

    The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t

  6. Contributions to statistics

    CERN Document Server

    Mahalanobis, P C

    1965-01-01

    Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt

  7. Annual Statistical Supplement, 2001

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  8. Annual Statistical Supplement, 2011

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  9. Annual Statistical Supplement, 2003

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  10. Annual Statistical Supplement, 2015

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  11. Annual Statistical Supplement, 2000

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  12. Annual Statistical Supplement, 2005

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  13. Annual Statistical Supplement, 2014

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  14. Annual Statistical Supplement, 2009

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  15. Annual Statistical Supplement, 2017

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  16. Annual Statistical Supplement, 2008

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  17. Annual Statistical Supplement, 2010

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  18. Annual Statistical Supplement, 2016

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  19. Annual Statistical Supplement, 2004

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  20. Annual Statistical Supplement, 2002

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  1. Annual Statistical Supplement, 2007

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  2. Annual Statistical Supplement, 2006

    Data.gov (United States)

    Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...

  3. Principles of statistics

    CERN Document Server

    Bulmer, M G

    1979-01-01

    There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo

  4. Fundamental statistical theories

    International Nuclear Information System (INIS)

    Demopoulos, W.

    1976-01-01

    Einstein argued that since quantum mechanics is not a fundamental theory it cannot be regarded as in any sense final. The pure statistical states of the quantum theory are not dispersion-free. In this sense, the theory is significantly statistical. The problem investigated in this paper is to determine under what conditions is a significalty statistical theory correctly regarded as fundamental. The solution developed in this paper is that a statistical theory is fundamental only if it is complete; moreover the quantum theory is complete. (B.R.H.)

  5. Plague Maps and Statistics

    Science.gov (United States)

    ... Healthcare Professionals Clinicians Public Health Officials Veterinarians Prevention History of Plague Resources FAQ Maps and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States ...

  6. Valley Fever (Coccidioidomycosis) Statistics

    Science.gov (United States)

    ... mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis Treatment Statistics Healthcare Professionals More Resources Candida auris General Information ...

  7. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  8. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  9. Statistics in a Nutshell

    CERN Document Server

    Boslaugh, Sarah

    2008-01-01

    Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat

  10. Statistics is Easy

    CERN Document Server

    Shasha, Dennis

    2010-01-01

    Statistics is the activity of inferring results about a population given a sample. Historically, statistics books assume an underlying distribution to the data (typically, the normal distribution) and derive results under that assumption. Unfortunately, in real life, one cannot normally be sure of the underlying distribution. For that reason, this book presents a distribution-independent approach to statistics based on a simple computational counting idea called resampling. This book explains the basic concepts of resampling, then systematically presents the standard statistical measures along

  11. Experimental techniques; Techniques experimentales

    Energy Technology Data Exchange (ETDEWEB)

    Roussel-Chomaz, P. [GANIL CNRS/IN2P3, CEA/DSM, 14 - Caen (France)

    2007-07-01

    This lecture presents the experimental techniques, developed in the last 10 or 15 years, in order to perform a new class of experiments with exotic nuclei, where the reactions induced by these nuclei allow to get information on their structure. A brief review of the secondary beams production methods will be given, with some examples of facilities in operation or under project. The important developments performed recently on cryogenic targets will be presented. The different detection systems will be reviewed, both the beam detectors before the targets, and the many kind of detectors necessary to detect all outgoing particles after the reaction: magnetic spectrometer for the heavy fragment, detection systems for the target recoil nucleus, {gamma} detectors. Finally, several typical examples of experiments will be detailed, in order to illustrate the use of each detector either alone, or in coincidence with others. (author)

  12. Demystifying EQA statistics and reports.

    Science.gov (United States)

    Coucke, Wim; Soumali, Mohamed Rida

    2017-02-15

    Reports act as an important feedback tool in External Quality Assessment (EQA). Their main role is to score laboratories for their performance in an EQA round. The most common scores that apply to quantitative data are Q- and Z-scores. To calculate these scores, EQA providers need to have an assigned value and standard deviation for the sample. Both assigned values and standard deviations can be derived chemically or statistically. When derived statistically, different anomalies against the normal distribution of the data have to be handled. Various procedures for evaluating laboratories are able to handle these anomalies. Formal tests and graphical representation techniques are discussed and suggestions are given to help choosing between the different evaluations techniques. In order to obtain reliable estimates for calculating performance scores, a satisfactory number of data is needed. There is no general agreement about the minimal number that is needed. A solution for very small numbers is proposed by changing the limits of evaluation.
Apart from analyte- and sample-specific laboratory evaluation, supplementary information can be obtained by combining results for different analytes and samples. Various techniques are overviewed. It is shown that combining results leads to supplementary information, not only for quantitative, but also for qualitative and semi-quantitative analytes.

  13. Radar imaging using statistical orthogonality

    Science.gov (United States)

    Falconer, David G.

    2000-08-01

    Statistical orthogonality provides a mathematical basis for imaging scattering data with an inversion algorithm that is both robust and economic. The statistical technique is based on the approximate orthogonality of vectors whose elements are exponential functions with imaginary arguments and random phase angles. This orthogonality allows one to image radar data without first inverting a matrix whose dimensionality equals or exceeds the number of pixels or voxels in the algorithmic image. Additionally, statistical-based methods are applicable to data sets collected under a wide range of operational conditions, e.g., the random flight paths of the curvilinear SAR, the frequency-hopping emissions of ultra- wideband radar, or the narrowband data collected with a bistatic radar. The statistical approach also avoids the often-challenging and computationally intensive task of converting the collected measurements to a data format that is appropriate for imaging with a fast Fourier transform (FFT) or fast tomography algorithm (FTA), e.g., interpolating from polar to rectangular coordinates, or conversely.

  14. Statistical perspectives on inverse problems

    DEFF Research Database (Denmark)

    Andersen, Kim Emil

    of the interior of an object from electrical boundary measurements. One part of this thesis concerns statistical approaches for solving, possibly non-linear, inverse problems. Thus inverse problems are recasted in a form suitable for statistical inference. In particular, a Bayesian approach for regularisation...... problem is given in terms of probability distributions. Posterior inference is obtained by Markov chain Monte Carlo methods and new, powerful simulation techniques based on e.g. coupled Markov chains and simulated tempering is developed to improve the computational efficiency of the overall simulation......Inverse problems arise in many scientific disciplines and pertain to situations where inference is to be made about a particular phenomenon from indirect measurements. A typical example, arising in diffusion tomography, is the inverse boundary value problem for non-invasive reconstruction...

  15. Statistics for High Energy Physics

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed.  Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells,  Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...

  16. Energy statistics yearbook 2000

    International Nuclear Information System (INIS)

    2002-01-01

    The Energy Statistics Yearbook 2000 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-third in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  17. Energy statistics yearbook 2001

    International Nuclear Information System (INIS)

    2004-01-01

    The Energy Statistics Yearbook 2001 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-fifth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  18. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes c...

  19. Applied Statistics with SPSS

    Science.gov (United States)

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  20. Cancer Statistics Animator

    Science.gov (United States)

    This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.

  1. Principles of medical statistics

    National Research Council Canada - National Science Library

    Feinstein, Alvan R

    2002-01-01

    ... or limited attention. They are then offered a simple, superficial account of the most common doctrines and applications of statistical theory. The "get-it-over-withquickly" approach has been encouraged and often necessitated by the short time given to statistics in modern biomedical education. The curriculum is supposed to provide fundament...

  2. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...

  3. Practical statistics simply explained

    CERN Document Server

    Langley, Dr Russell A

    1971-01-01

    For those who need to know statistics but shy away from math, this book teaches how to extract truth and draw valid conclusions from numerical data using logic and the philosophy of statistics rather than complex formulae. Lucid discussion of averages and scatter, investigation design, more. Problems with solutions.

  4. Thiele. Pioneer in statistics

    DEFF Research Database (Denmark)

    Lauritzen, Steffen Lilholt

    This book studies the brilliant Danish 19th Century astronomer, T.N. Thiele who made important contributions to statistics, actuarial science, astronomy and mathematics. The most important of these contributions in statistics are translated into English for the first time, and the text includes...

  5. Energy statistics yearbook 2002

    International Nuclear Information System (INIS)

    2005-01-01

    The Energy Statistics Yearbook 2002 is a comprehensive collection of international energy statistics prepared by the United Nations Statistics Division. It is the forty-sixth in a series of annual compilations which commenced under the title World Energy Supplies in Selected Years, 1929-1950. It updates the statistical series shown in the previous issue. Supplementary series of monthly and quarterly data on production of energy may be found in the Monthly Bulletin of Statistics. The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from the annual energy questionnaire distributed by the United Nations Statistics Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistics Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  6. Enhancing statistical literacy

    NARCIS (Netherlands)

    Droogers, M.J.S.|info:eu-repo/dai/nl/413392252; Drijvers, P.H.M.|info:eu-repo/dai/nl/074302922

    2017-01-01

    Current secondary school statistics curricula focus on procedural knowledge and pay too little attention to statistical reasoning. As a result, students are not able to apply their knowledge to practice. In addition, education often targets the average student, which may lead to gifted students

  7. Water Quality Statistics

    Science.gov (United States)

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  8. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  9. Practical statistics for educators

    CERN Document Server

    Ravid, Ruth

    2014-01-01

    Practical Statistics for Educators, Fifth Edition, is a clear and easy-to-follow text written specifically for education students in introductory statistics courses and in action research courses. It is also a valuable resource and guidebook for educational practitioners who wish to study their own settings.

  10. Workshop statistics discovery with data and Minitab

    CERN Document Server

    Rossman, Allan J

    1998-01-01

    Shorn of all subtlety and led naked out of the protec­ tive fold of educational research literature, there comes a sheepish little fact: lectures don't work nearly as well as many of us would like to think. -George Cobb (1992) This book contains activities that guide students to discover statistical concepts, explore statistical principles, and apply statistical techniques. Students work toward these goals through the analysis of genuine data and through inter­ action with one another, with their instructor, and with technology. Providing a one-semester introduction to fundamental ideas of statistics for college and advanced high school students, Warkshop Statistics is designed for courses that employ an interactive learning environment by replacing lectures with hands­ on activities. The text contains enough expository material to stand alone, but it can also be used to supplement a more traditional textbook. Some distinguishing features of Workshop Statistics are its emphases on active learning, conceptu...

  11. Statistical Symbolic Execution with Informed Sampling

    Science.gov (United States)

    Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco

    2014-01-01

    Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.

  12. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  13. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  14. Statistical Plasma Physics

    CERN Document Server

    Ichimaru, Setsuo

    2004-01-01

    Plasma physics is an integral part of statistical physics, complete with its own basic theories. Designed as a two-volume set, Statistical Plasma Physics is intended for advanced undergraduate and beginning graduate courses on plasma and statistical physics, and as such, its presentation is self-contained and should be read without difficulty by those with backgrounds in classical mechanics, electricity and magnetism, quantum mechanics, and statistics. Major topics include: plasma phenomena in nature, kinetic equations, plasmas and dielectric media, electromagnetic properties of Vlasov plasmas in thermodynamic equilibria, transient processes, and instabilities. Statistical Plasma Physics, Volume II, treats subjects in the field of condensed plasma physics, with applications to condensed matter physics, atomic physics, nuclear physics, and astrophysics. The aim of this book is to elucidate a number of basic topics in physics of dense plasmas that interface with condensed matter physics, atomic physics, nuclear...

  15. Statistical mechanics of dictionary learning

    Science.gov (United States)

    Sakata, Ayaka; Kabashima, Yoshiyuki

    2013-07-01

    Finding a basis matrix (dictionary) by which objective signals are represented sparsely is of major relevance in various scientific and technological fields. We consider a problem to learn a dictionary from a set of training signals. We employ techniques of statistical mechanics of disordered systems to evaluate the size of the training set necessary to typically succeed in the dictionary learning. The results indicate that the necessary size is much smaller than previously estimated, which theoretically supports and/or encourages the use of dictionary learning in practical situations.

  16. Doing statistical mediation and moderation

    CERN Document Server

    Jose, Paul E

    2013-01-01

    Written in a friendly, conversational style, this book offers a hands-on approach to statistical mediation and moderation for both beginning researchers and those familiar with modeling. Starting with a gentle review of regression-based analysis, Paul Jose covers basic mediation and moderation techniques before moving on to advanced topics in multilevel modeling, structural equation modeling, and hybrid combinations, such as moderated mediation. User-friendly features include numerous graphs and carefully worked-through examples; ""Helpful Suggestions"" about procedures and pitfalls; ""Knowled

  17. An Overview of Short-term Statistical Forecasting Methods

    DEFF Research Database (Denmark)

    Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat

    2006-01-01

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...

  18. Statistical mechanics in JINR

    International Nuclear Information System (INIS)

    Tonchev, N.; Shumovskij, A.S.

    1986-01-01

    The history of investigations, conducted at the JINR in the field of statistical mechanics, beginning with the fundamental works by Bogolyubov N.N. on superconductivity microscopic theory is presented. Ideas, introduced in these works and methods developed in them, have largely determined the ways for developing statistical mechanics in the JINR and Hartree-Fock-Bogolyubov variational principle has become an important method of the modern nucleus theory. A brief review of the main achievements, connected with the development of statistical mechanics methods and their application in different fields of physical science is given

  19. Statistics a complete introduction

    CERN Document Server

    Graham, Alan

    2013-01-01

    Statistics: A Complete Introduction is the most comprehensive yet easy-to-use introduction to using Statistics. Written by a leading expert, this book will help you if you are studying for an important exam or essay, or if you simply want to improve your knowledge. The book covers all the key areas of Statistics including graphs, data interpretation, spreadsheets, regression, correlation and probability. Everything you will need is here in this one book. Each chapter includes not only an explanation of the knowledge and skills you need, but also worked examples and test questions.

  20. AP statistics crash course

    CERN Document Server

    D'Alessio, Michael

    2012-01-01

    AP Statistics Crash Course - Gets You a Higher Advanced Placement Score in Less Time Crash Course is perfect for the time-crunched student, the last-minute studier, or anyone who wants a refresher on the subject. AP Statistics Crash Course gives you: Targeted, Focused Review - Study Only What You Need to Know Crash Course is based on an in-depth analysis of the AP Statistics course description outline and actual Advanced Placement test questions. It covers only the information tested on the exam, so you can make the most of your valuable study time. Our easy-to-read format covers: exploring da

  1. Statistical physics; Physique statistique

    Energy Technology Data Exchange (ETDEWEB)

    Couture, L.; Zitoun, R. [Universite Pierre et Marie Curie, 75 - Paris (France)

    1992-12-31

    The basis of statistical physics is exposed. The statistical models of Maxwell-Boltzmann, of Bose-Einstein and of Fermi-Dirac and their particular application fields are presented. The statistical theory is applied in different ranges of physics: gas characteristics, paramagnetism, crystal thermal properties and solid electronic properties. A whole chapter is dedicated to helium and its characteristics such as superfluidity, another deals with superconductivity. Superconductivity is presented both experimentally and theoretically. Meissner effect and Josephson effect are described and the framework of BCS theory is drawn. (A.C.)

  2. The nature of statistics

    CERN Document Server

    Wallis, W Allen

    2014-01-01

    Focusing on everyday applications as well as those of scientific research, this classic of modern statistical methods requires little to no mathematical background. Readers develop basic skills for evaluating and using statistical data. Lively, relevant examples include applications to business, government, social and physical sciences, genetics, medicine, and public health. ""W. Allen Wallis and Harry V. Roberts have made statistics fascinating."" - The New York Times ""The authors have set out with considerable success, to write a text which would be of interest and value to the student who,

  3. Statistical deception at work

    CERN Document Server

    Mauro, John

    2013-01-01

    Written to reveal statistical deceptions often thrust upon unsuspecting journalists, this book views the use of numbers from a public perspective. Illustrating how the statistical naivete of journalists often nourishes quantitative misinformation, the author's intent is to make journalists more critical appraisers of numerical data so that in reporting them they do not deceive the public. The book frequently uses actual reported examples of misused statistical data reported by mass media and describes how journalists can avoid being taken in by them. Because reports of survey findings seldom g

  4. Methods of statistical physics

    CERN Document Server

    Akhiezer, Aleksandr I

    1981-01-01

    Methods of Statistical Physics is an exposition of the tools of statistical mechanics, which evaluates the kinetic equations of classical and quantized systems. The book also analyzes the equations of macroscopic physics, such as the equations of hydrodynamics for normal and superfluid liquids and macroscopic electrodynamics. The text gives particular attention to the study of quantum systems. This study begins with a discussion of problems of quantum statistics with a detailed description of the basics of quantum mechanics along with the theory of measurement. An analysis of the asymptotic be

  5. Statistical Group Comparison

    CERN Document Server

    Liao, Tim Futing

    2011-01-01

    An incomparably useful examination of statistical methods for comparisonThe nature of doing science, be it natural or social, inevitably calls for comparison. Statistical methods are at the heart of such comparison, for they not only help us gain understanding of the world around us but often define how our research is to be carried out. The need to compare between groups is best exemplified by experiments, which have clearly defined statistical methods. However, true experiments are not always possible. What complicates the matter more is a great deal of diversity in factors that are not inde

  6. Infant Statistical Learning

    Science.gov (United States)

    Saffran, Jenny R.; Kirkham, Natasha Z.

    2017-01-01

    Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812

  7. School Violence: Data & Statistics

    Science.gov (United States)

    ... Programs Press Room Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first step in preventing school violence is to understand the extent and nature ...

  8. Medicaid Drug Claims Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Drug Claims Statistics CD is a useful tool that conveniently breaks up Medicaid claim counts and separates them by quarter and includes an annual count.

  9. CMS Statistics Reference Booklet

    Data.gov (United States)

    U.S. Department of Health & Human Services — The annual CMS Statistics reference booklet provides a quick reference for summary information about health expenditures and the Medicare and Medicaid health...

  10. EDI Performance Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — This section contains statistical information and reports related to the percentage of electronic transactions being sent to Medicare contractors in the formats...

  11. Basics of statistical physics

    CERN Document Server

    Müller-Kirsten, Harald J W

    2013-01-01

    Statistics links microscopic and macroscopic phenomena, and requires for this reason a large number of microscopic elements like atoms. The results are values of maximum probability or of averaging. This introduction to statistical physics concentrates on the basic principles, and attempts to explain these in simple terms supplemented by numerous examples. These basic principles include the difference between classical and quantum statistics, a priori probabilities as related to degeneracies, the vital aspect of indistinguishability as compared with distinguishability in classical physics, the differences between conserved and non-conserved elements, the different ways of counting arrangements in the three statistics (Maxwell-Boltzmann, Fermi-Dirac, Bose-Einstein), the difference between maximization of the number of arrangements of elements, and averaging in the Darwin-Fowler method. Significant applications to solids, radiation and electrons in metals are treated in separate chapters, as well as Bose-Eins...

  12. Illinois travel statistics, 2008

    Science.gov (United States)

    2009-01-01

    The 2008 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  13. Illinois travel statistics, 2010

    Science.gov (United States)

    2011-01-01

    The 2010 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  14. Illinois travel statistics, 2009

    Science.gov (United States)

    2010-01-01

    The 2009 Illinois Travel Statistics publication is assembled to provide detailed traffic : information to the different users of traffic data. While most users of traffic data at this level : of detail are within the Illinois Department of Transporta...

  15. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  16. Transport statistics 1996

    CSIR Research Space (South Africa)

    Shepperson, L

    1997-12-01

    Full Text Available This publication contains transport and related statistics on roads, vehicles, infrastructure, passengers, freight, rail, air, maritime and road traffic, and international comparisons. The information compiled in this publication has been gathered...

  17. Information theory and statistics

    CERN Document Server

    Kullback, Solomon

    1968-01-01

    Highly useful text studies logarithmic measures of information and their application to testing statistical hypotheses. Includes numerous worked examples and problems. References. Glossary. Appendix. 1968 2nd, revised edition.

  18. Statistical Measures of Marksmanship

    National Research Council Canada - National Science Library

    Johnson, Richard

    2001-01-01

    .... This report describes objective statistical procedures to measure both rifle marksmanship accuracy, the proximity of an array of shots to the center of mass of a target, and marksmanship precision...

  19. CDC WONDER: Cancer Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The United States Cancer Statistics (USCS) online databases in WONDER provide cancer incidence and mortality data for the United States for the years since 1999, by...

  20. Statistical theory of heat

    CERN Document Server

    Scheck, Florian

    2016-01-01

    Scheck’s textbook starts with a concise introduction to classical thermodynamics, including geometrical aspects. Then a short introduction to probabilities and statistics lays the basis for the statistical interpretation of thermodynamics. Phase transitions, discrete models and the stability of matter are explained in great detail. Thermodynamics has a special role in theoretical physics. Due to the general approach of thermodynamics the field has a bridging function between several areas like the theory of condensed matter, elementary particle physics, astrophysics and cosmology. The classical thermodynamics describes predominantly averaged properties of matter, reaching from few particle systems and state of matter to stellar objects. Statistical Thermodynamics covers the same fields, but explores them in greater depth and unifies classical statistical mechanics with quantum theory of multiple particle systems. The content is presented as two tracks: the fast track for master students, providing the essen...

  1. Statistics For Neuroscientists

    Directory of Open Access Journals (Sweden)

    Subbakrishna D.K

    2000-01-01

    Full Text Available The role statistical methods play in medicine in the interpretation of empirical data is well recognized by researchers. With modern computing facilities and software packages there is little need for familiarity with the computational details of statistical calculations. However, for the researcher to understand whether these calculations are valid and appropriate it is necessary that the user is aware of the rudiments of the statistical methodology. Also, it needs to be emphasized that no amount of advanced analysis can be a substitute for a properly planned and executed study. An attempt is made in this communication to discuss some of the theoretical issues that are important for the valid analysis and interpretation of precious date that are gathered. The article summarises some of the basic statistical concepts followed by illustrations from live data generated from various research projects from the department of Neurology of this Institute.

  2. Elements of statistical thermodynamics

    CERN Document Server

    Nash, Leonard K

    2006-01-01

    Encompassing essentially all aspects of statistical mechanics that appear in undergraduate texts, this concise, elementary treatment shows how an atomic-molecular perspective yields new insights into macroscopic thermodynamics. 1974 edition.

  3. Ehrlichiosis: Statistics and Epidemiology

    Science.gov (United States)

    ... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On This ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...

  4. Anaplasmosis: Statistics and Epidemiology

    Science.gov (United States)

    ... a tick Diseases transmitted by ticks Statistics and Epidemiology Recommend on Facebook Tweet Share Compartir On This ... Holman RC, McQuiston JH, Krebs JW, Swerdlow DL. Epidemiology of human ehrlichiosis and anaplasmosis in the United ...

  5. Boating Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  6. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  7. Statistical Engine Knock Control

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    A new statistical concept of the knock control of a spark ignition automotive engine is proposed . The control aim is associated with the statistical hy pothesis test which compares the threshold value to the average value of the max imal amplitud e of the knock sensor signal at a given freq uency...... which includ es generation of the amplitud e signals, a threshold value d etermination and a knock sound mod el is d eveloped for evaluation of the control concept....

  8. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  9. Statistical mechanics rigorous results

    CERN Document Server

    Ruelle, David

    1999-01-01

    This classic book marks the beginning of an era of vigorous mathematical progress in equilibrium statistical mechanics. Its treatment of the infinite system limit has not been superseded, and the discussion of thermodynamic functions and states remains basic for more recent work. The conceptual foundation provided by the Rigorous Results remains invaluable for the study of the spectacular developments of statistical mechanics in the second half of the 20th century.

  10. Multilevel statistical models

    CERN Document Server

    Goldstein, Harvey

    2011-01-01

    This book provides a clear introduction to this important area of statistics. The author provides a wide of coverage of different kinds of multilevel models, and how to interpret different statistical methodologies and algorithms applied to such models. This 4th edition reflects the growth and interest in this area and is updated to include new chapters on multilevel models with mixed response types, smoothing and multilevel data, models with correlated random effects and modeling with variance.

  11. European environmental statistics handbook

    Energy Technology Data Exchange (ETDEWEB)

    Newman, O.; Foster, A. [comps.] [Manchester Business School, Manchester (United Kingdom). Library and Information Service

    1993-12-31

    This book is a compilation of statistical materials on environmental pollution drawn from governmental and private sources. It is divided into ten chapters: air, water and land - monitoring statistics; cities, regions and nations; costs, budgets and expenditures - costs of pollution and its control, including air pollution; effects; general industry and government data; laws and regulations; politics and opinion - including media coverage; pollutants and wastes; pollution control industry; and tools, methods and solutions. 750 tabs.

  12. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  13. A DIY guide to statistical strategy

    International Nuclear Information System (INIS)

    Pregibon, D.

    1986-01-01

    This chapter is a do it yourself (DIY) guide to developing statistical strategy for a particular data analysis task. The primary audience of the chapter is research statisticians. The material should also be of interest to nonstatisticians, especially knowledge engineers, who are interested in using statistical data analysis as an application domain for Artificial Intelligence techniques. The do's and don'ts of strategy development are outlined. The ''linear regression'' task is used to illustrate many of the ideas

  14. Breakthroughs in statistics

    CERN Document Server

    Johnson, Norman

    This is author-approved bcc: This is the third volume of a collection of seminal papers in the statistical sciences written during the past 110 years. These papers have each had an outstanding influence on the development of statistical theory and practice over the last century. Each paper is preceded by an introduction written by an authority in the field providing background information and assessing its influence. Volume III concerntrates on articles from the 1980's while including some earlier articles not included in Volume I and II. Samuel Kotz is Professor of Statistics in the College of Business and Management at the University of Maryland. Norman L. Johnson is Professor Emeritus of Statistics at the University of North Carolina. Also available: Breakthroughs in Statistics Volume I: Foundations and Basic Theory Samuel Kotz and Norman L. Johnson, Editors 1993. 631 pp. Softcover. ISBN 0-387-94037-5 Breakthroughs in Statistics Volume II: Methodology and Distribution Samuel Kotz and Norman L. Johnson, Edi...

  15. UN Data- Environmental Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  16. UN Data: Environment Statistics: Waste

    Data.gov (United States)

    World Wide Human Geography Data Working Group — The Environment Statistics Database contains selected water and waste statistics by country. Statistics on water and waste are based on official statistics supplied...

  17. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    an example of Iran). Ali Haghizadeha, Davoud Davoudi Moghaddama, Hamid Reza Pourghasemib* a Department of Range and Watershed Management Engineering, Lorestan University, Lorestan, Iran b Department of Natural Resources and ...

  18. Using WSD Techniques for Lexical Selection in Statistical Machine Translation

    Science.gov (United States)

    2005-07-01

    Y-w ` Y-R c hWT g YBXZY-hihWYKVIT^koT]hWTt YBm l£Rymsp�TR�pKm] hfYsh1= Rdm]_t g ]_m � ]ãhfm¢XZY<T] b ]r¬ujrYK] h b TR¥l�Tp�hWmV b...m]_\\ihfVWT b ] hW\\K¡a�Vfjr\\ih b ]_t�­nmV z qxmViYehfm:h ` Y c m b ] h z � @ H lnjrRyRdw hWVij_\\ih b ]_t°&h ` YBm ]_Y´XZYK\\ihImjkh c jrhLmlS

  19. A textbook of computer based numerical and statistical techniques

    CERN Document Server

    Jaiswal, AK

    2009-01-01

    About the Book: Application of Numerical Analysis has become an integral part of the life of all the modern engineers and scientists. The contents of this book covers both the introductory topics and the more advanced topics such as partial differential equations. This book is different from many other books in a number of ways. Salient Features: Mathematical derivation of each method is given to build the students understanding of numerical analysis. A variety of solved examples are given. Computer programs for almost all numerical methods discussed have been presented in `C` langu

  20. Analysis of compressive fracture in rock using statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  1. Learning for Semantic Parsing Using Statistical Syntactic Parsing Techniques

    Science.gov (United States)

    2010-05-01

    Committee: Raymond J. Mooney, Supervisor Jason M. Baldridge Risto Miikkulainen Martha Palmer Bruce W. Porter Learning for Semantic Parsing Using... Baldridge , Risto Miikku- lainen, Martha Palmer, Bruce Porter, and the former member, Ben Kuipers. I have a deeper understanding of the research thanks to...their insight and suggestions. I am especially grateful to Jason Baldridge for his valuable advice and help on my job hunting besides research. I would

  2. 7th International Workshop on Statistical Simulation

    CERN Document Server

    Mignani, Stefania; Monari, Paola; Salmaso, Luigi

    2014-01-01

    The Department of Statistical Sciences of the University of Bologna in collaboration with the Department of Management and Engineering of the University of Padova, the Department of Statistical Modelling of Saint Petersburg State University, and INFORMS Simulation Society sponsored the Seventh Workshop on Simulation. This international conference was devoted to statistical techniques in stochastic simulation, data collection, analysis of scientific experiments, and studies representing broad areas of interest. The previous workshops took place in St. Petersburg, Russia in 1994, 1996, 1998, 2001, 2005, and 2009. The Seventh Workshop took place in the Rimini Campus of the University of Bologna, which is in Rimini’s historical center.

  3. IBM SPSS statistics 19 made simple

    CERN Document Server

    Gray, Colin D

    2012-01-01

    This new edition of one of the most widely read textbooks in its field introduces the reader to data analysis with the most powerful and versatile statistical package on the market: IBM SPSS Statistics 19. Each new release of SPSS Statistics features new options and other improvements. There remains a core of fundamental operating principles and techniques which have continued to apply to all releases issued in recent years and have been proved to be worth communicating in a small volume. This practical and informal book combines simplicity and clarity of presentation with a comprehensive trea

  4. Topology for statistical modeling of petascale data.

    Energy Technology Data Exchange (ETDEWEB)

    Pascucci, Valerio (University of Utah, Salt Lake City, UT); Mascarenhas, Ajith Arthur; Rusek, Korben (Texas A& M University, College Station, TX); Bennett, Janine Camille; Levine, Joshua (University of Utah, Salt Lake City, UT); Pebay, Philippe Pierre; Gyulassy, Attila (University of Utah, Salt Lake City, UT); Thompson, David C.; Rojas, Joseph Maurice (Texas A& M University, College Station, TX)

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  5. Exotic statistics on surfaces

    International Nuclear Information System (INIS)

    Imbo, T.D.; March-Russel, J.

    1990-01-01

    We investigate the allowed spectrum of statistics for n identical spinless particles on an arbitrary closed two-manifold M, by using a powerful topological approach to the study of quantum kinematics. On a surface of genus g≥1 statistics other than Bose or Fermi can only be obtained by utilizing multi-component state vectors transforming as an irreducible unitary representation of the fundamental group of the n-particle configuration space. These multi-component (or nonscalar) quantizations allow the possibility of fractional statistics, as well as other exotic, nonfractional statistics some of whose properties we discuss. On an orientable surface of genus g≥0 only anyons with rational statistical parameter θ/π=p/q are allowed, and their number is restricted to be sq-g+1 (selement ofZ). For nonorientable surfaces only θ=0, π are allowed. Finally, we briefly comment on systems of spinning particles and make a comparison with the results for solitons in the O(3)-invariant nonlinear sigma model with space manifold M. (orig.)

  6. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  7. 1992 Energy statistics Yearbook

    International Nuclear Information System (INIS)

    1994-01-01

    The principal objective of the Yearbook is to provide a global framework of comparable data on long-term trends in the supply of mainly commercial primary and secondary forms of energy. Data for each type of fuel and aggregate data for the total mix of commercial fuels are shown for individual countries and areas and are summarized into regional and world totals. The data are compiled primarily from annual questionnaires distributed by the United Nations Statistical Division and supplemented by official national statistical publications. Where official data are not available or are inconsistent, estimates are made by the Statistical Division based on governmental, professional or commercial materials. Estimates include, but are not limited to, extrapolated data based on partial year information, use of annual trends, trade data based on partner country reports, breakdowns of aggregated data as well as analysis of current energy events and activities

  8. Philosophy of statistics

    CERN Document Server

    Forster, Malcolm R

    2011-01-01

    Statisticians and philosophers of science have many common interests but restricted communication with each other. This volume aims to remedy these shortcomings. It provides state-of-the-art research in the area of philosophy of statistics by encouraging numerous experts to communicate with one another without feeling "restricted” by their disciplines or thinking "piecemeal” in their treatment of issues. A second goal of this book is to present work in the field without bias toward any particular statistical paradigm. Broadly speaking, the essays in this Handbook are concerned with problems of induction, statistics and probability. For centuries, foundational problems like induction have been among philosophers' favorite topics; recently, however, non-philosophers have increasingly taken a keen interest in these issues. This volume accordingly contains papers by both philosophers and non-philosophers, including scholars from nine academic disciplines.

  9. Statistics and social critique

    Directory of Open Access Journals (Sweden)

    Alain Desrosières

    2014-07-01

    Full Text Available This paper focuses on the history of the uses of statistics as a tool for socialcritique. Whereas nowadays they are very often conceived as being in the hands of the powerful, there are many historical cases when they were, on the contrary, used to oppose the authority. The author first illustrates the theory of Ted Porter according to which quantification might be a “tool of weakness”. He then addresses the fact that statistics were used in the context of labour and on living conditions, thus being a resource for the lower class of society (and presenting the theory of statistics of Pelloutier, an anarchist activist. Finally comes the question of the conditions of success of these counterpropositions, discussed on the examples of the new random experiments in public policies, and of the measure of the 1% of the richest persons

  10. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    This RS code is to do Object-by-Object analysis of each Object's sub-objects, e.g. statistical analysis of an object's individual image data pixels. Statistics, such as percentiles (so-called "quartiles") are derived by the process, but the return of that can only be a Scene Variable, not an Object...... an analysis of the values of the object's pixels in MS-Excel. The shell of the proceedure could also be used for purposes other than just the derivation of Object - Sub-object statistics, e.g. rule-based assigment processes....... Variable. This procedure was developed in order to be able to export objects as ESRI shape data with the 90-percentile of the Hue of each object's pixels as an item in the shape attribute table. This procedure uses a sub-level single pixel chessboard segmentation, loops for each of the objects...

  11. Energy statistics manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    Detailed, complete, timely and reliable statistics are essential to monitor the energy situation at a country level as well as at an international level. Energy statistics on supply, trade, stocks, transformation and demand are indeed the basis for any sound energy policy decision. For instance, the market of oil -- which is the largest traded commodity worldwide -- needs to be closely monitored in order for all market players to know at any time what is produced, traded, stocked and consumed and by whom. In view of the role and importance of energy in world development, one would expect that basic energy information to be readily available and reliable. This is not always the case and one can even observe a decline in the quality, coverage and timeliness of energy statistics over the last few years.

  12. Statistical inferences in phylogeography

    DEFF Research Database (Denmark)

    Nielsen, Rasmus; Beaumont, Mark A

    2009-01-01

    can randomly lead to multiple different genealogies. Likewise, the same gene trees can arise under different demographic models. This problem has led to the emergence of many statistical methods for making phylogeographic inferences. A popular phylogeographic approach based on nested clade analysis...... is challenged by the fact that a certain amount of the interpretation of the data is left to the subjective choices of the user, and it has been argued that the method performs poorly in simulation studies. More rigorous statistical methods based on coalescence theory have been developed. However, these methods...... may also be challenged by computational problems or poor model choice. In this review, we will describe the development of statistical methods in phylogeographic analysis, and discuss some of the challenges facing these methods....

  13. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... in conjunction with image data are plagued with various challenges beyond the usual ones encountered in current applications. In this presentation we will introduce the basic ideas of SPC and the multivariate control charts commonly used in industry. We will further discuss the challenges the practitioners...

  14. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  15. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  16. READING STATISTICS AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Reviewed by Yavuz Akbulut

    2008-10-01

    Full Text Available The book demonstrates the best and most conservative ways to decipher and critique research reports particularly for social science researchers. In addition, new editions of the book are always better organized, effectively structured and meticulously updated in line with the developments in the field of research statistics. Even the most trivial issues are revisited and updated in new editions. For instance, purchaser of the previous editions might check the interpretation of skewness and kurtosis indices in the third edition (p. 34 and in the fifth edition (p.29 to see how the author revisits every single detail. Theory and practice always go hand in hand in all editions of the book. Re-reading previous editions (e.g. third edition before reading the fifth edition gives the impression that the author never stops ameliorating his instructional text writing methods. In brief, “Reading Statistics and Research” is among the best sources showing research consumers how to understand and critically assess the statistical information and research results contained in technical research reports. In this respect, the review written by Mirko Savić in Panoeconomicus (2008, 2, pp. 249-252 will help the readers to get a more detailed overview of each chapters. I cordially urge the beginning researchers to pick a highlighter to conduct a detailed reading with the book. A thorough reading of the source will make the researchers quite selective in appreciating the harmony between the data analysis, results and discussion sections of typical journal articles. If interested, beginning researchers might begin with this book to grasp the basics of research statistics, and prop up their critical research reading skills with some statistics package applications through the help of Dr. Andy Field’s book, Discovering Statistics using SPSS (second edition published by Sage in 2005.

  17. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...

  18. Statistics As Principled Argument

    CERN Document Server

    Abelson, Robert P

    2012-01-01

    In this illuminating volume, Robert P. Abelson delves into the too-often dismissed problems of interpreting quantitative data and then presenting them in the context of a coherent story about one's research. Unlike too many books on statistics, this is a remarkably engaging read, filled with fascinating real-life (and real-research) examples rather than with recipes for analysis. It will be of true interest and lasting value to beginning graduate students and seasoned researchers alike. The focus of the book is that the purpose of statistics is to organize a useful argument from quantitative

  19. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  20. Statistics for business

    CERN Document Server

    Waller, Derek L

    2008-01-01

    Statistical analysis is essential to business decision-making and management, but the underlying theory of data collection, organization and analysis is one of the most challenging topics for business students and practitioners. This user-friendly text and CD-ROM package will help you to develop strong skills in presenting and interpreting statistical information in a business or management environment. Based entirely on using Microsoft Excel rather than more complicated applications, it includes a clear guide to using Excel with the key functions employed in the book, a glossary of terms and