International Nuclear Information System (INIS)
Hernandez M, B.
1997-01-01
The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)
Applying contemporary statistical techniques
Wilcox, Rand R
2003-01-01
Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc
Energy Technology Data Exchange (ETDEWEB)
Hernandez M, B [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)
1997-07-01
The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)
Statistical Techniques for Project Control
Badiru, Adedeji B
2012-01-01
A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati
Projection operator techniques in nonequilibrium statistical mechanics
International Nuclear Information System (INIS)
Grabert, H.
1982-01-01
This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)
Applicability of statistical process control techniques
Schippers, W.A.J.
1998-01-01
This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some
Statistical and Computational Techniques in Manufacturing
2012-01-01
In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...
Statistical evaluation of vibration analysis techniques
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Probability, statistics, and associated computing techniques
International Nuclear Information System (INIS)
James, F.
1983-01-01
This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets
A survey of statistical downscaling techniques
Energy Technology Data Exchange (ETDEWEB)
Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik
1997-12-31
The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)
A survey of statistical downscaling techniques
Energy Technology Data Exchange (ETDEWEB)
Zorita, E; Storch, H von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik
1998-12-31
The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)
Review of the Statistical Techniques in Medical Sciences | Okeh ...
African Journals Online (AJOL)
... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.
Testing of statistical techniques used in SYVAC
International Nuclear Information System (INIS)
Dalrymple, G.; Edwards, H.; Prust, J.
1984-01-01
Analysis of the SYVAC (SYstems Variability Analysis Code) output adopted four techniques to provide a cross comparison of their performance. The techniques used were: examination of scatter plots; correlation/regression; Kruskal-Wallis one-way analysis of variance by ranks; comparison of cumulative distribution functions and risk estimates between sub-ranges of parameter values. The analysis was conducted for the case of a single nuclide chain and was based mainly on simulated dose after 500,000 years. The results from this single SYVAC case showed that site parameters had the greatest influence on dose to man. The techniques of correlation/regression and Kruskal-Wallis were both successful and consistent in their identification of important parameters. Both techniques ranked the eight most important parameters in the same order when analysed for maximum dose. The results from a comparison of cdfs and risks in sub-ranges of the parameter values were not entirely consistent with other techniques. Further sampling of the high dose region is recommended in order to improve the accuracy of this method. (author)
Time series prediction: statistical and neural techniques
Zahirniak, Daniel R.; DeSimio, Martin P.
1996-03-01
In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.
Predicting radiotherapy outcomes using statistical learning techniques
International Nuclear Information System (INIS)
El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O; Lindsay, Patricia E; Hope, Andrew J
2009-01-01
Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model
"Statistical Techniques for Particle Physics" (2/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (1/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (4/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
"Statistical Techniques for Particle Physics" (3/4)
CERN. Geneva
2009-01-01
This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...
Statistical Theory of the Vector Random Decrement Technique
DEFF Research Database (Denmark)
Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.
1999-01-01
decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...
Statistic techniques of process control for MTR type
International Nuclear Information System (INIS)
Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.
2002-01-01
This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)
The statistical chopper in the time-of-flight technique
International Nuclear Information System (INIS)
Albuquerque Vieira, J. de.
1975-12-01
A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
Energy Technology Data Exchange (ETDEWEB)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Categorical and nonparametric data analysis choosing the best statistical technique
Nussbaum, E Michael
2014-01-01
Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain
The application of statistical techniques to nuclear materials accountancy
International Nuclear Information System (INIS)
Annibal, P.S.; Roberts, P.D.
1990-02-01
Over the past decade much theoretical research has been carried out on the development of statistical methods for nuclear materials accountancy. In practice plant operation may differ substantially from the idealized models often cited. This paper demonstrates the importance of taking account of plant operation in applying the statistical techniques, to improve the accuracy of the estimates and the knowledge of the errors. The benefits are quantified either by theoretical calculation or by simulation. Two different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an accountancy tank is investigated. Secondly, a means of improving the knowledge of the 'Material Unaccounted For' (the difference between the inventory calculated from input/output data, and the measured inventory), using information about the plant measurement system, is developed and compared with existing general techniques. (author)
Combining heuristic and statistical techniques in landslide hazard assessments
Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni
2014-05-01
As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.
Line identification studies using traditional techniques and wavelength coincidence statistics
International Nuclear Information System (INIS)
Cowley, C.R.; Adelman, S.J.
1990-01-01
Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum
Statistical techniques to extract information during SMAP soil moisture assimilation
Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.
2017-12-01
Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.
Statistical optimisation techniques in fatigue signal editing problem
International Nuclear Information System (INIS)
Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.
2015-01-01
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection
Statistical optimisation techniques in fatigue signal editing problem
Energy Technology Data Exchange (ETDEWEB)
Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)
2015-02-03
Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
Statistical methods of evaluating and comparing imaging techniques
International Nuclear Information System (INIS)
Freedman, L.S.
1987-01-01
Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested
Statistical precision of delayed-neutron nondestructive assay techniques
International Nuclear Information System (INIS)
Bayne, C.K.; McNeany, S.R.
1979-02-01
A theoretical analysis of the statistical precision of delayed-neutron nondestructive assay instruments is presented. Such instruments measure the fissile content of nuclear fuel samples by neutron irradiation and delayed-neutron detection. The precision of these techniques is limited by the statistical nature of the nuclear decay process, but the precision can be optimized by proper selection of system operating parameters. Our method is a three-part analysis. We first present differential--difference equations describing the fundamental physics of the measurements. We then derive and present complete analytical solutions to these equations. Final equations governing the expected number and variance of delayed-neutron counts were computer programmed to calculate the relative statistical precision of specific system operating parameters. Our results show that Poisson statistics do not govern the number of counts accumulated in multiple irradiation-count cycles and that, in general, maximum count precision does not correspond with maximum count as first expected. Covariance between the counts of individual cycles must be considered in determining the optimum number of irradiation-count cycles and the optimum irradiation-to-count time ratio. For the assay system in use at ORNL, covariance effects are small, but for systems with short irradiation-to-count transition times, covariance effects force the optimum number of irradiation-count cycles to be half those giving maximum count. We conclude that the equations governing the expected value and variance of delayed-neutron counts have been derived in closed form. These have been computerized and can be used to select optimum operating parameters for delayed-neutron assay devices
Studies on coal flotation in flotation column using statistical technique
Energy Technology Data Exchange (ETDEWEB)
M.S. Jena; S.K. Biswal; K.K. Rao; P.S.R. Reddy [Institute of Minerals & Materials Technology (IMMT), Orissa (India)
2009-07-01
Flotation of Indian high ash coking coal fines to obtain clean coal has been reported earlier by many authors. Here an attempt has been made to systematically analyse factors influencing the flotation process using statistical design of experiments technique. Studies carried out in a 100 mm diameter column using factorial design to establish weightage of factors such as feed rate, air rate and collector dosage indicated that all three parameters have equal influence on the flotation process. Subsequently RSM-CCD design was used to obtain best result and it is observed that 94% combustibles can be recovered with 82.5% weight recovery at 21.4% ash from a feed containing 31.3% ash content.
Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques
Gulgundi, Mohammad Shahid; Shetty, Amba
2018-03-01
Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.
Statistical and particle physics: Common problems and techniques
International Nuclear Information System (INIS)
Bowler, K.C.; Mc Kane, A.J.
1984-01-01
These proceedings contain statistical mechanical studies in condensed matter physics; interfacial problems in statistical physics; string theory; general monte carlo methods and their application to Lattice gauge theories; topological excitations in field theory; phase transformation kinetics; and studies of chaotic systems
TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION
Directory of Open Access Journals (Sweden)
А. А. Vershinina
2014-01-01
Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.
An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques
2018-01-09
100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications
Statistical techniques for sampling and monitoring natural resources
Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado
2004-01-01
We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical
Statistical sampling techniques as applied to OSE inspections
International Nuclear Information System (INIS)
Davis, J.J.; Cote, R.W.
1987-01-01
The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing
GIS-Based bivariate statistical techniques for groundwater potential ...
Indian Academy of Sciences (India)
24
This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.
Territories typification technique with use of statistical models
Galkin, V. I.; Rastegaev, A. V.; Seredin, V. V.; Andrianov, A. V.
2018-05-01
Territories typification is required for solution of many problems. The results of geological zoning received by means of various methods do not always agree. That is why the main goal of the research given is to develop a technique of obtaining a multidimensional standard classified indicator for geological zoning. In the course of the research, the probabilistic approach was used. In order to increase the reliability of geological information classification, the authors suggest using complex multidimensional probabilistic indicator P K as a criterion of the classification. The second criterion chosen is multidimensional standard classified indicator Z. These can serve as characteristics of classification in geological-engineering zoning. Above mentioned indicators P K and Z are in good correlation. Correlation coefficient values for the entire territory regardless of structural solidity equal r = 0.95 so each indicator can be used in geological-engineering zoning. The method suggested has been tested and the schematic map of zoning has been drawn.
Application of Statistical Potential Techniques to Runaway Transport Studies
International Nuclear Information System (INIS)
Eguilior, S.; Castejon, F.; Parrondo, J. M.
2001-01-01
A method is presented for computing runaway production rate based on techniques of noise-activated escape in a potential is presented in this work. A generalised potential in 2D momentum space is obtained from the deterministic or drift terms of Langevin equations. The diffusive or stochastic terms that arise directly from the stochastic nature of collisions, play the role of the noise that activates barrier crossings. The runaway electron source is given by the escape rate in such a potential which is obtained from an Arrenius-like relation. Runaway electrons are those skip the potential barrier due to the effect of stochastic collisions. In terms of computation time, this method allows one to quickly obtain the source term for a runway electron transport code.(Author) 11 refs
Statistical techniques for the identification of reactor component structural vibrations
International Nuclear Information System (INIS)
Kemeny, L.G.
1975-01-01
The identification, on-line and in near real-time, of the vibration frequencies, modes and amplitudes of selected key reactor structural components and the visual monitoring of these phenomena by nuclear power plant operating staff will serve to further the safety and control philosophy of nuclear systems and lead to design optimisation. The School of Nuclear Engineering has developed a data acquisition system for vibration detection and identification. The system is interfaced with the HIFAR research reactor of the Australian Atomic Energy Commission. The reactor serves to simulate noise and vibrational phenomena which might be pertinent in power reactor situations. The data acquisition system consists of a small computer interfaced with a digital correlator and a Fourier transform unit. An incremental tape recorder is utilised as a backing store and as a means of communication with other computers. A small analogue computer and an analogue statistical analyzer can be used in the pre and post computational analysis of signals which are received from neutron and gamma detectors, thermocouples, accelerometers, hydrophones and strain gauges. Investigations carried out to date include a study of the role of local and global pressure fields due to turbulence in coolant flow and pump impeller induced perturbations on (a) control absorbers, (B) fuel element and (c) coolant external circuit and core tank structure component vibrations. (Auth.)
Statistical mechanics of sensing and communications: Insights and techniques
International Nuclear Information System (INIS)
Murayama, T; Davis, P
2008-01-01
In this article we review a basic model for analysis of large sensor networks from the point of view of collective estimation under bandwidth constraints. We compare different sensing aggregation levels as alternative 'strategies' for collective estimation: moderate aggregation from a moderate number of sensors for which communication bandwidth is enough that data encoding can be reversible, and large scale aggregation from very many sensors - in which case communication bandwidth constraints require the use of nonreversible encoding. We show the non-trivial trade-off between sensing quality, which can be increased by increasing the number of sensors, and communication quality under bandwidth constraints, which decreases if the number of sensors is too large. From a practical standpoint, we verify that such a trade-off exists in constructively defined communications schemes. We introduce a probabilistic encoding scheme and define rate distortion models that are suitable for analysis of the large network limit. Our description shows that the methods and ideas from statistical physics can play an important role in formulating effective models for such schemes
Statistical classification techniques in high energy physics (SDDT algorithm)
International Nuclear Information System (INIS)
Bouř, Petr; Kůs, Václav; Franc, Jiří
2016-01-01
We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)
MacLean, Adam L.; Harrington, Heather A.; Stumpf, Michael P. H.; Byrne, Helen M.
2015-01-01
mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since
Multifactorial QT Interval Prolongation and Takotsubo Cardiomyopathy
Directory of Open Access Journals (Sweden)
Michael Gysel
2014-01-01
Full Text Available A 71-year-old woman collapsed while working as a grocery store cashier. CPR was performed and an AED revealed torsades de pointes (TdP. She was subsequently defibrillated resulting in restoration of sinus rhythm with a QTc interval of 544 msec. Further evaluation revealed a diagnosis of Takotsubo Cardiomyopathy (TCM contributing to the development of a multifactorial acquired long QT syndrome (LQTS. The case highlights the role of TCM as a cause of LQTS in the setting of multiple risk factors including old age, female gender, hypokalemia, and treatment with QT prolonging medications. It also highlights the multifactorial nature of acquired LQTS and lends support to growing evidence of an association with TCM.
Multifactorial analysis of fatigue scale among nurses in Poland
Directory of Open Access Journals (Sweden)
Kwiecień-Jaguś Katarzyna
2016-01-01
Full Text Available Significant progress in the field of nursing has contributed to the widening of range of functions and professional duties of nurses. More frequent lack of nursing personnel has an impact on negative reception of work, it decreases sense of professional satisfaction and increases the level of burden and fatigue. Methods. The study applied the non-experimental method – a descriptive comparative study without a control group. The data was collected on the basis of Polish-language version of a Japanese questionnaire. In order to evaluate the level of physical fatigue the pedometer was used. Results.158 respondents of a group of 160 were included in the statistical analysis. The study group was internally diversified. The research project assessed the usefulness of the multifactorial analysis in evaluating the main components of nursing fatigue. Multifactorial analysis has shown that mental fatigue concentrated with changes in activeness, motivation and physical fatigue are strongly correlated with age, professional experience and education. Conclusion. Nursing is a profession of a special character and mission. Regardless of the place of work, nursing staff should be given the possibility of pursuing their profession under conditions ensuring the sense of security and protecting them from harmful effects on health.
Hayslett, H T
1991-01-01
Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the
Statistical evaluation of recorded knowledge in nuclear and other instrumental analytical techniques
International Nuclear Information System (INIS)
Braun, T.
1987-01-01
The main points addressed in this study are the following: Statistical distribution patterns of published literature on instrumental analytical techniques 1981-1984; structure of scientific literature and heuristics for identifying active specialities and emerging hot spot research areas in instrumental analytical techniques; growth and growth rates of the literature in some of the identified hot research areas; quality and quantity in instrumental analytical research output. (orig.)
Directory of Open Access Journals (Sweden)
Yuan Wang
Full Text Available With the use of iTRAQ technique, a multifactorial comparative proteomic study can be performed. In this study, to obtain an overview of ethanol, CYP2E1 and gender effects on liver injury and gain more insight into the underlying molecular mechanism, mouse liver proteomes were quantitatively analyzed using iTRAQ under eight conditions including mice of different genders, wild type versus CYP2E1 knockout, and normal versus alcohol diet. A series of statistical and bioinformatic analyses were explored to simplify and clarify multifactorial comparative proteomic data. First, with the Principle Component analysis, six proteins, CYP2E1, FAM25, CA3, BHMT, HIBADH and ECHS1, involved in oxidation reduction, energy and lipid metabolism and amino acid metabolism, were identified as the most differentially expressed gene products across all of the experimental conditions of our chronic alcoholism model. Second, hierarchical clustering analysis showed CYP2E1 knockout played a primary role in the overall differential protein expression compared with ethanol and gender factors. Furthermore, pair-wise multiple comparisons have revealed that the only significant expression difference lied in wild-type and CYP2E1 knockout mice both treated with ethanol. Third, K-mean clustering analysis indicated that the CYP2E1 knockout had the reverse effect on ethanol induced oxidative stress and lipid oxidation. More importantly, IPA analysis of proteomic data inferred that the gene expressions of two upstream regulators, NRF2 and PPARα, regulated by chronic alcohol feeding and CYP2E1 knockout, are involved in ethanol induced oxidative stress and lipid oxidation. The present study provides an effectively comprehensive data analysis strategy to compare multiple biological factors, contributing to biochemical effects of alcohol on the liver. The mass spectrometry proteomics data have been deposited to the ProteomeXchange with data set identifier of PXD000635.
Directory of Open Access Journals (Sweden)
D.P. van der Nest
2015-03-01
Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis
DEFF Research Database (Denmark)
Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...
ATHEROSCLEROSIS DISEASE: A MULTI-FACTORIAL PATHOLOGY
Directory of Open Access Journals (Sweden)
Marcieli da Luz Giroldo1; Arienne Serrano Alves1; Francielle Baptista1
2007-06-01
Full Text Available Atherosclerosis or arterial stiffening is a gradual disease that restricts the normal blood flow in different areas of body and maylead to secondary illnesses as myocardial infarction and cerebral stroke. Innumerable factors are related to the development ofatherosclerosis, among them are the dyslipidemia; genetic factors; arterial hypertension; diabetes mellitus; obesity; smoking;lack of exercise; pulmonary infection by Chlamydia and stress. Due to multi-factorial atherosclerosis characteristics,innumerable drugs, with differentiated mechanisms of action, are being elaborated to be used in prevention and control of thisdisease. However, beyond the pharmacological therapy, a balanced diet, physical activity and elimination of risk habits, assmoking, also are need for controlling atherosclerosis progression, as well as for the increase of expectative and quality of life
International Nuclear Information System (INIS)
2005-01-01
For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees
Curve fitting and modeling with splines using statistical variable selection techniques
Smith, P. L.
1982-01-01
The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.
International Nuclear Information System (INIS)
Ren, Qingguo; Dewan, Sheilesh Kumar; Li, Ming; Li, Jianying; Mao, Dingbiao; Wang, Zhenglei; Hua, Yanqing
2012-01-01
Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI vol ) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique
Energy Technology Data Exchange (ETDEWEB)
Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)
2012-10-15
Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.
International Nuclear Information System (INIS)
2001-01-01
For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
1999-01-01
For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor
2016-09-01
In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].
International Nuclear Information System (INIS)
Park, Jinyong; Balasingham, P.; McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W.
2004-01-01
Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater
Multifactorial etiology of Torus mandibularis: study of twins.
Auškalnis, Adomas; Rutkūnas, Vygandas; Bernhardt, Olaf; Šidlauskas, Mantas; Šalomskienė, Loreta; Basevičienė, Nomeda
2015-01-01
The aim of this study is to investigate the multifactorial etiology of mandibular tori analyzing the influence of genetics, occlusal overload, various clinical variables and their interactions. Overall, plaster casts of 162 twins (81 twin pairs) were analyzed for the presence or absence of mandibular tori. Atypical wear facets on canine tips or incisors were recorded to diagnose bruxism. Angle Class, any kind of anterior open bite and positive, negative or flat curve of Wilson were recorded. Zygosity determination was carried out using a DNA test. Mandibular tori were found in 56.8% of the cases. In 93.6% of all monozygotic twin pairs both individuals had or did not have mandibular tori (κ=0.96±0.04; p<0.001), compared to 79.4% concordance of mandibular tori in dizygotic co-twins (κ=0.7±0.12; p<0.001). Prevalence of mandibular tori was significantly higher in the group of bruxers (67.5%) compared to non-bruxers (31.3%) (p<0.001). Significant association between mandibular tori and negative or flat curve of Wilson in the maxillary second premolars and first molars was found (OR=2.55, 95% CI (1.19-5.46), p=0.016). In all monozygotic bruxers, 97.1% showed concordance of mandibular tori presence in both co-twins compared to 78.9% dizygotic bruxers, and this difference is statistically significant (p=0.007). Our results suggest that the mandibular tori are of a multifactorial origin. Mandibular tori seem to have genetic predisposition, and may be associated with teeth grinding as well as with negative or flat CW in region of maxillary second premolar and first molar.
International Nuclear Information System (INIS)
2003-01-01
For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products
International Nuclear Information System (INIS)
2004-01-01
For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees
International Nuclear Information System (INIS)
2000-01-01
For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products
International Nuclear Information System (INIS)
Budgor, A.B.; West, B.J.
1978-01-01
We employ the equivalence between Zwanzig's projection-operator formalism and perturbation theory to demonstrate that the approximate-solution technique of statistical linearization for nonlinear stochastic differential equations corresponds to the lowest-order β truncation in both the consolidated perturbation expansions and in the ''mass operator'' of a renormalized Green's function equation. Other consolidated equations can be obtained by selectively modifying this mass operator. We particularize the results of this paper to the Duffing anharmonic oscillator equation
The statistical analysis techniques to support the NGNP fuel performance experiments
Energy Technology Data Exchange (ETDEWEB)
Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.
2013-10-15
This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.
Ratner, Bruce
2011-01-01
The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has
A comparison of linear and nonlinear statistical techniques in performance attribution.
Chan, N H; Genovese, C R
2001-01-01
Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.
Directory of Open Access Journals (Sweden)
VIMALA C.
2015-05-01
Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.
Walz, Michael; Leckebusch, Gregor C.
2016-04-01
Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.
How Stuttering Develops: The Multifactorial Dynamic Pathways Theory
Smith, Anne; Weber, Christine
2017-01-01
Purpose: We advanced a multifactorial, dynamic account of the complex, nonlinear interactions of motor, linguistic, and emotional factors contributing to the development of stuttering. Our purpose here is to update our account as the multifactorial dynamic pathways theory. Method: We review evidence related to how stuttering develops, including…
Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.
2012-04-01
Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.
International Nuclear Information System (INIS)
Fukuda, Toshio; Mitsuoka, Toyokazu.
1985-01-01
The detection of leak in piping system is an important diagnostic technique for facilities to prevent accidents and to take maintenance measures, since the occurrence of leak lowers productivity and causes environmental destruction. As the first step, it is necessary to detect the occurrence of leak without delay, and as the second step, if the place of leak occurrence in piping system can be presumed, accident countermeasures become easy. The detection of leak by pressure is usually used for detecting large leak. But the method depending on pressure is simple and advantageous, therefore the extension of the detecting technique by pressure gradient method to the detection of smaller scale leak using statistical analysis techniques was examined for a pipeline in steady operation in this study. Since the flow in a pipe irregularly varies during pumping, statistical means is required for the detection of small leak by pressure. The index for detecting leak proposed in this paper is the difference of the pressure gradient at the both ends of a pipeline. The experimental results on water and air in nylon tubes are reported. (Kako, I.)
Directory of Open Access Journals (Sweden)
Land Walker H
2011-01-01
Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.
International Nuclear Information System (INIS)
Carvajal Escobar Yesid; Munoz, Flor Matilde
2007-01-01
The project this centred in the revision of the state of the art of the ocean-atmospheric phenomena that you affect the Colombian hydrology especially The Phenomenon Enos that causes a socioeconomic impact of first order in our country, it has not been sufficiently studied; therefore it is important to approach the thematic one, including the variable macroclimates associated to the Enos in the analyses of water planning. The analyses include revision of statistical techniques of analysis of consistency of hydrological data with the objective of conforming a database of monthly flow of the river reliable and homogeneous Cauca. Statistical methods are used (Analysis of data multivariante) specifically The analysis of principal components to involve them in the development of models of prediction of flows monthly means in the river Cauca involving the Lineal focus as they are the model autoregressive AR, ARX and Armax and the focus non lineal Net Artificial Network.
Statistics and error considerations at the application of SSND T-technique in radon measurement
International Nuclear Information System (INIS)
Jonsson, G.
1993-01-01
Plastic films are used for the detection of alpha particles from disintegrating radon and radon daughter nuclei. After etching there are tracks (cones) or holes in the film as a result of the exposure. The step from a counted number of tracks/holes per surface unit of the film to a reliable value of the radon and radon daughter level is surrounded by statistical considerations of different nature. Some of them are the number of counted tracks, the length of the time of exposure, the season of the time of exposure, the etching technique and the method of counting the tracks or holes. The number of background tracks of an unexposed film increases the error of the measured radon level. Some of the mentioned effects of statistical nature will be discussed in the report. (Author)
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
International Nuclear Information System (INIS)
Pham, Bihn T.; Einerson, Jeffrey J.
2010-01-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.
International Nuclear Information System (INIS)
Aminu Ibrahim; Hafizan Juahir; Mohd Ekhwan Toriman; Mustapha, A.; Azman Azid; Isiyaka, H.A.
2015-01-01
Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH 3 -NL, PO 4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)
GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)
Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza
2017-12-01
Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.
MacLean, Adam L.
2015-12-16
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
Multifactorial modelling of high-temperature treatment of timber in the saturated water steam medium
Prosvirnikov, D. B.; Safin, R. G.; Ziatdinova, D. F.; Timerbaev, N. F.; Lashkov, V. A.
2016-04-01
The paper analyses experimental data obtained in studies of high-temperature treatment of softwood and hardwood in an environment of saturated water steam. Data were processed in the Curve Expert software for the purpose of statistical modelling of processes and phenomena occurring during this process. The multifactorial modelling resulted in the empirical dependences, allowing determining the main parameters of this type of hydrothermal treatment with high accuracy.
Navard, Sharon E.
1989-01-01
In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.
Directory of Open Access Journals (Sweden)
Jianning Wu
2015-01-01
Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
McLoughlin, M. Padraig M. M.
2008-01-01
The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…
International Nuclear Information System (INIS)
Scheffel, Hans; Stolzmann, Paul; Schlett, Christopher L.; Engel, Leif-Christopher; Major, Gyöngi Petra; Károlyi, Mihály; Do, Synho; Maurovich-Horvat, Pál; Hoffmann, Udo
2012-01-01
Objectives: To compare image quality of coronary artery plaque visualization at CT angiography with images reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), and model based iterative reconstruction (MBIR) techniques. Methods: The coronary arteries of three ex vivo human hearts were imaged by CT and reconstructed with FBP, ASIR and MBIR. Coronary cross-sectional images were co-registered between the different reconstruction techniques and assessed for qualitative and quantitative image quality parameters. Readers were blinded to the reconstruction algorithm. Results: A total of 375 triplets of coronary cross-sectional images were co-registered. Using MBIR, 26% of the images were rated as having excellent overall image quality, which was significantly better as compared to ASIR and FBP (4% and 13%, respectively, all p < 0.001). Qualitative assessment of image noise demonstrated a noise reduction by using ASIR as compared to FBP (p < 0.01) and further noise reduction by using MBIR (p < 0.001). The contrast-to-noise-ratio (CNR) using MBIR was better as compared to ASIR and FBP (44 ± 19, 29 ± 15, 26 ± 9, respectively; all p < 0.001). Conclusions: Using MBIR improved image quality, reduced image noise and increased CNR as compared to the other available reconstruction techniques. This may further improve the visualization of coronary artery plaque and allow radiation reduction.
Energy Technology Data Exchange (ETDEWEB)
Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)
2015-01-15
Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling
International Nuclear Information System (INIS)
Beale, E.M.L.
1983-05-01
The Department of the Environment has embarked on a programme to develop computer models to help with assessment of sites suitable for the disposal of nuclear wastes. The first priority is to produce a system, based on the System Variability Analysis Code (SYVAC) obtained from Atomic Energy of Canada Ltd., suitable for assessing radioactive waste disposal in land repositories containing non heat producing wastes from typical UK sources. The requirements of the SYVAC system development were so diverse that each portion of the development was contracted to a different company. Scicon are responsible for software coordination, system integration and user interface. Their present report contains comments on 'Statistical techniques for the development and application of SYVAC'. (U.K.)
International Nuclear Information System (INIS)
Wilson, D.W.; Gaskell, S.J.; Fahmy, D.R.; Joyce, B.G.; Groom, G.V.; Griffiths, K.; Kemp, K.W.; Nix, A.B.J.; Rowlands, R.J.
1979-01-01
Adopting the rationale that the improvement of intra-laboratory performance of immunometric assays will enable the assessment of national QC schemes to become more meaningful, the group of participating laboratories has developed statistical and analytical techniques for the improvement of accuracy, precision and monitoring of error for the determination of steroid hormones. These developments are now described and their relevance to NQC schemes discussed. Attention has been focussed on some of the factors necessary for improving standards of quality in immunometric assays and their relevance to laboratories participating in NQC schemes as described. These have included the 'accuracy', precision and robustness of assay procedures as well as improved methods for internal quality control. (Auth.)
International Nuclear Information System (INIS)
Garcia, Francisco; Palacio, Carlos; Garcia, Uriel
2012-01-01
Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.
Directory of Open Access Journals (Sweden)
Voza Danijela
2015-12-01
Full Text Available The aim of this article is to evaluate the quality of the Danube River in its course through Serbia as well as to demonstrate the possibilities for using three statistical methods: Principal Component Analysis (PCA, Factor Analysis (FA and Cluster Analysis (CA in the surface water quality management. Given that the Danube is an important trans-boundary river, thorough water quality monitoring by sampling at different distances during shorter and longer periods of time is not only ecological, but also a political issue. Monitoring was carried out at monthly intervals from January to December 2011, at 17 sampling sites. The obtained data set was treated by multivariate techniques in order, firstly, to identify the similarities and differences between sampling periods and locations, secondly, to recognize variables that affect the temporal and spatial water quality changes and thirdly, to present the anthropogenic impact on water quality parameters.
Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom
2015-01-01
It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.
International Nuclear Information System (INIS)
Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois
2008-01-01
Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites
Jsub(Ic)-testing of A-533 B - statistical evaluation of some different testing techniques
International Nuclear Information System (INIS)
Nilsson, F.
1978-01-01
The purpose of the present study was to compare statistically some different methods for the evaluation of fracture toughness of the nuclear reactor material A-533 B. Since linear elastic fracture mechanics is not applicable to this material at the interesting temperature (275 0 C), the so-called Jsub(Ic) testing method was employed. Two main difficulties are inherent in this type of testing. The first one is to determine the quantity J as a function of the deflection of the three-point bend specimens used. Three different techniques were used, the first two based on the experimentally observed input of energy to the specimen and the third employing finite element calculations. The second main problem is to determine the point when crack growth begins. For this, two methods were used, a direct electrical method and the indirect R-curve method. A total of forty specimens were tested at two laboratories. No statistically significant different results were obtained from the respective laboratories. The three methods of calculating J yielded somewhat different results, although the discrepancy was small. Also the two methods of determination of the growth initiation point yielded consistent results. The R-curve method, however, exhibited a larger uncertainty as measured by the standard deviation. The resulting Jsub(Ic) value also agreed well with earlier presented results. The relative standard deviation was of the order of 25%, which is quite small for this type of experiment. (author)
Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi
2016-01-01
Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N
2017-09-01
In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to
International Nuclear Information System (INIS)
Zhang Qifeng; Peng Yun; Duan Xiaomin; Sun Jihang; Yu Tong; Han Zhonglong
2013-01-01
Objective: To investigate the feasibility to reduce radiation doses on pediatric multidetector abdominal CT using the adaptive statistical iterative reconstruction technique (ASIR) associated with automated tube current modulation technique (ATCM). Methods: Thirty patients underwent abdominal CT with ATCM and the follow-up scan with ATCM cooperated with 40% ASIR. ATCM was used with age dependent noise index (NI) settings: NI = 9 for 0-5 year old and NI = 11 for > 5 years old for simple ATCM group, NI = 11 for 0-5 year old and NI = 15 for > 5 years old for ATCM cooperated with 40% ASIR group (AISR group). Two radiologists independently evaluated images for diagnostic quality and image noise with subjectively image quality score and image noise score using a 5-point scale. Interobserver agreement was assessed by Kappa test. The volume CT dose indexes (CTDIvol) for the two groups were recorded. Statistical significance for the CTDIvol value was analyzed by pair-sample t test. Results: The average CTDIvol for the ASIR group was (1.38 ± 0.64) mGy, about 60% lower than (3.56 ± 1.23) mGy for the simple ATCM group, and the CTDIvol of two groups had statistically significant differences. (t = 33.483, P < 0.05). The subjective image quality scores for the simple ATCM group were 4.43 ± 0.57 and 4.37 ±0.61, Kappa = 0.878, P < 0.01 (ASIR group: 4.70 ± 0.47 and 4.60 ± 0.50, Kappa = 0.783, P < 0.01), by two observers. The image noise score for the simple ATCM group were 4.03 ±0.56 and 3.83 ±0.53, Kappa = 0.572, P < 0.01 (ASIR group: 4.20 ± 0.48 and 4.10 ± 0.48, Kappa = 0.748, P < 0.01), by two observers. All images had acceptable diagnostic image quality. Conclusion: Lower radiation dose can be achieved by elevating NI with ASIR in pediatric CT abdominal studies, while maintaining diagnostically acceptable images. (authors)
Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell
2017-01-01
In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.
Directory of Open Access Journals (Sweden)
Khalifa M. Al-Kindi
2017-08-01
Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.
Díaz, Zuleyka; Segovia, María Jesús; Fernández, José
2005-01-01
Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
Energy Technology Data Exchange (ETDEWEB)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two
Fenta Mekonnen, Dagnenet; Disse, Markus
2018-04-01
Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs) and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i) to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG) and the Statistical Downscaling Model (SDSM), and (ii) to downscale future climate scenarios of precipitation, maximum temperature (Tmax) and minimum temperature (Tmin) of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM) have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase from 0.4 to 4.3
Directory of Open Access Journals (Sweden)
D. Fenta Mekonnen
2018-04-01
Full Text Available Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG and the Statistical Downscaling Model (SDSM, and (ii to downscale future climate scenarios of precipitation, maximum temperature (Tmax and minimum temperature (Tmin of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase
A Review of Statistical Techniques for 2x2 and RxC Categorical Data Tables In SPSS
Directory of Open Access Journals (Sweden)
Cengiz BAL
2009-11-01
Full Text Available In this study, a review of statistical techniques for RxC categorical data tables is explained in detail. The emphasis is given to the association of techniques and their corresponding data considerations. Some suggestions to how to handle specific categorical data tables in SPSS and common mistakes in the interpretation of the SPSS outputs are shown.
International Nuclear Information System (INIS)
Tatsukawa, Yoshimi; Yamada, Michiko; Ohishi, Waka; Hida, Ayumi; Akahoshi, Masazumi; Fujiwara, Saeko; Cologne, John B; Hsu, Wan-Ling; Furukawa, Kyoji; Takahashi, Norio; Nakamura, Nori; Suyama, Akihiko; Ozasa, Kotaro; Shore, Roy
2013-01-01
There is no convincing evidence regarding radiation-induced heritable risks of adult-onset multifactorial diseases in humans, although it is important from the standpoint of protection and management of populations exposed to radiation. The objective of the present study was to examine whether parental exposure to atomic-bomb (A-bomb) radiation led to an increased risk of common polygenic, multifactorial diseases—hypertension, hypercholesterolaemia, diabetes mellitus, angina pectoris, myocardial infarction or stroke—in the first-generation (F 1 ) offspring of A-bomb survivors. A total of 11 951 F 1 offspring of survivors in Hiroshima or Nagasaki, conceived after the bombing, underwent health examinations to assess disease prevalence. We found no evidence that paternal or maternal A-bomb radiation dose, or the sum of their doses, was associated with an increased risk of any multifactorial diseases in either male or female offspring. None of the 18 radiation dose–response slopes, adjusted for other risk factors for the diseases, was statistically significantly elevated. However, the study population is still in mid-life (mean age 48.6 years), and will express much of its multifactorial disease incidence in the future, so ongoing longitudinal follow-up will provide increasingly informative risk estimates regarding hereditary genetic effects for incidence of adult-onset multifactorial disease. (paper)
Statistical techniques for modeling extreme price dynamics in the energy market
International Nuclear Information System (INIS)
Mbugua, L N; Mwita, P N
2013-01-01
Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.
Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.
2017-09-01
Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.
Directory of Open Access Journals (Sweden)
Vujović Svetlana R.
2013-01-01
Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an
La productividad multifactorial: concepto, medición y significado..
Enrique Hernández Laos
2007-01-01
En la bibliografía empírica del crecimiento económico suele hacerse referencia a las llamadas “fuentes del crecimiento.” Ello implica la medición y conceptualización de la llamada productividad multifactorial, lo que no constituye una tarea sencilla. En este artículo se exponen algunos de los principales problemas para la medición de la productividad multifactorial, y se avanza en la discusión de su conceptualización con fines de análisis empírico del crecimiento de los países. Por último, se...
International Nuclear Information System (INIS)
Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.
1983-01-01
A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction
International Nuclear Information System (INIS)
Abbas Alkarkhi, F.M.; Ismail, Norli; Easa, Azhar Mat
2008-01-01
Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers
International Nuclear Information System (INIS)
Carneiro, Alvaro Luiz Guimaraes; Santos, Francisco Carlos Barbosa dos
2007-01-01
Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)
Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.
2017-06-01
Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal
Directory of Open Access Journals (Sweden)
M.A. Delavar
2016-02-01
Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and
Kwon, Heejin; Cho, Jinhan; Oh, Jongyeong; Kim, Dongwon; Cho, Junghyun; Kim, Sanghyun; Lee, Sangyun; Lee, Jihyun
2015-10-01
To investigate whether reduced radiation dose abdominal CT images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) compromise the depiction of clinically competent features when compared with the currently used routine radiation dose CT images reconstructed with ASIR. 27 consecutive patients (mean body mass index: 23.55 kg m(-2) underwent CT of the abdomen at two time points. At the first time point, abdominal CT was scanned at 21.45 noise index levels of automatic current modulation at 120 kV. Images were reconstructed with 40% ASIR, the routine protocol of Dong-A University Hospital. At the second time point, follow-up scans were performed at 30 noise index levels. Images were reconstructed with filtered back projection (FBP), 40% ASIR, 30% ASIR-V, 50% ASIR-V and 70% ASIR-V for the reduced radiation dose. Both quantitative and qualitative analyses of image quality were conducted. The CT dose index was also recorded. At the follow-up study, the mean dose reduction relative to the currently used common radiation dose was 35.37% (range: 19-49%). The overall subjective image quality and diagnostic acceptability of the 50% ASIR-V scores at the reduced radiation dose were nearly identical to those recorded when using the initial routine-dose CT with 40% ASIR. Subjective ratings of the qualitative analysis revealed that of all reduced radiation dose CT series reconstructed, 30% ASIR-V and 50% ASIR-V were associated with higher image quality with lower noise and artefacts as well as good sharpness when compared with 40% ASIR and FBP. However, the sharpness score at 70% ASIR-V was considered to be worse than that at 40% ASIR. Objective image noise for 50% ASIR-V was 34.24% and 46.34% which was lower than 40% ASIR and FBP. Abdominal CT images reconstructed with ASIR-V facilitate radiation dose reductions of to 35% when compared with the ASIR. This study represents the first clinical research experiment to use ASIR-V, the newest version of
Energy Technology Data Exchange (ETDEWEB)
Campa, M F; Romero, M R
1969-01-01
A statistical zonation technique developed by J.D. Testerman is presented referring to its application to the San Andres reservoir in the Poza Rica area in Veracruz, Mex. The method is based on a statistical technique which permits grouping of similar values of certain parameter, i.e., porosity, for individual wells within a field. The resulting groups or zones are used in a correlation analysis to deduce whether there is continuity of porosity in any direction. In the San Andres reservoir, there is a continuity of the porous media on NE-SW direction. This is an important fact for the waterflooding project being carried on.
Does the Use of Multifactorial Training Methods Increase Practitioners' Competence?
Pittman, Corinthus Omari; Lawdis, Katina
2017-01-01
Skilled therapy practitioners are required by their governing associations to seek professional development per licensure requirements. These requirements facilitate clinical reasoning and confidence during patient care. There are limited online professional development workshops, especially ones that offer multifactorial training as an…
Predictive value of testing for multiple genetic variants in multifactorial
A.C.J.W. Janssens (Cécile); M.J. Khoury (Muin Joseph)
2009-01-01
textabstractMultifactorial diseases such as type 2 diabetes, osteoporosis, and cardiovascular disease are caused by a complex interplay of many genetic and nongenetic factors, each of which conveys a minor increase in the risk of disease. Unraveling the genetic origins of these diseases is
Presbyastasis: a multifactorial cause of balance problems in the ...
African Journals Online (AJOL)
Presbyastasis: a multifactorial cause of balance problems in the elderly. C Rogers. Abstract. Presbyastasis is the result of age-related physiological changes in the three sensory systems and their central connections that contribute to balance. In all likelihood, presbyastasis is a complex condition involving many intertwined ...
Vitamin D Deficiency : Universal Risk Factor for Multifactorial Diseases?
de Borst, Martin H.; de Boer, Rudolf A.; Stolk, Ronald P.; Slaets, Joris P. J.; Wolffenbuttel, Bruce H. R.; Navis, Gerjan
In the Western world, the majority of morbidity and mortality are caused by multifactorial diseases. Some risk factors are related to more than one type of disease. These so-called universal risk factors are highly relevant to the population, as reduction of universal risk factors may reduce the
International Nuclear Information System (INIS)
Pirkle, F.L.
1981-04-01
STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included
Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir
2017-11-01
In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.
A Comparison of Selected Statistical Techniques to Model Soil Cation Exchange Capacity
Khaledian, Yones; Brevik, Eric C.; Pereira, Paulo; Cerdà, Artemi; Fattah, Mohammed A.; Tazikeh, Hossein
2017-04-01
Cation exchange capacity (CEC) measures the soil's ability to hold positively charged ions and is an important indicator of soil quality (Khaledian et al., 2016). However, other soil properties are more commonly determined and reported, such as texture, pH, organic matter and biology. We attempted to predict CEC using different advanced statistical methods including monotone analysis of variance (MONANOVA), artificial neural networks (ANNs), principal components regressions (PCR), and particle swarm optimization (PSO) in order to compare the utility of these approaches and identify the best predictor. We analyzed 170 soil samples from four different nations (USA, Spain, Iran and Iraq) under three land uses (agriculture, pasture, and forest). Seventy percent of the samples (120 samples) were selected as the calibration set and the remaining 50 samples (30%) were used as the prediction set. The results indicated that the MONANOVA (R2= 0.82 and Root Mean Squared Error (RMSE) =6.32) and ANNs (R2= 0.82 and RMSE=5.53) were the best models to estimate CEC, PSO (R2= 0.80 and RMSE=5.54) and PCR (R2= 0.70 and RMSE=6.48) also worked well and the overall results were very similar to each other. Clay (positively correlated) and sand (negatively correlated) were the most influential variables for predicting CEC for the entire data set, while the most influential variables for the various countries and land uses were different and CEC was affected by different variables in different situations. Although the MANOVA and ANNs provided good predictions of the entire dataset, PSO gives a formula to estimate soil CEC using commonly tested soil properties. Therefore, PSO shows promise as a technique to estimate soil CEC. Establishing effective pedotransfer functions to predict CEC would be productive where there are limitations of time and money, and other commonly analyzed soil properties are available. References Khaledian, Y., Kiani, F., Ebrahimi, S., Brevik, E.C., Aitkenhead
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
Experimental design techniques in statistical practice a practical software-based approach
Gardiner, W P
1998-01-01
Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of...
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization
Individual differences in gendered person perception: a multifactorial study
2008-01-01
M.A. The psychological study of gender has evolved to comprise both dispositional and social cognitive perspectives (Morawski, 1987). Recent theoretical debates within these fields have centred on multifactorial and unifactorial conceptions of gendered factors (Spence, 1993), and the cognitive representation of gender (Howard & Hollander, 1997). This study aimed to investigate specific phenomena implicated in the above approaches. Firstly, it assessed the influence of using gender as a bas...
Energy Technology Data Exchange (ETDEWEB)
Solaimani, Mohiuddin [Univ. of Texas-Dallas, Richardson, TX (United States); Iftekhar, Mohammed [Univ. of Texas-Dallas, Richardson, TX (United States); Khan, Latifur [Univ. of Texas-Dallas, Richardson, TX (United States); Thuraisingham, Bhavani [Univ. of Texas-Dallas, Richardson, TX (United States); Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-09-01
Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. As a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.
National Research Council Canada - National Science Library
Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E
2008-01-01
... for a particular dependent variable. These capabilities are combined into a unique visualization system that is demonstrated via a North Atlantic hurricane climate study using a systematic workflow. This research corroborates the notion that enhanced parallel coordinates coupled with statistical analysis can be used for more effective knowledge discovery and confirmation in complex, real-world data sets.
Spectral deformation techniques applied to the study of quantum statistical irreversible processes
International Nuclear Information System (INIS)
Courbage, M.
1978-01-01
A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)
Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM
Warner, Rebecca M.
2007-01-01
This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…
2008-07-07
analyzing multivariate data sets. The system was developed using the Java Development Kit (JDK) version 1.5; and it yields interactive performance on a... script and captures output from the MATLAB’s “regress” and “stepwisefit” utilities that perform simple and stepwise regression, respectively. The MATLAB...Statistical Association, vol. 85, no. 411, pp. 664–675, 1990. [9] H. Hauser, F. Ledermann, and H. Doleisch, “ Angular brushing of extended parallel coordinates
Energy Technology Data Exchange (ETDEWEB)
Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)
2015-04-15
Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.
Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat
2009-01-01
Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.
Tawfik, Hazem
1991-01-01
A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.
On some surprising statistical properties of a DNA fingerprinting technique called AFLP
Gort, G.
2010-01-01
AFLP is a widely used DNA fingerprinting technique, resulting in band absence - presence profiles, like a bar code. Bands represent DNA fragments, sampled from the genome of an individual plant or other organism. The DNA fragments travel through a lane of an electrophoretic gel or microcapillary
International Nuclear Information System (INIS)
Carew, John F.; Finch, Stephen J.; Lois, Lambros
2003-01-01
The calculated >1-MeV pressure vessel fluence is used to determine the fracture toughness and integrity of the reactor pressure vessel. It is therefore of the utmost importance to ensure that the fluence prediction is accurate and unbiased. In practice, this assurance is provided by comparing the predictions of the calculational methodology with an extensive set of accurate benchmarks. A benchmarking database is used to provide an estimate of the overall average measurement-to-calculation (M/C) bias in the calculations ( ). This average is used as an ad-hoc multiplicative adjustment to the calculations to correct for the observed calculational bias. However, this average only provides a well-defined and valid adjustment of the fluence if the M/C data are homogeneous; i.e., the data are statistically independent and there is no correlation between subsets of M/C data.Typically, the identification of correlations between the errors in the database M/C values is difficult because the correlation is of the same magnitude as the random errors in the M/C data and varies substantially over the database. In this paper, an evaluation of a reactor dosimetry benchmark database is performed to determine the statistical validity of the adjustment to the calculated pressure vessel fluence. Physical mechanisms that could potentially introduce a correlation between the subsets of M/C ratios are identified and included in a multiple regression analysis of the M/C data. Rigorous statistical criteria are used to evaluate the homogeneity of the M/C data and determine the validity of the adjustment.For the database evaluated, the M/C data are found to be strongly correlated with dosimeter response threshold energy and dosimeter location (e.g., cavity versus in-vessel). It is shown that because of the inhomogeneity in the M/C data, for this database, the benchmark data do not provide a valid basis for adjusting the pressure vessel fluence.The statistical criteria and methods employed in
Statistical techniques for automating the detection of anomalous performance in rotating machinery
International Nuclear Information System (INIS)
Piety, K.R.; Magette, T.E.
1978-01-01
Surveillance techniques which extend the sophistication existing in automated systems monitoring in industrial rotating equipment are described. The monitoring system automatically established limiting criteria during an initial learning period of a few days; and subsequently, while monitoring the test rotor during an extended period of normal operation, experienced a false alarm rate of 0.5%. At the same time, the monitoring system successfully detected all fault types that introduced into the test setup. Tests on real equipment are needed to provide final verification of the monitoring techniques. There are areas that would profit from additional investigation in the laboratory environment. A comparison of the relative value of alternate descriptors under given fault conditions would be worthwhile. This should be pursued in conjunction with extending the set of fault types available, e.g., lecaring problems. Other tests should examine the effects of using fewer (more coarse) intervals to define the lumped operational states. finally, techniques to diagnose the most probable fault should be developed by drawing upon the extensive data automatically logged by the monitoring system
Directory of Open Access Journals (Sweden)
Nsikak U Benson
Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.
Statistical techniques for automating the detection of anomalous performance in rotating machinery
International Nuclear Information System (INIS)
Piety, K.R.; Magette, T.E.
1979-01-01
The level of technology utilized in automated systems that monitor industrial rotating equipment and the potential of alternative surveillance methods are assessed. It is concluded that changes in surveillance methodology would upgrade ongoing programs and yet still be practical for implementation. An improved anomaly recognition methodology is formulated and implemented on a minicomputer system. The effectiveness of the monitoring system was evaluated in laboratory tests on a small rotor assembly, using vibrational signals from both displacement probes and accelerometers. Time and frequency domain descriptors are selected to compose an overall signature that characterizes the monitored equipment. Limits for normal operation of the rotor assembly are established automatically during an initial learning period. Thereafter, anomaly detection is accomplished by applying an approximate statistical test to each signature descriptor. As demonstrated over months of testing, this monitoring system is capable of detecting anomalous conditions while exhibiting a false alarm rate below 0.5%
Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E
2018-04-26
Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants
Directory of Open Access Journals (Sweden)
Qing Gu
2016-03-01
Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.
An Efficient Statistical Computation Technique for Health Care Big Data using R
Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.
2017-08-01
Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to -day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.
Srivastava, Shweta; Vatsalya, Vatsalya; Arora, Ashoo; Arora, Kashmiri L; Karch, Robert
2012-03-22
Diarrhoea is one of the leading causes of morbidity and mortality in developing countries in Africa and South Asia such as India. Prevalence of diarrheal diseases in those countries is higher than developed western world and largely has been associated with socio-economic and sanitary conditions. However, present available data has not been sufficiently evaluated to study the role of other factors like healthcare development, population density, sex and regional influence on diarrheal prevalence pattern. Study was performed to understand the relationship of diarrheal prevalence with specific measures namely; healthcare services development, demographics, population density, socio-economic conditions, sex, and regional prevalence patterns in India. Data from Annual national health reports and other epidemiological studies were included and statistically analyzed. Our results demonstrate significant correlation of the disease prevalence pattern with certain measures like healthcare centers, population growth rate, sex and region-specific morbidity. Available information on sanitation like water supply and toilet availability and socioeconomic conditions like poverty and literacy measures could only be associated as trends of significance. This study can be valuable for improvisation of appropriate strategies focused on important measures like healthcare resources, population growth and regional significances to evaluate prevalence patterns and management of the diarrhoea locally and globally.
Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan
2012-03-01
Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.
Wang, Hao; Wang, Qunwei; He, Ming
2018-05-01
In order to investigate and improve the level of detection technology of water content in liquid chemical reagents of domestic laboratories, proficiency testing provider PT0031 (CNAS) has organized proficiency testing program of water content in toluene, 48 laboratories from 18 provinces/cities/municipals took part in the PT. This paper introduces the implementation process of proficiency testing for determination of water content in toluene, including sample preparation, homogeneity and stability test, the results of statistics of iteration robust statistic technique and analysis, summarized and analyzed those of the different test standards which are widely used in the laboratories, put forward the technological suggestions for the improvement of the test quality of water content. Satisfactory results were obtained by 43 laboratories, amounting to 89.6% of the total participating laboratories.
International Nuclear Information System (INIS)
Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.
1986-01-01
High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%
Teleni, Vicki; Baldauf, Richard B., Jr.
A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…
Directory of Open Access Journals (Sweden)
Gledsneli Maria Lima Lins
2010-12-01
Full Text Available Water has a decisive influence on populations’ life quality – specifically in areas like urban supply, drainage, and effluents treatment – due to its sound impact over public health. Water rational use constitutes the greatest challenge faced by water demand management, mainly with regard to urban household water consumption. This makes it important to develop researches to assist water managers and public policy-makers in planning and formulating water demand measures which may allow urban water rational use to be met. This work utilized the multivariate techniques Factor Analysis and Multiple Linear Regression Analysis – in order to determine the participation level of socioeconomic and climatic variables in monthly urban household consumption changes – applying them to two districts of Campina Grande city (State of Paraíba, Brazil. The districts were chosen based on socioeconomic criterion (income level so as to evaluate their water consumer’s behavior. A 9-year monthly data series (from year 2000 up to 2008 was utilized, comprising family income, water tariff, and quantity of household connections (economies – as socioeconomic variables – and average temperature and precipitation, as climatic variables. For both the selected districts of Campina Grande city, the obtained results point out the variables “water tariff” and “family income” as indicators of these district’s household consumption.
Garrett, John; Li, Yinsheng; Li, Ke; Chen, Guang-Hong
2017-03-01
Digital breast tomosynthesis (DBT) is a three dimensional (3D) breast imaging modality in which projections are acquired over a limited angular span around the compressed breast and reconstructed into image slices parallel to the detector. DBT has been shown to help alleviate the breast tissue overlapping issues of two dimensional (2D) mammography. Since the overlapping tissues may simulate cancer masses or obscure true cancers, this improvement is critically important for improved breast cancer screening and diagnosis. In this work, a model-based image reconstruction method is presented to show that spatial resolution in DBT volumes can be maintained while dose is reduced using the presented method when compared to that of a state-of-the-art commercial reconstruction technique. Spatial resolution was measured in phantom images and subjectively in a clinical dataset. Noise characteristics were explored in a cadaver study. In both the quantitative and subjective results the image sharpness was maintained and overall image quality was maintained at reduced doses when the model-based iterative reconstruction was used to reconstruct the volumes.
International Nuclear Information System (INIS)
Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.
1982-11-01
One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables
Malik, Riffat Naseem; Hashmi, Muhammad Zaffar
2017-10-01
Himalayan foothills streams, Pakistan play an important role in living water supply and irrigation of farmlands; thus, the water quality is closely related to public health. Multivariate techniques were applied to check spatial and seasonal trends, and metals contamination sources of the Himalayan foothills streams, Pakistan. Grab surface water samples were collected from different sites (5-15 cm water depth) in pre-washed polyethylene containers. Fast Sequential Atomic Absorption Spectrophotometer (Varian FSAA-240) was used to measure the metals concentration. Concentrations of Ni, Cu, and Mn were high in pre-monsoon season than the post-monsoon season. Cluster analysis identified impaired, moderately impaired and least impaired clusters based on water parameters. Discriminant function analysis indicated spatial variability in water was due to temperature, electrical conductivity, nitrates, iron and lead whereas seasonal variations were correlated with 16 physicochemical parameters. Factor analysis identified municipal and poultry waste, automobile activities, surface runoff, and soil weathering as major sources of contamination. Levels of Mn, Cr, Fe, Pb, Cd, Zn and alkalinity were above the WHO and USEPA standards for surface water. The results of present study will help to higher authorities for the management of the Himalayan foothills streams.
International Nuclear Information System (INIS)
Stevens, D.L.; Dagle, G.E.
1986-01-01
Retention and translocation of inhaled radionuclides are often estimated from the sacrifice of multiple animals at different time points. The data for each time point can be averaged and a smooth curve fitted to the mean values, or a smooth curve may be fitted to the entire data set. However, an analysis based on means may not be the most appropriate if there is substantial variation in the initial amount of the radionuclide inhaled or if the data are subject to outliers. A method has been developed that takes account of these problems. The body burden is viewed as a compartmental system, with the compartments identified with body organs. A median polish is applied to the multiple logistic transform of the compartmental fractions (compartment burden/total burden) at each time point. A smooth function is fitted to the results of the median polish. This technique was applied to data from beagles exposed to an aerosol of 239 Pu(NO 3 ) 4 . Models of retention and translocation for lungs, skeleton, liver, kidneys, and tracheobronchial lymph nodes were developed and used to estimate dose. 4 refs., 3 figs., 4 tabs
Statistical signal processing techniques for coherent transversal beam dynamics in synchrotrons
Energy Technology Data Exchange (ETDEWEB)
Alhumaidi, Mouhammad
2015-03-04
identifying and analyzing the betatron oscillation sourced from the kick based on its mixing and temporal patterns. The accelerator magnets can generate unwanted spurious linear and non-linear fields due to fabrication errors or aging. These error fields in the magnets can excite undesired resonances leading together with the space charge tune spread to long term beam losses and reducing dynamic aperture. Therefore, the knowledge of the linear and non-linear magnets errors in circular accelerator optics is very crucial for controlling and compensating resonances and their consequent beam losses and beam quality deterioration. This is indispensable, especially for high beam intensity machines. Fortunately, the relationship between the beam offset oscillation signals recorded at the BPMs is a manifestation of the accelerator optics, and can therefore be exploited in the determination of the optics linear and non-linear components. Thus, beam transversal oscillations can be excited deliberately for purposes of diagnostics operation of particle accelerators. In this thesis, we propose a novel method for detecting and estimating the optics lattice non-linear components located in-between the locations of two BPMs by analyzing the beam offset oscillation signals of a BPMs-triple containing these two BPMs. Depending on the non-linear components in-between the locations of the BPMs-triple, the relationship between the beam offsets follows a multivariate polynomial accordingly. After calculating the covariance matrix of the polynomial terms, the Generalized Total Least Squares method is used to find the model parameters, and thus the non-linear components. A bootstrap technique is used to detect the existing polynomial model orders by means of multiple hypothesis testing, and determine confidence intervals for the model parameters.
Khan, Firdos; Pilz, Jürgen
2016-04-01
South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological
Multifactorial Understanding of Ion Abundance in Tandem Mass Spectrometry Experiments.
Fazal, Zeeshan; Southey, Bruce R; Sweedler, Jonathan V; Rodriguez-Zas, Sandra L
2013-01-29
In a bottom-up shotgun approach, the proteins of a mixture are enzymatically digested, separated, and analyzed via tandem mass spectrometry. The mass spectra relating fragment ion intensities (abundance) to the mass-to-charge are used to deduce the amino acid sequence and identify the peptides and proteins. The variables that influence intensity were characterized using a multi-factorial mixed-effects model, a ten-fold cross-validation, and stepwise feature selection on 6,352,528 fragment ions from 61,543 peptide ions. Intensity was higher in fragment ions that did not have neutral mass loss relative to any mass loss or that had a +1 charge state. Peptide ions classified for proton mobility as non-mobile had lowest intensity of all mobility levels. Higher basic residue (arginine, lysine or histidine) counts in the peptide ion and low counts in the fragment ion were associated with lower fragment ion intensities. Higher counts of proline in peptide and fragment ions were associated with lower intensities. These results are consistent with the mobile proton theory. Opposite trends between peptide and fragment ion counts and intensity may be due to the different impact of factor under consideration at different stages of the MS/MS experiment or to the different distribution of observations across peptide and fragment ion levels. Presence of basic residues at all three positions next to the fragmentation site was associated with lower fragment ion intensity. The presence of proline proximal to the fragmentation site enhanced fragmentation and had the opposite trend when located distant from the site. A positive association between fragment ion intensity and presence of sulfur residues (cysteine and methionine) on the vicinity of the fragmentation site was identified. These results highlight the multi-factorial nature of fragment ion intensity and could improve the algorithms for peptide identification and the simulation in tandem mass spectrometry experiments.
Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.
2018-05-01
The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.
Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.
2017-12-01
Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.
Mullan, Donal; Chen, Jie; Zhang, Xunchang John
2016-02-01
Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.
Muhammad, Said; Tahir Shah, M; Khan, Sardar
2010-10-01
The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
Flore, Jacinthe
2016-09-01
This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination. © The Author(s) 2016.
Peng, Chengtao; Qiu, Bensheng; Zhang, Cheng; Ma, Changyu; Yuan, Gang; Li, Ming
2017-07-01
Over the years, the X-ray computed tomography (CT) has been successfully used in clinical diagnosis. However, when the body of the patient to be examined contains metal objects, the image reconstructed would be polluted by severe metal artifacts, which affect the doctor's diagnosis of disease. In this work, we proposed a dynamic re-weighted total variation (DRWTV) technique combined with the statistic iterative reconstruction (SIR) method to reduce the artifacts. The DRWTV method is based on the total variation (TV) and re-weighted total variation (RWTV) techniques, but it provides a sparser representation than TV and protects the tissue details better than RWTV. Besides, the DRWTV can suppress the artifacts and noise, and the SIR convergence speed is also accelerated. The performance of the algorithm is tested on both simulated phantom dataset and clinical dataset, which are the teeth phantom with two metal implants and the skull with three metal implants, respectively. The proposed algorithm (SIR-DRWTV) is compared with two traditional iterative algorithms, which are SIR and SIR constrained by RWTV regulation (SIR-RWTV). The results show that the proposed algorithm has the best performance in reducing metal artifacts and protecting tissue details.
Barman, S.; Bhattacharjya, R. K.
2017-12-01
The River Subansiri is the major north bank tributary of river Brahmaputra. It originates from the range of Himalayas beyond the Great Himalayan range at an altitude of approximately 5340m. Subansiri basin extends from tropical to temperate zones and hence exhibits a great diversity in rainfall characteristics. In the Northern and Central Himalayan tracts, precipitation is scarce on account of high altitudes. On the other hand, Southeast part of the Subansiri basin comprising the sub-Himalayan and the plain tract in Arunachal Pradesh and Assam, lies in the tropics. Due to Northeast as well as Southwest monsoon, precipitation occurs in this region in abundant quantities. Particularly, Southwest monsoon causes very heavy precipitation in the entire Subansiri basin during May to October. In this study, the rainfall over Subansiri basin has been studied at 24 different locations by multiple linear and non-linear regression based statistical downscaling techniques and by Artificial Neural Network based model. APHRODITE's gridded rainfall data of 0.25˚ x 0.25˚ resolutions and climatic parameters of HadCM3 GCM of resolution 2.5˚ x 3.75˚ (latitude by longitude) have been used in this study. It has been found that multiple non-linear regression based statistical downscaling technique outperformed the other techniques. Using this method, the future rainfall pattern over the Subansiri basin has been analyzed up to the year 2099 for four different time periods, viz., 2020-39, 2040-59, 2060-79, and 2080-99 at all the 24 locations. On the basis of historical rainfall, the months have been categorized as wet months, months with moderate rainfall and dry months. The spatial changes in rainfall patterns for all these three types of months have also been analyzed over the basin. Potential decrease of rainfall in the wet months and months with moderate rainfall and increase of rainfall in the dry months are observed for the future rainfall pattern of the Subansiri basin.
DEFF Research Database (Denmark)
and straightforward idea is to interpret effects relative to the residual error and to choose the proper effect size measure. For multi-attribute bar plots of F-statistics this amounts, in balanced settings, to a simple transformation of the bar heights to get them transformed into depicting what can be seen...... on a multifactorial sensory profile data set and compared to actual d-prime calculations based on ordinal regression modelling through the ordinal package. A generic ``plug-in'' implementation of the method is given in the SensMixed package, which again depends on the lmerTest package. We discuss and clarify the bias...
Toward a multifactorial model of expertise: beyond born versus made.
Hambrick, David Z; Burgoyne, Alexander P; Macnamara, Brooke N; Ullén, Fredrik
2018-02-15
The debate over the origins of individual differences in expertise has raged for over a century in psychology. The "nature" view holds that expertise reflects "innate talent"-that is, genetically determined abilities. The "nurture" view counters that, if talent even exists, its effects on ultimate performance are negligible. While no scientist takes seriously a strict nature-only view of expertise, the nurture view has gained tremendous popularity over the past several decades. This environmentalist view holds that individual differences in expertise reflect training history, with no important contribution to ultimate performance by innate ability ("talent"). Here, we argue that, despite its popularity, this view is inadequate to account for the evidence concerning the origins of expertise that has accumulated since the view was first proposed. More generally, we argue that the nature versus nurture debate in research on expertise is over-or certainly should be, as it has been in other areas of psychological research for decades. We describe a multifactorial model for research on the nature and nurture of expertise, which we believe will provide a progressive direction for future research on expertise. © 2018 New York Academy of Sciences.
DEFF Research Database (Denmark)
Toft, Ulla; Jakobsen, Iris Marie; Aadahl, Mette
2012-01-01
To investigate whether the effect of an individualised multi-factorial lifestyle intervention on dietary habits differs across socioeconomic groups.......To investigate whether the effect of an individualised multi-factorial lifestyle intervention on dietary habits differs across socioeconomic groups....
International Nuclear Information System (INIS)
Singh, Kunwar P.; Malik, Amrita; Sinha, Sarita
2005-01-01
Multivariate statistical techniques, such as cluster analysis (CA), factor analysis (FA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the data set on water quality of the Gomti river (India), generated during three years (1999-2001) monitoring at eight different sites for 34 parameters (9792 observations). This study presents usefulness of multivariate statistical techniques for evaluation and interpretation of large complex water quality data sets and apportionment of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Three significant groups, upper catchments (UC), middle catchments (MC) and lower catchments (LC) of sampling sites were obtained through CA on the basis of similarity between them. FA/PCA applied to the data sets pertaining to three catchments regions of the river resulted in seven, seven and six latent factors, respectively responsible for the data structure, explaining 74.3, 73.6 and 81.4% of the total variance of the respective data sets. These included the trace metals group (leaching from soil and industrial waste disposal sites), organic pollution group (municipal and industrial effluents), nutrients group (agricultural runoff), alkalinity, hardness, EC and solids (soil leaching and runoff process). DA showed the best results for data reduction and pattern recognition during both temporal and spatial analysis. It rendered five parameters (temperature, total alkalinity, Cl, Na and K) affording more than 94% right assignations in temporal analysis, while 10 parameters (river discharge, pH, BOD, Cl, F, PO 4 , NH 4 -N, NO 3 -N, TKN and Zn) to afford 97% right assignations in spatial analysis of three different regions in the basin. Thus, DA allowed reduction in dimensionality of the large data set, delineating a few indicator parameters responsible for large variations in water quality. Further
Etiology of Balkan endemic nephropathy: A multifactorial disease?
International Nuclear Information System (INIS)
Toncheva, Draga; Dimitrov, Tzvetan; Stojanova, Stiliana
1998-01-01
Balkan endemic nephropathy (BEN) is of great clinical importance in the restricted areas of Bulgaria, Rumania, Croatia, Serbia, Bosnia and Herzegovina. So far, studies on the etiological factors for BEN have not discovered any single environmental causative agent of this puzzling disease. These data reject the possibility of a purely environmental causation of BEN. The pattern of BEN transmission in the risk families is not typical for single gene disorders. Extensive epidemiological and genetic studies disclose characteristics of multifactorial (polygenic) inheritance of BEN. The evidences of 'familial tendency', variation of the risk for BEN depending on the number of sick parents and the degree of relatedness; the development of BEN in individuals from at-risk families who were born in non-endemic areas; the data that disease is not found in the gypsy population and the expressions of 3q25 cytogenetic marker suggest that the genetic factors play an important role as causative factors in BEN development. The possible impact of environmental triggers on individuals genetically predisposed to BEN could be supposed by the following data: the cytogenetic results of the increased frequency of folate sensitive Fra sites, spontaneous or radiation-induced aberrations in several bands in BEN patients, the data from the detailed analysis of breaks in BEN patients and controls that generate structural chromosome aberrations; the occurrence of BEN in immigrants. Genetical epidemiological approaches to etiology and prevention of BEN are proposed. The predisposing genes for BEN could be genes localized in a region between 3q25-3q26; transforming growth factor-β (TGF-β), genetic heterogeneity of xenobiotic-metabolizing enzymes; defects in the host's immune system. The predisposing genes for BEN patients with urinary tract tumors could be germline mutations in tumor suppressor genes and acquired somatic mutations in oncogenes
Pestman, Wiebe R
2009-01-01
This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.
Gemne, G
1982-12-01
The article reviews available pathophysiological evidence for a multifactorial etiology of the Raynaud type of peripheral circulation disorder in persons exposed to vibration from handheld tools and discusses the consequences this viewpoint may have for diagnostics, preventive work, and research.
Vranić, Andrea; Španić, Ana Marija; Carretti, Barbara; Borella, Erika
2013-11-01
Several studies have shown an increase in memory performance after teaching mnemonic techniques to older participants. However, transfer effects to non-trained tasks are generally either very small, or not found. The present study investigates the efficacy of a multifactorial memory training program for older adults living in a residential care center. The program combines teaching of memory strategies with activities based on metacognitive (metamemory) and motivational aspects. Specific training-related gains in the Immediate list recall task (criterion task), as well as transfer effects on measures of short-term memory, long-term memory, working memory, motivational (need for cognition), and metacognitive aspects (subjective measure of one's memory) were examined. Maintenance of training benefits was assessed after seven months. Fifty-one older adults living in a residential care center, with no cognitive impairments, participated in the study. Participants were randomly assigned to two programs: the experimental group attended the training program, while the active control group was involved in a program in which different psychological issues were discussed. A benefit in the criterion task and substantial general transfer effects were found for the trained group, but not for the active control, and they were maintained at the seven months follow-up. Our results suggest that training procedures, which combine teaching of strategies with metacognitive-motivational aspects, can improve cognitive functioning and attitude toward cognitive activities in older adults.
Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan
2016-01-01
The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.
Szulc, Stefan
1965-01-01
Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then
MLQ (Chestionarul Multifactorial de Leadership. Date preliminare pentru România
Directory of Open Access Journals (Sweden)
Dragoş Iliescu
2008-01-01
Full Text Available The Multifactorial Leadership Questionnaire (MLQ, in its latest version (5X, is a complex instrument, created in order to offer (1 a valid measurement of the transformational, transactional and passive components of leadership and also (2 as accurate a profile as possible for a person’s leadership potential and leadership-related behavior. The MLQ has been often used in laboratory and field researches, being an adequate, very useful tool for selection, transfer, promotion, development and counseling of individuals, groups or organizations. Various forms of the MLQ have been used in more than 30 countries, in industrial organizations, hospitals, religious institutions, military organizations, governmental agencies, universities, primary and secondary schools. It has been demonstrated that the efficiency of the MLQ remains constant, no matter if the leader is evaluated by his direct superiors, subordinates, co-workers or customers. An outstanding advantage of the MLQ is thus the possibility of 360° usage (with parallel forms of for selfevaluation and peer-evaluation. The adaptation of the MLQ started in Romania in 2005. This paper presents a pilot study on the self-evaluation form of the MLQ. A pilot sample of 229 participants was used, comprising mediumlevel and top managers, recruited from different corporations in Bucharest. Primary statistics, reliability, interscale correlations and the factor analysis of the MLQ are being presented and discussed, contrasted with the results reported by the original authors on USA samples. The results of these preliminary studies are encouraging, indicating that, in spite of an obvious need of extra research, the Romanian MLQ is an effective tool so far.
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked
International Nuclear Information System (INIS)
Kleijnen, J.P.C.; Helton, J.C.
1999-01-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
DEFF Research Database (Denmark)
Gæde, Peter; Oellgaard, Jens; Carstensen, Bendix
2016-01-01
Aims/hypothesis: The aim of this work was to study the potential long-term impact of a 7.8Â years intensified, multifactorial intervention in patients with type 2 diabetes mellitus and microalbuminuria in terms of gained years of life and years free from incident cardiovascular disease. Methods...... for all microvascular complications was decreased in the intensive-therapy group in the range 0.52 to 0.67, except for peripheral neuropathy (HR 1.12). Conclusions/interpretation: At 21.2Â years of follow-up of 7.8Â years of intensified, multifactorial, target-driven treatment of type 2 diabetes......: The original intervention (mean treatment duration 7.8Â years) involved 160 patients with type 2 diabetes and microalbuminuria who were randomly assigned (using sealed envelopes) to receive either conventional therapy or intensified, multifactorial treatment including both behavioural and pharmacological...
International Nuclear Information System (INIS)
Podorozhnyi, D.M.; Postnikov, E.B.; Sveshnikova, L.G.; Turundaevsky, A.N.
2005-01-01
A multivariate statistical procedure for solving problems of estimating physical parameters on the basis of data from measurements with multichannel equipment is described. Within the multivariate procedure, an algorithm is constructed for estimating the energy of primary cosmic rays and the exponent in their power-law spectrum. They are investigated by using the KLEM spectrometer (NUCLEON project) as a specific example of measuring equipment. The results of computer experiments simulating the operation of the multivariate procedure for this equipment are given, the proposed approach being compared in these experiments with the one-parameter approach presently used in data processing
Amato, Umberto; Antoniadis, Anestis; De Feis, Italia; Masiello, Guido; Matricardi, Marco; Serio, Carmine
2009-03-01
Remote sensing of atmosphere is changing rapidly thanks to the development of high spectral resolution infrared space-borne sensors. The aim is to provide more and more accurate information on the lower atmosphere, as requested by the World Meteorological Organization (WMO), to improve reliability and time span of weather forecasts plus Earth's monitoring. In this paper we show the results we have obtained on a set of Infrared Atmospheric Sounding Interferometer (IASI) observations using a new statistical strategy based on dimension reduction. Retrievals have been compared to time-space colocated ECMWF analysis for temperature, water vapor and ozone.
Jaya Christiyan, K. G.; Chandrasekhar, U.; Mathivanan, N. Rajesh; Venkateswarlu, K.
2018-02-01
A 3D printing was successfully used to fabricate samples of Polylactic Acid (PLA). Processing parameters such as Lay-up speed, Lay-up thickness, and printing nozzle were varied. All samples were tested for flexural strength using three point load test. A statistical mathematical model was developed to correlate the processing parameters with flexural strength. The result clearly demonstrated that the lay-up thickness and nozzle diameter influenced flexural strength significantly, whereas lay-up speed hardly influenced the flexural strength.
International Nuclear Information System (INIS)
Abu-Hamdeh, Nidal H.; Alnefaie, Khaled A.; Almitani, Khalid H.
2013-01-01
Highlights: • The successes of using olive waste/methanol as an adsorbent/adsorbate pair. • The experimental gross cycle coefficient of performance obtained was COP a = 0.75. • Optimization showed expanding adsorbent mass to a certain range increases the COP. • The statistical optimization led to optimum tank volume between 0.2 and 0.3 m 3 . • Increasing the collector area to a certain range increased the COP. - Abstract: The current work demonstrates a developed model of a solar adsorption refrigeration system with specific requirements and specifications. The recent scheme can be employed as a refrigerator and cooler unit suitable for remote areas. The unit runs through a parabolic trough solar collector (PTC) and uses olive waste as adsorbent with methanol as adsorbate. Cooling production, COP (coefficient of performance, and COP a (cycle gross coefficient of performance) were used to assess the system performance. The system’s design optimum parameters in this study were arrived to through statistical and experimental methods. The lowest temperature attained in the refrigerated space was 4 °C and the equivalent ambient temperature was 27 °C. The temperature started to decrease steadily at 20:30 – when the actual cooling started – until it reached 4 °C at 01:30 in the next day when it rose again. The highest COP a obtained was 0.75
International Nuclear Information System (INIS)
Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni
2012-01-01
To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 ± 3.00) than low-dose ASIR (49.24 ± 9.11, P < 0.01) and reference-dose ASIR images (24.93 ± 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Bunkyo-ku, Tokyo (Japan)
2012-08-15
To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 {+-} 3.00) than low-dose ASIR (49.24 {+-} 9.11, P < 0.01) and reference-dose ASIR images (24.93 {+-} 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)
MacKenzie, Dana
2004-01-01
The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).
International Nuclear Information System (INIS)
El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz
2017-01-01
Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock–water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. - Highlights: • Hydrochemical investigations were carried out in Dhurma aquifer in Saudi Arabia. • The factors controlling potential groundwater pollution in an arid region were studied. • Chemical and statistical analyses are integrated to assess these factors. • Five main factors were extracted, which explain >77% of the total data variance. • The chemical characteristics of the groundwater were influenced by rock–water interactions
Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol
2013-12-01
The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.
International Nuclear Information System (INIS)
Kapur, G.S.; Sastry, M.I.S.; Jaiswal, A.K.; Sarpal, A.S.
2004-01-01
The present paper describes various classification techniques like cluster analysis, principal component (PC)/factor analysis to classify different types of base stocks. The API classification of base oils (Group I-III) has been compared to a more detailed NMR derived chemical compositional and molecular structural parameters based classification in order to point out the similarities of the base oils in the same group and the differences between the oils placed in different groups. The detailed compositional parameters have been generated using 1 H and 13 C nuclear magnetic resonance (NMR) spectroscopic methods. Further, oxidation stability, measured in terms of rotating bomb oxidation test (RBOT) life, of non-conventional base stocks and their blends with conventional base stocks, has been quantitatively correlated with their 1 H NMR and elemental (sulphur and nitrogen) data with the help of multiple linear regression (MLR) and artificial neural networks (ANN) techniques. The MLR based model developed using NMR and elemental data showed a high correlation between the 'measured' and 'estimated' RBOT values for both training (R=0.859) and validation (R=0.880) data sets. The ANN based model, developed using fewer number of input variables (only 1 H NMR data) also showed high correlation between the 'measured' and 'estimated' RBOT values for training (R=0.881), validation (R=0.860) and test (R=0.955) data sets
Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao
2018-04-01
In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.
El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz
2017-10-01
Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M
2014-01-01
Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.
International Nuclear Information System (INIS)
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-01-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.
Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus
2013-06-01
The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
Energy Technology Data Exchange (ETDEWEB)
Vorona, Gregory A. [The Children' s Hospital of Pittsburgh of UPMC, Department of Radiology, Pittsburgh, PA (United States); Allegheny General Hospital, Department of Radiology, Pittsburgh, PA (United States); Ceschin, Rafael C.; Clayton, Barbara L.; Sutcavage, Tom; Tadros, Sameh S.; Panigrahy, Ashok [The Children' s Hospital of Pittsburgh of UPMC, Department of Radiology, Pittsburgh, PA (United States)
2011-09-15
The use of the adaptive statistical iterative reconstruction (ASIR) algorithm has been shown to reduce radiation doses in adults undergoing abdominal CT studies while preserving image quality. To our knowledge, no studies have been done to validate the use of ASIR in children. To retrospectively evaluate differences in radiation dose and image quality in pediatric CT abdominal studies utilizing 40% ASIR compared with filtered-back projection (FBP). Eleven patients (mean age 8.5 years, range 2-17 years) had separate 40% ASIR and FBP enhanced abdominal CT studies on different days between July 2009 and October 2010. The ASIR studies utilized a 38% mA reduction in addition to our pediatric protocol mAs. Study volume CT dose indexes (CTDI{sub vol}) and dose-length products (DLP) were recorded. A consistent representative image was obtained from each study. The images were independently evaluated by two radiologists in a blinded manner for diagnostic utility, image sharpness and image noise. The average CTDI{sub vol} and DLP for the 40% ASIR studies were 4.25 mGy and 185.04 mGy-cm, compared with 6.75 mGy and 275.79 mGy-cm for the FBP studies, representing 37% and 33% reductions in both, respectively. The radiologists' assessments of subjective image quality did not demonstrate any significant differences between the ASIR and FBP images. In our experience, the use of 40% ASIR with a 38% decrease in mA lowers the radiation dose for children undergoing enhanced abdominal examinations by an average of 33%, while maintaining diagnostically acceptable images. (orig.)
International Nuclear Information System (INIS)
Vorona, Gregory A.; Ceschin, Rafael C.; Clayton, Barbara L.; Sutcavage, Tom; Tadros, Sameh S.; Panigrahy, Ashok
2011-01-01
The use of the adaptive statistical iterative reconstruction (ASIR) algorithm has been shown to reduce radiation doses in adults undergoing abdominal CT studies while preserving image quality. To our knowledge, no studies have been done to validate the use of ASIR in children. To retrospectively evaluate differences in radiation dose and image quality in pediatric CT abdominal studies utilizing 40% ASIR compared with filtered-back projection (FBP). Eleven patients (mean age 8.5 years, range 2-17 years) had separate 40% ASIR and FBP enhanced abdominal CT studies on different days between July 2009 and October 2010. The ASIR studies utilized a 38% mA reduction in addition to our pediatric protocol mAs. Study volume CT dose indexes (CTDI vol ) and dose-length products (DLP) were recorded. A consistent representative image was obtained from each study. The images were independently evaluated by two radiologists in a blinded manner for diagnostic utility, image sharpness and image noise. The average CTDI vol and DLP for the 40% ASIR studies were 4.25 mGy and 185.04 mGy-cm, compared with 6.75 mGy and 275.79 mGy-cm for the FBP studies, representing 37% and 33% reductions in both, respectively. The radiologists' assessments of subjective image quality did not demonstrate any significant differences between the ASIR and FBP images. In our experience, the use of 40% ASIR with a 38% decrease in mA lowers the radiation dose for children undergoing enhanced abdominal examinations by an average of 33%, while maintaining diagnostically acceptable images. (orig.)
Üstün-Aytekin, Özlem; Arısoy, Sevda; Aytekin, Ali Özhan; Yıldız, Ece
2016-03-01
X-prolyl dipeptidyl aminopeptidase (PepX) is an intracellular enzyme from the Gram-positive bacterium Lactococcus lactis spp. lactis NRRL B-1821, and it has commercial importance. The objective of this study was to compare the effects of several cell disruption methods on the activity of PepX. Statistical optimization methods were performed for two cavitation methods, hydrodynamic (high-pressure homogenization) and acoustic (sonication), to determine the more appropriate disruption method. Two level factorial design (2FI), with the parameters of number of cycles and pressure, and Box-Behnken design (BBD), with the parameters of cycle, sonication time, and power, were used for the optimization of the high-pressure homogenization and sonication methods, respectively. In addition, disruption methods, consisting of lysozyme, bead milling, heat treatment, freeze-thawing, liquid nitrogen, ethylenediaminetetraacetic acid (EDTA), Triton-X, sodium dodecyl sulfate (SDS), chloroform, and antibiotics, were performed and compared with the high-pressure homogenization and sonication methods. The optimized values of high-pressure homogenization were one cycle at 130 MPa providing activity of 114.47 mU ml(-1), while sonication afforded an activity of 145.09 mU ml(-1) at 28 min with 91% power and three cycles. In conclusion, sonication was the more effective disruption method, and its optimal operation parameters were manifested for the release of intracellular enzyme from a L. lactis spp. lactis strain, which is a Gram-positive bacterium. Copyright © 2015 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Cláudio Roberto Rosário
2012-07-01
Full Text Available The purpose of this research is to improve the practice on customer satisfaction analysis The article presents an analysis model to analyze the answers of a customer satisfaction evaluation in a systematic way with the aid of multivariate statistical techniques, specifically, exploratory analysis with PCA – Partial Components Analysis with HCA - Hierarchical Cluster Analysis. It was tried to evaluate the applicability of the model to be used by the issue company as a tool to assist itself on identifying the value chain perceived by the customer when applied the questionnaire of customer satisfaction. It was found with the assistance of multivariate statistical analysis that it was observed similar behavior among customers. It also allowed the company to conduct reviews on questions of the questionnaires, using analysis of the degree of correlation between the questions that was not a company’s practice before this research.
International Nuclear Information System (INIS)
Welsh, T.L.; McRae, L.P.; Delegard, C.H.; Liebetrau, A.M.; Johnson, W.C.; Theis, W.; Lemaire, R.J.; Xiao, J.
1995-06-01
Quantitative physical measurements are a n component of the International Atomic Energy Agency (IAEA) nuclear material m ampersand guards verification regime. In December 1994, LA.FA safeguards were initiated on an inventory of excess plutonium powder items at the Plutonium Finishing Plant, Vault 3, on the US Department of Energy's Hanford Site. The material originl from the US nuclear weapons complex. The diversity of the chemical form and the heterogenous physical form of this inventory were anticipated to challenge the precision and accuracy of quantitative destructive analytical techniques. A sampling design was used to estimate the degree of heterogeneity of the plutonium content of a variety of inventory items. Plutonium concentration, the item net weight, and the 240 Pu content were among the variables considered in the design. Samples were obtained from randomly selected location within each item. Each sample was divided into aliquots and analyzed chemically. Operator measurements by calorimetry and IAEA measurements by coincident neutron nondestructive analysis also were performed for the initial physical inventory verification materials and similar items not yet under IAEA safeguards. The heterogeneity testing has confirmed that part of the material is indeed significantly heterogeneous; this means that precautionary measures must be taken to obtain representative samples for destructive analysis. In addition, the sampling variability due to material heterogeneity was found to be comparable with, or greater than, the variability of the operator's calorimetric measurements
Directory of Open Access Journals (Sweden)
Fidel Castro Pérez
2009-03-01
Full Text Available Se realizó un estudio transversal, cualicuantitativo, descriptivo, observacional, con el objetivo de determinar la prevalencia del tiempo de neumatización mastoidea, y analítico con el objetivo de estudiar factores relacionados en 900 niños con edades 2, 5 y 10 años (300 en cada edad, mediante radiología de la apófisis mastoideas realizadas por la técnica de Shuller entre enero de 1992 a 2007, el cual a partir del año 2006 se incorporó a un proyecto de investigación de una nueva modalidad de tratamiento quirúrgico en niños con otitis media serosa, siendo necesario estudiar el tipo de neumatización temporal. Se utilizó el método de la estadística descriptiva y el empírico de la encuesta para recoger antecedentes de otitis medias agudas o crónicas agudizadas y se empleó el principio bioético de consentimiento informado. Se emplearon las pruebas para variables no paramétricas de Chi-cuadrado y OR para analizar asociación entre variables cualitativas al 95 % de certeza. Se encontró que no había diferencias en neumatización en las diferentes edades, pero el sexo femenino y el antecedente de otitis media estaban asociados con la presencia de no neumatización de la apófisis mastoides. Se concluye que la variación en neumatización es determinada por el código genético e influenciado por otros factores.Petromastoid neumatization. Impact of Multifactorial Etiological Approach.Pinar delRio.1992-2007.A cross-sectional qualitative,descriptive,observational study was carried out aimed at dermining the prevalence of the time of mastoid neumatization with the purpose of study those factors related to the condition in 900 children at the ages of 2,5 and 10 years (300 in every age group through the x-ray of the mastoid apophysis using the Shuller technique between January of 1992 to January 1997.From year 2006 on it was incorporated to a research project a new surgical procedure in children suffering from serous otitis media
Rubino, Corrado; Mazzarello, Vittorio; Faenza, Mario; Montella, Andrea; Santanelli, Fabio; Farace, Francesco
2015-06-01
The aim of this study was to evaluate the effects on adipocyte morphology of 2 techniques of fat harvesting and of fat purification in lipofilling, considering that the number of viable healthy adipocytes is important in fat survival in recipient areas of lipofilling. Fat harvesting was performed in 10 female patients from flanks, on one side with a 2-mm Coleman cannula and on the other side with a 3-mm Mercedes cannula. Thirty milliliter of fat tissue from each side was collected and divided into three 10 mL syringes: A, B, and C. The fat inside syringe A was left untreated, the fat in syringe B underwent simple sedimentation, and the fat inside syringe C underwent centrifugation at 3000 rpm for 3 minutes. Each fat graft specimen was processed for examination under low-vacuum scanning electron microscope. Diameter (μ) and number of adipocytes per square millimeter and number of altered adipocytes per square millimeter were evaluated. Untreated specimens harvested with the 2 different techniques were first compared, then sedimented versus centrifuged specimens harvested with the same technique were compared. Statistical analysis was performed using Wilcoxon signed rank test. The number of adipocytes per square millimeter was statistically higher in specimens harvested with the 3-mm Mercedes cannula (P = 0.0310). The number of altered cells was statistically higher in centrifuged specimens than in sedimented ones using both methods of fat harvesting (P = 0.0080) with a 2-mm Coleman cannula and (P = 0.0050) with a 3-mm Mercedes cannula. Alterations in adipocyte morphology consisted in wrinkling of the membrane, opening of pore with leakage of oily material, reduction of cellular diameter, and total collapse of the cellular membrane. Fat harvesting by a 3-mm cannula results in a higher number of adipocytes and centrifugation of the harvested fat results in a higher number of morphologic altered cells than sedimentation.
International Nuclear Information System (INIS)
Xu, Yan; He, Wen; Chen, Hui; Hu, Zhihai; Li, Juan; Zhang, Tingting
2013-01-01
Aim: To evaluate the relationship between different noise indices (NIs) and radiation dose and to compare the effect of different reconstruction algorithm applications for ultra-low-dose chest computed tomography (CT) on image quality improvement and the accuracy of volumetric measurement of ground-glass opacity (GGO) nodules using a phantom study. Materials and methods: A 11 cm thick transverse phantom section with a chest wall, mediastinum, and 14 artificial GGO nodules with known volumes (919.93 ± 64.05 mm 3 ) was constructed. The phantom was scanned on a Discovery CT 750HD scanner with five different NIs (NIs = 20, 30, 40, 50, and 60). All data were reconstructed with a 0.625 mm section thickness using the filtered back-projection (FBP), 50% adaptive statistical iterative reconstruction (ASiR), and Veo model-base iterative reconstruction algorithms. Image noise was measured in six regions of interest (ROIs). Nodule volumes were measured using a commercial volumetric software package. The image quality and the volume measurement errors were analysed. Results: Image noise increased dramatically from 30.7 HU at NI 20 to 122.4 HU at NI 60, with FBP reconstruction. Conversely, Veo reconstruction effectively controlled the noise increase, with an increase from 9.97 HU at NI 20 to only 15.1 HU at NI 60. Image noise at NI 60 with Veo was even lower (50.8%) than that at NI 20 with FBP. The contrast-to-noise ratio (CNR) of Veo at NI 40 was similar to that of FBP at NI 20. All artificial GGO nodules were successfully identified and measured with an average relative volume measurement error with Veo at NI 60 of 4.24%, comparable to a value of 10.41% with FBP at NI 20. At NI 60, the radiation dose was only one-tenth that at NI 20. Conclusion: The Veo reconstruction algorithms very effectively reduced image noise compared with the conventional FBP reconstructions. Using ultra-low-dose CT scanning and Veo reconstruction, GGOs can be detected and quantified with an acceptable
Directory of Open Access Journals (Sweden)
Arveti Nagaraju
2016-10-01
Full Text Available Hydrogeochemical studies were carried out in and around Udayagiri area of Andhra Pradesh in order to assess the chemistry of the groundwater and to identify the dominant hydrogeochemical processes and mechanisms responsible for the evolution of the chemical composition of the groundwater. Descriptive statistics, correlation matrices, principal component analysis (PCA, together with cluster analysis (CA were used to gain an understanding of the hydrogeochemical processes in the study area. PCA has identified 4 main processes influencing the groundwater chemistry viz., mineral precipitation and dissolution, seawater intrusion, cation exchange, and carbonate balance. Further, three clusters C1, C2 and C3 were obtained. Samples from C1 contain high level of Cl− and may be due to the intensive evaporation and contamination from landfill leachate. Most of the samples from C2 are located closer to the sea and the high level of Na+ +K+ in these samples may be attributed to seawater intrusion. The geochemistry of water samples in C3 are more likely to originate from rock weathering. This has been supported by Gibbs diagram. The groundwater geochemistry in the study area is mostly of natural origin, but is influenced to some degree by human activity. Evaluación de la calidad del agua subterránea a través de técnicas estadísticas multivariadas en el área Udayagiri, distrito Nellore, Andhra Pradesh, en el sur de India Resumen Se realizaron estudios hidrogeoquímicos en y alrededor del área Udayagiri de Andhra Pradesh para evaluar la química del agua subterránea e identificar los procesos hidrogeoquímicos dominantes y los mecanismos responsables de la evolución en la composición química del agua subterránea. Se utilizaron estadísticas descriptivas, matrices de correlación, análisis de componentes principales, al igual que análisis de grupos, para obtener y entender los procesos hidrogeoquímicos en el área de estudio. Los an
Herojeet, Rajkumar; Rishi, Madhuri S.; Lata, Renu; Dolma, Konchok
2017-09-01
multivariate techniques for reliable quality characterization of surface water quality to develop effective pollution reduction strategies and maintain a fine balance between the industrialization and ecological integrity.
Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel
2017-09-01
In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Serdobolskii, Vadim Ivanovich
2007-01-01
This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...
AbstractImage and multifactorial statistical analyses were used to evaluate the intensity of fluorescence signal from cells of three strains of A. pullulans and one strain of Rhodosporidium toruloides, as an outgroup, hybridized with either a universal o...
Understanding Statistics - Cancer Statistics
Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.
Suijker, Jacqueline J; MacNeil-Vroomen, Janet L; van Rijn, Marjon; Buurman, Bianca M; de Rooij, Sophia E; Moll van Charante, Eric P; Bosmans, Judith E
2017-01-01
To evaluate the cost-effectiveness of nurse-led multifactorial care to prevent or postpone new disabilities in community-living older people in comparison with usual care. We conducted cost-effectiveness and cost-utility analyses alongside a cluster randomized trial with one-year follow-up. Participants were aged ≥ 70 years and at increased risk of functional decline. Participants in the intervention group (n = 1209) received a comprehensive geriatric assessment and individually tailored multifactorial interventions coordinated by a community-care registered nurse with multiple follow-up visits. The control group (n = 1074) received usual care. Costs were assessed from a healthcare perspective. Outcome measures included disability (modified Katz-Activities of Daily Living (ADL) index score), and quality-adjusted life-years (QALYs). Statistical uncertainty surrounding Incremental Cost-Effectiveness Ratios (ICERs) was estimated using bootstrapped bivariate regression models while adjusting for confounders. There were no statistically significant differences in Katz-ADL index score and QALYs between the two groups. Total mean costs were significantly higher in the intervention group (EUR 6518 (SE 472) compared with usual care (EUR 5214 (SE 338); adjusted mean difference €1457 (95% CI: 572; 2537). Cost-effectiveness acceptability curves showed that the maximum probability of the intervention being cost-effective was 0.14 at a willingness to pay (WTP) of EUR 50,000 per one point improvement on the Katz-ADL index score and 0.04 at a WTP of EUR 50,000 per QALY gained. The current intervention was not cost-effective compared to usual care to prevent or postpone new disabilities over a one-year period. Based on these findings, implementation of the evaluated multifactorial nurse-led care model is not to be recommended.
Velasco-Tapia, Fernando
2014-01-01
Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994
Directory of Open Access Journals (Sweden)
Fernando Velasco-Tapia
2014-01-01
Full Text Available Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC volcanic range (Mexican Volcanic Belt. In this locality, the volcanic activity (3.7 to 0.5 Ma was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward’s linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas in the comingled lavas (binary mixtures.
Directory of Open Access Journals (Sweden)
S. Ars
2017-12-01
Full Text Available This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping
Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe
2017-12-01
This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances
Davidson, Norman
2003-01-01
Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody
Directory of Open Access Journals (Sweden)
Fábio M. Pereira
2016-10-01
Full Text Available This paper describes the development of a novel microfluidic platform for multifactorial analysis integrating four label-free detection methods: electrical impedance, refractometry, optical absorption and fluorescence. We present the rationale for the design and the details of the microfabrication of this multifactorial hybrid microfluidic chip. The structure of the platform consists of a three-dimensionally patterned polydimethylsiloxane top part attached to a bottom SU-8 epoxy-based negative photoresist part, where microelectrodes and optical fibers are incorporated to enable impedance and optical analysis. As a proof of concept, the chip functions have been tested and explored, enabling a diversity of applications: (i impedance-based identification of the size of micro beads, as well as counting and distinguishing of erythrocytes by their volume or membrane properties; (ii simultaneous determination of the refractive index and optical absorption properties of solutions; and (iii fluorescence-based bead counting.
Iturria-Medina, Yasser; Carbonell, Félix M; Sotero, Roberto C; Chouinard-Decorte, Francois; Evans, Alan C
2017-05-15
Generative models focused on multifactorial causal mechanisms in brain disorders are scarce and generally based on limited data. Despite the biological importance of the multiple interacting processes, their effects remain poorly characterized from an integrative analytic perspective. Here, we propose a spatiotemporal multifactorial causal model (MCM) of brain (dis)organization and therapeutic intervention that accounts for local causal interactions, effects propagation via physical brain networks, cognitive alterations, and identification of optimum therapeutic interventions. In this article, we focus on describing the model and applying it at the population-based level for studying late onset Alzheimer's disease (LOAD). By interrelating six different neuroimaging modalities and cognitive measurements, this model accurately predicts spatiotemporal alterations in brain amyloid-β (Aβ) burden, glucose metabolism, vascular flow, resting state functional activity, structural properties, and cognitive integrity. The results suggest that a vascular dysregulation may be the most-likely initial pathologic event leading to LOAD. Nevertheless, they also suggest that LOAD it is not caused by a unique dominant biological factor (e.g. vascular or Aβ) but by the complex interplay among multiple relevant direct interactions. Furthermore, using theoretical control analysis of the identified population-based multifactorial causal network, we show the crucial advantage of using combinatorial over single-target treatments, explain why one-target Aβ based therapies might fail to improve clinical outcomes, and propose an efficiency ranking of possible LOAD interventions. Although still requiring further validation at the individual level, this work presents the first analytic framework for dynamic multifactorial brain (dis)organization that may explain both the pathologic evolution of progressive neurological disorders and operationalize the influence of multiple interventional
MLQ (Chestionarul Multifactorial de Leadership). Date preliminare pentru România
Dragoş Iliescu; Rareş Mocanu; Felicia Beldean
2008-01-01
The Multifactorial Leadership Questionnaire (MLQ), in its latest version (5X), is a complex instrument, created in order to offer (1) a valid measurement of the transformational, transactional and passive components of leadership and also (2) as accurate a profile as possible for a person’s leadership potential and leadership-related behavior. The MLQ has been often used in laboratory and field researches, being an adequate, very useful tool for selection, transfer, promotion, development and...
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble
International Nuclear Information System (INIS)
Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D.; Rodriguez G, N. L.
2009-01-01
The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1·10 13 n·cm -2 ·s -1 . The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)
Natrella, Mary Gibbons
1963-01-01
Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations
Phung, Dung; Huang, Cunrui; Rutherford, Shannon; Dwirahmadi, Febi; Chu, Cordia; Wang, Xiaoming; Nguyen, Minh; Nguyen, Nga Huy; Do, Cuong Manh; Nguyen, Trung Hieu; Dinh, Tuan Anh Diep
2015-05-01
The present study is an evaluation of temporal/spatial variations of surface water quality using multivariate statistical techniques, comprising cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA). Eleven water quality parameters were monitored at 38 different sites in Can Tho City, a Mekong Delta area of Vietnam from 2008 to 2012. Hierarchical cluster analysis grouped the 38 sampling sites into three clusters, representing mixed urban-rural areas, agricultural areas and industrial zone. FA/PCA resulted in three latent factors for the entire research location, three for cluster 1, four for cluster 2, and four for cluster 3 explaining 60, 60.2, 80.9, and 70% of the total variance in the respective water quality. The varifactors from FA indicated that the parameters responsible for water quality variations are related to erosion from disturbed land or inflow of effluent from sewage plants and industry, discharges from wastewater treatment plants and domestic wastewater, agricultural activities and industrial effluents, and contamination by sewage waste with faecal coliform bacteria through sewer and septic systems. Discriminant analysis (DA) revealed that nephelometric turbidity units (NTU), chemical oxygen demand (COD) and NH₃ are the discriminating parameters in space, affording 67% correct assignation in spatial analysis; pH and NO₂ are the discriminating parameters according to season, assigning approximately 60% of cases correctly. The findings suggest a possible revised sampling strategy that can reduce the number of sampling sites and the indicator parameters responsible for large variations in water quality. This study demonstrates the usefulness of multivariate statistical techniques for evaluation of temporal/spatial variations in water quality assessment and management.
Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi
2015-03-15
Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate
Wang, H X; Lü, P J; Yue, S W; Chang, L Y; Li, Y; Zhao, H P; Li, W R; Gao, J B
2017-12-05
Objective: To investigate the image quality and radiation dose with wide-detector(80 mm) and adaptive statistical iterative reconstruction-V (ASIR-V) technique at abdominal contrast enhanced CT scan. Methods: In the first phantom experiment part, the percentage of ASIR-V for half dose of combined wide detector with ASIR-V technique as compared with standard-detector (40 mm) technique was determined. The human experiment was performed based on the phantom study, 160 patients underwent contrast-enhanced abdominal CT scan were prospectively collected and divided into the control group ( n =40) with image reconstruction using 40% ASIR (group A) and the study group ( n =120) with random number table. According to pre-ASIR-V percentage, the study group was assigned into three groups[40 cases in each group, group B: 0 pre-ASIR-V scan with image reconstruction of 0-100% post-ASIR-V (interval 10%, subgroups B0-B10); group C: 20% pre-ASIR-V with 20%, 40% and 60% post-ASIR-V (subgroups C1-C3); group D: 40%pre-ASIR-V with 40% and 60% post-ASIR-V (subgroups D1-D2)]. Image noise, CT attenuation values and CNR of the liver, pancreas, aorta and portal vein were compared by using two sample t test and One-way ANOVA. Qualitative visual parameters (overall image quality as graded on a 5-point scale) was compared by Mann-Whitney U test and Kruskal-Wallis H test. Results: The phantom experiment showed that the percentage of pre-ASIR-V for half dose was 40%. With the 40% pre-ASIR-V, radiation dose in the study group was reduced by 35.5% as compared with the control group. Image noise in the subgroups of B2-B10, C2-C3 and D1-D2 were lower ( t =-14.681--3.046, all P 0.05). The subjective image quality scores increased gradually in the range of 0-60% post-ASIR-V and decreased with post-ASIR-V larger than 70%. The overall image quality of subgroup B3-B8, C2-C3 and D1-D2 were higher than that in group A ( Z =-2.229--6.533, all P ASIR technique, wide-detector combined with 40% pre
DEFF Research Database (Denmark)
Stidsen, Jacob Volmer; Nielsen, Jens Steen; Henriksen, Jan Erik
2017-01-01
INTRODUCTION: We present the protocol for a multifactorial intervention study designed to test whether individualised treatment, based on pathophysiological phenotyping and individualised treatment goals, improves type 2 diabetes (T2D) outcomes. METHODS AND ANALYSIS: We will conduct a prospective...
DEFF Research Database (Denmark)
Aadahl, M; Smith, L von Huth; Toft, U
2011-01-01
Aim To examine the effect of a multifactorial lifestyle intervention on 5-year change in physical activity (PA) and to explore whether length of education had an impact on the effect of the intervention. Methods Two random samples (high intervention group A, n=11 708; low intervention group B, n......-based multifactorial lifestyle intervention did not influence social inequality in PA. Keywords Lifestyle, Exercise, Randomised Intervention Study, Ischemic Heart Disease, Socioeconomic Position....
Using a multifactorial approach to determine fall risk profiles in portuguese older adults.
Moniz-Pereira, Vera; Carnide, Filomena; Ramalho, Fátima; André, Helô; Machado, Maria; Santos-Rocha, Rita; Veloso, António P
2013-01-01
The aim of this study was to use a multifactorial approach to characterize episodic and recurrent fallers risk profiles in Portuguese older adults. To accomplish the mentioned purpose, 1416 Portuguese older adults above 65 years were tested with three different field measurements: 1) health and falls questionnaire; 2) Physical Activity questionnaire and 3) a set of functional fitness tests. The subjects were divided in three different groups according to fall prevalence: non-fallers, subjects who did not report any falls during the previous year, episodic fallers, those who reported to have fallen only once during the previous year, and recurrent fallers, the ones that fell twice or more times during the previous year. Episodic and Recurrent fallers risk profiles were established using multifactorial logistic regression models in order to avoid confounding effects between the variables. The results showed that age was not a risk factor for either episodic or recurrent falling. In addition, health parameters were shown to be the factors distinguishing recurrent from episodic fallers. This may imply that, comparing with episodic falls, recurrent falls are more associated with higher presence of chronic conditions and are less likely to occur due to external factors. Furthermore, being a woman, having fear of falling and lower functional fitness levels were determinant factors for both episodic and recurrent falls. It is also important to note that, although total physical activity was only related with episodic falling, promoting physical activity and exercise may be the easiest and cheapest way to improve functional fitness and health levels and therefore, its role in fall prevention should not be underestimated. The results of this study reinforce the importance of using a multifactorial approach, not only focusing on cognitive-behavioral factors, but also on promoting physical activity and healthy lifestyles, when assessing fall risk or planning an intervention
Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan
2018-06-01
The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.
Industrial statistics with Minitab
Cintas, Pere Grima; Llabres, Xavier Tort-Martorell
2012-01-01
Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Besseris, George J
2013-01-01
Data screening is an indispensable phase in initiating the scientific discovery process. Fractional factorial designs offer quick and economical options for engineering highly-dense structured datasets. Maximum information content is harvested when a selected fractional factorial scheme is driven to saturation while data gathering is suppressed to no replication. A novel multi-factorial profiler is presented that allows screening of saturated-unreplicated designs by decomposing the examined response to its constituent contributions. Partial effects are sliced off systematically from the investigated response to form individual contrasts using simple robust measures. By isolating each time the disturbance attributed solely to a single controlling factor, the Wilcoxon-Mann-Whitney rank stochastics are employed to assign significance. We demonstrate that the proposed profiler possesses its own self-checking mechanism for detecting a potential influence due to fluctuations attributed to the remaining unexplainable error. Main benefits of the method are: 1) easy to grasp, 2) well-explained test-power properties, 3) distribution-free, 4) sparsity-free, 5) calibration-free, 6) simulation-free, 7) easy to implement, and 8) expanded usability to any type and size of multi-factorial screening designs. The method is elucidated with a benchmarked profiling effort for a water filtration process.
Directory of Open Access Journals (Sweden)
George J Besseris
Full Text Available Data screening is an indispensable phase in initiating the scientific discovery process. Fractional factorial designs offer quick and economical options for engineering highly-dense structured datasets. Maximum information content is harvested when a selected fractional factorial scheme is driven to saturation while data gathering is suppressed to no replication. A novel multi-factorial profiler is presented that allows screening of saturated-unreplicated designs by decomposing the examined response to its constituent contributions. Partial effects are sliced off systematically from the investigated response to form individual contrasts using simple robust measures. By isolating each time the disturbance attributed solely to a single controlling factor, the Wilcoxon-Mann-Whitney rank stochastics are employed to assign significance. We demonstrate that the proposed profiler possesses its own self-checking mechanism for detecting a potential influence due to fluctuations attributed to the remaining unexplainable error. Main benefits of the method are: 1 easy to grasp, 2 well-explained test-power properties, 3 distribution-free, 4 sparsity-free, 5 calibration-free, 6 simulation-free, 7 easy to implement, and 8 expanded usability to any type and size of multi-factorial screening designs. The method is elucidated with a benchmarked profiling effort for a water filtration process.
Directory of Open Access Journals (Sweden)
Pascual Izquierdo-Egea
2015-03-01
Full Text Available Se presenta aqui una tecnica estadistica para medir la conflictividad social a traves del registro mortuorio. Nace al amparo del metodo de valoracion contextual empleado en el analisis de los ajuares funerarios desde 1993. Se trata de una herramienta fundamental para el desarrollo de la arqueologia de los fenomenos sociales, cuyos relevantes resultados empiricos avalan su trascendencia teorica. Tras proceder a su conceptualizacion en funcion de la desigualdad social y la riqueza relativa, se explican las dos clases de conflictividad social definidas: estructural o estatica y coyuntural o dinamica. Finalmente, se incluyen sus conexiones con la ley demografica de Malthus a traves de sus dos parametros: poblacion y recursos. Todo este entramado teorico se ilustra con algunas aplicaciones referidas a las civilizaciones antiguas, abarcando la protohistoria iberica, la Mesoamerica prehispanica o la Roma altoimperial. ENGLISH: A statistical technique to measure social conflict through the mortuary record is presented here. It is born under the contextual valuation method used in the analysis of grave goods since 1993. This is a fundamental tool for the development of the archaeology of social phenomena, whose relevant empirical results support its theoretical significance. After conveying its conceptualization in terms of social inequality and relative wealth, the two classes of social conflict are explained: static or structural and dynamic or conjunctural. Finally, connections with the Malthusian demographic law through its two parameters—population and resources—are included. The synthesis of these theoretical frameworks is illustrated with applications to ancient civilizations, including Iberian protohistory, prehispanic Mesoamerica, and early imperial Rome.
International Nuclear Information System (INIS)
Lim, Gyeong Hui
2008-03-01
This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics
DEFF Research Database (Denmark)
Vind, Ane B; Andersen, Hanne E; Pedersen, Kirsten D
2009-01-01
, mean age 74, 73.7%women, who had visited the emergency department or had been hospitalized due to a fall. INTERVENTION: Identification of general medical, cardiovascular, and physical risk factors for falls and individual intervention in the intervention group. Participants in the control group....... Followup exceeded 90.0%. A total of 422 falls were registered in the intervention group, 398 in the control group. Intention-to-treat analysis revealed no effect of the intervention on fall rates (relative risk=1.06, 95%confidence interval (CI)=0.75 -1.51), proportion with falls (odds ratio (OR)=1.20, 95......OBJECTIVES: To evaluate the effect of multifactorial fall prevention in community-dwelling people aged 65 and older in Denmark. DESIGN: Randomized, controlled clinical trial. SETTING: Geriatric outpatient clinic at Glostrup University Hospital. PARTICIPANTS: Three hundred ninety-two elderly people...
Pavas, Edison Gil; Gómez-García, Miguel Angel
2009-01-01
This work deals with the treatment of the wastewaters resulting from the process of dyeing flowers. In some local cases for growing flowers near to Medellín (Colombia), wastewater color was found to be one of the main problems in meeting local effluent standards. Wastewaters were treated by photodegradation process (which includes photocatalysis) to achieve the degradation of dyes mixture and organic matter in the wastewater. A multifactorial experimental design was proposed, including as experimental factors the following variables: pH, and the concentration of both catalyst (TiO(2)) and hydrogen peroxide (H(2)O(2)). According to the obtained results, at the optimized variables values, it is possible to reach a 99% reduction of dyes, a 76.9% of mineralization (TOC) and a final biodegradability of 0.834. Kinetic analysis allows proposing a pseudo first order reaction for the reduction, the mineralization, and the biodegradation processes.
Psczolla, M
2013-10-01
In Germany there is a clear deficit in the non-operative treatment of chronic and complex diseases and pain disorders in acute care hospitals. Only about 20 % of the treatments are carried out in orthopedic hospitals. Hospitals specialized in manual medicine have therefore formed a working group on non-operative orthopedic manual medicine acute care clinics (ANOA). The ANOA has developed a multimodal assessment procedure called the OPS 8-977 which describes the structure and process quality of multimodal and interdisciplinary diagnosis and treatment of the musculoskeletal system. Patients are treated according to clinical pathways oriented on the clinical findings. The increased duration of treatment in the German diagnosis-related groups (DRG) system is compensated for with a supplemental remuneration. Thus, complex and multifactorial orthopedic diseases and pain disorders are conservatively and appropriately treated as inpatient departments of acute care hospitals.
Gersner, Roman; Gal, Ram; Levit, Ofir; Moshe, Hagar; Zangen, Abraham
2014-06-01
Major depressive disorder (MDD) is a common and devastating mental illness behaviorally characterized by various symptoms, including reduced motivation, anhedonia and psychomotor retardation. Although the etiology of MDD is still obscure, a genetic predisposition appears to play an important role. Here we used, for the first time, a multifactorial selective breeding procedure to generate a distinct 'depressed' rat line (DRL); our selection was based upon mobility in the forced swim test, sucrose preference and home-cage locomotion, three widely used tests associated with core characteristics of MDD. Other behavioral effects of the selection process, as well as changes in brain-derived neurotrophic factor (BDNF) and the response to three antidepressant treatments, were also examined. We show that decreased mobility in the forced swim test and decreased sucrose preference (two directly selected traits), as well as decreased exploration in the open field test (an indirectly selected trait), are hereditary components in DRL rats. In addition, lower BDNF levels are observed in the dorsal hippocampus of DRL rats, complying with the neurotrophic hypothesis of depression. Finally, electroconvulsive shocks (ECS) but not pharmacological treatment normalizes both the depressive-like behavioral impairments and the BDNF-related molecular alterations in DRL rats, highlighting the need for robust treatment when the disease is inherited and not necessarily triggered by salient chronic stress. We therefore provide a novel multifactorial genetic rat model for depression-related behaviors. The model can be used to further study the etiology of the disease and suggest molecular correlates and possible treatments for the disease.
A Multifactorial, Criteria-based Progressive Algorithm for Hamstring Injury Treatment.
Mendiguchia, Jurdan; Martinez-Ruiz, Enrique; Edouard, Pascal; Morin, Jean-Benoît; Martinez-Martinez, Francisco; Idoate, Fernando; Mendez-Villanueva, Alberto
2017-07-01
Given the prevalence of hamstring injuries in football, a rehabilitation program that effectively promotes muscle tissue repair and functional recovery is paramount to minimize reinjury risk and optimize player performance and availability. This study aimed to assess the concurrent effectiveness of administering an individualized and multifactorial criteria-based algorithm (rehabilitation algorithm [RA]) on hamstring injury rehabilitation in comparison with using a general rehabilitation protocol (RP). Implementing a double-blind randomized controlled trial approach, two equal groups of 24 football players (48 total) completed either an RA group or a validated RP group 5 d after an acute hamstring injury. Within 6 months after return to sport, six hamstring reinjuries occurred in RP versus one injury in RA (relative risk = 6, 90% confidence interval = 1-35; clinical inference: very likely beneficial effect). The average duration of return to sport was possibly quicker (effect size = 0.34 ± 0.42) in RP (23.2 ± 11.7 d) compared with RA (25.5 ± 7.8 d) (-13.8%, 90% confidence interval = -34.0% to 3.4%; clinical inference: possibly small effect). At the time to return to sport, RA players showed substantially better 10-m time, maximal sprinting speed, and greater mechanical variables related to speed (i.e., maximum theoretical speed and maximal horizontal power) than the RP. Although return to sport was slower, male football players who underwent an individualized, multifactorial, criteria-based algorithm with a performance- and primary risk factor-oriented training program from the early stages of the process markedly decreased the risk of reinjury compared with a general protocol where long-length strength training exercises were prioritized.
Kim, Moon H.; Ritz, Christian T.; Arvin, Donald V.
2012-01-01
Potential wetland extents were estimated for a 14-mile reach of the Wabash River near Terre Haute, Indiana. This pilot study was completed by the U.S. Geological Survey in cooperation with the U.S. Department of Agriculture, Natural Resources Conservation Service (NRCS). The study showed that potential wetland extents can be estimated by analyzing streamflow statistics with the available streamgage data, calculating the approximate water-surface elevation along the river, and generating maps by use of flood-inundation mapping techniques. Planning successful restorations for Wetland Reserve Program (WRP) easements requires a determination of areas that show evidence of being in a zone prone to sustained or frequent flooding. Zone determinations of this type are used by WRP planners to define the actively inundated area and make decisions on restoration-practice installation. According to WRP planning guidelines, a site needs to show evidence of being in an "inundation zone" that is prone to sustained or frequent flooding for a period of 7 consecutive days at least once every 2 years on average in order to meet the planning criteria for determining a wetland for a restoration in agricultural land. By calculating the annual highest 7-consecutive-day mean discharge with a 2-year recurrence interval (7MQ2) at a streamgage on the basis of available streamflow data, one can determine the water-surface elevation corresponding to the calculated flow that defines the estimated inundation zone along the river. By using the estimated water-surface elevation ("inundation elevation") along the river, an approximate extent of potential wetland for a restoration in agricultural land can be mapped. As part of the pilot study, a set of maps representing the estimated potential wetland extents was generated in a geographic information system (GIS) application by combining (1) a digital water-surface plane representing the surface of inundation elevation that sloped in the downstream
Rumsey, Deborah
2011-01-01
The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou
... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...
van der Pols-Vijlbrief, Rachel; Wijnhoven, Hanneke A H; Bosmans, Judith E; Twisk, Jos W R; Visser, Marjolein
2017-12-01
Undernutrition in old age is associated with increased morbidity, mortality and health care costs. Treatment by caloric supplementation results in weight gain, but compliance is poor in the long run. Few studies targeted underlying causes of undernutrition in community-dwelling older adults. This study aimed to evaluate the cost-effectiveness of a multifactorial personalized intervention focused on eliminating or managing the underlying causes of undernutrition to prevent and reduce undernutrition in comparison with usual care. A randomized controlled trial was performed among 155 community-dwelling older adults receiving home care with or at risk of undernutrition. The intervention included a personalized action plan and 6 months support. The control group received usual care. Body weight, and secondary outcomes were measured in both groups at baseline and 6 months follow-up. Multiple imputation, linear regression and generalized estimating equation analyses were used to analyze intervention effects. In the cost-effectiveness analyses regression models were bootstrapped to estimate statistical uncertainty. This intervention showed no statistically significant effects on body weight, mid-upper arm circumference, grip strength, gait speed and 12-Item Short-Form Health Survey physical component scale as compared to usual care, but there was an effect on the 12-Item Short-Form Health Survey mental component scale (0-100) (β = 8.940, p=0.001). Borderline significant intervention effects were found for both objective and subjective physical function measures, Short Physical Performance Battery (0-12) (β = 0.56, p=0.08) and ADL-Barthel score (0-20) (β = 0.69, p=0.09). Societal costs in the intervention group were statistically non-significantly lower than in the control group (mean difference -274; 95% CI -1111; 782). Cost-effectiveness acceptability curves showed that the probability of cost-effectiveness was 0.72 at a willingness-to-pay of 1000
Ferrer, A.; Formiga, F.; Sanz, H.; de Vries, O.J.; Badia, T.; Pujol, R.
2014-01-01
Background: The purpose of this study was to assess the effectiveness of a multifactorial intervention to reduce falls among the oldest-old people, including individuals with cognitive impairment or comorbidities. Methods: A randomized, single-blind, parallel-group clinical trial was conducted from
Suijker, Jacqueline J.; van Rijn, Marjon; Buurman, Bianca M.; ter Riet, Gerben; van Charante, Eric P. Moll; de Rooij, Sophia E.
2016-01-01
Background To evaluate the effects of nurse-led multifactorial care to prevent disability in community-living older people. Methods In a cluster randomized trail, 11 practices (n = 1,209 participants) were randomized to the intervention group, and 13 practices (n = 1,074 participants) were
Statistical Pattern Recognition
Webb, Andrew R
2011-01-01
Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions. It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
Proteases and proteolysis in Alzheimer disease: a multifactorial view on the disease process.
De Strooper, Bart
2010-04-01
Alzheimer disease is characterized by the accumulation of abnormally folded protein fragments, i.e., amyloid beta peptide (Abeta) and tau that precipitate in amyloid plaques and neuronal tangles, respectively. In this review we discuss the complicated proteolytic pathways that are responsible for the generation and clearance of these fragments, and how disturbances in these pathways interact and provide a background for a novel understanding of Alzheimer disease as a multifactorial disorder. Recent insights evolve from the static view that the morphologically defined plaques and tangles are disease driving towards a more dynamic, biochemical view in which the intermediary soluble Abeta oligomers and soluble tau fragments are considered as the main mediators of neurotoxicity. The relevance of proteolytic pathways, centered on the generation and clearance of toxic Abeta, on the cleavage and nucleation of tau, and on the general proteostasis of the neurons, then becomes obvious. Blocking or stimulating these pathways provide, or have the potential to provide, interesting drug targets, which raises the hope that we will be able to provide a cure for this dreadful disorder.
Anstey, Kaarin J; Horswill, Mark S; Wood, Joanne M; Hatherly, Christopher
2012-03-01
The current study evaluated part of the Multifactorial Model of Driving Safety to elucidate the relative importance of cognitive function and a limited range of standard measures of visual function in relation to the Capacity to Drive Safely. Capacity to Drive Safely was operationalized using three validated screening measures for older drivers. These included an adaptation of the well validated Useful Field of View (UFOV) and two newer measures, namely a Hazard Perception Test (HPT), and a Hazard Change Detection Task (HCDT). Community dwelling drivers (n=297) aged 65-96 were assessed using a battery of measures of cognitive and visual function. Factor analysis of these predictor variables yielded factors including Executive/Speed, Vision (measured by visual acuity and contrast sensitivity), Spatial, Visual Closure, and Working Memory. Cognitive and Vision factors explained 83-95% of age-related variance in the Capacity to Drive Safely. Spatial and Working Memory were associated with UFOV, HPT and HCDT, Executive/Speed was associated with UFOV and HCDT and Vision was associated with HPT. The Capacity to Drive Safely declines with chronological age, and this decline is associated with age-related declines in several higher order cognitive abilities involving manipulation and storage of visuospatial information under speeded conditions. There are also age-independent effects of cognitive function and vision that determine driving safety. Copyright © 2011 Elsevier Ltd. All rights reserved.
Boslaugh, Sarah
2013-01-01
Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.
Business statistics for dummies
Anderson, Alan
2013-01-01
Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w
On quantum statistical inference
Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have
Playing at Statistical Mechanics
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Akifuddin, Syed; Khatoon, Farheen
2015-12-01
Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.
... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...
Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics
Uric acid as one of the important factors in multifactorial disorders – facts and controversies
Pasalic, Daria; Marinkovic, Natalija; Feher-Turkovic, Lana
2012-01-01
With considering serum concentration of the uric acid in humans we are observing hyperuricemia and possible gout development. Many epidemiological studies have shown the relationship between the uric acid and different disorders such are obesity, metabolic syndrome, hypertension and coronary artery disease. Clinicians and investigators recognized serum uric acid concentration as very important diagnostic and prognostic factor of many multifactorial disorders. This review presented few clinical conditions which are not directly related to uric acid, but the concentrations of uric acid might have a great impact in observing, monitoring, prognosis and therapy of such disorders. Uric acid is recognized as a marker of oxidative stress. Production of the uric acid includes enzyme xanthine oxidase which is involved in producing of radical-oxigen species (ROS). As by-products ROS have a significant role in the increased vascular oxidative stress and might be involved in atherogenesis. Uric acid may inhibit endothelial function by inhibition of nitric oxide-function under conditions of oxidative stress. Down regulation of nitric oxide and induction of endothelial dysfunction might also be involved in pathogenesis of hypertension. The most important and well evidenced is possible predictive role of uric acid in predicting short-term outcome (mortality) in acute myocardial infarction (AMI) patients and stroke. Nephrolithiasis of uric acid origin is significantly more common among patients with the metabolic syndrome and obesity. On contrary to this, uric acid also acts is an “antioxidant”, a free radical scavenger and a chelator of transitional metal ions which are converted to poorly reactive forms. PMID:22384520
Koczy, Petra; Becker, Clemens; Rapp, Kilian; Klie, Thomas; Beische, Denis; Büchele, Gisela; Kleiner, Andrea; Guerra, Virginia; Rissmann, Ulrich; Kurrle, Susan; Bredthauer, Doris
2011-02-01
To evaluate the effectiveness of a multifactorial intervention to reduce the use of physical restraints in residents of nursing homes. Cluster-randomized controlled trial. Forty-five nursing homes in Germany. Three hundred thirty-three residents who were being restrained at the start of the intervention. Persons responsible for the intervention in the nursing homes attended a 6-hour training course that included education about the reasons restraints are used, the adverse effects, and alternatives to their use. Technical aids, such as hip protectors and sensor mats, were provided. The training was designed to give the change agents tools for problem-solving to prevent behavioral symptoms and injuries from falls without using physical restraints. The main outcome was the complete cessation of physical restraint use on 3 consecutive days 3 months after the start of the intervention. Secondary outcomes were partial reductions in restraint use, percentage of fallers, number of psychoactive drugs, and occurrence of behavioral symptoms. The probability of being unrestrained in the intervention group (IG) was more than twice that in the control group (CG) at the end of the study (odds ratio=2.16, 95% confidence interval=1.05-4.46). A partial reduction of restraint use was also about twice as often achieved in the IG as in the CG. No negative effect was observed regarding medication or behavioral symptoms. The percentage of fallers was higher in the IG. The intervention reduced restraint use without a significant increase in falling, behavioral symptoms, or medication. © 2011, Copyright the Authors. Journal compilation © 2011, The American Geriatrics Society.
Directory of Open Access Journals (Sweden)
Anja S Euser
Full Text Available BACKGROUND: Although P300 amplitude reductions constitute a persistent finding in children of addicted parents, relatively little is known about the specificity of this finding. The major aim of this study was to investigate the association between parental rearing, adverse life events, stress-reactivity, substance use and psychopathology on the one hand, and P300 amplitude in response to both target and novel distracter stimuli on the other hand. Moreover, we assessed whether risk group status (i.e., having a parental history of Substance Use Disorders [SUD] uniquely contributed to P300 amplitude variation above and beyond these other variables. METHODS: Event-related potentials were recorded in high-risk adolescents with a parental history of SUD (HR;n=80 and normal-risk controls (NR;n=100 while performing a visual Novelty Oddball paradigm. Stress-evoked cortisol levels were assessed and parenting, life adversities, substance use and psychopathology were examined by using self-reports. RESULTS: HR adolescents displayed smaller P300 amplitudes in response to novel- and to target stimuli than NR controls, while the latter only approached significance. Interestingly, the effect of having a parental history of SUD on target-P300 disappeared when all other variables were taken into account. Externalizing problem behavior was a powerful predictor of target-P300. In contrast, risk group status uniquely predicted novelty-P300 amplitude reductions above and beyond all other factors. CONCLUSION: Overall, the present findings suggest that the P300 amplitude reduction to novel stimuli might be a more specific endophenotype for SUD than the target-P300 amplitude. This pattern of results underscores the importance of conducting multifactorial assessments when examining important cognitive processes in at-risk adolescents.
Lectures on algebraic statistics
Drton, Mathias; Sullivant, Seth
2009-01-01
How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.
Experimental Mathematics and Computational Statistics
Energy Technology Data Exchange (ETDEWEB)
Bailey, David H.; Borwein, Jonathan M.
2009-04-30
The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.
Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung
2012-09-01
To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Twenty-six patients (M:F=13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, PASIR studies (20.81 vs. 16.67, P=0.004), but was not different in the aorta (18.23 vs. 18.72, P=0.726). The subjective image quality demonstrated no difference between the two studies. A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Lee, Seung Hyun, E-mail: circle1128@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, Myung-Joon, E-mail: mjkim@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Yoon, Choon-Sik, E-mail: yooncs58@yuhs.ac [Department of Radiology, Gangnam Severance Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Lee, Mi-Jung, E-mail: mjl1213@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of)
2012-09-15
Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality.
International Nuclear Information System (INIS)
Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung
2012-01-01
Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality
Sadovskii, Michael V
2012-01-01
This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.
Pekkarinen, T; Löyttyniemi, E; Välimäki, M
2013-12-01
Guidelines suggest identification of women at fracture risk by bone density measurement and subsequently pharmacotherapy. However, most women who sustain a hip fracture do not have osteoporosis in terms of bone density. The present non-pharmacological intervention among elderly women unselected for osteoporosis reduced hip fracture risk by 55 % providing an alternative approach to fracture prevention. Hip fractures are expensive for society and cause disability for those who sustain them. We studied whether a multifactorial non-pharmacological prevention program reduces hip fracture risk in elderly women. A controlled trial concerning 60- to 70-year-old community-dwelling Finnish women was undertaken. A random sample was drawn from the Population Information System and assigned into the intervention group (IG) and control group (CG). Of the 2,547 women who were invited to the IG, 1,004 (39 %) and of the 2,120 invited to the CG, 1,174 (55 %) participated. The IG participated in a fracture prevention program for 1 week at a rehabilitation center followed by review days twice. The CG received no intervention. During the 10-year follow-up, both groups participated in survey questionnaire by mail. Outcome of interest was occurrence of hip fractures and changes in bone-health-related lifestyle. During the follow-up, 12 (1.2 %) women in the IG and 29 (2.5 %) in the CG sustained a hip fracture (P = 0.039). The determinants of hip fractures by stepwise logistic regression were baseline smoking (odds ratio (OR) 4.32 (95 % confidence interval [CI] 2.14-8.71), age OR 1.15/year (95 % CI 1.03-1.28), fall history OR 2.7 (95 % CI 1.24-5.9), stroke history OR 2.99 (95 % CI 1.19-7.54) and participating in this program OR 0.45 (95 % CI 0.22-0.93). Starting vitamin D and calcium supplement use was more common in the IG compared with the CG. The results suggest that this non-pharmacological fracture prevention program may reduce the risk of hip fractures in elderly
Goodman, Joseph W
2015-01-01
This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications. The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i
Energy Technology Data Exchange (ETDEWEB)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2017-05-15
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
International Nuclear Information System (INIS)
Eliazar, Iddo
2017-01-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.
... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...
Directory of Open Access Journals (Sweden)
Hill KD
2014-11-01
Full Text Available Keith D Hill,1,2 Lesley Day,3 Terry P Haines4,5 1School of Physiotherapy and Exercise Science, Faculty of Health Sciences, Curtin University, Perth, WA, Australia; 2National Ageing Research Institute, Royal Melbourne Hospital, Parkville, VIC, Australia; 3Falls Prevention Research Unit, Monash Injury Research Institute, Monash University, VIC, Australia; 4Allied Health Research Unit, Southern Health, Cheltenham, VIC, Australia; 5Physiotherapy Department, Faculty of Medicine, Nursing, and Health Sciences, Monash University, VIC, Australia Purpose: To investigate previous, current, or planned participation in, and perceptions toward, multifactorial fall prevention programs such as those delivered through a falls clinic in the community setting, and to identify factors influencing older people’s intent to undertake these interventions.Design and methods: Community-dwelling people aged >70 years completed a telephone survey. Participants were randomly selected from an electronic residential telephone listing, but purposeful sampling was used to include equal numbers with and without common chronic health conditions associated with fall-related hospitalization. The survey included scenarios for fall prevention interventions, including assessment/multifactorial interventions, such as those delivered through a falls clinic. Participants were asked about previous exposure to, or intent to participate in, the interventions. A path model analysis was used to identify factors associated with intent to participate in assessment/multifactorial interventions.Results: Thirty of 376 participants (8.0% reported exposure to a multifactorial falls clinic-type intervention in the past 5 years, and 16.0% expressed intention to undertake this intervention. Of the 132 participants who reported one or more falls in the past 12 months, over one-third were undecided or disagreed that a falls clinic type of intervention would be of benefit to them. Four elements
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Glaz, Joseph
2009-01-01
Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.
Lyons, L.
2016-01-01
Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.
DEFF Research Database (Denmark)
Edberg, Anna; Freyhult, Eva; Sand, Salomon
- and inter-national data excerpts. For example, major PCA loadings helped deciphering both shared and disparate features, relating to food groups, across Danish and Swedish preschool consumers. Data interrogation, reliant on the above-mentioned composite techniques, disclosed one outlier dietary prototype...... prototype with the latter property was identified also in the Danish data material, but without low consumption of Vegetables or Fruit & berries. The second MDA-type of data interrogation involved Supervised Learning, also known as Predictive Modelling. These exercises involved the Random Forest (RF...... not elaborated on in-depth, output from several analyses suggests a preference for energy-based consumption data for Cluster Analysis and Predictive Modelling, over those appearing as weight....
Energy Technology Data Exchange (ETDEWEB)
Guignard, P.A.; Chan, W. (Royal Melbourne Hospital, Parkville (Australia). Dept. of Nuclear Medicine)
1984-09-01
Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned.
International Nuclear Information System (INIS)
Guignard, P.A.; Chan, W.
1984-01-01
Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned. (author)
Directory of Open Access Journals (Sweden)
R. Soundararajan
2015-01-01
Full Text Available Artificial Neural Network (ANN approach was used for predicting and analyzing the mechanical properties of A413 aluminum alloy produced by squeeze casting route. The experiments are carried out with different controlled input variables such as squeeze pressure, die preheating temperature, and melt temperature as per Full Factorial Design (FFD. The accounted absolute process variables produce a casting with pore-free and ideal fine grain dendritic structure resulting in good mechanical properties such as hardness, ultimate tensile strength, and yield strength. As a primary objective, a feed forward back propagation ANN model has been developed with different architectures for ensuring the definiteness of the values. The developed model along with its predicted data was in good agreement with the experimental data, inferring the valuable performance of the optimal model. From the work it was ascertained that, for castings produced by squeeze casting route, the ANN is an alternative method for predicting the mechanical properties and appropriate results can be estimated rather than measured, thereby reducing the testing time and cost. As a secondary objective, quantitative and statistical analysis was performed in order to evaluate the effect of process parameters on the mechanical properties of the castings.
Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae
2012-09-01
This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.
Cordell, H J; Todd, J A; Bennett, S T; Kawaguchi, Y; Farrall, M
1995-10-01
To investigate the genetic component of multifactorial diseases such as type 1 (insulin-dependent) diabetes mellitus (IDDM), models involving the joint action of several disease loci are important. These models can give increased power to detect an effect and a greater understanding of etiological mechanisms. Here, we present an extension of the maximum lod score method of N. Risch, which allows the simultaneous detection and modeling of two unlinked disease loci. Genetic constraints on the identical-by-descent sharing probabilities, analogous to the "triangle" restrictions in the single-locus method, are derived, and the size and power of the test statistics are investigated. The method is applied to affected-sib-pair data, and the joint effects of IDDM1 (HLA) and IDDM2 (the INS VNTR) and of IDDM1 and IDDM4 (FGF3-linked) are assessed with relation to the development of IDDM. In the presence of genetic heterogeneity, there is seen to be a significant advantage in analyzing more than one locus simultaneously. Analysis of these families indicates that the effects at IDDM1 and IDDM2 are well described by a multiplicative genetic model, while those at IDDM1 and IDDM4 follow a heterogeneity model.
Energy Technology Data Exchange (ETDEWEB)
Cordell, H.J.; Todd, J.A.; Bennett, S.T. [Univ. of Oxford (United Kingdom)] [and others
1995-10-01
To investigate the genetic component of multifactorial diseases such as type 1 (insulin-dependent) diabetes mellitus (IDDM), models involving the joint action of several disease loci are important. These models can give increased power to detect an effect and a greater understanding of etiological mechanisms. Here, we present an extension of the maximum lod score method of N. Risch, which allows the simultaneous detection and modeling of two unlinked disease loci. Genetic constraints on the identical-by-descent sharing probabilities, analogous to the {open_quotes}triangle{close_quotes} restrictions in the single-locus method, are derived, and the size and power of the test statistics are investigated. The method is applied to affected-sib-pair data, and the joint effects of IDDM1 (HLA) and IDDM2 (the INS VNTR) and of IDDM1 and IDDM4 (FGF3-linked) are assessed with relation to the development of IDDM. In the presence of genetic heterogeneity, there is seen to be a significant advantage in analyzing more than one locus simultaneously. Analysis of these families indicates that the effects at IDDM1 and IDDM2 are well described by a multiplicative genetic model, while those at IDDM1 and IDDM4 follow a heterogeneity model. 17 refs., 9 tabs.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
International Nuclear Information System (INIS)
Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae
2012-01-01
This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 x 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver (p < 0.05). Furthermore, the F-value, which was used as a scale for the difference in recognition rates, was highest in the average gray level, relatively high in the skewness and the entropy, and relatively low in the uniformity, the relative smoothness and the average contrast. The recognition rate for a fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.
Blakemore, J S
1962-01-01
Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co
Wannier, Gregory Hugh
1966-01-01
Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for
Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James
2014-01-01
Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.
Affum, Andrews Obeng; Osae, Shiloh Dede; Nyarko, Benjamin Jabez Botwe; Afful, Samuel; Fianko, Joseph Richmond; Akiti, Tetteh Thomas; Adomako, Dickson; Acquaah, Samuel Osafo; Dorleku, Micheal; Antoh, Emmanuel; Barnes, Felix; Affum, Enoch Acheampong
2015-02-01
In recent times, surface water resource in the Western Region of Ghana has been found to be inadequate in supply and polluted by various anthropogenic activities. As a result of these problems, the demand for groundwater by the human populations in the peri-urban communities for domestic, municipal and irrigation purposes has increased without prior knowledge of its water quality. Water samples were collected from 14 public hand-dug wells during the rainy season in 2013 and investigated for total coliforms, Escherichia coli, mercury (Hg), arsenic (As), cadmium (Cd) and physicochemical parameters. Multivariate statistical analysis of the dataset and a linear stoichiometric plot of major ions were applied to group the water samples and to identify the main factors and sources of contamination. Hierarchal cluster analysis revealed four clusters from the hydrochemical variables (R-mode) and three clusters in the case of water samples (Q-mode) after z score standardization. Principal component analysis after a varimax rotation of the dataset indicated that the four factors extracted explained 93.3 % of the total variance, which highlighted salinity, toxic elements and hardness pollution as the dominant factors affecting groundwater quality. Cation exchange, mineral dissolution and silicate weathering influenced groundwater quality. The ranking order of major ions was Na(+) > Ca(2+) > K(+) > Mg(2+) and Cl(-) > SO4 (2-) > HCO3 (-). Based on piper plot and the hydrogeology of the study area, sodium chloride (86 %), sodium hydrogen carbonate and sodium carbonate (14 %) water types were identified. Although E. coli were absent in the water samples, 36 % of the wells contained total coliforms (Enterobacter species) which exceeded the WHO guidelines limit of zero colony-forming unit (CFU)/100 mL of drinking water. With the exception of Hg, the concentration of As and Cd in 79 and 43 % of the water samples exceeded the WHO guideline limits of 10 and 3
THE ROLE OF MULTIFACTORIAL APPROACH TO TREATMENT OF OBESITY IN FEMALES
Directory of Open Access Journals (Sweden)
O. L. Andrianova
2015-01-01
Full Text Available Background: Obesity is characterized by an increased risk of diabetes mellitus, coronary heart disease, arterial hypertension and reproductive system disorders, which makes it necessary to implement multifactorial correction of metabolic disturbances.Aim: To analyze diet intake structure of patients with obesity, efficacy and safety of sibutramine monotherapy and sibutramine/metformin combination therapy in the formation of adequate nutritional stereotypes and reduction of bodyweight.Materials and methods: Eighty-two obese women aged 18 to 49 years (mean age 29.7±5.7 years were included into this observational study. Inclusion criteria: women aged 18 to 49 years, with waist circumference>80 cm, body mass index (BMI>27 kg/m², triglyceride level>1.7 mmol/L and/or low-density lipoprotein cholesterol>3.8 mmol/L, and/or high-density lipoprotein cholesterol<1.29 mmol/L. Exclusion criteria in this study were presence of severe somatic and endocrine disorders. The control group consisted of 35 healthy women aged 18 to 49 years (mean age 28.7±5.6 years. All patients were recruited within the observational program PrimaVera. Results: Analysis of dietary intake in female obese patients showed an excess of daily energy intake of 650±250 kcal compared to that in subjects with normal BMI and normal waist circumference. Seventy one percent of patients had excessive expectations from treatment duration and desired weight loss. For control of their eating behavior, 52 patients were administered Reduxin® (sibutramine+cellulose microcristallic 10 mg daily. Other 30 patients (24 women with a history of carbohydrate metabolism disturbances during pregnancy and 6 women who had delivered babies with birth weight of above4 kg were administered Reduxin® 10 mg and metformin 500 mg daily, with weekly dose increase by 500 mg to the final dose of 1500 mg daily. The treatment lasted for 24 weeks. Daily caloric intake decreased by 24±4% from baseline (p<0
Introductory statistical inference
Mukhopadhyay, Nitis
2014-01-01
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...
U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...
Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.
... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...
DEFF Research Database (Denmark)
Tryggestad, Kjell
2004-01-01
The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...
Tada, Sayaka; Ikebe, Kazunori; Matsuda, Ken-Ichi; Maeda, Yoshinobu
2013-12-01
Predicting the tooth survival is such a great challenge for evidence-based dentistry. To prevent further tooth loss of partially edentulous patients, estimation of individualized risk and benefit for each residual tooth is important to the clinical decision-making. While there are several reports indicating a risk of losing the abutment teeth of RPDs, there are no existing reports exploring the cause of abutment loss by multifactorial analysis. The aim of this practice-based longitudinal study was to determine the prognostic factors affecting the survival period of RPD abutments using a multifactorial risk assessment. One hundred and forty-seven patients had been previously provided with a total of 236 new RPDs at the Osaka University Dental Hospital; the 856 abutments for these RPDs were analyzed. Survival of abutment teeth was estimated using the Kaplan-Meier method. Multivariate analysis was conducted by Cox's proportional hazard modelling. The 5-year survival rates were 86.6% for direct abutments and 93.1% for indirect abutments, compared with 95.8% survival in non-abutment teeth. The multivariate analysis showed that abutment survival was significantly associated with crown-root ratio (hazard ratio (HR): 3.13), root canal treatment (HR: 2.93), pocket depth (HR: 2.51), type of abutments (HR: 2.19) and occlusal support (HR: 1.90). From this practice-based longitudinal study, we concluded that RPD abutment teeth are more likely to be lost than other residual teeth. From the multifactorial risk factor assessment, several prognostic factors, such as occlusal support, crown-root ratio, root canal treatment, and pocket depth were suggested. These results could be used to estimate the individualized risk for the residual teeth, to predict the prognosis of RPD abutments and to facilitate an evidence-based clinical decision making. Copyright © 2013 Elsevier Ltd. All rights reserved.
Batchelor, Frances A; Hill, Keith D; Mackintosh, Shylie F; Said, Catherine M; Whitehead, Craig H
2012-09-01
To determine whether a multifactorial falls prevention program reduces falls in people with stroke at risk of recurrent falls and whether this program leads to improvements in gait, balance, strength, and fall-related efficacy. A single blind, multicenter, randomized controlled trial with 12-month follow-up. Participants were recruited after discharge from rehabilitation and followed up in the community. Participants (N=156) were people with stroke at risk of recurrent falls being discharged home from rehabilitation. Tailored multifactorial falls prevention program and usual care (n=71) or control (usual care, n=85). Primary outcomes were rate of falls and proportion of fallers. Secondary outcomes included injurious falls, falls risk, participation, activity, leg strength, gait speed, balance, and falls efficacy. There was no significant difference in fall rate (intervention: 1.89 falls/person-year, control: 1.76 falls/person-year, incidence rate ratio=1.10, P=.74) or the proportion of fallers between the groups (risk ratio=.83, 95% confidence interval=.60-1.14). There was no significant difference in injurious fall rate (intervention: .74 injurious falls/person-year, control: .49 injurious falls/person-year, incidence rate ratio=1.57, P=.25), and there were no significant differences between groups on any other secondary outcome. This multifactorial falls prevention program was not effective in reducing falls in people with stroke who are at risk of falls nor was it more effective than usual care in improving gait, balance, and strength in people with stroke. Further research is required to identify effective interventions for this high-risk group. Copyright © 2012 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Counting statistics in radioactivity measurements
International Nuclear Information System (INIS)
Martin, J.
1975-01-01
The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr
Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio
2017-04-01
Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic
Goodman, J. W.
This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.
Schwabl, Franz
2006-01-01
The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...
Jana, Madhusudan
2015-01-01
Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...
Guénault, Tony
2007-01-01
In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...
Mandl, Franz
1988-01-01
The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient
Rohatgi, Vijay K
2003-01-01
Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth
Levine-Wissing, Robin
2012-01-01
All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep
Horikawa, Yukio
2018-02-06
Maturity-onset diabetes of the young (MODY) is a form of diabetes classically characterized as having autosomal dominant inheritance, onset before the age of 25 years in at least one family member and partly preserved pancreatic β-cell function. The 14 responsible genes are reported to be MODY type 1~14, of which MODY 2 and 3 might be the most common forms. Although MODY is currently classified as diabetes of a single gene defect, it has become clear that mutations in rare MODYs, such as MODY 5 and MODY 6, have small mutagenic effects and low penetrance. In addition, as there are differences in the clinical phenotypes caused by the same mutation even in the same family, other phenotypic modifying factors are thought to exist; MODY could well have characteristics of type 2 diabetes mellitus, which is of multifactorial origin. Here, we outline the effects of genetic and environmental factors on the known phenotypes of MODY, focusing mainly on the examples of MODY 5 and 6, which have low penetrance, as suggestive models for elucidating the multifactorial origin of type 2 diabetes mellitus. © 2018 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Indian Academy of Sciences (India)
inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.
Schrödinger, Erwin
1952-01-01
Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.
Official Statistics and Statistics Education: Bridging the Gap
Directory of Open Access Journals (Sweden)
Gal Iddo
2017-03-01
Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.
Statistics techniques applied to electron probe microanalysis
International Nuclear Information System (INIS)
Brizuela, H.; Del Giorgio, M.; Budde, C.; Briozzo, C.; Riveros, J.
1987-01-01
A description of Montroll-West's general theory for a tridimensional random walk of a particle with internal degrees of freedom is given, connecting this problem with the master equation solution. The possibility of its application to EPMA is discussed. Numerical solutions are given for thick or collimated beams at several energies interacting with samples of different shape and size. Spatial distribution of particles within the sample -for a stationary state- is analized, as well as the electron backscattering coefficient. (Author) [es
Statistically tuned Gaussian background subtraction technique for ...
Indian Academy of Sciences (India)
temporal median method and mixture of Gaussian model and performance evaluation ... to process the videos captured by unmanned aerial vehicle (UAV). ..... The output is obtained by simulation using MATLAB 2010 in a standalone PC with ...
International Nuclear Information System (INIS)
Anon.
1994-01-01
For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources
Statistical mechanics of superconductivity
Kita, Takafumi
2015-01-01
This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Pivato, Marcus
2013-01-01
We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...
Directory of Open Access Journals (Sweden)
Ene Cristian Roata
2018-05-01
Full Text Available Introduction. Distal Hypoperfusion Ischemic Syndrome (DHIS is a multifactorial debilitating condition causing peripheral ischemia and potentially tissue necrosis. In an effort to further refine its surgical treatment we aim to describe a modified, simple and reliable technique for managing DHIS in patients with arteriovenous fistulas. Materials and Methods. Twenty-nine consecutive patients with DHIS operated by a single surgical team over a period of 7 years were included in the study. All patients underwent the same surgical technique: stenotic ligature. Outcomes were analyzed clinically and the effectiveness of the procedure was proven using McNemar test. Clinical variables were statistically analyzed in SPSS 17.0 for Windows. Results. The technique we used consists in performing a stenosing ligature on the vein, using a 0-silk suture, and adjusting the suture in order to achieve either a radial pulse or capillary pulse, while maintaining a good thrill at palpation of the vein. The procedure was successful in 83% of patients proved by immediate symptomatic relief. Paired data analysis showed significant decrease of all symptoms: cold extremity (p=0,021, paraesthesia (p<0,001, pain (p<0,001. History of coronary artery disease, arteriopathy or the absence of radial pulse is statistically correlated with an increased risk of developing DHIS. Conclusions. Stenotic ligature is a simple, cheap and reliable technique for managing DHIS with lower septic risks which can be easily performed under local anesthesia.
Hamilton, Jada G; Waters, Erika A
2018-02-01
People who believe that cancer has both genetic and behavioral risk factors have more accurate mental models of cancer causation and may be more likely to engage in cancer screening behaviors than people who do not hold such multifactorial causal beliefs. This research explored possible health cognitions and emotions that might produce such differences. Using nationally representative cross-sectional data from the US Health Information National Trends Survey (N = 2719), we examined whether endorsing a multifactorial model of cancer causation was associated with perceptions of risk and other cancer-related cognitions and affect. Data were analyzed using linear regression with jackknife variance estimation and procedures to account for the complex survey design and weightings. Bivariate and multivariable analyses indicated that people who endorsed multifactorial beliefs about cancer had higher absolute risk perceptions, lower pessimism about cancer prevention, and higher worry about harm from environmental toxins that could be ingested or that emanate from consumer products (Ps feelings of risk, but multivariable analyses suggested that this effect was accounted for by the negative affect associated with reporting a family history of cancer. Multifactorial beliefs were not associated with believing that everything causes cancer or that there are too many cancer recommendations to follow (Ps > .05). Holding multifactorial causal beliefs about cancer are associated with a constellation of risk perceptions, health cognitions, and affect that may motivate cancer prevention and detection behavior. Copyright © 2017 John Wiley & Sons, Ltd.
Fairhall, Nicola; Sherrington, Catherine; Kurrle, Susan E; Lord, Stephen R; Lockwood, Keri; Howard, Kirsten; Hayes, Alison; Monaghan, Noeline; Langron, Colleen; Aggar, Christina; Cameron, Ian D
2015-01-01
To compare the costs and cost-effectiveness of a multifactorial interdisciplinary intervention versus usual care for older people who are frail. Cost-effectiveness study embedded within a randomized controlled trial. Community-based intervention in Sydney, Australia. A total of 241 community-dwelling people 70 years or older who met the Cardiovascular Health Study criteria for frailty. A 12-month multifactorial, interdisciplinary intervention targeting identified frailty characteristics versus usual care. Health and social service use, frailty, and health-related quality of life (EQ-5D) were measured over the 12-month intervention period. The difference between the mean cost per person for 12 months in the intervention and control groups (incremental cost) and the ratio between incremental cost and effectiveness were calculated. A total of 216 participants (90%) completed the study. The prevalence of frailty was 14.7% lower in the intervention group compared with the control group at 12 months (95% CI 2.4%-27.0%; P = .02). There was no significant between-group difference in EQ-5D utility scores. The cost for 1 extra person to transition out of frailty was $A15,955 (at 2011 prices). In the "very frail" subgroup (participants met >3 Cardiovascular Health Study frailty criteria), the intervention was both more effective and less costly than the control. A cost-effectiveness acceptability curve shows that the intervention would be cost-effective with 80% certainty if decision makers were willing to pay $A50,000 per extra person transitioning from frailty. In the very frail subpopulation, this reduced to $25,000. For frail older people residing in the community, a 12-month multifactorial intervention provided better value for money than usual care, particularly for the very frail, in whom it has a high probability of being cost saving, as well as effective. Trial registration: ACTRN12608000250336. Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care
FINANCIAL WELL BEING: A MULTIFACTORIAL ANALYSIS OF LUDOVICENSE’S BEHAVIOR
Directory of Open Access Journals (Sweden)
Anna Paula Carvalho Diniz
2015-12-01
Full Text Available The objective of this study is to analyze the influence of behavioral factors (materialism, financial behavior, financial attitude and financial knowledge and socioeconomic and demographic variables (gender, age, marital status, children, level of schooling, race, occupation and income in financial well-being. The study’s scenario is the city called São Luís, located in the state of Maranhão, where 629 questionnaires were obtained. To analyze the data, descriptive statistics, factor analysis and multiple linear regression have been used. Results indicate that people from Maranhão are not satisfied with their financial situation, and have presented a low level of financial well-being, which is positively influenced by aspects related to investment in saving account, financial attitude control and age.
Directory of Open Access Journals (Sweden)
Chun-Jen Cheng
2010-06-01
Conclusion: Within the limitations of this study, these four design factors had different contributions to the fracture strength of repaired provisional restorations. Clinicians must be aware of the sequence of importance in determining better problem-solving methods.
Kolomiets, V. I.
2018-03-01
The influence of complex influence of climatic factors (temperature, humidity) and electric mode (supply voltage) on the corrosion resistance of metallization of integrated circuits has been considered. The regression dependence of the average time of trouble-free operation t on the mentioned factors has been established in the form of a modified Arrhenius equation that is adequate in a wide range of factor values and is suitable for selecting accelerated test modes. A technique for evaluating the corrosion resistance of aluminum metallization of depressurized CMOS integrated circuits has been proposed.
International Nuclear Information System (INIS)
Anon.
1989-01-01
World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production
DEFF Research Database (Denmark)
Eliraqi, G M; Vistisen, D; Lauritzen, T
2015-01-01
Aim To investigate whether intensive multifactorial treatment can reverse the predisposed adverse phenotype of people with Type 2 diabetes who have a family history of diabetes. Methods Data from the randomized controlled trial ADDITION-Denmark were used. A total of 1441 newly diagnosed patients...... pressure, lipids and HbA1c) after 5 years of follow-up in participants with and without a family history of diabetes. An interaction term between family history of diabetes and treatment group was included in the models to test for a modifying effect of the intervention. All analyses were adjusted for age...... with diabetes (598 with family history of diabetes) were randomized to intensive treatment or routine care. Family history of diabetes was defined as having one parent and/or sibling with diabetes. Linear mixed-effects models were used to assess the changes in risk factors (BMI, waist circumference, blood...
DEFF Research Database (Denmark)
Oellgaard, Jens; Gæde, Peter; Rossing, Peter
2018-01-01
of hospitalisation for heart failure. METHODS: One hundred and sixty individuals were randomised to conventional or intensified multifactorial intervention, using sealed envelopes. The trial was conducted using the Prospective, Randomised, Open, Blinded Endpoints (PROBE) design. After 7.8 years, all individuals were......AIMS/HYPOTHESIS: In type 2 diabetes mellitus, heart failure is a frequent, potentially fatal and often forgotten complication. Glucose-lowering agents and adjuvant therapies modify the risk of heart failure. We recently reported that 7.8 years of intensified compared with conventional...... offered intensified therapy and the study continued as an observational follow-up study for an additional 13.4 years. Heart-failure hospitalisations were adjudicated from patient records by an external expert committee blinded for treatment allocation. Event rates were compared using a Cox regression...
DEFF Research Database (Denmark)
Tao, L; Wilson, E C F; Wareham, N J
2015-01-01
, falling to £37 500 over 30 years. The ICER fell below £30 000 only when the intervention cost was below £631 per patient: we estimated the cost at £981. Conclusion Given conventional thresholds of cost-effectiveness, the intensive treatment delivered in ADDITION was not cost-effective compared......Aims To examine the short- and long-term cost-effectiveness of intensive multifactorial treatment compared with routine care among people with screen-detected Type 2 diabetes. Methods Cost–utility analysis in ADDITION-UK, a cluster-randomized controlled trial of early intensive treatment in people...... at 3.5%). Adjusted incremental QALYs were 0.0000, – 0.0040, 0.0140 and 0.0465 over the same time horizons. Point estimate incremental cost-effectiveness ratios (ICERs) suggested that the intervention was not cost-effective although the ratio improved over time: the ICER over 10 years was £82 250...
DEFF Research Database (Denmark)
Pedersen, Mette B.; Giraldi, Annamaria; Kristensen, Ellids
2015-01-01
of 968 patients with screen-detected type 2 diabetes. MAIN OUTCOME MEASURES: Low sexual desire and low sexual satisfaction. RESULTS: Mean (standard deviation, SD) age was 64.9 (6.9) years. The prevalence of low sexual desire was 53% (RC) and 54% (IT) among women, and 24% (RC) and 25% (IT) among men......OBJECTIVE: Sexual problems are common in people with diabetes. It is unknown whether early detection of diabetes and subsequent intensive multifactorial treatment (IT) are associated with sexual health. We report the prevalence of low sexual desire and low sexual satisfaction among people....... The prevalence of low sexual satisfaction was 23% (RC) and 18% (IT) among women, and 27% (RC) and 37% (IT) among men. Among men, the prevalence of low sexual satisfaction was significantly higher in the IT group than in the RC group, p = 0.01. CONCLUSION: Low sexual desire and low satisfaction are frequent among...
Isaranuwatchai, Wanrudee; Perdrizet, Johnna; Markle-Reid, Maureen; Hoch, Jeffrey S
2017-09-01
Falls among older adults can cause serious morbidity and pose economic burdens on society. Older age is a known risk factor for falls and age has been shown to influence the effectiveness of fall prevention programs. To our knowledge, no studies have explicitly investigated whether cost-effectiveness of a multifactorial fall prevention intervention (the intervention) is influenced by age. This economic evaluation explores: 1) the cost-effectiveness of a multifactorial fall prevention intervention compared to usual care for community-dwelling adults ≥ 75 years at risk of falling in Canada; and 2) the influence of age on the cost-effectiveness of the intervention. Net benefit regression was used to examine the cost-effectiveness of the intervention with willingness-to-pay values ranging from $0-$50,000. Effects were measured as change in the number of falls, from baseline to 6-month follow-up. Costs were measured using a societal perspective. The cost-effectiveness analysis was conducted for both the total sample and by age subgroups (75-84 and 85+ years). For the total sample, the intervention was not economically attractive. However, the intervention was cost-effective at higher willingness-to-pay (WTP) (≥ $25,000) for adults 75-84 years and at lower WTP (cost-effectiveness of the intervention depends on age and decision makers' WTP to prevent falls. Understanding the influence of age on the cost-effectiveness of an intervention may help to target resources to those who benefit most. Retrospectively registered. Clinicaltrials.gov identifier: NCT00463658 (18 April 2007).
A Statistical Primer: Understanding Descriptive and Inferential Statistics
Gillian Byrne
2007-01-01
As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...
Initiating statistical maintenance optimization
International Nuclear Information System (INIS)
Doyle, E. Kevin; Tuomi, Vesa; Rowley, Ian
2007-01-01
Since the 1980 s maintenance optimization has been centered around various formulations of Reliability Centered Maintenance (RCM). Several such optimization techniques have been implemented at the Bruce Nuclear Station. Further cost refinement of the Station preventive maintenance strategy includes evaluation of statistical optimization techniques. A review of successful pilot efforts in this direction is provided as well as initial work with graphical analysis. The present situation reguarding data sourcing, the principle impediment to use of stochastic methods in previous years, is discussed. The use of Crowe/AMSAA (Army Materials Systems Analysis Activity) plots is demonstrated from the point of view of justifying expenditures in optimization efforts. (author)
Statistical analysis of management data
Gatignon, Hubert
2013-01-01
This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...
National Statistical Commission and Indian Official Statistics*
Indian Academy of Sciences (India)
IAS Admin
a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.
Statistical inference a short course
Panik, Michael J
2012-01-01
A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal
Tellinghuisen, Joel
2008-01-01
The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.
Intuitive introductory statistics
Wolfe, Douglas A
2017-01-01
This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...
Directory of Open Access Journals (Sweden)
Danciu Victor
2009-05-01
Full Text Available The foreign market entry is a strategic choice of any company. The firms use a practical approach in order to select and utilize the most appropriate market entry strategies and modes. This paper aims at showing how the multi-factorial grid can check up t
Monbaliu, Diethard; Vekemans, Katrien; Hoekstra, Harm; Vaahtera, Lauri; Libbrecht, Louis; Derveaux, Katelijne; Parkkinen, Jaakko; Liu, Qiang; Heedfeld, Veerle; Wylin, Tine; Deckx, Hugo; Zeegers, Marcel; Balligand, Erika; Buurman, Wim; van Pelt, Jos; Porte, Robert J.; Pirenne, Jacques
Objective: To design a multifactorial biological modulation approach targeting ischemia reperfusion injury to augment viability of porcine liver grafts from non-heart-beating donors (NHBD). Background Data: Liver Transplantation (LTx) from NHBD is associated with an increased risk of primary
Suijker, Jacqueline J.; MacNeil-Vroomen, Janet L.; van Rijn, Marjon; Buurman, Bianca M.; de Rooij, Sophia E.; van Charente, Eric P. Moll; Bosmans, Judith E.
2017-01-01
Objective To evaluate the cost-effectiveness of nurse-led multifactorial care to prevent or postpone new disabilities in community-living older people in comparison with usual care. Methods We conducted cost-effectiveness and cost-utility analyses alongside a cluster randomized trial with one-year
Suijker, Jacqueline J.; MacNeil-Vroomen, Janet L.; van Rijn, Marjon; Buurman, Bianca M.; de Rooij, Sophia E.; Moll van Charante, Eric P.; Bosmans, Judith E.
2017-01-01
Objective To evaluate the cost-effectiveness of nurse-led multifactorial care to prevent or postpone new disabilities in community-living older people in comparison with usual care. Methods We conducted cost-effectiveness and cost-utility analyses alongside a cluster randomized trial with one-year
Suijker, Jacqueline J; MacNeil-Vroomen, Janet L; van Rijn, Marjon; Buurman, Bianca M; de Rooij, Sophia E; Moll van Charante, Eric P; Bosmans, Judith E
2017-01-01
OBJECTIVE: To evaluate the cost-effectiveness of nurse-led multifactorial care to prevent or postpone new disabilities in community-living older people in comparison with usual care. METHODS: We conducted cost-effectiveness and cost-utility analyses alongside a cluster randomized trial with one-year
Directory of Open Access Journals (Sweden)
Ferrer A
2014-02-01
Full Text Available Assumpta Ferrer,1 Francesc Formiga,2,3 Héctor Sanz,4 Oscar J de Vries,5 Teresa Badia,6 Ramón Pujol2,3 On behalf of the OCTABAIX Study Group 1Primary Healthcare Centre "El Plà" CAP-I, Sant Feliu de Llobregat, 2Geriatric Unit, Internal Medicine Service, Hospital Universitari de Bellvitge, 3Bellvitge Biomedical Research Institute, IDIBELL, L'Hospitalet de Llobregat, 4Support Research Unit, Primary Health Department Costa Ponent, IDIAP Jordi Gol, Barcelona, Spain; 5Department of Internal Medicine, VU University Medical Center, Amsterdam, the Netherlands; 6Primary Healthcare Centre Martorell, Barcelona, Spain Background: The purpose of this study was to assess the effectiveness of a multifactorial intervention to reduce falls among the oldest-old people, including individuals with cognitive impairment or comorbidities. Methods: A randomized, single-blind, parallel-group clinical trial was conducted from January 2009 to December 2010 in seven primary health care centers in Baix Llobregat (Barcelona. Of 696 referred people who were born in 1924, 328 were randomized to an intervention group or a control group. The intervention model used an algorithm and was multifaceted for both patients and their primary care providers. Primary outcomes were risk of falling and time until falls. Data analyses were by intention-to-treat. Results: Sixty-five (39.6% subjects in the intervention group and 48 (29.3% in the control group fell during follow-up. The difference in the risk of falls was not significant (relative risk 1.28, 95% confidence interval [CI] 0.94–1.75. Cox regression models with time from randomization to the first fall were not significant. Cox models for recurrent falls showed that intervention had a negative effect (hazard ratio [HR] 1.46, 95% CI 1.03–2.09 and that functional impairment (HR 1.42, 95% CI 0.97–2.12, previous falls (HR 1.09, 95% CI 0.74–1.60, and cognitive impairment (HR 1.08, 95% CI 0.72–1.60 had no effect on the
Directory of Open Access Journals (Sweden)
Adrion Christine
2012-09-01
Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study
Adrion, Christine; Mansmann, Ulrich
2012-09-10
A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions
Engaging with the Art & Science of Statistics
Peters, Susan A.
2010-01-01
How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…
Bomer, Ilanit; Saure, Carola; Caminiti, Carolina; Ramos, Javier Gonzales; Zuccaro, Graciela; Brea, Mercedes; Bravo, Mónica; Maza, Carmen
2015-11-01
Craniopharyngioma is a histologically benign brain malformation with a fundamental role in satiety modulation, causing obesity in up to 52% of patients. To evaluate cardiovascular risk factors, body composition, resting energy expenditure (REE), and energy intake in craniopharyngioma patients and to compare the data with those from children with multifactorial obesity. All obese children and adolescents who underwent craniopharyngioma resection and a control group of children with multifactorial obesity in follow-up between May 2012 and April 2013. Anthropometric measurements, bioelectrical impedance, indirect calorimetry, energy intake, homeostatic model assessment insulin resistance (HOMA-IR), and dyslipidemia were evaluated. Twenty-three patients with craniopharyngioma and 43 controls were included. Children with craniopharyngioma-related obesity had a lower fat-free mass percentage (62.4 vs. 67.5; p=0.01) and a higher fat mass percentage (37.5 vs. 32.5; p=0.01) compared to those with multifactorial obesity. A positive association was found between %REE and %fat-free mass in subjects with multifactorial obesity (68±1% in normal REE vs. 62.6±1% in low REE; p=0.04), but not in craniopharyngioma patients (62±2.7 in normal REE vs. 61.2±1.8% in low REE; p=0.8). No differences were found in metabolic involvement or energy intake. REE was lower in craniopharyngioma patients compared to children with multifactorial obesity regardless of the amount of fat-free mass, suggesting that other factors may be responsible for the lower REE.
Análisis multifactorial de los factores de riesgo de bajo peso al nacer en Salvador, Bahia
Directory of Open Access Journals (Sweden)
Solla Jorge José Santos Pereira
1997-01-01
Full Text Available El presente estudio constituye un análisis multifactorial de los factores de riesgo de bajo peso al nacer en un grupo de recién nacidos en una zona urbana del Brasil. Se incluyeron en el estudio un total de 1 023 nacidos vivos, dados a luz en cuatro maternidades de Salvador, Bahia, entre julio de 1987 y febrero de 1988. Las fuentes de información fueron las historias clínicas y las entrevistas con las madres en la maternidad. El análisis se realizó mediante regresión logística. En el modelo final los factores de riesgo incluidos fueron los siguientes: edad materna menos de 21 años o más de 35; edad gestacional menos de 38 semanas; resultado desfavorable del embarazo anterior; intervalo intergenésico previo de 12 meses o menos; tabaquismo; e hipertensión. Se presentan los valores del riesgo atribuible poblacional para los factores de riesgo incluidos en el modelo final. Esos factores deben emplearse para detectar a las gestantes con alto riesgo de dar a luz un niño de bajo peso, a las que debe brindarse mayor atención prenatal.
Directory of Open Access Journals (Sweden)
Sepideh Parsi
2015-01-01
Full Text Available Alzheimer's disease (AD is a multifactorial, fatal neurodegenerative disorder characterized by the abnormal accumulation of Aβ and Tau deposits in the brain. There is no cure for AD, and failure at different clinical trials emphasizes the need for new treatments. In recent years, significant progress has been made toward the development of miRNA-based therapeutics for human disorders. This study was designed to evaluate the efficiency and potential safety of miRNA replacement therapy in AD, using miR-15/107 paralogues as candidate drug targets. We identified miR-16 as a potent inhibitor of amyloid precursor protein (APP and BACE1 expression, Aβ peptide production, and Tau phosphorylation in cells. Brain delivery of miR-16 mimics in mice resulted in a reduction of AD-related genes APP, BACE1, and Tau in a region-dependent manner. We further identified Nicastrin, a γ-secretase component involved in Aβ generation, as a target of miR-16. Proteomics analysis identified a number of additional putative miR-16 targets in vivo, including α-Synuclein and Transferrin receptor 1. Top-ranking biological networks associated with miR-16 delivery included AD and oxidative stress. Collectively, our data suggest that miR-16 is a good candidate for future drug development by targeting simultaneously endogenous regulators of AD biomarkers (i.e., Aβ and Tau, inflammation, and oxidative stress.
Directory of Open Access Journals (Sweden)
Qiang Lu
Full Text Available CAHs, as a cleaning solvent, widely contaminated shallow groundwater with the development of manufacturing in China's Yangtze River Delta. This study focused on the distribution of CAHs, and correlations between CAHs and environmental variables in a shallow groundwater in Shanghai, using kriging interpolation and multifactorial analysis. The results showed that the overall CAHs plume area (above DIV was approximately 9,000 m(2 and located in the 2-4 m underground, DNAPL was accumulated at an area of approximately 1,400 m(2 and located in the 6-8m sandy silt layer on the top of the muddy silty clay. Heatmap of PPC for CAHs and environmental variables showed that the correlation between "Fe(2+" and most CAHs such as "1,1,1-TCA", "1,1-DCA", "1,1-DCE" and "%TCA" were significantly positive (p<0.001, but "%CA" and/or "%VC" was not, and "Cl-" was significantly positive correlated with "1,1-DCA" and "1,1-DCE" (p<0.001. The PCA demonstrated that the relative proportions of CAHs in groundwater were mostly controlled by the sources and the natural attenuation. In conclusion, the combination of geographical and chemometrics was helpful to establishing an aerial perspective of CAHs and identifying reasons for the accumulation of toxic dechlorination intermediates, and could become a useful tool for characterizing contaminated sites in general.
A statistical manual for chemists
Bauer, Edward
1971-01-01
A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect
Statistical modeling for degradation data
Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru
2017-01-01
This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.
Statistical methods for ranking data
Alvo, Mayer
2014-01-01
This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.
Computational statistics handbook with Matlab
Martinez, Wendy L
2007-01-01
Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...
Finkelstein, Michael O
2015-01-01
This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...
International Nuclear Information System (INIS)
Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.
1978-01-01
The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB
Multifactorial Causes of Suicide
Directory of Open Access Journals (Sweden)
Antemir Cristina-Laura
2017-12-01
Full Text Available The science of psychology is well placed to the advantage of understanding why some people are trying to take their lives, and others do not. Understanding the psychological processes underlying the idea of suicide and the decision to act on suicidal thoughts is particularly important. Especially since interventions should be targeted at the suicidal ideation when it first appears before it becomes an attempt of suicide. Factors associated with suicidal risk can be classified into four groups: personality and individual differences, cognitive factors, social factors and life-threatening factors. Each of these factors can contribute to the emergence of suicide risk independently or together with other factors. Some of them are associated with the emergence of suicidal ideation, while others increase the likelihood that these thoughts will come to life.
Introductory statistics for engineering experimentation
Nelson, Peter R; Coffin, Marie
2003-01-01
The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...
On quantum statistical inference
DEFF Research Database (Denmark)
Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.
Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...
Nixon, Ralph A
2017-07-01
Abnormalities of the endosomal-lysosomal network (ELN) are a signature feature of Alzheimer's disease (AD). These include the earliest known cytopathology that is specific to AD and that affects endosomes and induces the progressive failure of lysosomes, each of which are directly linked by distinct mechanisms to neurodegeneration. The origins of ELN dysfunction and β-amyloidogenesis closely overlap, which reflects their common genetic basis, the established early involvement of endosomes and lysosomes in amyloid precursor protein (APP) processing and clearance, and the pathologic effect of certain APP metabolites on ELN functions. Genes that promote β-amyloidogenesis in AD (APP, PSEN1/2, and APOE4) have primary effects on ELN function. The importance of primary ELN dysfunction to pathogenesis is underscored by the mutations in more than 35 ELN-related genes that, thus far, are known to cause familial neurodegenerative diseases even though different pathogenic proteins may be involved. In this article, I discuss growing evidence that implicates AD gene-driven ELN disruptions as not only the antecedent pathobiology that underlies β-amyloidogenesis but also as the essential partner with APP and its metabolites that drive the development of AD, including tauopathy, synaptic dysfunction, and neurodegeneration. The striking amelioration of diverse deficits in animal AD models by remediating ELN dysfunction further supports a need to integrate APP and ELN relationships, including the role of amyloid-β, into a broader conceptual framework of how AD arises, progresses, and may be effectively therapeutically targeted.-Nixon, R. A. Amyloid precursor protein and endosomal-lysosomal dysfunction in Alzheimer's disease: inseparable partners in a multifactorial disease. © FASEB.
Kuete, Victor; Donfack, Arno R Nanfack; Mbaveng, Armelle T; Zeino, Maen; Tane, Pierre; Efferth, Thomas
2015-08-01
Multidrug resistance in cancer represents a major problem in chemotherapy. The present study was designed to assess the cytotoxicity of anthraquinones from Pentas schimperi, namely damnacanthal (1), damnacanthol (2), 3-hydroxy-2-hydroxymethyl anthraquinone (3) and schimperiquinone B (4) against nine drug-sensitive and multidrug resistant (MDR) cancer cell lines. The resazurin reduction assay was used to evaluate the cytotoxicity of the above compounds, whilst caspase-Glo assay was used to detect the activation of caspases enzymes by compounds 1 and 2. Cell cycle, mitochondrial membrane potential (MMP) and levels of reactive oxygen species were all analyzed via flow cytometry. Anthraquinones 1 and 2 displayed cytotoxic effects with IC50 values below 81 μM on all the nine tested cancer cell lines whilst 3 and 4 displayed selective activities. The recorded IC50 values for compounds 1 and 2 ranged from 3.12 μM and 12.18 μM (towards leukemia CCRF-CEM cells) and from 30.32 μM and 80.11 μM (towards gliobastoma U87MG.ΔEGFR cells) respectively, and from 0.20 μM (against CCRF-CEM cells) to 195.12 μM (against CEM/ADR5000 cells) for doxorubicin. Compounds 1 and 2 induced apoptosis in CCRF-CEM leukemia cells, mediated by the disruption of the MMP and increase in ROS production. Anthraquinones from Pentas schimperi and mostly 1 and 2 are potential cytotoxic natural products that deserve more investigations to develop novel antineoplastic drugs against multifactorial drug resistant cancers.
Lundgren, Lina E; Tran, Tai T; Nimphius, Sophia; Raymond, Ellen; Secomb, Josh L; Farley, Oliver R L; Newton, Robert U; Steele, Julie R; Sheppard, Jeremy M
2015-11-01
To develop and evaluate a multifactorial model based on landing performance to estimate injury risk for surfing athletes. Five measures were collected from 78 competitive surfing athletes and used to create a model to serve as a screening tool for landing tasks and potential injury risk. In the second part of the study, the model was evaluated using junior surfing athletes (n = 32) with a longitudinal follow-up of their injuries over 26 wk. Two models were compared based on the collected data, and magnitude-based inferences were applied to determine the likelihood of differences between injured and noninjured groups. The study resulted in a model based on 5 measures--ankle-dorsiflexion range of motion, isometric midthigh-pull lower-body strength, time to stabilization during a drop-and-stick (DS) landing, relative peak force during a DS landing, and frontal-plane DS-landing video analysis--for male and female professional surfers and male and female junior surfers. Evaluation of the model showed that a scaled probability score was more likely to detect injuries in junior surfing athletes and reported a correlation of r = .66, P = .001, with a model of equal variable importance. The injured (n = 7) surfers had a lower probability score (0.18 ± 0.16) than the noninjured group (n = 25, 0.36 ± 0.15), with 98% likelihood, Cohen d = 1.04. The proposed model seems sensitive and easy to implement and interpret. Further research is recommended to show full validity for potential adaptations for other sports.
DEFF Research Database (Denmark)
Vind, Ane B; Andersen, Hanne E; Pedersen, Kirsten D
2009-01-01
outpatient department. PARTICIPANTS: One thousand one hundred five community-dwelling adults aged 65 and older who had sustained at least one injurious fall. MEASUREMENTS: Marital status, housing tenure, income, comorbidity, hospitalization, fractures, and drug use before invitation to participate......OBJECTIVES: To address the external validity of a trial of multifactorial fall prevention through an analysis of differences between participants and nonparticipants regarding socioeconomic and morbidity variables. DESIGN: Analysis of nonresponse in a randomized clinical trial. SETTING: Geriatric...... nonparticipants of a trial of multifactorial fall prevention differed significantly from participants in terms of socioeconomic and morbidity variables and were more likely to be hospitalized or die during 6 months of follow-up. Because of the differences between the two populations, it is questionable whether...
Petridou, Eleni Th; Manti, Eirini G; Ntinapogias, Athanasios G; Negri, Eva; Szczerbinska, Katarzyna
2009-08-01
To compare and quantify the effectiveness of multifactorial versus exercise-alone interventions in reducing recurrent falls among community-dwelling older people. A meta-analysis of recently published studies on fall prevention interventions was conducted. Measure of the overall effectiveness was the combined risk ratio for recurrent falls, whereas heterogeneity was explored via metaregression analyses. Ten of the 52 identified studies met the preset criteria and were included in the analysis. The exercise-alone interventions were about 5 times more effective compared to multifactorial ones. Short-term interventions, smaller samples, and younger age related to better outcomes. From cost-efficiency and public health perspectives, exercise-alone interventions can be considered valuable, as they are more likely to be implemented in countries with less resources. Further qualitative research is needed, however, to explore determinants of willingness to participate and comply with interventions aiming to prevent recurrent falls among older people.
Lies, damn lies and statistics
International Nuclear Information System (INIS)
Jones, M.D.
2001-01-01
Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab
... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...
Directory of Open Access Journals (Sweden)
Mohammad Ali Cheraghi
2017-05-01
Full Text Available Delirium is the most common problem in patients in intensive care units. Prevention of delirium is more important than treatment. The aim of this study is to determine the effect of the NICE-adjusted multifactorial intervention to prevent delirium in open heart surgery patients. Methods: This study is a quasi-experimental study on 88 patients (In each group, 44 patients undergoing open heart surgery in the intensive care unit of Imam Khomeini Hospital, Tehran. Subjects received usual care group, only the incidence of delirium were studied. So that patients in the two groups of second to fifth postoperative day, twice a day by the researcher, and CAM-ICU questionnaire were followed. After completion of the sampling in this group, in the intervention group also examined incidence of delirium was conducted in the same manner except that multifactorial interventions based on the intervention of NICE modified by the researcher on the second day to fifth implementation and intervention on each turn, their implementation was followed. As well as to check the quality of sleep and pain in the intervention group of CPOT and Pittsburgh Sleep assessment tools were used. Data analysis was done using the SPSS software, version 16. A T-test, a chi-square test, and a Fisher’s exact test were also carried out. Results: The incidence of delirium in the control group was 42.5%; and in the intervention group, it was 22.5%. The result showed the incidence of delirium in open heart surgery hospitalized patients after multifactorial intervention based on adjusted NICE guidelines has been significantly reduced. Conclusion: The NICE-adjusted multifactorial intervention guidelines for the prevention of delirium in cardiac surgery patients significantly reduced the incidence of delirium in these patients. So, using this method as an alternative comprehensive and reliable in preventing delirium in hospitalized patients in the ward heart surgery is recommended.
Renfro, Mindy Oxman; Fehrer, Steven
2011-01-01
Unintentional falls is an increasing public health problem as incidence of falls rises and the population ages. The Centers for Disease Control and Prevention reports that 1 in 3 adults aged 65 years and older will experience a fall this year; 20% to 30% of those who fall will sustain a moderate to severe injury. Physical therapists caring for older adults are usually engaged with these patients after the first injury fall and may have little opportunity to abate fall risk before the injuries occur. This article describes the content selection and development of a simple-to-administer, multifactorial, Fall Risk Assessment & Screening Tool (FRAST), designed specifically for use in primary care settings to identify those older adults with high fall risk. Fall Risk Assessment & Screening Tool incorporates previously validated measures within a new multifactorial tool and includes targeted recommendations for intervention. Development of the multifactorial FRAST used a 5-part process: identification of significant fall risk factors, review of best evidence, selection of items, creation of the scoring grid, and development of a recommended action plan. Fall Risk Assessment & Screening Tool has been developed to assess fall risk in the target population of older adults (older than 65 years) living and ambulating independently in the community. Many fall risk factors have been considered and 15 items selected for inclusion. Fall Risk Assessment & Screening Tool includes 4 previously validated measures to assess balance, depression, falls efficacy, and home safety. Reliability and validity studies of FRAST are under way. Fall risk for community-dwelling older adults is an urgent, multifactorial, public health problem. Providing primary care practitioners (PCPs) with a very simple screening tool is imperative. Fall Risk Assessment & Screening Tool was created to allow for safe, quick, and low-cost administration by minimally trained office staff with interpretation and
Directory of Open Access Journals (Sweden)
Suijker Jacqueline J
2012-04-01
Full Text Available Abstract Background Functional decline in community-dwelling older persons is associated with the loss of independence, the need for hospital and nursing-home care and premature death. The effectiveness of multifactorial interventions in preventing functional decline remains controversial. The aim of this study is to investigate whether functional decline in community-dwelling older persons can be delayed or prevented by a comprehensive geriatric assessment, multifactorial interventions and nurse-led care coordination. Methods/Design In a cluster randomized controlled trial, with the general practice as the unit of randomization, 1281 participants from 25 general practices will be enrolled in each condition to compare the intervention with usual care. The intervention will focus on older persons who are at increased risk for functional decline, identified by an Identification of Seniors at Risk Primary Care (ISAR-PC score (≥ 2. These older persons will receive a comprehensive geriatric assessment, an individually tailored care and treatment plan, consisting of multifactorial, evidence-based interventions and subsequent nurse-led care coordination. The control group will receive 'care as usual' by the general practitioner (GP. The main outcome after 12 months is the level of physical functioning on the modified Katz-15 index score. The secondary outcomes are health-related quality of life, psychological and social functioning, healthcare utilization and institutionalization. Furthermore, a process evaluation and cost-effectiveness analysis will be performed. Discussion This study will provide new knowledge regarding the effectiveness and feasibility of a comprehensive geriatric assessment, multifactorial interventions and nurse-led elderly care in general practice. Trial registration NTR2653 Grant Unrestricted grant 'The Netherlands Organisation for Health Research and development' no 313020201
2014-01-01
Background In line with a rapidly ageing global population, the rise in the frequency of falls will lead to increased healthcare and social care costs. This study will be one of the few randomized controlled trials evaluating a multifaceted falls intervention in a low-middle income, culturally-diverse older Asian community. The primary objective of our paper is to evaluate whether individually tailored multifactorial interventions will successfully reduce the number of falls among older adults. Methods Three hundred community-dwelling older Malaysian adults with a history of (i) two or more falls, or (ii) one injurious fall in the past 12 months will be recruited. Baseline assessment will include cardiovascular, frailty, fracture risk, psychological factors, gait and balance, activities of daily living and visual assessments. Fallers will be randomized into 2 groups: to receive tailored multifactorial interventions (intervention group); or given lifestyle advice with continued conventional care (control group). Multifactorial interventions will target 6 specific risk factors. All participants will be re-assessed after 12 months. The primary outcome measure will be fall recurrence, measured with monthly falls diaries. Secondary outcomes include falls risk factors; and psychological measures including fear of falling, and quality of life. Discussion Previous studies evaluating multifactorial interventions in falls have reported variable outcomes. Given likely cultural, personal, lifestyle and health service differences in Asian countries, it is vital that individually-tailored multifaceted interventions are evaluated in an Asian population to determine applicability of these interventions in our setting. If successful, these approaches have the potential for widespread application in geriatric healthcare services, will reduce the projected escalation of falls and fall-related injuries, and improve the quality of life of our older community. Trial registration
M Tontodonati; F Sozio; F Vadini; E Polilli; T Ursini; G Calella; P Di Stefano; E Mazzotta; A Costantini; C D'Amario; G Parruti
2012-01-01
Purpose of the study: Considering costs of antiretrovirals (ARVs) for HIV patients is increasingly needed. A simple and comprehensive tool weighing comorbidities and ARV-related toxicities could be useful to judge the appropriateness of use of more expensive drugs. We conceived a MultiFactorial Risk Score (MFRS) to evaluate the appropriateness of ARVs prescription relative to their costs. Methods: HIV patients were consecutively enrolled in 2010-2011. We considered socio-demographic character...
Directory of Open Access Journals (Sweden)
Leonardo Eladio Vergara Guillén
2011-08-01
Full Text Available En esta investigación se modela a partir de los parámetros del proceso el espesor de la capa de óxido y la microdureza de los aluminios Al3003 y Al6063 anodizados. Para ello se realizaron estudios de la microdureza y espesor de capa de la superficie anodizada, utilizando técnicas de análisis multifactorial y diseño robusto. Se establecieron los siguientes niveles de los parámetros del proceso: temperatura [15 °C, 25 °C], tiempo [30 min; 60 min], concentración de electrolito [1,2 M; 2 M], densidad de corriente [1 Amp/dm²; 3 Amp/dm²], aluminio [Al3003,Al6063] y como variable de ruido, la deformación plástica [0%, 10%, 20%, 30%]. Se propuso un diseño fraccionado 2(7-2 mixto, con el cual se efectuó un total de 48 pruebas usando soluciones electrolíticas de ácido sulfúrico. La medición de microdureza se realizó con un indentador Vickers con carga de 400 g; el espesor de la capa de óxido se captó mediante microscopia electrónica. A los resultados se les realizó un análisis de varianza (ANOVA, para determinar los factores significativos y la robustez de los efectos. Se encontraron resultados de microdureza [HV] [85,74-308,87]; y espesor de óxido [µm] [12,82- 94,69]. Finalmente, se muestran los modelos de predicción de cada una de las respuestas en función de los factores significativos estas ecuaciones permitirán seleccionar la microdureza y espesor de la capa de óxido para cumplir los requerimientos de un producto particular mediante una selección apropiada de los parámetros del proceso.In this research, the thickness of the oxide layer and the microhardness of anodized aluminum Al3003 and Al6063 are modeled based on process parameters. To this end, studies of the microhardness and the thickness layer of the anodized surface were made, via techniques of multifactorial analysis and robust design. The following levels of the process parameters were established: temperature [15°C, 25°C], time [30min; 60min
A primer of multivariate statistics
Harris, Richard J
2014-01-01
Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why
Isotopic safeguards statistics
International Nuclear Information System (INIS)
Timmerman, C.L.; Stewart, K.B.
1978-06-01
The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results
... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...
State Transportation Statistics 2014
2014-12-15
The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...
Saugstad, L F
1994-12-01
An association has been established between the multifactorially inherited rate of physical maturation and the final step in brain development, when some 40% of synapses are eliminated. This may imply that similarly to endocrine disease entities, we have cerebral disease entities at the extremes of the maturational rate continuum. The restriction of prepubertal pruning to excitatory synapses leaving the number of inhibitory ones fairly constant, implies changes in cerebral excitability as a function of rate of maturation (age at puberty). In early maturation there will be an excess in excitatory drive due to prematurely abridged pruning, which compounds a synchronization tendency inherent in excessive synaptic density. Lowering excitatory level with antiepileptics is hypothesized to be a logical treatment in this type of brain dysfunction. In late maturation, a deficit in excitatory drive due to failure to shut down the pruning process associated with a tendency to the breakdown of circuitry and desynchronization, adds to a similar adversity inherent in reduced synaptic density. Raising the excitatory level with convulsants is hypothesized to be the treatment for this type of CNS dysfunction. The maturational theory of Kraepelin's psychoses holds that they are naturally occurring contrasting chemical signaling disorders in the brain at the extremes of the maturational rate continuum: manic depressive psychosis is a disorder of the early maturer and comprises raised cerebral excitability and a raised density of synapses. This is successfully treated with anti-epileptics like sodium valproate and carbamazepin. Schizophrenia is a disorder in late maturation with reduced cerebral excitability and reduced synaptic density. This is accordingly treated with convulsants such as typical and atypical neuroleptics. However, the conventional effective treatments in both disorders act on inhibition only by either lowering or raising inhibitory level. While the neuroleptics
Directory of Open Access Journals (Sweden)
Valavanis Ioannis K
2010-09-01
Full Text Available Abstract Background Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD. The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total, gender, and nutrition (38 in total, e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI as normal (BMI ≤ 25 or overweight (BMI > 25. Two artificial neural network (ANN based methods were designed and used towards the analysis of the available data. These corresponded to i a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN, and ii a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN which combines genetic algorithms and the popular back-propagation training algorithm. Results PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets
Valavanis, Ioannis K; Mougiakakou, Stavroula G; Grimaldi, Keith A; Nikita, Konstantina S
2010-09-08
Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. The ANN based methods revealed factors
Renyi statistics in equilibrium statistical mechanics
International Nuclear Information System (INIS)
Parvan, A.S.; Biro, T.S.
2010-01-01
The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.
Sampling, Probability Models and Statistical Reasoning Statistical
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
A Multidisciplinary Approach for Teaching Statistics and Probability
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Intermediate statistics a modern approach
Stevens, James P
2007-01-01
Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.
Alonso-Domínguez, Rosario; Gómez-Marcos, Manuel A; Patino-Alonso, Maria C; Sánchez-Aguadero, Natalia; Agudo-Conde, Cristina; Castaño-Sánchez, Carmen; García-Ortiz, Luis; Recio-Rodríguez, José I
2017-01-01
Introduction New information and communication technologies (ICTs) may promote lifestyle changes, but no adequate evidence is available on their combined effect of ICTs with multifactorial interventions aimed at improving diet and increasing physical activity in patients with type 2 diabetes mellitus (DM2). The primary objective of this study is to assess the effect of a multifactorial intervention to increase physical activity and adherence to Mediterranean diet in DM2. Methods and analysis ...
Savage, Leonard J
1972-01-01
Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.
State Transportation Statistics 2010
2011-09-14
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...
State Transportation Statistics 2012
2013-08-15
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...
Adrenal Gland Tumors: Statistics
... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...
State transportation statistics 2009
2009-01-01
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...
State Transportation Statistics 2011
2012-08-08
The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...
Neuroendocrine Tumor: Statistics
... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...
State Transportation Statistics 2013
2014-09-19
The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...
BTS statistical standards manual
2005-10-01
The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...
Trombetti, A; Hars, M; Herrmann, F; Rizzoli, R; Ferrari, S
2013-03-01
This controlled intervention study in hospitalized oldest old adults showed that a multifactorial fall-and-fracture risk assessment and management program, applied in a dedicated geriatric hospital unit, was effective in improving fall-related physical and functional performances and the level of independence in activities of daily living in high-risk patients. Hospitalization affords a major opportunity for interdisciplinary cooperation to manage fall-and-fracture risk factors in older adults. This study aimed at assessing the effects on physical performances and the level of independence in activities of daily living (ADL) of a multifactorial fall-and-fracture risk assessment and management program applied in a geriatric hospital setting. A controlled intervention study was conducted among 122 geriatric inpatients (mean ± SD age, 84 ± 7 years) admitted with a fall-related diagnosis. Among them, 92 were admitted to a dedicated unit and enrolled into a multifactorial intervention program, including intensive targeted exercise. Thirty patients who received standard usual care in a general geriatric unit formed the control group. Primary outcomes included gait and balance performances and the level of independence in ADL measured 12 ± 6 days apart. Secondary outcomes included length of stay, incidence of in-hospital falls, hospital readmission, and mortality rates. Compared to the usual care group, the intervention group had significant improvements in Timed Up and Go (adjusted mean difference [AMD] = -3.7s; 95 % CI = -6.8 to -0.7; P = 0.017), Tinetti (AMD = -1.4; 95 % CI = -2.1 to -0.8; P fall-and-fracture risk-based intervention program, applied in a dedicated geriatric hospital unit, was effective and more beneficial than usual care in improving physical parameters related to the risk of fall and disability among high-risk oldest old patients.
Evaluation of blood flow in Allograft Renal Arteries anastomosed with two different techniques
International Nuclear Information System (INIS)
Zomorrodi, A.; Bohluli, A.; Tarzamany, M.K.
2008-01-01
Renal artery stenosis in renal transplantation (TRAS) is an avoidable short or long term surgical complication. The etiology is multifactorial, but faulty anastomosis is a major factor. In our transplant center, we evaluated the incidence of TRAS with the use of two different suturing techniques of the anastomosis site between allograft renal and renal and iliac arteries in two groups of renal transplant recipients, group A: 14 patients (6 males and 8 females with age 16 to 59 and mean age of 38 years) in whom allograft arteries were anastomosed with a continuous suture technique and group B: 14 patients (7 males and 7 females with age 32 to 61 and mean age of 46.6 years) in whom the allograft arteries were anastomosed with a combined suture technique (continuous and uninterrupted. Post transplantation, the velocity of blood flow in the renal and iliac arteries at the site of anastomosis was measured by color Doppler ultrasound. The ultrasonographer was blinded to the surgical technique in both study groups. The ratio of the maximum velocity of blood at the site of anastomosis to that in the iliac artery of less than 2.5 was considered as non-significant stenosis, while a ratio of more than 2.5 was considered significant stenosis. In group A there were 9 cases of non-significant stenosis in comparison to 3 cases in group B, while there were no cases of significant stenosis in group A in comparison to 3 cases in group B; the difference was not statistically significant. We conclude that there was no difference in the compared surgical techniques of anastomosis in our study groups. This suggests that other factors such as gentle handling of tissue, enough spatula, margin reversion and comparable diameter of the anastomosed vessels may be more important in the prevention of renal allograft stenosis than the type of suture technique. (author)
Information Statistics in Schools Educate your students about the value and everyday use of statistics. The Statistics in Schools program provides resources for teaching and learning with real life data. Explore the site for standards-aligned, classroom-ready activities. Statistics in Schools Math Activities History
Transport Statistics - Transport - UNECE
Sustainable Energy Statistics Trade Transport Themes UNECE and the SDGs Climate Change Gender Ideas 4 Change UNECE Weekly Videos UNECE Transport Areas of Work Transport Statistics Transport Transport Statistics About us Terms of Reference Meetings and Events Meetings Working Party on Transport Statistics (WP.6
Directory of Open Access Journals (Sweden)
Vervoort Gerald
2006-03-01
Full Text Available Abstract Background Patients with chronic kidney disease (CKD are at a greatly increased risk of developing cardiovascular disease. Recently developed guidelines address multiple risk factors and life-style interventions. However, in current practice few patients reach their targets. A multifactorial approach with the aid of nurse practitioners was effective in achieving treatment goals and reducing vascular events in patients with diabetes mellitus and in patients with heart failure. We propose that this also holds for the CKD population. Design MASTERPLAN is a multicenter randomized controlled clinical trial designed to evaluate whether a multifactorial approach with the aid of nurse-practicioners reduces cardiovascular risk in patients with CKD. Approximately 800 patients with a creatinine clearance (estimated by Cockcroft-Gault between 20 to 70 ml/min, will be included. To all patients the same set of guidelines will be applied and specific cardioprotective medication will be prescribed. In the intervention group the nurse practitioner will provide lifestyle advice and actively address treatment goals. Follow-up will be five years. Primary endpoint is the composite of myocardial infarction, stroke and cardiovascular mortality. Secondary endpoints are cardiovascular morbidity, overall mortality, decline of renal function, change in markers of vascular damage and change in quality of life. Enrollment has started in April 2004 and the study is on track with 700 patients included on October 15th, 2005. This article describes the design of the MASTERPLAN study.
Directory of Open Access Journals (Sweden)
A. Cecile J.W. Janssens
2006-12-01
Full Text Available Multifactorial diseases such as type 2 diabetes, osteoporosis, and cardiovascular disease are caused by a complex interplay of many genetic and nongenetic factors, each of which conveys a minor increase in the risk of disease. Unraveling the genetic origins of these diseases is expected to lead to individualized medicine, in which the prevention and treatment strategies are personalized on the basis of the results of predictive genetic tests. This great optimism is counterbalanced by concerns about the ethical, legal, and social implications of genomic medicine, such as the protection of privacy and autonomy, stigmatization, discrimination, and the psychological burden of genetic testing. These concerns are translated from genetic testing in monogenic disorders, but this translation may not be appropriate. Multiple genetic testing (genomic profiling has essential differences from genetic testing in monogenic disorders. The differences lie in the lower predictive value of the test results, the pleiotropic effects of susceptibility genes, and the low inheritance of genomic profiles. For these reasons, genomic profiling may be more similar to nongenetic tests than to predictive tests for monogenic diseases. Therefore, ethical, legal, and social issues that apply to predictive genetic testing for monogenic diseases may not be relevant for the prediction of multifactorial disorders in genomic medicine.
Directory of Open Access Journals (Sweden)
Llobera Joan
2010-09-01
Full Text Available Abstract Background Lowering of blood pressure by antihypertensive drugs reduces the risks of cardiovascular events, stroke, and total mortality. However, poor adherence to antihypertensive medications reduces their effectiveness and increases the risk of adverse events. In terms of relative risk reduction, an improvement in medication adherence could be as effective as the development of a new drug. Methods/Design The proposed randomized controlled trial will include patients with a low adherence to medication and uncontrolled blood pressure. The intervention group will receive a multifactorial intervention during the first, third, and ninth months, to improve adherence. This intervention will include motivational interviews, pill reminders, family support, blood pressure self-recording, and simplification of the dosing regimen. Measurement The primary outcome is systolic blood pressure. The secondary outcomes are diastolic blood pressure, proportion of patients with adequately controlled blood pressure, and total cost. Discussion The trial will evaluate the impact of a multifactorial adherence intervention in routine clinical practice. Ethical approval was given by the Ethical Committee on Human Research of Balearic islands, Spain (approval number IB 969/08 PI. Trial registration Current controlled trials ISRCTN21229328
Pedersen, Mette B; Giraldi, Annamaria; Kristensen, Ellids; Lauritzen, Torsten; Sandbæk, Annelli; Charles, Morten
2015-03-01
Sexual problems are common in people with diabetes. It is unknown whether early detection of diabetes and subsequent intensive multifactorial treatment (IT) are associated with sexual health. We report the prevalence of low sexual desire and low sexual satisfaction among people with screen-detected diabetes and compare the impact of intensive multifactorial treatment with the impact of routine care (RC) on these measures. A cross-sectional analysis of the ADDITION-Denmark trial cohort six years post-diagnosis. 190 general practices around Denmark. A total of 968 patients with screen-detected type 2 diabetes. Low sexual desire and low sexual satisfaction. Mean (standard deviation, SD) age was 64.9 (6.9) years. The prevalence of low sexual desire was 53% (RC) and 54% (IT) among women, and 24% (RC) and 25% (IT) among men. The prevalence of low sexual satisfaction was 23% (RC) and 18% (IT) among women, and 27% (RC) and 37% (IT) among men. Among men, the prevalence of low sexual satisfaction was significantly higher in the IT group than in the RC group, p = 0.01. Low sexual desire and low satisfaction are frequent among men and women with screen-detected diabetes, and IT may negatively impact men's sexual satisfaction.
Thompson, Bryony A; Goldgar, David E; Paterson, Carol; Clendenning, Mark; Walters, Rhiannon; Arnold, Sven; Parsons, Michael T; Michael D, Walsh; Gallinger, Steven; Haile, Robert W; Hopper, John L; Jenkins, Mark A; Lemarchand, Loic; Lindor, Noralane M; Newcomb, Polly A; Thibodeau, Stephen N; Young, Joanne P; Buchanan, Daniel D; Tavtigian, Sean V; Spurdle, Amanda B
2013-01-01
Mismatch repair (MMR) gene sequence variants of uncertain clinical significance are often identified in suspected Lynch syndrome families, and this constitutes a challenge for both researchers and clinicians. Multifactorial likelihood model approaches provide a quantitative measure of MMR variant pathogenicity, but first require input of likelihood ratios (LRs) for different MMR variation-associated characteristics from appropriate, well-characterized reference datasets. Microsatellite instability (MSI) and somatic BRAF tumor data for unselected colorectal cancer probands of known pathogenic variant status were used to derive LRs for tumor characteristics using the Colon Cancer Family Registry (CFR) resource. These tumor LRs were combined with variant segregation within families, and estimates of prior probability of pathogenicity based on sequence conservation and position, to analyze 44 unclassified variants identified initially in Australasian Colon CFR families. In addition, in vitro splicing analyses were conducted on the subset of variants based on bioinformatic splicing predictions. The LR in favor of pathogenicity was estimated to be ~12-fold for a colorectal tumor with a BRAF mutation-negative MSI-H phenotype. For 31 of the 44 variants, the posterior probabilities of pathogenicity were such that altered clinical management would be indicated. Our findings provide a working multifactorial likelihood model for classification that carefully considers mode of ascertainment for gene testing. © 2012 Wiley Periodicals, Inc.
Analytical system availability techniques
Brouwers, J.J.H.; Verbeek, P.H.J.; Thomson, W.R.
1987-01-01
Analytical techniques are presented to assess the probability distributions and related statistical parameters of loss of production from equipment networks subject to random failures and repairs. The techniques are based on a theoretical model for system availability, which was further developed
Generalized quantum statistics
International Nuclear Information System (INIS)
Chou, C.
1992-01-01
In the paper, a non-anyonic generalization of quantum statistics is presented, in which Fermi-Dirac statistics (FDS) and Bose-Einstein statistics (BES) appear as two special cases. The new quantum statistics, which is characterized by the dimension of its single particle Fock space, contains three consistent parts, namely the generalized bilinear quantization, the generalized quantum mechanical description and the corresponding statistical mechanics
Energy Technology Data Exchange (ETDEWEB)
Denniston, C. [Laboratory of Genetics, University of Wisconsin-Madison, Madison (United States); Chakraborty, R. [Human Genetics Center, University of Texas School of Public Health, P.O. Box 20334, Houston, TX (United States); Sankaranarayanan, K. [Department of Radiation Genetics and Chemical Mutagenesis, Sylvius Laboratories, Leiden University Medical Centre, Wassenaarseweg 72, 2333 AL Leiden (Netherlands)
1998-08-31
Multifactorial diseases, which include the common congenital abnormalities (incidence: 6%) and chronic diseases with onset predominantly in adults (population prevalence: 65%), contribute substantially to human morbidity and mortality. Their transmission patterns do not conform to Mendelian expectations. The model most frequently used to explain their inheritance and to estimate risks to relatives is a Multifactorial Threshold Model (MTM) of disease liability. The MTM assumes that: (1) the disease is due to the joint action of a large number of genetic and environmental factors, each of which contributing a small amount of liability, (2) the distribution of liability in the population is Gaussian and (3) individuals whose liability exceeds a certain threshold value are affected by the disease. For most of these diseases, the number of genes involved or the environmental factors are not fully known. In the context of radiation exposures of the population, the question of the extent to which induced mutations will cause an increase in the frequencies of these diseases has remained unanswered. In this paper, we address this problem by using a modified version of MTM which incorporates mutation and selection as two additional parameters. The model assumes a finite number of gene loci and threshold of liability (hence, the designation, Finite-Locus Threshold Model or FLTM). The FLTM permits one to examine the relationship between broad-sense heritability of disease liability and mutation component (MC), the responsiveness of the disease to a change in mutation rate. Through the use of a computer program (in which mutation rate, selection, threshold, recombination rate and environmental variance are input parameters and MC and heritability of liability are output estimates), we studied the MC-heritability relationship for (1) a permanent increase in mutation rate (e.g., when the population sustains radiation exposure in every generation) and (2) a one-time increase in
National Statistical Commission and Indian Official Statistics
Indian Academy of Sciences (India)
Author Affiliations. T J Rao1. C. R. Rao Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) University of Hyderabad Campus Central University Post Office, Prof. C. R. Rao Road Hyderabad 500 046, AP, India.
Understanding search trees via statistical physics
Indian Academy of Sciences (India)
ary search tree model (where stands for the number of branches of the search tree), an important problem for data storage in computer science, using a variety of statistical physics techniques that allow us to obtain exact asymptotic results.
Telling the truth with statistics
CERN. Geneva; CERN. Geneva. Audiovisual Unit
2002-01-01
This course of lectures will cover probability, distributions, fitting, errors and confidence levels, for practising High Energy Physicists who need to use Statistical techniques to express their results. Concentrating on these appropriate specialist techniques means that they can be covered in appropriate depth, while assuming only the knowledge and experience of a typical Particle Physicist. The different definitions of probability will be explained, and it will be appear why this basic subject is so controversial; there are several viewpoints and it is important to understand them all, rather than abusing the adherents of different beliefs. Distributions will be covered: the situations they arise in, their useful properties, and the amazing result of the Central Limit Theorem. Fitting a parametrisation to a set of data is one of the most widespread uses of statistics: these are lots of ways of doing this and these will be presented, with discussion of which is appropriate in different circumstances. This t...
Applied statistics for social and management sciences
Miah, Abdul Quader
2016-01-01
This book addresses the application of statistical techniques and methods across a wide range of disciplines. While its main focus is on the application of statistical methods, theoretical aspects are also provided as fundamental background information. It offers a systematic interpretation of results often discovered in general descriptions of methods and techniques such as linear and non-linear regression. SPSS is also used in all the application aspects. The presentation of data in the form of tables and graphs throughout the book not only guides users, but also explains the statistical application and assists readers in interpreting important features. The analysis of statistical data is presented consistently throughout the text. Academic researchers, practitioners and other users who work with statistical data will benefit from reading Applied Statistics for Social and Management Sciences. .
International Conference on Robust Statistics 2015
Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta
2016-01-01
This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...
On Quantum Statistical Inference, II
Barndorff-Nielsen, O. E.; Gill, R. D.; Jupp, P. E.
2003-01-01
Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, theoretical developments in the theory of quantum measurements have brought the basic mathematical framework for the probability calculations much closer to that of classical probability theory. The present paper reviews this field and proposes and inte...
Practical Statistics for Particle Physicists
Lista, Luca
2017-01-01
These three lectures provide an introduction to the main concepts of statistical data analysis useful for precision measurements and searches for new signals in High Energy Physics. The frequentist and Bayesian approaches to probability theory will introduced and, for both approaches, inference methods will be presented. Hypothesis tests will be discussed, then significance and upper limit evaluation will be presented with an overview of the modern and most advanced techniques adopted for data analysis at the Large Hadron Collider.
Statistical and thermal physics with computer applications
Gould, Harvey
2010-01-01
This textbook carefully develops the main ideas and techniques of statistical and thermal physics and is intended for upper-level undergraduate courses. The authors each have more than thirty years' experience in teaching, curriculum development, and research in statistical and computational physics. Statistical and Thermal Physics begins with a qualitative discussion of the relation between the macroscopic and microscopic worlds and incorporates computer simulations throughout the book to provide concrete examples of important conceptual ideas. Unlike many contemporary texts on the
Marrakesh International Conference on Probability and Statistics
Ouassou, Idir; Rachdi, Mustapha
2015-01-01
This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.
Contributions to sampling statistics
Conti, Pier; Ranalli, Maria
2014-01-01
This book contains a selection of the papers presented at the ITACOSM 2013 Conference, held in Milan in June 2013. ITACOSM is the bi-annual meeting of the Survey Sampling Group S2G of the Italian Statistical Society, intended as an international forum of scientific discussion on the developments of theory and application of survey sampling methodologies and applications in human and natural sciences. The book gathers research papers carefully selected from both invited and contributed sessions of the conference. The whole book appears to be a relevant contribution to various key aspects of sampling methodology and techniques; it deals with some hot topics in sampling theory, such as calibration, quantile-regression and multiple frame surveys, and with innovative methodologies in important topics of both sampling theory and applications. Contributions cut across current sampling methodologies such as interval estimation for complex samples, randomized responses, bootstrap, weighting, modeling, imputati...
Lee, Hsuei-Chen; Chang, Ku-Chou; Tsauo, Jau-Yih; Hung, Jen-Wen; Huang, Yu-Ching; Lin, Sang-I
2013-04-01
To evaluate effects of a multifactorial fall prevention program on fall incidence and physical function in community-dwelling older adults. Multicenter randomized controlled trial. Three medical centers and adjacent community health centers. Community-dwelling older adults (N=616) who have fallen in the previous year or are at risk of falling. After baseline assessment, eligible subjects were randomly allocated into the intervention group (IG) or the control group (CG), stratified by the Physiological Profile Assessment (PPA) fall risk level. The IG received a 3-month multifactorial intervention program including 8 weeks of exercise training, health education, home hazards evaluation/modification, along with medication review and ophthalmology/other specialty consults. The CG received health education brochures, referrals, and recommendations without direct exercise intervention. Primary outcome was fall incidence within 1 year. Secondary outcomes were PPA battery (overall fall risk index, vision, muscular strength, reaction time, balance, and proprioception), Timed Up & Go (TUG) test, Taiwan version of the International Physical Activity Questionnaire, EuroQol-5D, Geriatric Depression Scale (GDS), and the Falls Efficacy Scale-International at 3 months after randomization. Participants were 76±7 years old and included low risk 25.6%, moderate risk 25.6%, and marked risk 48.7%. The cumulative 1-year fall incidence was 25.2% in the IG and 27.6% in the CG (hazard ratio=.90; 95% confidence interval, .66-1.23). The IG improved more favorably than the CG on overall PPA fall risk index, reaction time, postural sway with eyes open, TUG test, and GDS, especially for those with marked fall risk. The multifactorial fall prevention program with exercise intervention improved functional performance at 3 months for community-dwelling older adults with risk of falls, but did not reduce falls at 1-year follow-up. Fall incidence might have been decreased simultaneously in both
Recreational Boating Statistics 2012
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Recreational Boating Statistics 2013
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
U.S. Department of Health & Human Services — The CMS Office of Enterprise Data and Analytics has developed CMS Program Statistics, which includes detailed summary statistics on national health care, Medicare...
Recreational Boating Statistics 2011
Department of Homeland Security — Every year, the USCG compiles statistics on reported recreational boating accidents. These statistics are derived from accident reports that are filed by the owners...
... Doing AMIGAS Stay Informed Cancer Home Uterine Cancer Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... the most commonly diagnosed gynecologic cancer. U.S. Cancer Statistics Data Visualizations Tool The Data Visualizations tool makes ...
Tuberculosis Data and Statistics
... Advisory Groups Federal TB Task Force Data and Statistics Language: English (US) Español (Spanish) Recommend on Facebook ... Set) Mortality and Morbidity Weekly Reports Data and Statistics Decrease in Reported Tuberculosis Cases MMWR 2010; 59 ( ...
National transportation statistics 2011
2011-04-01
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics : (BTS), National Transportation Statistics presents information on the U.S. transportation system, including : its physical components, safety reco...
National Transportation Statistics 2008
2009-01-08
Compiled and published by the U.S. Department of Transportations Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record...
... News & Events About Us Home > Health Information Share Statistics Research shows that mental illnesses are common in ... of mental illnesses, such as suicide and disability. Statistics Top ı cs Mental Illness Any Anxiety Disorder ...
School Violence: Data & Statistics
... Social Media Publications Injury Center School Violence: Data & Statistics Recommend on Facebook Tweet Share Compartir The first ... Vehicle Safety Traumatic Brain Injury Injury Response Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ...
Caregiver Statistics: Demographics
... You are here Home Selected Long-Term Care Statistics Order this publication Printer-friendly version What is ... needs and services are wide-ranging and complex, statistics may vary from study to study. Sources for ...
... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...
... Standard Drink? Drinking Levels Defined Alcohol Facts and Statistics Print version Alcohol Use in the United States: ... 1238–1245, 2004. PMID: 15010446 National Center for Statistics and Analysis. 2014 Crash Data Key Findings (Traffic ...
National Transportation Statistics 2009
2010-01-21
Compiled and published by the U.S. Department of Transportation's Bureau of Transportation Statistics (BTS), National Transportation Statistics presents information on the U.S. transportation system, including its physical components, safety record, ...
National transportation statistics 2010
2010-01-01
National Transportation Statistics presents statistics on the U.S. transportation system, including its physical components, safety record, economic performance, the human and natural environment, and national security. This is a large online documen...
DEFF Research Database (Denmark)
Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard
Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...
Principles of applied statistics
National Research Council Canada - National Science Library
Cox, D. R; Donnelly, Christl A
2011-01-01
.... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...
Statistical approaches to orofacial pain and temporomandibular disorders research
Manfredini, Daniele; Nardini, Luca Guarda; Carrozzo, Eleonora; Salmaso, Luigi
2014-01-01
This book covers the biostatistical methods utilized to interpret and analyze dental research in the areas of orofacial pain and temporomandibular disorders. It will guide practitioners in these fields who would like to interpret research findings or find examples on the design of clinical investigations. After an introduction dealing with the basic issues, the central sections of the textbook are dedicated to the different types of investigations in sight of specific goals researchers may have. The final section contains more elaborate statistical concepts for expert professionals. The field of orofacial pain and temporomandibular disorders is emerging as one of the most critical areas of clinical research in dentistry. Due to the complexity of clinical pictures, the multifactorial etiology, and the importance of psychosocial factors in all aspects of the TMD practice, clinicians often find it hard to appraise their modus operandi, and researchers must constantly increase their knowledge in epidemiology and ...
Interactive statistics with ILLMO
Martens, J.B.O.S.
2014-01-01
Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured
Lenard, Christopher; McCarthy, Sally; Mills, Terence
2014-01-01
There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…
Youth Sports Safety Statistics
... 6):794-799. 31 American Heart Association. CPR statistics. www.heart.org/HEARTORG/CPRAndECC/WhatisCPR/CPRFactsandStats/CPRpercent20Statistics_ ... Mental Health Services Administration, Center for Behavioral Health Statistics and Quality. (January 10, 2013). The DAWN Report: ...
Statistical methods for astronomical data analysis
Chattopadhyay, Asis Kumar
2014-01-01
This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...
Semiclassical analysis, Witten Laplacians, and statistical mechanis
Helffer, Bernard
2002-01-01
This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S
Dowdy, Shirley; Chilko, Daniel
2011-01-01
Praise for the Second Edition "Statistics for Research has other fine qualities besides superior organization. The examples and the statistical methods are laid out with unusual clarity by the simple device of using special formats for each. The book was written with great care and is extremely user-friendly."-The UMAP Journal Although the goals and procedures of statistical research have changed little since the Second Edition of Statistics for Research was published, the almost universal availability of personal computers and statistical computing application packages have made it possible f
Statistics & probaility for dummies
Rumsey, Deborah J
2013-01-01
Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra
Nonparametric statistical inference
Gibbons, Jean Dickinson
2010-01-01
Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente
Griffiths, Dawn
2009-01-01
Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics
Directory of Open Access Journals (Sweden)
Isoaho Raimo
2007-04-01
Full Text Available Abstract Background This study aimed to assess the effects of a risk-based, multifactorial fall prevention programme on health-related quality of life among the community-dwelling aged who had fallen at least once during the previous 12 months. Methods The study is a part of a single-centre, risk-based, multifactorial randomised controlled trial. The intervention lasted for 12 months and consisted of a geriatric assessment, guidance and treatment, individual instruction in fall prevention, group exercise, lectures on themes related to falling, psychosocial group activities and home exercise. Of the total study population (n = 591, 97% of eligible subjects, 513(251 in the intervention group and 262 in the control group participated in this study. The effect of the intervention on quality of life was measured using the 15D health-related quality of life instrument consisting of 15 dimensions. The data were analysed using the chi-square test or Fisher's exact test, the Mann-Whitney U-test and logistic regression. Results In men, the results showed significant differences in the changes between the intervention and control groups in depression (p = 0.017 and distress (p = 0.029 and marginally significant differences in usual activities (p = 0.058 and sexual activity (p = 0.051. In women, significant differences in the changes between the groups were found in usual activities (p = 0.005 and discomfort/symptoms (p = 0.047. For the subjects aged 65 to 74 years, significant differences in the changes between the groups were seen in distress (p = 0.037 among men and in usual activities (p = 0.011 among women. All improvements were in favour of the intervention group. Conclusion Fall prevention produced positive effects on some dimensions of health-related quality of life in the community-dwelling aged. Men benefited more than women.
Stenhagen, Magnus; Nordell, Eva; Elmståhl, Sölve
2013-04-01
The aim of this study was to describe the prevalence of falls in a general older population, especially among the most elderly, and the risk markers associated with falls. This is a cross-sectional study in which 38 fall risk markers were analysed in non-, occasional- and frequent-fallers. The population was 2,865 individuals (aged 60-93), randomly selected from the general population register. The risk of falling was calculated as age-adjusted odds ratios. The relation between the number of risk markers for an individual and falls was also analysed. About one in ten reported falling during the past 6 months, 35% of which were over 90 years old. Twenty-one risk markers were significantly related to falls confirming falling as a multifactorial problem. These included a variety of diseases, symptoms, medical and physical functions, life-style factors and the taking of certain drugs. The five risk markers with the highest odds ratio in frequent fallers were 'tendency to fall' (37.9), 'low walking speed' (12.8), consumption of 'neuroleptics' (10.9), 'impaired mobility' (10.0) and 'dementia' (5.4). Subjects with more than four and seven risk markers showed a 9- respectively 28-fold increase in the risk of falling, especially among frequent fallers and those aged over 90 years. Falls are common in the elderly population and the risk is multifactorial. The results imply that there is an overrepresentation of fallers in a distinct subgroup of the very elderly and those with multiple risk markers. The self-perceived clinical sign 'tendency to fall' seems highly sensitive as indicator of individuals at risk. Several risk markers may be treatable. Fall risk seems to increase in a non-linear, almost exponential way with increasing number of risk markers.
Naghshpour, Shahdad
2012-01-01
Statistics is the branch of mathematics that deals with real-life problems. As such, it is an essential tool for economists. Unfortunately, the way you and many other economists learn the concept of statistics is not compatible with the way economists think and learn. The problem is worsened by the use of mathematical jargon and complex derivations. Here's a book that proves none of this is necessary. All the examples and exercises in this book are constructed within the field of economics, thus eliminating the difficulty of learning statistics with examples from fields that have no relation to business, politics, or policy. Statistics is, in fact, not more difficult than economics. Anyone who can comprehend economics can understand and use statistics successfully within this field, including you! This book utilizes Microsoft Excel to obtain statistical results, as well as to perform additional necessary computations. Microsoft Excel is not the software of choice for performing sophisticated statistical analy...
Adaptive RAC codes employing statistical channel evaluation ...
African Journals Online (AJOL)
An adaptive encoding technique using row and column array (RAC) codes employing a different number of parity columns that depends on the channel state is proposed in this paper. The trellises of the proposed adaptive codes and a statistical channel evaluation technique employing these trellises are designed and ...
Baseline Statistics of Linked Statistical Data
Scharnhorst, Andrea; Meroño-Peñuela, Albert; Guéret, Christophe
2014-01-01
We are surrounded by an ever increasing ocean of information, everybody will agree to that. We build sophisticated strategies to govern this information: design data models, develop infrastructures for data sharing, building tool for data analysis. Statistical datasets curated by National
Statistical Methods for Environmental Pollution Monitoring
Energy Technology Data Exchange (ETDEWEB)
Gilbert, Richard O. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
1987-01-01
The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Some statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.
Experimental techniques; Techniques experimentales
Energy Technology Data Exchange (ETDEWEB)
Roussel-Chomaz, P. [GANIL CNRS/IN2P3, CEA/DSM, 14 - Caen (France)
2007-07-01
This lecture presents the experimental techniques, developed in the last 10 or 15 years, in order to perform a new class of experiments with exotic nuclei, where the reactions induced by these nuclei allow to get information on their structure. A brief review of the secondary beams production methods will be given, with some examples of facilities in operation or under project. The important developments performed recently on cryogenic targets will be presented. The different detection systems will be reviewed, both the beam detectors before the targets, and the many kind of detectors necessary to detect all outgoing particles after the reaction: magnetic spectrometer for the heavy fragment, detection systems for the target recoil nucleus, {gamma} detectors. Finally, several typical examples of experiments will be detailed, in order to illustrate the use of each detector either alone, or in coincidence with others. (author)
statistical tests for frequency distribution of mean gravity anomalies
African Journals Online (AJOL)
ES Obe
1980-03-01
Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Abdel Messih, Hanaa A; Ishak, Rania A H; Geneidi, Ahmed S; Mansour, Samar
2017-06-01
The aim of the present work is to exclusively optimize and model the effect of phospholipid type either egg phosphatidylcholine (EPC) or soybean phosphatidylcholine (SPC), together with other formulation variables, on the development of nano-ethosomal systems for transdermal delivery of a water-soluble antiemetic drug. Tropisetron HCl (TRO) is available as hard gelatin capsules and IV injections. The transdermal delivery of TRO is considered as a novel alternative route supposing to improve BAV as well as patient convenience. TRO-loaded ethanolic vesicular systems were prepared by hot technique. The effect of formulation variables were optimized through a response surface methodology using 3 × 2 2 -level full factorial design. The concentrations of both PC (A) and ethanol (B) and PC type (C) were the factors, while entrapment efficiency (Y 1 ), vesicle size (Y 2 ), polydispersity index (Y 3 ), and zeta potential (Y 4 ) were the responses. The drug permeation across rat skin from selected formulae was studied. Particle morphology, drug-excipient interactions, and vesicle stability were also investigated. The results proved the critical role of all formulation variables on ethosomal characteristics. The suggested models for all responses showed good predictability. Only the concentration of phospholipid, irrespective to PC type, had a significant effect on the transdermal flux (p transdermal TRO delivery.
Statistical Physics An Introduction
Yoshioka, Daijiro
2007-01-01
This book provides a comprehensive presentation of the basics of statistical physics. The first part explains the essence of statistical physics and how it provides a bridge between microscopic and macroscopic phenomena, allowing one to derive quantities such as entropy. Here the author avoids going into details such as Liouville’s theorem or the ergodic theorem, which are difficult for beginners and unnecessary for the actual application of the statistical mechanics. In the second part, statistical mechanics is applied to various systems which, although they look different, share the same mathematical structure. In this way readers can deepen their understanding of statistical physics. The book also features applications to quantum dynamics, thermodynamics, the Ising model and the statistical dynamics of free spins.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
The statistical stability phenomenon
Gorban, Igor I
2017-01-01
This monograph investigates violations of statistical stability of physical events, variables, and processes and develops a new physical-mathematical theory taking into consideration such violations – the theory of hyper-random phenomena. There are five parts. The first describes the phenomenon of statistical stability and its features, and develops methods for detecting violations of statistical stability, in particular when data is limited. The second part presents several examples of real processes of different physical nature and demonstrates the violation of statistical stability over broad observation intervals. The third part outlines the mathematical foundations of the theory of hyper-random phenomena, while the fourth develops the foundations of the mathematical analysis of divergent and many-valued functions. The fifth part contains theoretical and experimental studies of statistical laws where there is violation of statistical stability. The monograph should be of particular interest to engineers...
Equilibrium statistical mechanics
Jackson, E Atlee
2000-01-01
Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t
Applied statistics for economists
Lewis, Margaret
2012-01-01
This book is an undergraduate text that introduces students to commonly-used statistical methods in economics. Using examples based on contemporary economic issues and readily-available data, it not only explains the mechanics of the various methods, it also guides students to connect statistical results to detailed economic interpretations. Because the goal is for students to be able to apply the statistical methods presented, online sources for economic data and directions for performing each task in Excel are also included.
Mineral industry statistics 1975
Energy Technology Data Exchange (ETDEWEB)
1978-01-01
Production, consumption and marketing statistics are given for solid fuels (coal, peat), liquid fuels and gases (oil, natural gas), iron ore, bauxite and other minerals quarried in France, in 1975. Also accident statistics are included. Production statistics are presented of the Overseas Departments and territories (French Guiana, New Caledonia, New Hebrides). An account of modifications in the mining field in 1975 is given. Concessions, exploitation permits, and permits solely for prospecting for mineral products are discussed. (In French)
Lectures on statistical mechanics
Bowler, M G
1982-01-01
Anyone dissatisfied with the almost ritual dullness of many 'standard' texts in statistical mechanics will be grateful for the lucid explanation and generally reassuring tone. Aimed at securing firm foundations for equilibrium statistical mechanics, topics of great subtlety are presented transparently and enthusiastically. Very little mathematical preparation is required beyond elementary calculus and prerequisites in physics are limited to some elementary classical thermodynamics. Suitable as a basis for a first course in statistical mechanics, the book is an ideal supplement to more convent
Directory of Open Access Journals (Sweden)
Mirjam Nielen
2017-01-01
Full Text Available Always wondered why research papers often present rather complicated statistical analyses? Or wondered how to properly analyse the results of a pragmatic trial from your own practice? This talk will give an overview of basic statistical principles and focus on the why of statistics, rather than on the how.This is a podcast of Mirjam's talk at the Veterinary Evidence Today conference, Edinburgh November 2, 2016.
Equilibrium statistical mechanics
Mayer, J E
1968-01-01
The International Encyclopedia of Physical Chemistry and Chemical Physics, Volume 1: Equilibrium Statistical Mechanics covers the fundamental principles and the development of theoretical aspects of equilibrium statistical mechanics. Statistical mechanical is the study of the connection between the macroscopic behavior of bulk matter and the microscopic properties of its constituent atoms and molecules. This book contains eight chapters, and begins with a presentation of the master equation used for the calculation of the fundamental thermodynamic functions. The succeeding chapters highlight t
Mahalanobis, P C
1965-01-01
Contributions to Statistics focuses on the processes, methodologies, and approaches involved in statistics. The book is presented to Professor P. C. Mahalanobis on the occasion of his 70th birthday. The selection first offers information on the recovery of ancillary information and combinatorial properties of partially balanced designs and association schemes. Discussions focus on combinatorial applications of the algebra of association matrices, sample size analogy, association matrices and the algebra of association schemes, and conceptual statistical experiments. The book then examines latt
Boslaugh, Sarah
2008-01-01
Need to learn statistics as part of your job, or want some help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference that's perfect for anyone with no previous background in the subject. This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrat
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Annual Statistical Supplement, 2002
Social Security Administration — The Annual Statistical Supplement, 2002 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2010
Social Security Administration — The Annual Statistical Supplement, 2010 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2007
Social Security Administration — The Annual Statistical Supplement, 2007 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2001
Social Security Administration — The Annual Statistical Supplement, 2001 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2016
Social Security Administration — The Annual Statistical Supplement, 2016 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2011
Social Security Administration — The Annual Statistical Supplement, 2011 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2005
Social Security Administration — The Annual Statistical Supplement, 2005 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2015
Social Security Administration — The Annual Statistical Supplement, 2015 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2003
Social Security Administration — The Annual Statistical Supplement, 2003 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2017
Social Security Administration — The Annual Statistical Supplement, 2017 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2008
Social Security Administration — The Annual Statistical Supplement, 2008 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2014
Social Security Administration — The Annual Statistical Supplement, 2014 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2004
Social Security Administration — The Annual Statistical Supplement, 2004 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2000
Social Security Administration — The Annual Statistical Supplement, 2000 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2009
Social Security Administration — The Annual Statistical Supplement, 2009 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Annual Statistical Supplement, 2006
Social Security Administration — The Annual Statistical Supplement, 2006 includes the most comprehensive data available on the Social Security and Supplemental Security Income programs. More than...
Bulmer, M G
1979-01-01
There are many textbooks which describe current methods of statistical analysis, while neglecting related theory. There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again fo
Kanji, Gopal K
2006-01-01
This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.
Statistical distribution sampling
Johnson, E. S.
1975-01-01
Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.
Statistical perspectives on inverse problems
DEFF Research Database (Denmark)
Andersen, Kim Emil
of the interior of an object from electrical boundary measurements. One part of this thesis concerns statistical approaches for solving, possibly non-linear, inverse problems. Thus inverse problems are recasted in a form suitable for statistical inference. In particular, a Bayesian approach for regularisation...... problem is given in terms of probability distributions. Posterior inference is obtained by Markov chain Monte Carlo methods and new, powerful simulation techniques based on e.g. coupled Markov chains and simulated tempering is developed to improve the computational efficiency of the overall simulation......Inverse problems arise in many scientific disciplines and pertain to situations where inference is to be made about a particular phenomenon from indirect measurements. A typical example, arising in diffusion tomography, is the inverse boundary value problem for non-invasive reconstruction...
Statistics for High Energy Physics
CERN. Geneva
2018-01-01
The lectures emphasize the frequentist approach used for Dark Matter search and the Higgs search, discovery and measurements of its properties. An emphasis is put on hypothesis test using the asymptotic formulae formalism and its derivation, and on the derivation of the trial factor formulae in one and two dimensions. Various test statistics and their applications are discussed. Some keywords: Profile Likelihood, Neyman Pearson, Feldman Cousins, Coverage, CLs. Nuisance Parameters Impact, Look Elsewhere Effect... Selected Bibliography: G. J. Feldman and R. D. Cousins, A Unified approach to the classical statistical analysis of small signals, Phys.\\ Rev.\\ D {\\bf 57}, 3873 (1998). A. L. Read, Presentation of search results: The CL(s) technique,'' J.\\ Phys.\\ G {\\bf 28}, 2693 (2002). G. Cowan, K. Cranmer, E. Gross and O. Vitells, Asymptotic formulae for likelihood-based tests of new physics,' Eur.\\ Phys.\\ J.\\ C {\\bf 71}, 1554 (2011) Erratum: [Eur.\\ Phys.\\ J.\\ C {\\bf 73}...
Statistical uncertainties and unrecognized relationships
International Nuclear Information System (INIS)
Rankin, J.P.
1985-01-01
Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures
Workshop statistics discovery with data and Minitab
Rossman, Allan J
1998-01-01
Shorn of all subtlety and led naked out of the protec tive fold of educational research literature, there comes a sheepish little fact: lectures don't work nearly as well as many of us would like to think. -George Cobb (1992) This book contains activities that guide students to discover statistical concepts, explore statistical principles, and apply statistical techniques. Students work toward these goals through the analysis of genuine data and through inter action with one another, with their instructor, and with technology. Providing a one-semester introduction to fundamental ideas of statistics for college and advanced high school students, Warkshop Statistics is designed for courses that employ an interactive learning environment by replacing lectures with hands on activities. The text contains enough expository material to stand alone, but it can also be used to supplement a more traditional textbook. Some distinguishing features of Workshop Statistics are its emphases on active learning, conceptu...