WorldWideScience

Sample records for traditional statistical techniques

  1. Line identification studies using traditional techniques and wavelength coincidence statistics

    International Nuclear Information System (INIS)

    Cowley, C.R.; Adelman, S.J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum

  2. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  3. Statistical Techniques for Project Control

    CERN Document Server

    Badiru, Adedeji B

    2012-01-01

    A project can be simple or complex. In each case, proven project management processes must be followed. In all cases of project management implementation, control must be exercised in order to assure that project objectives are achieved. Statistical Techniques for Project Control seamlessly integrates qualitative and quantitative tools and techniques for project control. It fills the void that exists in the application of statistical techniques to project control. The book begins by defining the fundamentals of project management then explores how to temper quantitative analysis with qualitati

  4. Timber Elements: Traditional and Modern Strengthening Techniques

    Directory of Open Access Journals (Sweden)

    Raluca Hohan

    2010-01-01

    Full Text Available The main idea of this paper is to analyse the means for the rehabilitation of our cultural heritage timber structures. Several methods together with their application techniques are described, and also, the reasons for what these strengthening operations become imminent at a point. First of all, the necessity of the timber structural elements strengthening is explained through a short presentation of the factors which are degrading the material. Then, certain precautions and strengthening procedures are presented, all involving the usage of traditional materials like wood, metal, or concrete, and of modern materials like fiber reinforced polymeric composite.

  5. Projection operator techniques in nonequilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Grabert, H.

    1982-01-01

    This book is an introduction to the application of the projection operator technique to the statistical mechanics of irreversible processes. After a general introduction to the projection operator technique and statistical thermodynamics the Fokker-Planck and the master equation approach are described together with the response theory. Then, as applications the damped harmonic oscillator, simple fluids, and the spin relaxation are considered. (HSI)

  6. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  7. A statistical approach to traditional Vietnamese medical diagnoses standardization

    International Nuclear Information System (INIS)

    Nguyen Hoang Phuong; Nguyen Quang Hoa; Le Dinh Long

    1990-12-01

    In this paper the first results of the statistical approach for Cold-Heat diagnosis standardization as a first work in the ''eight rules diagnoses'' standardization of Traditional Vietnamese Medicine are briefly described. Some conclusions and suggestions for further work are given. 3 refs, 2 tabs

  8. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  9. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  10. Earth construction: traditional building techniques of Bhutan

    Directory of Open Access Journals (Sweden)

    João M. Guedes

    2018-01-01

    Full Text Available NCREP – Consultancy in Rehabilitation of Built Heritage Ltd., surveyed the constructive features of Bhutan's vernacular rammed earth built heritage, as part of a project financed by the World Bank and commissioned by the Division for the Conservation of Heritage Sites (DCHS of the Department of Culture - Ministry of Home and Cultural Affairs of Bhutan. This work, which aimed at better understanding the structural behaviour of this heritage and, based on this information, proposing measures to mitigate its seismic risk, included the study of 18 traditional rammed earth buildings in two villages in the Punakha district. The surveys were conducted house-to-house, based on a DCHS script, and included surveys of artisans responsible for building these constructive typologies, supported by a questionnaire integrated within the project, to collect information on the procedures, rites and practices followed in these constructions. This article focuses only on the first part of the work; it presents the main constructive characteristics assessed from the survey carried out on this built heritage and compiles the results of the surveys of the artisans.

  11. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis

  12. Application of microdialysis technique in the traditional chinese medicine

    DEFF Research Database (Denmark)

    Zhang, Shaomin; Zeng, Xianghui; Xu, Xiaohong

    2005-01-01

    The concentration of extracellular neurotransmitters can be dynamically measured by in vivo microdialysis. This technique can apply to quantitatively evaluating the beneficial effects of Traditional Chinese Medicine (TCM). In the present study, the protective effects of Puerarin (Pur) on cerebral...

  13. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    Science.gov (United States)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  14. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  15. Statistical techniques to extract information during SMAP soil moisture assimilation

    Science.gov (United States)

    Kolassa, J.; Reichle, R. H.; Liu, Q.; Alemohammad, S. H.; Gentine, P.

    2017-12-01

    Statistical techniques permit the retrieval of soil moisture estimates in a model climatology while retaining the spatial and temporal signatures of the satellite observations. As a consequence, the need for bias correction prior to an assimilation of these estimates is reduced, which could result in a more effective use of the independent information provided by the satellite observations. In this study, a statistical neural network (NN) retrieval algorithm is calibrated using SMAP brightness temperature observations and modeled soil moisture estimates (similar to those used to calibrate the SMAP Level 4 DA system). Daily values of surface soil moisture are estimated using the NN and then assimilated into the NASA Catchment model. The skill of the assimilation estimates is assessed based on a comprehensive comparison to in situ measurements from the SMAP core and sparse network sites as well as the International Soil Moisture Network. The NN retrieval assimilation is found to significantly improve the model skill, particularly in areas where the model does not represent processes related to agricultural practices. Additionally, the NN method is compared to assimilation experiments using traditional bias correction techniques. The NN retrieval assimilation is found to more effectively use the independent information provided by SMAP resulting in larger model skill improvements than assimilation experiments using traditional bias correction techniques.

  16. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Storch, H. von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1997-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  17. A survey of statistical downscaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E; Storch, H von [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    1998-12-31

    The derivation of regional information from integrations of coarse-resolution General Circulation Models (GCM) is generally referred to as downscaling. The most relevant statistical downscaling techniques are described here and some particular examples are worked out in detail. They are classified into three main groups: linear methods, classification methods and deterministic non-linear methods. Their performance in a particular example, winter rainfall in the Iberian peninsula, is compared to a simple downscaling analog method. It is found that the analog method performs equally well than the more complicated methods. Downscaling analysis can be also used as a tool to validate regional performance of global climate models by analyzing the covariability of the simulated large-scale climate and the regional climates. (orig.) [Deutsch] Die Ableitung regionaler Information aus Integrationen grob aufgeloester Klimamodelle wird als `Regionalisierung` bezeichnet. Dieser Beitrag beschreibt die wichtigsten statistischen Regionalisierungsverfahren und gibt darueberhinaus einige detaillierte Beispiele. Regionalisierungsverfahren lassen sich in drei Hauptgruppen klassifizieren: lineare Verfahren, Klassifikationsverfahren und nicht-lineare deterministische Verfahren. Diese Methoden werden auf den Niederschlag auf der iberischen Halbinsel angewandt und mit den Ergebnissen eines einfachen Analog-Modells verglichen. Es wird festgestellt, dass die Ergebnisse der komplizierteren Verfahren im wesentlichen auch mit der Analog-Methode erzielt werden koennen. Eine weitere Anwendung der Regionalisierungsmethoden besteht in der Validierung globaler Klimamodelle, indem die simulierte und die beobachtete Kovariabilitaet zwischen dem grosskaligen und dem regionalen Klima miteinander verglichen wird. (orig.)

  18. Review of the Statistical Techniques in Medical Sciences | Okeh ...

    African Journals Online (AJOL)

    ... medical researcher in selecting the appropriate statistical techniques. Of course, all statistical techniques have certain underlying assumptions, which must be checked before the technique is applied. Keywords: Variable, Prospective Studies, Retrospective Studies, Statistical significance. Bio-Research Vol. 6 (1) 2008: pp.

  19. Exploring Traditional Glass Bead Making Techniques in Jewellery ...

    African Journals Online (AJOL)

    Exploring traditional glass bead making techniques in jewellery in some prominent areas in Ghana is a means to exposing the area for metal and ceramic artists and other related fields of discipline such as aesthetics and criticism to complement their form of ... Keywords: livelihood, vitreous, glass bottles, furnace, threading ...

  20. Comparison of Traditional and Innovative Techniques to Solve Technical Challenges

    Science.gov (United States)

    Perchonok, Michele

    2011-01-01

    This slide presentation reviews the use of traditional and innovative techniques to solve technical challenges in food storage technology. The planning for a mission to Mars is underway, and the food storage technology improvements requires that improvements be made. This new technology is required, because current food storage technology is inadequate,refrigerators or freezers are not available for food preservation, and that a shelf life of 5 years is expected. A 10 year effort to improve food packaging technology has not enhanced significantly food packaging capabilities. Two innovation techniques were attempted InnoCentive and Yet2.com and have provided good results, and are still under due diligence for solver verification.

  1. Multivariate statistical treatment of PIXE analysis of some traditional Chinese medicines

    International Nuclear Information System (INIS)

    Xiaofeng Zhang; Jianguo Ma; Junfa Qin; Lun Xiao

    1991-01-01

    Elements in two kinds of 30 traditional Chinese medicines were analyzed by PIXE method, and the data were treated by multivariate statistical methods. The results show that these two kinds of traditional Chinese medicines are almost separable according to their elemental contents. The results are congruous with the traditional Chinese medicine practice. (author) 7 refs.; 2 figs.; 2 tabs

  2. Testing of statistical techniques used in SYVAC

    International Nuclear Information System (INIS)

    Dalrymple, G.; Edwards, H.; Prust, J.

    1984-01-01

    Analysis of the SYVAC (SYstems Variability Analysis Code) output adopted four techniques to provide a cross comparison of their performance. The techniques used were: examination of scatter plots; correlation/regression; Kruskal-Wallis one-way analysis of variance by ranks; comparison of cumulative distribution functions and risk estimates between sub-ranges of parameter values. The analysis was conducted for the case of a single nuclide chain and was based mainly on simulated dose after 500,000 years. The results from this single SYVAC case showed that site parameters had the greatest influence on dose to man. The techniques of correlation/regression and Kruskal-Wallis were both successful and consistent in their identification of important parameters. Both techniques ranked the eight most important parameters in the same order when analysed for maximum dose. The results from a comparison of cdfs and risks in sub-ranges of the parameter values were not entirely consistent with other techniques. Further sampling of the high dose region is recommended in order to improve the accuracy of this method. (author)

  3. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  4. Predicting radiotherapy outcomes using statistical learning techniques

    International Nuclear Information System (INIS)

    El Naqa, Issam; Bradley, Jeffrey D; Deasy, Joseph O; Lindsay, Patricia E; Hope, Andrew J

    2009-01-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  5. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    Science.gov (United States)

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  6. "Statistical Techniques for Particle Physics" (2/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  7. "Statistical Techniques for Particle Physics" (1/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  8. "Statistical Techniques for Particle Physics" (4/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  9. "Statistical Techniques for Particle Physics" (3/4)

    CERN Multimedia

    CERN. Geneva

    2009-01-01

    This series will consist of four 1-hour lectures on statistics for particle physics. The goal will be to build up to techniques meant for dealing with problems of realistic complexity while maintaining a formal approach. I will also try to incorporate usage of common tools like ROOT, RooFit, and the newly developed RooStats framework into the lectures. The first lecture will begin with a review the basic principles of probability, some terminology, and the three main approaches towards statistical inference (Frequentist, Bayesian, and Likelihood-based). I will then outline the statistical basis for multivariate analysis techniques (the Neyman-Pearson lemma) and the motivation for machine learning algorithms. Later, I will extend simple hypothesis testing to the case in which the statistical model has one or many parameters (the Neyman Construction and the Feldman-Cousins technique). From there I will outline techniques to incorporate background uncertainties. If time allows, I will touch on the statist...

  10. Statistical Theory of the Vector Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune; Ibrahim, S. R.

    1999-01-01

    decays. Due to the speed and/or accuracy of the Vector Random Decrement technique, it was introduced as an attractive alternative to the Random Decrement technique. In this paper, the theory of the Vector Random Decrement technique is extended by applying a statistical description of the stochastic...

  11. A statistical analysis of Chinese traditional sports science master′s degree thesis

    Directory of Open Access Journals (Sweden)

    SHEN Wenjuan

    2013-04-01

    Full Text Available Through a statistical analysis of 367 sports science master′s degree thesis on Chinese traditional sport in the past five years,some conclusions can be drawn that the traditional national sports master's degree thesis should expand the theoretical depth; expand the scope of the study,in particular,focusing on some disappearing traditional national sports; regulate the types of research methods; strengthen the depth of data mining,correct thesis references. Thus can further clarify the laws of traditional sports graduate Thesis and provide references for postgraduate training.

  12. Characterization of a Viking Blade Fabricated by Traditional Forging Techniques

    Science.gov (United States)

    Vo, H.; Frazer, D.; Bailey, N.; Traylor, R.; Austin, J.; Pringle, J.; Bickel, J.; Connick, R.; Connick, W.; Hosemann, P.

    2016-12-01

    A team of students from the University of California, Berkeley, participated in a blade-smithing competition hosted by the Minerals, Metals, and Materials Society at the TMS 2015 144th annual meeting and exhibition. Motivated by ancient forging methods, the UC Berkeley team chose to fabricate our blade from historical smithing techniques utilizing naturally-occurring deposits of iron ore. This approach resulted in receiving the "Best Example of a Traditional Blade Process/Ore Smelting Technique" award for our blade named "Berkelium." First, iron-enriched sand was collected from local beaches. Magnetite (Fe3O4) was then extracted from the sand and smelted into individual high- and low-carbon steel ingots. Layers of high- and low-carbon steels were forge-welded together, predominantly by hand, to form a composite material. Optical microscopy, energy dispersive spectroscopy, and Vickers hardness mechanical testing were conducted at different stages throughout the blade-making process to evaluate the microstructure and hardness evolution during formation. It was found that the pre-heat-treated blade microstructure was composed of ferrite and pearlite, and contained many nonmetallic inclusions. A final heat treatment was performed, which caused the average hardness of the blade edge to increase by more than a factor of two, indicating a martensitic transformation.

  13. Fusing Data Mining, Machine Learning and Traditional Statistics to Detect Biomarkers Associated with Depression.

    Directory of Open Access Journals (Sweden)

    Joanna F Dipnall

    Full Text Available Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study.The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010. Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators.After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30, serum glucose (OR 1.01; 95% CI 1.00, 1.01 and total bilirubin (OR 0.12; 95% CI 0.05, 0.28. Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016, and current smokers (p<0.001.The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling

  14. Fusing Data Mining, Machine Learning and Traditional Statistics to Detect Biomarkers Associated with Depression.

    Science.gov (United States)

    Dipnall, Joanna F; Pasco, Julie A; Berk, Michael; Williams, Lana J; Dodd, Seetal; Jacka, Felice N; Meyer, Denny

    2016-01-01

    Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (pmachine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future

  15. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  16. The statistical chopper in the time-of-flight technique

    International Nuclear Information System (INIS)

    Albuquerque Vieira, J. de.

    1975-12-01

    A detailed study of the 'statistical' chopper and of the method of analysis of the data obtained by this technique is made. The study includes the basic ideas behind correlation methods applied in time-of-flight techniques; comparisons with the conventional chopper made by an analysis of statistical errors; the development of a FORTRAN computer programme to analyse experimental results; the presentation of the related fields of work to demonstrate the potential of this method and suggestions for future study together with the criteria for a time-of-flight experiment using the method being studied [pt

  17. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art.

    Science.gov (United States)

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time.

  18. Fusing Data Mining, Machine Learning and Traditional Statistics to Detect Biomarkers Associated with Depression

    Science.gov (United States)

    Dipnall, Joanna F.

    2016-01-01

    Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009–2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. Results After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). Conclusion The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and

  19. Statistical and Economic Techniques for Site-specific Nematode Management.

    Science.gov (United States)

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  20. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  1. The application of statistical techniques to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Annibal, P.S.; Roberts, P.D.

    1990-02-01

    Over the past decade much theoretical research has been carried out on the development of statistical methods for nuclear materials accountancy. In practice plant operation may differ substantially from the idealized models often cited. This paper demonstrates the importance of taking account of plant operation in applying the statistical techniques, to improve the accuracy of the estimates and the knowledge of the errors. The benefits are quantified either by theoretical calculation or by simulation. Two different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an accountancy tank is investigated. Secondly, a means of improving the knowledge of the 'Material Unaccounted For' (the difference between the inventory calculated from input/output data, and the measured inventory), using information about the plant measurement system, is developed and compared with existing general techniques. (author)

  2. exploring traditional glass bead making techniques in jewellery

    African Journals Online (AJOL)

    User

    Glass bead making techniques and their mass production will help the individual ... communicate cultural values in a symbolic lan- guage which ..... Surface of most of the new beads were rough ... tourism potential to be developed further to.

  3. Endoscopic Radiofrequency Ablation-Assisted Resection of Juvenile Nasopharyngeal Angiofibroma: Comparison with Traditional Endoscopic Technique.

    Science.gov (United States)

    McLaughlin, Eamon J; Cunningham, Michael J; Kazahaya, Ken; Hsing, Julianna; Kawai, Kosuke; Adil, Eelam A

    2016-06-01

    To evaluate the feasibility of radiofrequency surgical instrumentation for endoscopic resection of juvenile nasopharyngeal angiofibroma (JNA) and to test the hypothesis that endoscopic radiofrequency ablation-assisted (RFA) resection will have superior intraoperative and/or postoperative outcomes as compared with traditional endoscopic (TE) resection techniques. Case series with chart review. Two tertiary care pediatric hospitals. Twenty-nine pediatric patients who underwent endoscopic transnasal resection of JNA from January 2000 to December 2014. Twenty-nine patients underwent RFA (n = 13) or TE (n = 16) JNA resection over the 15-year study period. Mean patient age was not statistically different between the 2 groups (P = .41); neither was their University of Pittsburgh Medical Center classification stage (P = .79). All patients underwent preoperative embolization. Mean operative times were not statistically different (P = .29). Mean intraoperative blood loss and the need for a transfusion were also not statistically different (P = .27 and .47, respectively). Length of hospital stay was not statistically different (P = .46). Recurrence rates did not differ between groups (P = .99) over a mean follow-up period of 2.3 years. There were no significant differences between RFA and TE resection in intraoperative or postoperative outcome parameters. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.

  4. Machine learning and statistical techniques : an application to the prediction of insolvency in Spanish non-life insurance companies

    OpenAIRE

    Díaz, Zuleyka; Segovia, María Jesús; Fernández, José

    2005-01-01

    Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...

  5. ANALYSIS OF TRADITIONAL BUILDING TECHNIQUES AND DAMAGE ASSESSMENT OF TRADITIONAL TURKISH HOUSE

    Directory of Open Access Journals (Sweden)

    Mine Tanac Zeren

    2015-03-01

    Full Text Available Western part of the Anatolia is one of the most important regions of the World that many civilizations have lived during the history since ancient times. Kula is an important historical town dating back to 17th century and is hosting important timber farmed structures (mansions unique with their space organizations, architectural features and structural system. This study creates an analysis model which is based on a detailed case study, defining structural system and damage causes for the upcoming restoration works within the region, and this methodology can be applied for other traditional regions as well.

  6. Combining heuristic and statistical techniques in landslide hazard assessments

    Science.gov (United States)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  7. Precision Learning Assessment: An Alternative to Traditional Assessment Techniques.

    Science.gov (United States)

    Caltagirone, Paul J.; Glover, Christopher E.

    1985-01-01

    A continuous and curriculum-based assessment method, Precision Learning Assessment (PLA), which integrates precision teaching and norm-referenced techniques, was applied to a math computation curriculum for 214 third graders. The resulting districtwide learning curves defining average annual progress through the computation curriculum provided…

  8. Traditional boat-building and navigational techniques of southern Orissa

    Digital Repository Service at National Institute of Oceanography (India)

    Tripati, S.

    of the region. No written records on ancient boatbuilding and navigational aids of the region are available for the reconstruction of the technique of boat-building. Boats of this area have been classified into two categories, namely planked and log boats...

  9. Statistical optimisation techniques in fatigue signal editing problem

    International Nuclear Information System (INIS)

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-01-01

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection

  10. Statistical optimisation techniques in fatigue signal editing problem

    Energy Technology Data Exchange (ETDEWEB)

    Nopiah, Z. M.; Osman, M. H. [Fundamental Engineering Studies Unit Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia); Baharin, N.; Abdullah, S. [Department of Mechanical and Materials Engineering Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia, 43600 UKM (Malaysia)

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  11. Application of multivariate statistical techniques in microbial ecology.

    Science.gov (United States)

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.

  12. Statistical methods of evaluating and comparing imaging techniques

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested

  13. Statistical precision of delayed-neutron nondestructive assay techniques

    International Nuclear Information System (INIS)

    Bayne, C.K.; McNeany, S.R.

    1979-02-01

    A theoretical analysis of the statistical precision of delayed-neutron nondestructive assay instruments is presented. Such instruments measure the fissile content of nuclear fuel samples by neutron irradiation and delayed-neutron detection. The precision of these techniques is limited by the statistical nature of the nuclear decay process, but the precision can be optimized by proper selection of system operating parameters. Our method is a three-part analysis. We first present differential--difference equations describing the fundamental physics of the measurements. We then derive and present complete analytical solutions to these equations. Final equations governing the expected number and variance of delayed-neutron counts were computer programmed to calculate the relative statistical precision of specific system operating parameters. Our results show that Poisson statistics do not govern the number of counts accumulated in multiple irradiation-count cycles and that, in general, maximum count precision does not correspond with maximum count as first expected. Covariance between the counts of individual cycles must be considered in determining the optimum number of irradiation-count cycles and the optimum irradiation-to-count time ratio. For the assay system in use at ORNL, covariance effects are small, but for systems with short irradiation-to-count transition times, covariance effects force the optimum number of irradiation-count cycles to be half those giving maximum count. We conclude that the equations governing the expected value and variance of delayed-neutron counts have been derived in closed form. These have been computerized and can be used to select optimum operating parameters for delayed-neutron assay devices

  14. Tradition

    DEFF Research Database (Denmark)

    Otto, Ton

    2016-01-01

    : beliefs, practices, institutions, and also things. In this sense, the meaning of the term in social research is very close to its usage in common language and is not always theoretically well developed (see Shils, 1971: 123). But the concept of tradition has also been central to major theoretical debates...... on the nature of social change, especially in connection with the notion of modernity. Here tradition is linked to various forms of agency as a factor of both stability and intentional change....

  15. Studies on coal flotation in flotation column using statistical technique

    Energy Technology Data Exchange (ETDEWEB)

    M.S. Jena; S.K. Biswal; K.K. Rao; P.S.R. Reddy [Institute of Minerals & Materials Technology (IMMT), Orissa (India)

    2009-07-01

    Flotation of Indian high ash coking coal fines to obtain clean coal has been reported earlier by many authors. Here an attempt has been made to systematically analyse factors influencing the flotation process using statistical design of experiments technique. Studies carried out in a 100 mm diameter column using factorial design to establish weightage of factors such as feed rate, air rate and collector dosage indicated that all three parameters have equal influence on the flotation process. Subsequently RSM-CCD design was used to obtain best result and it is observed that 94% combustibles can be recovered with 82.5% weight recovery at 21.4% ash from a feed containing 31.3% ash content.

  16. Reasoning with data an introduction to traditional and Bayesian statistics using R

    CERN Document Server

    Stanton, Jeffrey M

    2017-01-01

    Engaging and accessible, this book teaches readers how to use inferential statistical thinking to check their assumptions, assess evidence about their beliefs, and avoid overinterpreting results that may look more promising than they really are. It provides step-by-step guidance for using both classical (frequentist) and Bayesian approaches to inference. Statistical techniques covered side by side from both frequentist and Bayesian approaches include hypothesis testing, replication, analysis of variance, calculation of effect sizes, regression, time series analysis, and more. Students also get a complete introduction to the open-source R programming language and its key packages. Throughout the text, simple commands in R demonstrate essential data analysis skills using real-data examples. The companion website provides annotated R code for the book's examples, in-class exercises, supplemental reading lists, and links to online videos, interactive materials, and other resources.

  17. The patient relationship and therapeutic techniques of the South Sotho traditional healer.

    Science.gov (United States)

    Pinkoane, M G; Greeff, M; Williams, M J S

    2005-11-01

    Until 1996 the practice of traditional healers was outlawed in South Africa and not afforded a legal position in the community of health care providers. In 1978 the World Health Organization (WHO) identified traditional healers as those people forming an essential core of primary health care workers for rural people in the Third World Countries. However in 1994 the new South African government identified traditional healers as forming an essential element of primary health care workers. It is estimated that 80% of the black population uses traditional medicine because it is deeply rooted in their culture, which is linked to their religion. The traditional healer shares with the patient a world view which is completely alien to biomedical personnel. Therapeutic techniques typically used in traditional healing conflict with the therapeutic techniques used in biomedicine. The patients' perceptions of traditional healing, their needs and expectations, may be the driving force behind their continuous persistence to consult a traditional healer, even after these patients may have sought the therapeutic techniques of biomedical personnel. The operation of both systems in the same society creates a problem to both providers and recipients of health care. Confusion then arises and the consumer consequently chooses the services closer to her. The researcher aimed at investigating the characteristics of the relationship between the traditional healers and the patients, explored the therapeutic techniques that are used in the South Sotho traditional healing process, and investigated the views of both the traditional healers and the patients about the South -Sotho traditional healing process, to facilitate incorporation of the traditional healers in the National Health Care Delivery System. A qualitative research design was followed. Participants were identified by means of a non-probable, purposive voluntary sample. Data was collected by means of a video camera and semi

  18. Traditional Vs. Contemporary Managerial/Cost Accounting Techniques Differences Between Opinions Of Educators And Practitioners

    OpenAIRE

    M. A. Ekbatani; M. A. Sangeladji

    2011-01-01

    From the mid 1980s, the start of new movements in the field of managerial/cost accounting, a gap has emerged between the opinions of academia and practitioners regarding the degree of usefulness of managerial/cost accounting techniques. It is believed that practitioners generally prefer managerial/cost accounting techniques which are simple, practical and economically applicable. On the other hand, many authors and academia believe that the traditional managerial/cost accounting techniques ar...

  19. Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques

    Science.gov (United States)

    Gulgundi, Mohammad Shahid; Shetty, Amba

    2018-03-01

    Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.

  20. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    Science.gov (United States)

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  1. Statistical and particle physics: Common problems and techniques

    International Nuclear Information System (INIS)

    Bowler, K.C.; Mc Kane, A.J.

    1984-01-01

    These proceedings contain statistical mechanical studies in condensed matter physics; interfacial problems in statistical physics; string theory; general monte carlo methods and their application to Lattice gauge theories; topological excitations in field theory; phase transformation kinetics; and studies of chaotic systems

  2. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  3. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications

  4. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jianning Wu

    2015-01-01

    Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  5. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    Science.gov (United States)

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  6. Statistical techniques for sampling and monitoring natural resources

    Science.gov (United States)

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  7. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  8. Comparative Accuracy of Facial Models Fabricated Using Traditional and 3D Imaging Techniques.

    Science.gov (United States)

    Lincoln, Ketu P; Sun, Albert Y T; Prihoda, Thomas J; Sutton, Alan J

    2016-04-01

    The purpose of this investigation was to compare the accuracy of facial models fabricated using facial moulage impression methods to the three-dimensional printed (3DP) fabrication methods using soft tissue images obtained from cone beam computed tomography (CBCT) and 3D stereophotogrammetry (3D-SPG) scans. A reference phantom model was fabricated using a 3D-SPG image of a human control form with ten fiducial markers placed on common anthropometric landmarks. This image was converted into the investigation control phantom model (CPM) using 3DP methods. The CPM was attached to a camera tripod for ease of image capture. Three CBCT and three 3D-SPG images of the CPM were captured. The DICOM and STL files from the three 3dMD and three CBCT were imported to the 3DP, and six testing models were made. Reversible hydrocolloid and dental stone were used to make three facial moulages of the CPM, and the impressions/casts were poured in type IV gypsum dental stone. A coordinate measuring machine (CMM) was used to measure the distances between each of the ten fiducial markers. Each measurement was made using one point as a static reference to the other nine points. The same measuring procedures were accomplished on all specimens. All measurements were compared between specimens and the control. The data were analyzed using ANOVA and Tukey pairwise comparison of the raters, methods, and fiducial markers. The ANOVA multiple comparisons showed significant difference among the three methods (p 3D-SPG showed statistical difference in comparison to the models fabricated using the traditional method of facial moulage and 3DP models fabricated from CBCT imaging. 3DP models fabricated using 3D-SPG were less accurate than the CPM and models fabricated using facial moulage and CBCT imaging techniques. © 2015 by the American College of Prosthodontists.

  9. Statistical sampling techniques as applied to OSE inspections

    International Nuclear Information System (INIS)

    Davis, J.J.; Cote, R.W.

    1987-01-01

    The need has been recognized for statistically valid methods for gathering information during OSE inspections; and for interpretation of results, both from performance testing and from records reviews, interviews, etc. Battelle Columbus Division, under contract to DOE OSE has performed and is continuing to perform work in the area of statistical methodology for OSE inspections. This paper represents some of the sampling methodology currently being developed for use during OSE inspections. Topics include population definition, sample size requirements, level of confidence and practical logistical constraints associated with the conduct of an inspection based on random sampling. Sequential sampling schemes and sampling from finite populations are also discussed. The methods described are applicable to various data gathering activities, ranging from the sampling and examination of classified documents to the sampling of Protective Force security inspectors for skill testing

  10. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  11. Traditional Lecture Versus an Activity Approach for Teaching Statistics: A Comparison of Outcomes

    OpenAIRE

    Loveland, Jennifer L.

    2014-01-01

    Many educational researchers have proposed teaching statistics with less lecture and more active learning methods. However, there are only a few comparative studies that have taught one section of statistics with lectures and one section with activity-based methods; of those studies, the results are contradictory. To address the need for more research on the actual effectiveness of active learning methods in introductory statistics, this research study was undertaken. An introductory, univ...

  12. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    Science.gov (United States)

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  13. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    Energy Technology Data Exchange (ETDEWEB)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  14. Territories typification technique with use of statistical models

    Science.gov (United States)

    Galkin, V. I.; Rastegaev, A. V.; Seredin, V. V.; Andrianov, A. V.

    2018-05-01

    Territories typification is required for solution of many problems. The results of geological zoning received by means of various methods do not always agree. That is why the main goal of the research given is to develop a technique of obtaining a multidimensional standard classified indicator for geological zoning. In the course of the research, the probabilistic approach was used. In order to increase the reliability of geological information classification, the authors suggest using complex multidimensional probabilistic indicator P K as a criterion of the classification. The second criterion chosen is multidimensional standard classified indicator Z. These can serve as characteristics of classification in geological-engineering zoning. Above mentioned indicators P K and Z are in good correlation. Correlation coefficient values for the entire territory regardless of structural solidity equal r = 0.95 so each indicator can be used in geological-engineering zoning. The method suggested has been tested and the schematic map of zoning has been drawn.

  15. Application of Statistical Potential Techniques to Runaway Transport Studies

    International Nuclear Information System (INIS)

    Eguilior, S.; Castejon, F.; Parrondo, J. M.

    2001-01-01

    A method is presented for computing runaway production rate based on techniques of noise-activated escape in a potential is presented in this work. A generalised potential in 2D momentum space is obtained from the deterministic or drift terms of Langevin equations. The diffusive or stochastic terms that arise directly from the stochastic nature of collisions, play the role of the noise that activates barrier crossings. The runaway electron source is given by the escape rate in such a potential which is obtained from an Arrenius-like relation. Runaway electrons are those skip the potential barrier due to the effect of stochastic collisions. In terms of computation time, this method allows one to quickly obtain the source term for a runway electron transport code.(Author) 11 refs

  16. Statistical techniques for the identification of reactor component structural vibrations

    International Nuclear Information System (INIS)

    Kemeny, L.G.

    1975-01-01

    The identification, on-line and in near real-time, of the vibration frequencies, modes and amplitudes of selected key reactor structural components and the visual monitoring of these phenomena by nuclear power plant operating staff will serve to further the safety and control philosophy of nuclear systems and lead to design optimisation. The School of Nuclear Engineering has developed a data acquisition system for vibration detection and identification. The system is interfaced with the HIFAR research reactor of the Australian Atomic Energy Commission. The reactor serves to simulate noise and vibrational phenomena which might be pertinent in power reactor situations. The data acquisition system consists of a small computer interfaced with a digital correlator and a Fourier transform unit. An incremental tape recorder is utilised as a backing store and as a means of communication with other computers. A small analogue computer and an analogue statistical analyzer can be used in the pre and post computational analysis of signals which are received from neutron and gamma detectors, thermocouples, accelerometers, hydrophones and strain gauges. Investigations carried out to date include a study of the role of local and global pressure fields due to turbulence in coolant flow and pump impeller induced perturbations on (a) control absorbers, (B) fuel element and (c) coolant external circuit and core tank structure component vibrations. (Auth.)

  17. Statistical mechanics of sensing and communications: Insights and techniques

    International Nuclear Information System (INIS)

    Murayama, T; Davis, P

    2008-01-01

    In this article we review a basic model for analysis of large sensor networks from the point of view of collective estimation under bandwidth constraints. We compare different sensing aggregation levels as alternative 'strategies' for collective estimation: moderate aggregation from a moderate number of sensors for which communication bandwidth is enough that data encoding can be reversible, and large scale aggregation from very many sensors - in which case communication bandwidth constraints require the use of nonreversible encoding. We show the non-trivial trade-off between sensing quality, which can be increased by increasing the number of sensors, and communication quality under bandwidth constraints, which decreases if the number of sensors is too large. From a practical standpoint, we verify that such a trade-off exists in constructively defined communications schemes. We introduce a probabilistic encoding scheme and define rate distortion models that are suitable for analysis of the large network limit. Our description shows that the methods and ideas from statistical physics can play an important role in formulating effective models for such schemes

  18. Statistical classification techniques in high energy physics (SDDT algorithm)

    International Nuclear Information System (INIS)

    Bouř, Petr; Kůs, Václav; Franc, Jiří

    2016-01-01

    We present our proposal of the supervised binary divergence decision tree with nested separation method based on the generalized linear models. A key insight we provide is the clustering driven only by a few selected physical variables. The proper selection consists of the variables achieving the maximal divergence measure between two different classes. Further, we apply our method to Monte Carlo simulations of physics processes corresponding to a data sample of top quark-antiquark pair candidate events in the lepton+jets decay channel. The data sample is produced in pp̅ collisions at √S = 1.96 TeV. It corresponds to an integrated luminosity of 9.7 fb"-"1 recorded with the D0 detector during Run II of the Fermilab Tevatron Collider. The efficiency of our algorithm achieves 90% AUC in separating signal from background. We also briefly deal with the modification of statistical tests applicable to weighted data sets in order to test homogeneity of the Monte Carlo simulations and measured data. The justification of these modified tests is proposed through the divergence tests. (paper)

  19. Application of metabonomic analytical techniques in the modernization and toxicology research of traditional Chinese medicine.

    Science.gov (United States)

    Lao, Yong-Min; Jiang, Jian-Guo; Yan, Lu

    2009-08-01

    In the recent years, a wide range of metabonomic analytical techniques are widely used in the modern research of traditional Chinese medicine (TCM). At the same time, the international community has attached increasing importance to TCM toxicity problems. Thus, many studies have been implemented to investigate the toxicity mechanisms of TCM. Among these studies, many metabonomic-based methods have been implemented to facilitate TCM toxicity investigation. At present, the most prevailing methods for TCM toxicity research are mainly single analysis techniques using only one analytical means. These techniques include nuclear magnetic resonance (NMR), gas chromatography-mass spectrometry (GC-MS), and liquid chromatography-mass spectrometry (LC-MS), etc.; with these techniques, some favourable outcomes have been gained in the toxic reaction studies of TCM, such as the action target organs assay, the establishment of action pattern, the elucidation of action mechanism and the exploration of action material foundation. However, every analytical technique has its advantages and drawbacks, no existing analytical technique can be versatile. Multi-analysed techniques can partially overcome the shortcomings of single-analysed techniques. Combination of GC-MS and LC-MS metabolic profiling approaches has unravelled the pathological outcomes of aristolochic acid-induced nephrotoxicity, which can not be achieved by single-analysed techniques. It is believed that with the further development of metabonomic analytical techniques, especially multi-analysed techniques, metabonomics will greatly promote TCM toxicity research and be beneficial to the modernization of TCM in terms of extending the application of modern means in the TCM safety assessment, assisting the formulation of TCM safety norms and establishing the international standards indicators.

  20. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  1. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  2. Characterization of Natural Dyes and Traditional Korean Silk Fabric by Surface Analytical Techniques

    Directory of Open Access Journals (Sweden)

    Yeonhee Lee

    2013-05-01

    Full Text Available Time-of-flight secondary ion mass spectrometry (TOF-SIMS and X-ray photoelectron spectroscopy (XPS are well established surface techniques that provide both elemental and organic information from several monolayers of a sample surface, while also allowing depth profiling or image mapping to be carried out. The static TOF-SIMS with improved performances has expanded the application of TOF-SIMS to the study of a variety of organic, polymeric and biological materials. In this work, TOF-SIMS, XPS and Fourier Transform Infrared (FTIR measurements were used to characterize commercial natural dyes and traditional silk fabric dyed with plant extracts dyes avoiding the time-consuming and destructive extraction procedures necessary for the spectrophotometric and chromatographic methods previously used. Silk textiles dyed with plant extracts were then analyzed for chemical and functional group identification of their dye components and mordants. TOF-SIMS spectra for the dyed silk fabric showed element ions from metallic mordants, specific fragment ions and molecular ions from plant-extracted dyes. The results of TOF-SIMS, XPS and FTIR are very useful as a reference database for comparison with data about traditional Korean silk fabric and to provide an understanding of traditional dyeing materials. Therefore, this study shows that surface techniques are useful for micro-destructive analysis of plant-extracted dyes and Korean dyed silk fabric.

  3. Characterization of Natural Dyes and Traditional Korean Silk Fabric by Surface Analytical Techniques

    Science.gov (United States)

    Lee, Jihye; Kang, Min Hwa; Lee, Kang-Bong; Lee, Yeonhee

    2013-01-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) and X-ray photoelectron spectroscopy (XPS) are well established surface techniques that provide both elemental and organic information from several monolayers of a sample surface, while also allowing depth profiling or image mapping to be carried out. The static TOF-SIMS with improved performances has expanded the application of TOF-SIMS to the study of a variety of organic, polymeric and biological materials. In this work, TOF-SIMS, XPS and Fourier Transform Infrared (FTIR) measurements were used to characterize commercial natural dyes and traditional silk fabric dyed with plant extracts dyes avoiding the time-consuming and destructive extraction procedures necessary for the spectrophotometric and chromatographic methods previously used. Silk textiles dyed with plant extracts were then analyzed for chemical and functional group identification of their dye components and mordants. TOF-SIMS spectra for the dyed silk fabric showed element ions from metallic mordants, specific fragment ions and molecular ions from plant-extracted dyes. The results of TOF-SIMS, XPS and FTIR are very useful as a reference database for comparison with data about traditional Korean silk fabric and to provide an understanding of traditional dyeing materials. Therefore, this study shows that surface techniques are useful for micro-destructive analysis of plant-extracted dyes and Korean dyed silk fabric. PMID:28809257

  4. Online, Instructional Television and Traditional Delivery: Student Characteristics and Success Factors in Business Statistics

    Science.gov (United States)

    Dotterweich, Douglas P.; Rochelle, Carolyn F.

    2012-01-01

    Distance education has surged in recent years while research on student characteristics and factors leading to successful outcomes has not kept pace. This study examined characteristics of regional university students in undergraduate Business Statistics and factors linked to their success based on three modes of delivery - Online, Instructional…

  5. Combining traditional dietary assessment methods with novel metabolomics techniques: present efforts by the Food Biomarker Alliance

    DEFF Research Database (Denmark)

    Brouwer-Brolsma, Elske M; Brennan, Lorraine; Drevon, Christian A

    2017-01-01

    food metabolomics techniques that allow the quantification of up to thousands of metabolites simultaneously, which may be applied in intervention and observational studies. As biomarkers are often influenced by various other factors than the food under investigation, FoodBAll developed a food intake...... in these metabolomics studies, knowledge about available electronic metabolomics resources is necessary and further developments of these resources are essential. Ultimately, present efforts in this research area aim to advance quality control of traditional dietary assessment methods, advance compliance evaluation...

  6. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach.

    Science.gov (United States)

    Park, Jihong; Seeley, Matthew K; Francom, Devin; Reese, C Shane; Hopkins, J Ty

    2017-12-01

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.

  7. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    Directory of Open Access Journals (Sweden)

    Park Jihong

    2017-12-01

    Full Text Available In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01 and hip joint angle (F1,18 = 5.77, p = 0.03, but did not for the knee joint angle (F1,18 = 0.36, p = 0.56. The functional data analysis, however, found several differences at initial contact (ankle and knee joint, in the mid-stance (each joint and at toe off (ankle. Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1 evaluate the entire data as a function, and (2 detect the location and magnitude of differences within the evaluated function.

  8. Fusing Data Mining, Machine Learning and Traditional Statistics to Detect Biomarkers Associated with Depression

    OpenAIRE

    Dipnall, Joanna F.; Pasco, Julie A.; Berk, Michael; Williams, Lana J.; Dodd, Seetal; Jacka, Felice N.; Meyer, Denny

    2016-01-01

    Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted reg...

  9. An automated technique to identify potential inappropriate traditional Chinese medicine (TCM) prescriptions.

    Science.gov (United States)

    Yang, Hsuan-Chia; Iqbal, Usman; Nguyen, Phung Anh; Lin, Shen-Hsien; Huang, Chih-Wei; Jian, Wen-Shan; Li, Yu-Chuan

    2016-04-01

    Medication errors such as potential inappropriate prescriptions would induce serious adverse drug events to patients. Information technology has the ability to prevent medication errors; however, the pharmacology of traditional Chinese medicine (TCM) is not as clear as in western medicine. The aim of this study was to apply the appropriateness of prescription (AOP) model to identify potential inappropriate TCM prescriptions. We used the association rule of mining techniques to analyze 14.5 million prescriptions from the Taiwan National Health Insurance Research Database. The disease and TCM (DTCM) and traditional Chinese medicine-traditional Chinese medicine (TCMM) associations are computed by their co-occurrence, and the associations' strength was measured as Q-values, which often referred to as interestingness or life values. By considering the number of Q-values, the AOP model was applied to identify the inappropriate prescriptions. Afterwards, three traditional Chinese physicians evaluated 1920 prescriptions and validated the detected outcomes from the AOP model. Out of 1920 prescriptions, 97.1% of positive predictive value and 19.5% of negative predictive value were shown by the system as compared with those by experts. The sensitivity analysis indicated that the negative predictive value could improve up to 27.5% when the model's threshold changed to 0.4. We successfully applied the AOP model to automatically identify potential inappropriate TCM prescriptions. This model could be a potential TCM clinical decision support system in order to improve drug safety and quality of care. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.; Harrington, Heather A.; Stumpf, Michael P. H.; Byrne, Helen M.

    2015-01-01

    mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since

  11. Learning from tradition. Construction techniques and repair of historical buildings in the area of Brescia: some examples

    Directory of Open Access Journals (Sweden)

    Barbara Scala

    2016-06-01

    The essay focuses on the area of Brescia where, in spite of significant transformations over time of buildings and territory, many examples of traditional architecture still exist. The aim of the paper is also to suggest, by presenting some case studies, a strategy for conservation which proposes a dialogue between traditional methods, technological innovations and economic sustainability of interventions. Keywords: Traditional architecture, Brescia, Sustainability, Construction techniques, Protection

  12. RNA interference: a promising technique for the improvement of traditional crops.

    Science.gov (United States)

    Katoch, Rajan; Thakur, Neelam

    2013-03-01

    RNA interference (RNAi) is a homology-dependent gene-silencing technology that involves double-stranded RNA directed against a target gene. This technique has emerged as powerful tool in understanding the functions of a number of genes in recent years. For the improvement in the nutritional status of the plants and reduction in the level of antinutrients, the conventional breeding methods were not completely successful in achieving the tissue-specific regulation of some genes. RNAi has shown successful results in a number of plant species for nutritional improvement, change in morphology and alteration in metabolite synthesis. This technology has been applied mostly in genetic engineering of important crop plants, and till date there are no reports of its application for the improvement of traditional/underutilized crops. In this study, we discuss current knowledge of RNAi function and concept and strategies for the improvement of traditional crops. Practical application. Although RNAi has been extensively used for the improvement of popular crops, no attention has been given for the use of this technology for the improvement of underutilized crops. This study describes the importance of use of this technology for the improvement of underutilized crops.

  13. Characterization of clays used in the fabrication of traditional brazilian ceramic pans: culture and technique

    International Nuclear Information System (INIS)

    Borlini, Monica Castoldi; Aguiar, Mariane Costalonga de; Vieira, Carlos Mauricio Fontes; Monteiro, Sergio Neves

    2009-01-01

    The fabrication process of clay pans in the state of Espirito Santo, southeast of Brazil, is a recognized part of the country's popular culture. In Goiabeiras, a district of the state capital Vitoria, the traditional production of these pans is the source of income for many families. The technique used in these ceramic pans is of indigenous origin, characterized by manual molding, outdoor burning and application of tannin dye. The clay pans are distributed to several Brazilian states and are nowadays conquering the external market. In producing these pans, two types of, yellow and gray, clays are used. The actual source of raw material comes from the deposit of the Mulemba valley, where a concern on the possibility of exhaustion exists. The objective of this study was then to characterize these two types of clays and so contribute to the continuity of traditional clay pan production by knowing the characteristics of the local clays in case of an eventual need for their replacement. Chemical analysis, X-ray diffraction, particle size distribution, plasticity and thermal analysis of the clays were performed. The results showed that the clays are high plasticity kaolinite with considerable amounts of SiO 2 and Al 2 O 3 as well as of alkaline oxides, earth alkaline oxides and Fe 2 O 3 . (author)

  14. Comparison of Student Test Scores in a Coordinate Plane Unit Using Traditional Classroom Techniques Versus Traditional Techniques Coupled with an Ethnomathematics Software at Torch Middle School.

    Science.gov (United States)

    Magallanes, Adriana Moreno

    In response to low achievement in mathematics at a middle school, an ethnomathematic approach was used to teach coordinate planes. Whether there were achievement differences between students taught by the culturally sensitive approach and those taught by a traditional method was studied. Data were collected from the coordinate planes unit…

  15. Prepubertal gonadectomy in cats: different surgical techniques and comparison with gonadectomy at traditional age.

    Science.gov (United States)

    Porters, N; Polis, I; Moons, C; Duchateau, L; Goethals, K; Huyghe, S; de Rooster, H

    2014-09-06

    Feasibility, surgical time and complications of different surgical techniques for prepubertal gonadectomy (PPG; 8-12 weeks of age) in cats were studied and compared to gonadectomy at traditional age (TAG; 6-8 months of age). Kittens were randomly assigned to PPG or TAG. Ovarian pedicle haemostasis for PPG was achieved by ligatures (n=47), vascular clips (n=50), bipolar electrocoagulation (n=50), or pedicle tie (n=50); for TAG (n=34) ligatures were used. In male cats, PPG consisted of closed castration by spermatic cord knot (n=92) or ligature (n=91) while TAG (n=34) was an open castration by spermatic cord knot. A linear (surgical time) and a logistic regression (complications) model were designed. Significance was set at 0.05. For female PPG, clips and coagulation were the fastest procedures; placement of ligatures was most time-consuming. In male PPG, knot placement was significantly faster than ligation. In both sexes, very few intraoperative or wound complications were observed, irrespective of the surgical technique used. Surgical times in females (ligatures) as well as in males (knot) were significantly shorter for PPG than for TAG. PPG was as safe as TAG, yet took less time to perform and did not result in a greater rate of postoperative complications. British Veterinary Association.

  16. CHINESE LACQUER – SHORT OVERVIEW OF TRADITIONAL TECHNIQUES AND MATERIAL CHARACTERISATION FOR CONSERVATION PURPOSES

    Directory of Open Access Journals (Sweden)

    Xin You LIU

    2014-12-01

    Full Text Available Chinese lacquer (urushi is an ancient natural finishing material obtained from the sap of the lacquer trees (Rhus vernicifera. This has been used for millennia to protect and decorate furniture and various artefacts made of wood or other materials. Lacquered objects are important components of Chinese and world cultural heritage and their conservation imposes a good knowledge and understanding of the material and traditional techniques. This paper presents basic information on the Chinese lacquer as material and lacquering techniques in their historic evolution during different dynasties. The experimental part looked at the physical properties of raw urushi lacquer, respectively aspect, solids content and miscibility with different solvents. A limited compatibility with white spirit as potential thinner was demonstrated. Furthermore, the microstructure of the cured film and characteristic chemical features of raw urushi lacquer as liquid and cured film were investigated. The cured film of raw urushi presents a characteristic microstructural pattern. FTIR spectroscopy revealed a partial oxidation and polymerization following processing by Kuromisation. The further curing accentuated these changes, visible as a decrease of hydroxyl absorptions bands (3400 and 1360cm-1 occurring in parallel with an increase of carbonyl band (1740cm-1 . A strong decrease of the absorption band at 1270cm-1 , which nearly disappears for the cured film, was also observed

  17. In vitro evaluation of prosthodontic impression on natural dentition: a comparison between traditional and digital techniques

    Science.gov (United States)

    MALAGUTI, G.; ROSSI, R.; MARZIALI, B.; ESPOSITO, A.; BRUNO, G.; DARIOL, C.; DI FIORE, A.

    2016-01-01

    SUMMARY Objectives The aim of this in vitro study is to evaluate the marginal and internal fit of zirconia core crowns manufactured following different digital and traditional workflows. Methods A 6° taper shoulder prepared abutment tooth was used to produce 20 zirconia core crowns using four different scanning techniques: scanned directly with the extraoral lab scanner, scanned with intraoral scanner, dental impressions using individual dental tray and polyether, dental casts from a polyether impressions. Marginal and internal fits were evaluated with digital photography and the silicone replica method. Results Medium marginal gaps were 76,00 μm ± 28.9 for extraoral lab scanner, 80.50 μm ± 36,2 for intraoral scanner, 88.10 μm ± 34,8 for dental impression scan and 112,4 μm ± 37,2 for dental cast scan. Medium internal gaps were 23.20 μm ± 10,3 for extraoral lab scanner, 16.20 μm ± 8.3 for intraoral scanner, 27.20 μm ± 16.7 for dental impression scan and 30.20 μm ± 12.7 for dental cast scan. Conclusion Internal gap were extensively lower than 70 μm described in literature. Marginal fit was higher than ideal values for all the techniques but within the limit of clinical success. Intraoral scanners obtained the best results for internal gap. PMID:28280529

  18. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  19. Statistical evaluation of recorded knowledge in nuclear and other instrumental analytical techniques

    International Nuclear Information System (INIS)

    Braun, T.

    1987-01-01

    The main points addressed in this study are the following: Statistical distribution patterns of published literature on instrumental analytical techniques 1981-1984; structure of scientific literature and heuristics for identifying active specialities and emerging hot spot research areas in instrumental analytical techniques; growth and growth rates of the literature in some of the identified hot research areas; quality and quantity in instrumental analytical research output. (orig.)

  20. Improvement of traditional local rice varieties through induced mutations using nuclear techniques

    International Nuclear Information System (INIS)

    Pham Van Ro; Do Huu At

    2001-01-01

    'Improvement of local rice varieties for high yield, resistance to disease and insect pests (brown plant hopper and rice blast) and export quality through induced mutations for the Mekong Delta' started in 1993. After six years, it showed effecting on the field in the MD as well as at the south of Vietnam. TNDB-100 manifest very wide adaptation and yield stable variety. THDB is suitable for deepwater rice region, coastal area, where rice cultivation effected by acid sulphate and salinity conditions. Both varieties are good example for the method. Thank to good Co-operation from extension center from provinces, hundred classes of extension were organized to recommend to the farmers. And thank to the strongly supporting from IAEA so that nearly 400,000 ha of TNDB-100 occupied at the south of Vietnam as well as nearly 15,000 ha of THDB grown in the coastal as well as rainfed lowland rice areas at the South of Vietnam. To continue the rice improvement by this technique, seeds of six traditional local varieties were exposed under different dose of gamma rays to create new mutants. At present day hundred improved breeding lines were selected, a dozen of uniform lines were isolated and entranced the yield trail as well as regional testing program. From these improved varieties would be selected to contribute to the rice cultivation at the south of Vietnam in the next years. (author)

  1. Fungal disease detection in plants: Traditional assays, novel diagnostic techniques and biosensors.

    Science.gov (United States)

    Ray, Monalisa; Ray, Asit; Dash, Swagatika; Mishra, Abtar; Achary, K Gopinath; Nayak, Sanghamitra; Singh, Shikha

    2017-01-15

    Fungal diseases in commercially important plants results in a significant reduction in both quality and yield, often leading to the loss of an entire plant. In order to minimize the losses, it is essential to detect and identify the pathogens at an early stage. Early detection and accurate identification of pathogens can control the spread of infection. The present article provides a comprehensive overview of conventional methods, current trends and advances in fungal pathogen detection with an emphasis on biosensors. Traditional techniques are the "gold standard" in fungal detection which relies on symptoms, culture-based, morphological observation and biochemical identifications. In recent times, with the advancement of biotechnology, molecular and immunological approaches have revolutionized fungal disease detection. But the drawback lies in the fact that these methods require specific and expensive equipments. Thus, there is an urgent need for rapid, reliable, sensitive, cost effective and easy to use diagnostic methods for fungal pathogen detection. Biosensors would become a promising and attractive alternative, but they still have to be subjected to some modifications, improvements and proper validation for on-field use. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Improvement of traditional local rice varieties through induced mutations using nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pham Van Ro; Do Huu At [Cuu Long Delta Rice Research Institute (Viet Nam)

    2001-03-01

    'Improvement of local rice varieties for high yield, resistance to disease and insect pests (brown plant hopper and rice blast) and export quality through induced mutations for the Mekong Delta' started in 1993. After six years, it showed effecting on the field in the MD as well as at the south of Vietnam. TNDB-100 manifest very wide adaptation and yield stable variety. THDB is suitable for deepwater rice region, coastal area, where rice cultivation effected by acid sulphate and salinity conditions. Both varieties are good example for the method. Thank to good Co-operation from extension center from provinces, hundred classes of extension were organized to recommend to the farmers. And thank to the strongly supporting from IAEA so that nearly 400,000 ha of TNDB-100 occupied at the south of Vietnam as well as nearly 15,000 ha of THDB grown in the coastal as well as rainfed lowland rice areas at the South of Vietnam. To continue the rice improvement by this technique, seeds of six traditional local varieties were exposed under different dose of gamma rays to create new mutants. At present day hundred improved breeding lines were selected, a dozen of uniform lines were isolated and entranced the yield trail as well as regional testing program. From these improved varieties would be selected to contribute to the rice cultivation at the south of Vietnam in the next years. (author)

  3. Microbial composition of the Korean traditional food "kochujang" analyzed by a massive sequencing technique.

    Science.gov (United States)

    Nam, Young-Do; Park, So-lim; Lim, Seong-Il

    2012-04-01

    Kochujang is a traditional Korean fermented food that is made with red pepper, glutinous rice, salt, and soybean. Kochujang is fermented by naturally occurring microorganisms through which it obtains various health-promoting properties. In this study, the bacterial diversities of 9 local and 2 commercial brands of kochujang were analyzed with a barcoded pyrosequencing technique targeting the hyper-variable regions V1/V2 of the 16S rRNA gene. Through the analysis of 13524 bacterial pyrosequences, 223 bacterial species were identified, most of which converged on the phylum Firmicutes (average 93.1%). All of the kochujang samples were largely populated (>90.9% of abundance) by 12 bacterial families, and Bacillaceae showed the highest abundance in all but one sample. Bacillus subtilis and B. licheniformis were the most dominant bacterial species and were broadly distributed among the kochujang samples. Each sample contained a high abundance of region-specific bacterial species, such as B. sonorensis, B. pumilus, Weissella salipiscis, and diverse unidentified Bacillus species. Phylotype- and phylogeny-based community comparison analysis showed that the microbial communities of the two commercial brands were different from those of the local brands. Moreover, each local brand kochujang sample had region-specific microbial community reflecting the manufacturing environment. © 2012 Institute of Food Technologists®

  4. Efficacy of proprioceptive neuromuscular facilitation techniques versus traditional prosthetic training for improving ambulatory function in transtibial amputees

    OpenAIRE

    Pallavi Sahay, MPT; Santosh Kr. Prasad, MSc; Shahnawaz Anwer, MPT; P.K. Lenka, PhD; Ratnesh Kumar, MS

    2014-01-01

    The objective of this randomized controlled trial was to evaluate the efficacy of proprioceptive neuromuscular facilitation (PNF) techniques in comparison to traditional prosthetic training (TPT) in improving ambulatory function in transtibial amputees. Thirty study participants (19 men and 11 women) with unilateral transtibial amputation participated in the study. They were randomly allocated to either the traditional training group (i.e., TPT) (n = 15) or the PNF training group (n = 15). Th...

  5. Studying Student Benefits of Assigning a Service-Learning Project Compared to a Traditional Final Project in a Business Statistics Class

    Science.gov (United States)

    Phelps, Amy L.; Dostilio, Lina

    2008-01-01

    The present study addresses the efficacy of using service-learning methods to meet the GAISE guidelines (http://www.amstat.org/education/gaise/GAISECollege.htm) in a second business statistics course and further explores potential advantages of assigning a service-learning (SL) project as compared to the traditional statistics project assignment.…

  6. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  7. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  8. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  9. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  10. Curve fitting and modeling with splines using statistical variable selection techniques

    Science.gov (United States)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  11. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    International Nuclear Information System (INIS)

    Ren, Qingguo; Dewan, Sheilesh Kumar; Li, Ming; Li, Jianying; Mao, Dingbiao; Wang, Zhenglei; Hua, Yanqing

    2012-01-01

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI vol ) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  12. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  13. The traditional irrigation technique of Lake Garda lemon--houses (Northern Italy)

    Science.gov (United States)

    Barontini, Stefano; Vitale, Nicola; Fausti, Federico; Bettoni, Barbara; Bonati, Sara; Peli, Marco; Pietta, Antonella; Tononi, Marco; Ranzi, Roberto

    2016-04-01

    Between 16th and 19th centuries the North-Western side of Lake Garda was seat of an important district which, at the time of its maximum splendour between 18th and 19th centuries, produced and exported lemons and citrus even toward the Northern Europe and the Russia. The limonaie del Garda (Lake-Garda lemon-houses), the local name of the citrus orchards, were settled on terraces built on steep slopes, with landfill taken from the Eastern side of the lake, and closed by greenhouses during late autumn and winter in order to protect the cultivations. The terraces were built nearby streams, they were South-Eastern exposed and protected by walls from the cold winds. Thanks in fact to the Lake Garda microclimate, lemon trees were not cultivated in pots, as in the typical orangeries of mid-latitudes Europe, but directly in the soil. Here the citrus cultivation technique reached a remarkably high degree of standardisation, with local cultivar as the Madernino or lemon from Maderno, and it involved, as in modern industrial districts, all the surrounding land in order to satisfy the needing of required materials to build the terraces, the walls, the greenhouses and the wooden frames to hold the branches laden with fruits. Due to the great water requirement of lemon trees during summer, which is estimated to range from 150 to 300 ℓ every ten days, the water management played a key role in the cultivation technique. The traditional irrigation technique was standardized as well. During our surveys, we observed that most of the lemon-houses still conserve little stone flumes along the walls upslope to the terraces, with spillways every adult tree, i.e. about every 4 m. The flumes were filled with water taken from an upstream reservoir, built nearby a stream. The spillways were activated with a backwater obtained by means of a sand bag placed within the flume, just downstream to the spillway itself. In order to avoid any excavation, spilled water was driven to the base of each

  14. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  15. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  16. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    Science.gov (United States)

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  18. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    International Nuclear Information System (INIS)

    Park, Jinyong; Balasingham, P.; McKenna, Sean Andrew; Kulatilake, Pinnaduwa H. S. W.

    2004-01-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  19. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  20. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  1. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  2. Connection between perturbation theory, projection-operator techniques, and statistical linearization for nonlinear systems

    International Nuclear Information System (INIS)

    Budgor, A.B.; West, B.J.

    1978-01-01

    We employ the equivalence between Zwanzig's projection-operator formalism and perturbation theory to demonstrate that the approximate-solution technique of statistical linearization for nonlinear stochastic differential equations corresponds to the lowest-order β truncation in both the consolidated perturbation expansions and in the ''mass operator'' of a renormalized Green's function equation. Other consolidated equations can be obtained by selectively modifying this mass operator. We particularize the results of this paper to the Duffing anharmonic oscillator equation

  3. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  4. Contribution of analytical nuclear techniques in the reconstruction of the Brazilian pre-history analysing archaeological ceramics of Tupiguarani tradition

    International Nuclear Information System (INIS)

    Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida

    2011-01-01

    Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, k 0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)

  5. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  6. Comparison of a novel real-time SonixGPS needle-tracking ultrasound technique with traditional ultrasound for vascular access in a phantom gel model.

    Science.gov (United States)

    Kopac, Daniel S; Chen, Jerry; Tang, Raymond; Sawka, Andrew; Vaghadia, Himat

    2013-09-01

    Ultrasound-guided percutaneous vascular access for endovascular procedures is well established in surgical practice. Despite this, rates of complications from venous and arterial access procedures remain a significant cause of morbidity. We hypothesized that the use of a new technique of vascular access using an ultrasound with a novel needle-guidance positioning system (GPS) would lead to improved success rates of vascular puncture for both in-plane and out-of-plane techniques compared with traditional ultrasound. A prospective, randomized crossover study of medical students from all years of medical school was conducted using a phantom gel model. Each medical student performed three ultrasound-guided punctures with each of the four modalities (in-plane no GPS, in-plane with GPS, out-of-plane no GPS, out-of-plane with GPS) for a total of 12 attempts. The success or failure was judged by the ability to aspirate a simulated blood solution from the model. The time to successful puncture was also recorded. A poststudy validated NASA Task Load Index workload questionnaire was conducted to assess the student's perceptions of the two different techniques. A total of 30 students completed the study. There was no significant difference seen in the mean times of vascular access for each of the modalities. Higher success rates for vascular access using the GPS for both the in-plane (94% vs 91%) and the out-of-plane (86% vs 70%) views were observed; however, this was not statistically significant. The students perceived the mental demand (median 12.0 vs 14.00; P = .035) and effort to be lower (mean 11.25 vs 14.00; P = .044) as well as the performance to be higher (mean 15.50 vs 14.00; P = .041) for the GPS vs the traditional ultrasound-guided technique. Students also perceived their ability to access vessels increased with the aid of the GPS (7.00 vs 6.50; P = .007). The majority of students expressed a preference for GPS (26/30, 87%) as opposed to the traditional counterpart

  7. A comparison of linear and nonlinear statistical techniques in performance attribution.

    Science.gov (United States)

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  8. ISOLATED SPEECH RECOGNITION SYSTEM FOR TAMIL LANGUAGE USING STATISTICAL PATTERN MATCHING AND MACHINE LEARNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    VIMALA C.

    2015-05-01

    Full Text Available In recent years, speech technology has become a vital part of our daily lives. Various techniques have been proposed for developing Automatic Speech Recognition (ASR system and have achieved great success in many applications. Among them, Template Matching techniques like Dynamic Time Warping (DTW, Statistical Pattern Matching techniques such as Hidden Markov Model (HMM and Gaussian Mixture Models (GMM, Machine Learning techniques such as Neural Networks (NN, Support Vector Machine (SVM, and Decision Trees (DT are most popular. The main objective of this paper is to design and develop a speaker-independent isolated speech recognition system for Tamil language using the above speech recognition techniques. The background of ASR system, the steps involved in ASR, merits and demerits of the conventional and machine learning algorithms and the observations made based on the experiments are presented in this paper. For the above developed system, highest word recognition accuracy is achieved with HMM technique. It offered 100% accuracy during training process and 97.92% for testing process.

  9. Statistical uncertainty of extreme wind storms over Europe derived from a probabilistic clustering technique

    Science.gov (United States)

    Walz, Michael; Leckebusch, Gregor C.

    2016-04-01

    Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.

  10. Using Alternative Teaching Techniques To Enhance Student Performance in the Traditional Introductory Public Relations Course.

    Science.gov (United States)

    Lubbers, Charles A.

    2002-01-01

    Examines the value of two alternative tools as supplements for the traditional introduction to public relations course. Considers the usage of a study manual, usage of televised review sessions, year in school and major status. Indicates that all four variables are significantly correlated with class performance, but that the study manual explains…

  11. The use of adaptive statistical iterative reconstruction (ASiR) technique in evaluation of patients with cervical spine trauma: impact on radiation dose reduction and image quality.

    Science.gov (United States)

    Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan

    2016-01-01

    The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.

  12. Application of statistical downscaling technique for the production of wine grapes (Vitis vinifera L.) in Spain

    Science.gov (United States)

    Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.

    2012-04-01

    Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.

  13. Leak detection and localization in a pipeline system by application of statistical analysis techniques

    International Nuclear Information System (INIS)

    Fukuda, Toshio; Mitsuoka, Toyokazu.

    1985-01-01

    The detection of leak in piping system is an important diagnostic technique for facilities to prevent accidents and to take maintenance measures, since the occurrence of leak lowers productivity and causes environmental destruction. As the first step, it is necessary to detect the occurrence of leak without delay, and as the second step, if the place of leak occurrence in piping system can be presumed, accident countermeasures become easy. The detection of leak by pressure is usually used for detecting large leak. But the method depending on pressure is simple and advantageous, therefore the extension of the detecting technique by pressure gradient method to the detection of smaller scale leak using statistical analysis techniques was examined for a pipeline in steady operation in this study. Since the flow in a pipe irregularly varies during pumping, statistical means is required for the detection of small leak by pressure. The index for detecting leak proposed in this paper is the difference of the pressure gradient at the both ends of a pipeline. The experimental results on water and air in nylon tubes are reported. (Kako, I.)

  14. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  15. Flow prediction models using macroclimatic variables and multivariate statistical techniques in the Cauca River Valley

    International Nuclear Information System (INIS)

    Carvajal Escobar Yesid; Munoz, Flor Matilde

    2007-01-01

    The project this centred in the revision of the state of the art of the ocean-atmospheric phenomena that you affect the Colombian hydrology especially The Phenomenon Enos that causes a socioeconomic impact of first order in our country, it has not been sufficiently studied; therefore it is important to approach the thematic one, including the variable macroclimates associated to the Enos in the analyses of water planning. The analyses include revision of statistical techniques of analysis of consistency of hydrological data with the objective of conforming a database of monthly flow of the river reliable and homogeneous Cauca. Statistical methods are used (Analysis of data multivariante) specifically The analysis of principal components to involve them in the development of models of prediction of flows monthly means in the river Cauca involving the Lineal focus as they are the model autoregressive AR, ARX and Armax and the focus non lineal Net Artificial Network.

  16. Statistics and error considerations at the application of SSND T-technique in radon measurement

    International Nuclear Information System (INIS)

    Jonsson, G.

    1993-01-01

    Plastic films are used for the detection of alpha particles from disintegrating radon and radon daughter nuclei. After etching there are tracks (cones) or holes in the film as a result of the exposure. The step from a counted number of tracks/holes per surface unit of the film to a reliable value of the radon and radon daughter level is surrounded by statistical considerations of different nature. Some of them are the number of counted tracks, the length of the time of exposure, the season of the time of exposure, the etching technique and the method of counting the tracks or holes. The number of background tracks of an unexposed film increases the error of the measured radon level. Some of the mentioned effects of statistical nature will be discussed in the report. (Author)

  17. Dynamic re-weighted total variation technique and statistic Iterative reconstruction method for x-ray CT metal artifact reduction

    Science.gov (United States)

    Peng, Chengtao; Qiu, Bensheng; Zhang, Cheng; Ma, Changyu; Yuan, Gang; Li, Ming

    2017-07-01

    Over the years, the X-ray computed tomography (CT) has been successfully used in clinical diagnosis. However, when the body of the patient to be examined contains metal objects, the image reconstructed would be polluted by severe metal artifacts, which affect the doctor's diagnosis of disease. In this work, we proposed a dynamic re-weighted total variation (DRWTV) technique combined with the statistic iterative reconstruction (SIR) method to reduce the artifacts. The DRWTV method is based on the total variation (TV) and re-weighted total variation (RWTV) techniques, but it provides a sparser representation than TV and protects the tissue details better than RWTV. Besides, the DRWTV can suppress the artifacts and noise, and the SIR convergence speed is also accelerated. The performance of the algorithm is tested on both simulated phantom dataset and clinical dataset, which are the teeth phantom with two metal implants and the skull with three metal implants, respectively. The proposed algorithm (SIR-DRWTV) is compared with two traditional iterative algorithms, which are SIR and SIR constrained by RWTV regulation (SIR-RWTV). The results show that the proposed algorithm has the best performance in reducing metal artifacts and protecting tissue details.

  18. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  19. Assessment of Surface Water Quality Using Multivariate Statistical Techniques in the Terengganu River Basin

    International Nuclear Information System (INIS)

    Aminu Ibrahim; Hafizan Juahir; Mohd Ekhwan Toriman; Mustapha, A.; Azman Azid; Isiyaka, H.A.

    2015-01-01

    Multivariate Statistical techniques including cluster analysis, discriminant analysis, and principal component analysis/factor analysis were applied to investigate the spatial variation and pollution sources in the Terengganu river basin during 5 years of monitoring 13 water quality parameters at thirteen different stations. Cluster analysis (CA) classified 13 stations into 2 clusters low polluted (LP) and moderate polluted (MP) based on similar water quality characteristics. Discriminant analysis (DA) rendered significant data reduction with 4 parameters (pH, NH 3 -NL, PO 4 and EC) and correct assignation of 95.80 %. The PCA/ FA applied to the data sets, yielded in five latent factors accounting 72.42 % of the total variance in the water quality data. The obtained varifactors indicate that parameters in charge for water quality variations are mainly related to domestic waste, industrial, runoff and agricultural (anthropogenic activities). Therefore, multivariate techniques are important in environmental management. (author)

  20. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  1. The Macuti House, Traditional Building Techniques and Sustainable Development in Ilha de Moçambique

    DEFF Research Database (Denmark)

    Sollien, Silje Erøy

    2011-01-01

    This paper is part of the initial phase of PhD research focusing on conservation of the macuti architecture in the World Heritage City of Ilha de Moçambique. It questions how initiatives to preserve traditional ways of building in this area, of which parts could be described as an urban slum, may...... subordination, poverty and low social status, such initiatives need to be part of a wider programme of strengthening cultural and social capital among the population, avoiding division into tangible and intangible heritage management, and include broad ecological and socio-economic considerations....

  2. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.

    2015-12-16

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  3. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Science.gov (United States)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  4. Predicting Acquisition of Learning Outcomes: A Comparison of Traditional and Activity-Based Instruction in an Introductory Statistics Course.

    Science.gov (United States)

    Geske, Jenenne A.; Mickelson, William T.; Bandalos, Deborah L.; Jonson, Jessica; Smith, Russell W.

    The bulk of experimental research related to reforms in the teaching of statistics concentrates on the effects of alternative teaching methods on statistics achievement. This study expands on that research by including an examination of the effects of instructor and the interaction between instructor and method on achievement as well as attitudes,…

  5. Use of non-standardised micro-destructive techniques in the characterization of traditional construction materials

    Science.gov (United States)

    Ioannou, Ioannis; Theodoridou, Magdalini; Modestou, Sevasti; Fournari, Revecca; Dagrain, Fabrice

    2013-04-01

    The characterization of material properties and the diagnosis of their state of weathering and conservation are three of the most important steps in the field of cultural heritage preservation. Several standardised experimental methods exist, especially for determining the material properties and their durability. However, they are limited in their application by the required size of test specimens and the controlled laboratory conditions needed to undertake the tests; this is especially true when the materials under study constitute immovable parts of heritage structures. The current use of other advanced methods of analysis, such as imaging techniques, in the aforementioned field of research offers invaluable results. However, these techniques may not always be accessible to the wider research community due to their complex nature and relatively high cost of application. This study presents innovative applications of two recently developed cutting techniques; the portable Drilling Resistance Measuring System (DRMS) and the scratch tool. Both methods are defined as micro-destructive, since they only destroy a very small portion of sample material. The general concept of both methods lies within the forces needed to cut a material by linear (scratch tool) or rotational (DRMS) cutting action; these forces are related to the mechanical properties of the material and the technological parameters applied on the tool. Therefore, for a given testing configuration, the only parameter influencing the forces applied is the strength of the material. These two techniques have been used alongside a series of standardised laboratory tests aiming at the correlation of various stone properties (density, porosity, dynamic elastic modulus and uniaxial compressive strength). The results prove the potential of both techniques in assessing the uniaxial compressive strength of stones. The scratch tool has also been used effectively to estimate the compressive strength of mud bricks. It

  6. A Modified Moore Approach to Teaching Mathematical Statistics: An Inquiry Based Learning Technique to Teaching Mathematical Statistics

    Science.gov (United States)

    McLoughlin, M. Padraig M. M.

    2008-01-01

    The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…

  7. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  8. “Spray Technique: Tracing the Sketch Traditions of Limestone Cave in Lenggong, Perak”

    Directory of Open Access Journals (Sweden)

    Yahaya Fatan Hamamah

    2015-01-01

    Full Text Available Archaeological research provides the widest opportunity for researchers to analyse various aspects and disciplines appropriate to the subject and the object of choice. Subject and object selection is the work of exploration artefacts found in particular sites and archaeological heritage. Exploration and excavation on of a world heritage site such as Lenggong enables researchers to uncover various archaeological artefacts that are rich and meaningful. To find evidence of the strength and benefits of an artefact, further studies on each artefact should be carried out continuously. This essay will track the wisdom of the ancient artists use to produce paintings in a limestone cave in Lenggong, Perak, using spray techniques. Some artefacts that are identified as cave paintings show a very interesting sketch technique that are unique and special. This essay will also examine some of the cave paintings in other caves in Perak and also other caves in several countries as comparison. Studies involving cave paintings in Malaysia are new compared to Western countries. Thus, the study of one of the technique which is spray technique can open the eyes of the audience to acknowledge and recognise the ancient heritage. It also hoped that this study is able to increase the body of knowledge that goes beyond the boundaries of the arts district and the country.

  9. A Retrospective Analysis of Hemostatic Techniques in Primary Total Knee Arthroplasty: Traditional Electrocautery, Bipolar Sealer, and Argon Beam Coagulation.

    Science.gov (United States)

    Rosenthal, Brett D; Haughom, Bryan D; Levine, Brett R

    2016-01-01

    In this retrospective cohort study of 280 primary total knee arthroplasties, clinical outcomes relevant to hemostasis were compared by electrocautery type: traditional electrocautery (TE), bipolar sealer (BS), and argon beam coagulation (ABC). Age, sex, and preoperative diagnosis were not significantly different among the TE, BS, and ABC cohorts. The 3 hemostasis systems were statistically equivalent with respect to estimated blood loss. Wound drainage during the first 48 hours after surgery was equivalent between the BS and ABC cohorts but less for the TE cohort. Transfusion requirements were not significantly different among the cohorts. The 3 hemostasis systems were statistically equivalent with respect to mean change in hemoglobin level during the early postoperative period (levels were measured on postoperative day 1 and on discharge). As BS and ABC are clinically equivalent to TE, their increased cost may not be justified.

  10. Coronary artery plaques: Cardiac CT with model-based and adaptive-statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Scheffel, Hans; Stolzmann, Paul; Schlett, Christopher L.; Engel, Leif-Christopher; Major, Gyöngi Petra; Károlyi, Mihály; Do, Synho; Maurovich-Horvat, Pál; Hoffmann, Udo

    2012-01-01

    Objectives: To compare image quality of coronary artery plaque visualization at CT angiography with images reconstructed with filtered back projection (FBP), adaptive statistical iterative reconstruction (ASIR), and model based iterative reconstruction (MBIR) techniques. Methods: The coronary arteries of three ex vivo human hearts were imaged by CT and reconstructed with FBP, ASIR and MBIR. Coronary cross-sectional images were co-registered between the different reconstruction techniques and assessed for qualitative and quantitative image quality parameters. Readers were blinded to the reconstruction algorithm. Results: A total of 375 triplets of coronary cross-sectional images were co-registered. Using MBIR, 26% of the images were rated as having excellent overall image quality, which was significantly better as compared to ASIR and FBP (4% and 13%, respectively, all p < 0.001). Qualitative assessment of image noise demonstrated a noise reduction by using ASIR as compared to FBP (p < 0.01) and further noise reduction by using MBIR (p < 0.001). The contrast-to-noise-ratio (CNR) using MBIR was better as compared to ASIR and FBP (44 ± 19, 29 ± 15, 26 ± 9, respectively; all p < 0.001). Conclusions: Using MBIR improved image quality, reduced image noise and increased CNR as compared to the other available reconstruction techniques. This may further improve the visualization of coronary artery plaque and allow radiation reduction.

  11. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Jack, E-mail: jack.wallace@ce.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Champagne, Pascale, E-mail: champagne@civil.queensu.ca [Department of Civil Engineering, Queen’s University, Ellis Hall, 58 University Avenue, Kingston, Ontario K7L 3N6 (Canada); Monnier, Anne-Charlotte, E-mail: anne-charlotte.monnier@insa-lyon.fr [National Institute for Applied Sciences – Lyon, 20 Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  12. The 'French Fry' VAC technique: hybridisation of traditional open wound NPWT with closed incision NPWT.

    Science.gov (United States)

    Chopra, Karan; Tadisina, Kashyap K; Singh, Devinder P

    2016-04-01

    Surgical site occurrences (SSO), specifically surgical site infections represent a significant burden in the US health care system. It has been hypothesised that postoperative dressing can help drive down SSO. We describe the successful use of a novel technique combining both closed incision and open negative pressure wound therapy in the management of a high-risk wound associated with lymphoedema of obesity. © 2014 The Authors. International Wound Journal © 2014 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  13. Current applications of molecular imaging and luminescence-based techniques in traditional Chinese medicine.

    Science.gov (United States)

    Li, Jinhui; Wan, Haitong; Zhang, Hong; Tian, Mei

    2011-09-01

    Traditional Chinese medicine (TCM), which is fundamentally different from Western medicine, has been widely investigated using various approaches. Cellular- or molecular-based imaging has been used to investigate and illuminate the various challenges identified and progress made using therapeutic methods in TCM. Insight into the processes of TCM at the cellular and molecular changes and the ability to image these processes will enhance our understanding of various diseases of TCM and will provide new tools to diagnose and treat patients. Various TCM therapies including herbs and formulations, acupuncture and moxibustion, massage, Gua Sha, and diet therapy have been analyzed using positron emission tomography, single photon emission computed tomography, functional magnetic resonance imaging and ultrasound and optical imaging. These imaging tools have kept pace with developments in molecular biology, nuclear medicine, and computer technology. We provide an overview of recent developments in demystifying ancient knowledge - like the power of energy flow and blood flow meridians, and serial naturopathies - which are essential to visually and vividly recognize the body using modern technology. In TCM, treatment can be individualized in a holistic or systematic view that is consistent with molecular imaging technologies. Future studies might include using molecular imaging in conjunction with TCM to easily diagnose or monitor patients naturally and noninvasively. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  14. [Comments on] Statistical techniques for the development and application of SYVAC. (Document by Stephen Howe Ltd.)

    International Nuclear Information System (INIS)

    Beale, E.M.L.

    1983-05-01

    The Department of the Environment has embarked on a programme to develop computer models to help with assessment of sites suitable for the disposal of nuclear wastes. The first priority is to produce a system, based on the System Variability Analysis Code (SYVAC) obtained from Atomic Energy of Canada Ltd., suitable for assessing radioactive waste disposal in land repositories containing non heat producing wastes from typical UK sources. The requirements of the SYVAC system development were so diverse that each portion of the development was contracted to a different company. Scicon are responsible for software coordination, system integration and user interface. Their present report contains comments on 'Statistical techniques for the development and application of SYVAC'. (U.K.)

  15. Development of statistical and analytical techniques for use in national quality control schemes for steroid hormones

    International Nuclear Information System (INIS)

    Wilson, D.W.; Gaskell, S.J.; Fahmy, D.R.; Joyce, B.G.; Groom, G.V.; Griffiths, K.; Kemp, K.W.; Nix, A.B.J.; Rowlands, R.J.

    1979-01-01

    Adopting the rationale that the improvement of intra-laboratory performance of immunometric assays will enable the assessment of national QC schemes to become more meaningful, the group of participating laboratories has developed statistical and analytical techniques for the improvement of accuracy, precision and monitoring of error for the determination of steroid hormones. These developments are now described and their relevance to NQC schemes discussed. Attention has been focussed on some of the factors necessary for improving standards of quality in immunometric assays and their relevance to laboratories participating in NQC schemes as described. These have included the 'accuracy', precision and robustness of assay procedures as well as improved methods for internal quality control. (Auth.)

  16. Water quality, Multivariate statistical techniques, submarine out fall, spatial variation, temporal variation

    International Nuclear Information System (INIS)

    Garcia, Francisco; Palacio, Carlos; Garcia, Uriel

    2012-01-01

    Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.

  17. Application of multivariate statistical techniques in the water quality assessment of Danube river, Serbia

    Directory of Open Access Journals (Sweden)

    Voza Danijela

    2015-12-01

    Full Text Available The aim of this article is to evaluate the quality of the Danube River in its course through Serbia as well as to demonstrate the possibilities for using three statistical methods: Principal Component Analysis (PCA, Factor Analysis (FA and Cluster Analysis (CA in the surface water quality management. Given that the Danube is an important trans-boundary river, thorough water quality monitoring by sampling at different distances during shorter and longer periods of time is not only ecological, but also a political issue. Monitoring was carried out at monthly intervals from January to December 2011, at 17 sampling sites. The obtained data set was treated by multivariate techniques in order, firstly, to identify the similarities and differences between sampling periods and locations, secondly, to recognize variables that affect the temporal and spatial water quality changes and thirdly, to present the anthropogenic impact on water quality parameters.

  18. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  19. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    International Nuclear Information System (INIS)

    Sorelli, Luca; Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-01-01

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites

  20. Wrist muscle activity of khatrah approach in Mameluke technique using traditional bow archery

    Science.gov (United States)

    Ariffin, Muhammad Shahimi; Rambely, Azmin Sham; Ariff, Noratiqah Mohd

    2018-04-01

    An investigation of khatrah technique in archery was carried out. An electromyography (EMG) experiment was conducted towards six wrist muscles which are flexor carpi radialis, extensor carpi ulnaris and extensor digitorum communis for both arms. The maximum voluntary contraction (MVC) and activity data were recorded. The bow arm produced a higher muscle force compared to draw arm muscles during release phase. However, the muscle forces produced by bow arm had a consistency in term of pattern throughout the phases. In conclusion, the forces generated by the professional archer produced a force benchmark at the wrist joint to alleviate the risk of injury.

  1. Jsub(Ic)-testing of A-533 B - statistical evaluation of some different testing techniques

    International Nuclear Information System (INIS)

    Nilsson, F.

    1978-01-01

    The purpose of the present study was to compare statistically some different methods for the evaluation of fracture toughness of the nuclear reactor material A-533 B. Since linear elastic fracture mechanics is not applicable to this material at the interesting temperature (275 0 C), the so-called Jsub(Ic) testing method was employed. Two main difficulties are inherent in this type of testing. The first one is to determine the quantity J as a function of the deflection of the three-point bend specimens used. Three different techniques were used, the first two based on the experimentally observed input of energy to the specimen and the third employing finite element calculations. The second main problem is to determine the point when crack growth begins. For this, two methods were used, a direct electrical method and the indirect R-curve method. A total of forty specimens were tested at two laboratories. No statistically significant different results were obtained from the respective laboratories. The three methods of calculating J yielded somewhat different results, although the discrepancy was small. Also the two methods of determination of the growth initiation point yielded consistent results. The R-curve method, however, exhibited a larger uncertainty as measured by the standard deviation. The resulting Jsub(Ic) value also agreed well with earlier presented results. The relative standard deviation was of the order of 25%, which is quite small for this type of experiment. (author)

  2. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    Science.gov (United States)

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  3. Virtual Screening Techniques to Probe the Antimalarial Activity of some Traditionally Used Phytochemicals.

    Science.gov (United States)

    Shibi, Indira G; Aswathy, Lilly; Jisha, Radhakrishnan S; Masand, Vijay H; Gajbhiye, Jayant M

    2016-01-01

    Malaria parasites show resistance to most of the antimalarial drugs and hence developing antimalarials which can act on multitargets rather than a single target will be a promising strategy of drug design. Here we report a new approach by which virtual screening of 292 unique phytochemicals present in 72 traditionally important herbs is used for finding out inhibitors of plasmepsin-2 and falcipain-2 for antimalarial activity against P. falciparum. Initial screenings of the selected molecules by Random Forest algorithm model of Weka using the bioassay datasets AID 504850 and AID 2302 screened 120 out of the total 292 phytochemicals to be active against the targets. Toxtree scan cautioned 21 compounds to be either carcinogenic or mutagenic and were thus removed for further analysis. Out of the remaining 99 compounds, only 46 compounds offered drug-likeness as per the 'rule of five' criteria. Out of ten antimalarial drug targets, only two target proteins such as 3BPF and 3PNR of falcipain-2 and 1PFZ and 2BJU of plasmepsin-2 are selected as targets. The potential binding of the selected 46 compounds to the active sites of these four targets was analyzed using MOE software. The docked conformations and the interactions with the binding pocket residues of the target proteins were understood by 'Ligplot' analysis. It has been found that 8 compounds are dual inhibitors of falcipain-2 and plasmepsin-2, with the best binding energies. Compound 117 (6aR, 12aS)-12a-Hydroxy-9-methoxy-2,3-dimethylenedioxy-8-prenylrotenone (Usaratenoid C) present in the plant Millettia usaramensis showed maximum molecular docking score.

  4. Comparison of Fit of Dentures Fabricated by Traditional Techniques Versus CAD/CAM Technology.

    Science.gov (United States)

    McLaughlin, J Bryan; Ramos, Van; Dickinson, Douglas P

    2017-11-14

    To compare the shrinkage of denture bases fabricated by three methods: CAD/CAM, compression molding, and injection molding. The effect of arch form and palate depth was also tested. Nine titanium casts, representing combinations of tapered, ovoid, and square arch forms and shallow, medium, and deep palate depths, were fabricated using electron beam melting (EBM) technology. For each base fabrication method, three poly(vinyl siloxane) impressions were made from each cast, 27 dentures for each method. Compression-molded dentures were fabricated using Lucitone 199 poly methyl methacrylate (PMMA), and injection molded dentures with Ivobase's Hybrid Pink PMMA. For CAD/CAM, denture bases were designed and milled by Avadent using their Light PMMA. To quantify the space between the denture and the master cast, silicone duplicating material was placed in the intaglio of the dentures, the titanium master cast was seated under pressure, and the silicone was then trimmed and recovered. Three silicone measurements per denture were recorded, for a total of 243 measurements. Each silicone measurement was weighed and adjusted to the surface area of the respective arch, giving an average and standard deviation for each denture. Comparison of manufacturing methods showed a statistically significant difference (p = 0.0001). Using a ratio of the means, compression molding had on average 41% to 47% more space than injection molding and CAD/CAM. Comparison of arch/palate forms showed a statistically significant difference (p = 0.023), with shallow palate forms having more space with compression molding. The ovoid shallow form showed CAD/CAM and compression molding had more space than injection molding. Overall, injection molding and CAD/CAM fabrication methods produced equally well-fitting dentures, with both having a better fit than compression molding. Shallow palates appear to be more affected by shrinkage than medium or deep palates. Shallow ovoid arch forms appear to benefit from

  5. Improved Statistical Fault Detection Technique and Application to Biological Phenomena Modeled by S-Systems.

    Science.gov (United States)

    Mansouri, Majdi; Nounou, Mohamed N; Nounou, Hazem N

    2017-09-01

    In our previous work, we have demonstrated the effectiveness of the linear multiscale principal component analysis (PCA)-based moving window (MW)-generalized likelihood ratio test (GLRT) technique over the classical PCA and multiscale principal component analysis (MSPCA)-based GLRT methods. The developed fault detection algorithm provided optimal properties by maximizing the detection probability for a particular false alarm rate (FAR) with different values of windows, and however, most real systems are nonlinear, which make the linear PCA method not able to tackle the issue of non-linearity to a great extent. Thus, in this paper, first, we apply a nonlinear PCA to obtain an accurate principal component of a set of data and handle a wide range of nonlinearities using the kernel principal component analysis (KPCA) model. The KPCA is among the most popular nonlinear statistical methods. Second, we extend the MW-GLRT technique to one that utilizes exponential weights to residuals in the moving window (instead of equal weightage) as it might be able to further improve fault detection performance by reducing the FAR using exponentially weighed moving average (EWMA). The developed detection method, which is called EWMA-GLRT, provides improved properties, such as smaller missed detection and FARs and smaller average run length. The idea behind the developed EWMA-GLRT is to compute a new GLRT statistic that integrates current and previous data information in a decreasing exponential fashion giving more weight to the more recent data. This provides a more accurate estimation of the GLRT statistic and provides a stronger memory that will enable better decision making with respect to fault detection. Therefore, in this paper, a KPCA-based EWMA-GLRT method is developed and utilized in practice to improve fault detection in biological phenomena modeled by S-systems and to enhance monitoring process mean. The idea behind a KPCA-based EWMA-GLRT fault detection algorithm is to

  6. Radiation dose reduction on multidetector abdominal CT using adaptive statistical iterative reconstruction technique in children

    International Nuclear Information System (INIS)

    Zhang Qifeng; Peng Yun; Duan Xiaomin; Sun Jihang; Yu Tong; Han Zhonglong

    2013-01-01

    Objective: To investigate the feasibility to reduce radiation doses on pediatric multidetector abdominal CT using the adaptive statistical iterative reconstruction technique (ASIR) associated with automated tube current modulation technique (ATCM). Methods: Thirty patients underwent abdominal CT with ATCM and the follow-up scan with ATCM cooperated with 40% ASIR. ATCM was used with age dependent noise index (NI) settings: NI = 9 for 0-5 year old and NI = 11 for > 5 years old for simple ATCM group, NI = 11 for 0-5 year old and NI = 15 for > 5 years old for ATCM cooperated with 40% ASIR group (AISR group). Two radiologists independently evaluated images for diagnostic quality and image noise with subjectively image quality score and image noise score using a 5-point scale. Interobserver agreement was assessed by Kappa test. The volume CT dose indexes (CTDIvol) for the two groups were recorded. Statistical significance for the CTDIvol value was analyzed by pair-sample t test. Results: The average CTDIvol for the ASIR group was (1.38 ± 0.64) mGy, about 60% lower than (3.56 ± 1.23) mGy for the simple ATCM group, and the CTDIvol of two groups had statistically significant differences. (t = 33.483, P < 0.05). The subjective image quality scores for the simple ATCM group were 4.43 ± 0.57 and 4.37 ±0.61, Kappa = 0.878, P < 0.01 (ASIR group: 4.70 ± 0.47 and 4.60 ± 0.50, Kappa = 0.783, P < 0.01), by two observers. The image noise score for the simple ATCM group were 4.03 ±0.56 and 3.83 ±0.53, Kappa = 0.572, P < 0.01 (ASIR group: 4.20 ± 0.48 and 4.10 ± 0.48, Kappa = 0.748, P < 0.01), by two observers. All images had acceptable diagnostic image quality. Conclusion: Lower radiation dose can be achieved by elevating NI with ASIR in pediatric CT abdominal studies, while maintaining diagnostically acceptable images. (authors)

  7. Laparoscopic cystostomy in pigs: Technique and comparison with traditional open cystostomy for surgical stress

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    2014-01-01

    Full Text Available Cystostomy is a common procedure in veterinary surgery. We describe a technique for laparoscopic cystostomy (LC group; n = 7 in Bama miniature pigs and compare the surgical stress induced by this procedure to open cystostomy (OC group; n = 7. A three-portal approach was used for laparoscopic cystostomy. First, we placed 2 simple interrupted sutures between the ventral body wall and urinary bladder. Then, a purse-string suture was placed in the urinary bladder wall, approximately 1 cm cranially to the two sutures. A stab incision was made at the center of the purse-string suture and a 12-F Foley catheter advanced into the urinary bladder; the suture was then pulled tightly and tied. Again, two interrupted sutures were placed 1 cm cranially to the catheter, between the ventral body wall and the bladder, to establish cystopexy. The extracorporeal portion of the catheter was fixed to the skin by a finger-trap suture. Blood samples were collected to measure the white blood cell count and serum concentrations of cortisol, interleukin-6, and C-reactive protein; follow-up laparoscopy was performed 1 month after the surgery. Laparoscopic cystotomy was successfully performed in all the pigs; the mean operating time was 43 ± 5 min. The levels of the stress markers reflected a lower stress response for LC than OC. Thus, LC appears to be better than OC both in terms of technique and physiological responses elicited, and may be more suitable than OC in the creation of experimental animal models for investigations on urinary diseases and those requiring diversion of urine flow.

  8. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    Science.gov (United States)

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  9. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae habitat and population densities

    Directory of Open Access Journals (Sweden)

    Khalifa M. Al-Kindi

    2017-08-01

    Full Text Available In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  10. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    Science.gov (United States)

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  11. An Automated Technique to Construct a Knowledge Base of Traditional Chinese Herbal Medicine for Cancers: An Exploratory Study for Breast Cancer.

    Science.gov (United States)

    Nguyen, Phung Anh; Yang, Hsuan-Chia; Xu, Rong; Li, Yu-Chuan Jack

    2018-01-01

    Traditional Chinese Medicine utilization has rapidly increased worldwide. However, there is limited database provides the information of TCM herbs and diseases. The study aims to identify and evaluate the meaningful associations between TCM herbs and breast cancer by using the association rule mining (ARM) techniques. We employed the ARM techniques for 19.9 million TCM prescriptions by using Taiwan National Health Insurance claim database from 1999 to 2013. 364 TCM herbs-breast cancer associations were derived from those prescriptions and were then filtered by their support of 20. Resulting of 296 associations were evaluated by comparing to a gold-standard that was curated information from Chinese-Wikipedia with the following terms, cancer, tumor, malignant. All 14 TCM herbs-breast cancer associations with their confidence of 1% were valid when compared to gold-standard. For other confidences, the statistical results showed consistently with high precisions. We thus succeed to identify the TCM herbs-breast cancer associations with useful techniques.

  12. Multiparametric statistics

    CERN Document Server

    Serdobolskii, Vadim Ivanovich

    2007-01-01

    This monograph presents mathematical theory of statistical models described by the essentially large number of unknown parameters, comparable with sample size but can also be much larger. In this meaning, the proposed theory can be called "essentially multiparametric". It is developed on the basis of the Kolmogorov asymptotic approach in which sample size increases along with the number of unknown parameters.This theory opens a way for solution of central problems of multivariate statistics, which up until now have not been solved. Traditional statistical methods based on the idea of an infinite sampling often break down in the solution of real problems, and, dependent on data, can be inefficient, unstable and even not applicable. In this situation, practical statisticians are forced to use various heuristic methods in the hope the will find a satisfactory solution.Mathematical theory developed in this book presents a regular technique for implementing new, more efficient versions of statistical procedures. ...

  13. Airfoil shape optimization using non-traditional optimization technique and its validation

    Directory of Open Access Journals (Sweden)

    R. Mukesh

    2014-07-01

    Full Text Available Computational fluid dynamics (CFD is one of the computer-based solution methods which is more widely employed in aerospace engineering. The computational power and time required to carry out the analysis increase as the fidelity of the analysis increases. Aerodynamic shape optimization has become a vital part of aircraft design in the recent years. Generally if we want to optimize an airfoil we have to describe the airfoil and for that, we need to have at least hundred points of x and y co-ordinates. It is really difficult to optimize airfoils with this large number of co-ordinates. Nowadays many different schemes of parameter sets are used to describe general airfoil such as B-spline, and PARSEC. The main goal of these parameterization schemes is to reduce the number of needed parameters as few as possible while controlling the important aerodynamic features effectively. Here the work has been done on the PARSEC geometry representation method. The objective of this work is to introduce the knowledge of describing general airfoil using twelve parameters by representing its shape as a polynomial function. And also we have introduced the concept of Genetic Algorithm to optimize the aerodynamic characteristics of a general airfoil for specific conditions. A MATLAB program has been developed to implement PARSEC, Panel Technique, and Genetic Algorithm. This program has been tested for a standard NACA 2411 airfoil and optimized to improve its coefficient of lift. Pressure distribution and co-efficient of lift for airfoil geometries have been calculated using the Panel method. The optimized airfoil has improved co-efficient of lift compared to the original one. The optimized airfoil is validated using wind tunnel data.

  14. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    Science.gov (United States)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  15. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    Science.gov (United States)

    Fenta Mekonnen, Dagnenet; Disse, Markus

    2018-04-01

    Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs) and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i) to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG) and the Statistical Downscaling Model (SDSM), and (ii) to downscale future climate scenarios of precipitation, maximum temperature (Tmax) and minimum temperature (Tmin) of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM) have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase from 0.4 to 4.3

  16. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    Directory of Open Access Journals (Sweden)

    D. Fenta Mekonnen

    2018-04-01

    Full Text Available Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG and the Statistical Downscaling Model (SDSM, and (ii to downscale future climate scenarios of precipitation, maximum temperature (Tmax and minimum temperature (Tmin of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase

  17. Monitoring and Modeling the Impact of Grazers Using Visual, Remote and Traditional Field Techniques

    Science.gov (United States)

    Roadknight, C. M.; Marshall, I. W.; Rose, R. J.

    2009-04-01

    The relationship between wild and domestic animals and the landscape they graze upon is important to soil erosion studies because they are a strong influence on vegetation cover (a key control on the rate of overland flow runoff), and also because the grazers contribute directly to sediment transport via carriage and indirectly by exposing fresh soil by trampling and burrowing/excavating. Quantifying the impacts of these effects on soil erosion and their dependence on grazing intensity, in complex semi-natural habitats has proved difficult. This is due to lack of manpower to collect sufficient data and weak standardization of data collection between observers. The advent of cheaper and more sophisticated digital camera technology and GPS tracking devices has lead to an increase in the amount of habitat monitoring information that is being collected. We report on the use of automated trail cameras to continuously capture images of grazer (sheep, rabbits, deer) activity in a variety of habitats at the Moor House nature reserve in northern England. As well as grazer activity these cameras also give valuable information on key climatic soil erosion factors such as snow, rain and wind and plant growth and thus allow the importance of a range of grazer activities and the grazing intensity to be estimated. GPS collars and more well established survey methods (erosion monitoring, dung counting and vegetation surveys) are being used to generate a detailed representation of land usage and plan camera siting. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the data processing time and increase focus on important subsets in the collected data. We also present a land usage model that estimates grazing intensity, grazer behaviours and their impact on soil coverage at sites where cameras have not been deployed, based on generalising from camera sites to other

  18. A Review of Statistical Techniques for 2x2 and RxC Categorical Data Tables In SPSS

    Directory of Open Access Journals (Sweden)

    Cengiz BAL

    2009-11-01

    Full Text Available In this study, a review of statistical techniques for RxC categorical data tables is explained in detail. The emphasis is given to the association of techniques and their corresponding data considerations. Some suggestions to how to handle specific categorical data tables in SPSS and common mistakes in the interpretation of the SPSS outputs are shown.

  19. Statistical techniques for modeling extreme price dynamics in the energy market

    International Nuclear Information System (INIS)

    Mbugua, L N; Mwita, P N

    2013-01-01

    Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.

  20. Renormalization group theory outperforms other approaches in statistical comparison between upscaling techniques for porous media

    Science.gov (United States)

    Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.

    2017-09-01

    Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.

  1. Evaluation of significantly modified water bodies in Vojvodina by using multivariate statistical techniques

    Directory of Open Access Journals (Sweden)

    Vujović Svetlana R.

    2013-01-01

    Full Text Available This paper illustrates the utility of multivariate statistical techniques for analysis and interpretation of water quality data sets and identification of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Multivariate statistical techniques, such as factor analysis (FA/principal component analysis (PCA and cluster analysis (CA, were applied for the evaluation of variations and for the interpretation of a water quality data set of the natural water bodies obtained during 2010 year of monitoring of 13 parameters at 33 different sites. FA/PCA attempts to explain the correlations between the observations in terms of the underlying factors, which are not directly observable. Factor analysis is applied to physico-chemical parameters of natural water bodies with the aim classification and data summation as well as segmentation of heterogeneous data sets into smaller homogeneous subsets. Factor loadings were categorized as strong and moderate corresponding to the absolute loading values of >0.75, 0.75-0.50, respectively. Four principal factors were obtained with Eigenvalues >1 summing more than 78 % of the total variance in the water data sets, which is adequate to give good prior information regarding data structure. Each factor that is significantly related to specific variables represents a different dimension of water quality. The first factor F1 accounting for 28 % of the total variance and represents the hydrochemical dimension of water quality. The second factor F2 accounting for 18% of the total variance and may be taken factor of water eutrophication. The third factor F3 accounting 17 % of the total variance and represents the influence of point sources of pollution on water quality. The fourth factor F4 accounting 13 % of the total variance and may be taken as an ecological dimension of water quality. Cluster analysis (CA is an

  2. From psychotherapy to e-therapy: the integration of traditional techniques and new communication tools in clinical settings.

    Science.gov (United States)

    Castelnuovo, Gianluca; Gaggioli, Andrea; Mantovani, Fabrizia; Riva, Giuseppe

    2003-08-01

    Technology is starting to influence psychological fields. In particular, computer-mediated communication (CMC) is providing new tools that can be fruitfully applied in psychotherapy. These new technologies do not substitute for traditional techniques and approaches but they could be used as integration in the clinical process, enhancing or making easier particular steps of it. This paper focuses on the concept of e-therapy as a new modality of helping people resolve life and relationship issues. It utilizes the power and convenience of the Internet to allow synchronous and asynchronous communication between patient and therapist. It is important to underline that e-therapy is not an alternative treatment, but a resource that can be added to traditional psychotherapy. The paper also discusses how different forms of CMC can be fruitfully applied in psychology and psychotherapy, by evaluating the effectiveness of them in the clinical practice. To enhance the diffusion of e-therapy, further research is needed to evaluate all the pros and cons.

  3. Are we really measuring what we say we're measuring? Using video techniques to supplement traditional construct validation procedures.

    Science.gov (United States)

    Podsakoff, Nathan P; Podsakoff, Philip M; Mackenzie, Scott B; Klinger, Ryan L

    2013-01-01

    Several researchers have persuasively argued that the most important evidence to consider when assessing construct validity is whether variations in the construct of interest cause corresponding variations in the measures of the focal construct. Unfortunately, the literature provides little practical guidance on how researchers can go about testing this. Therefore, the purpose of this article is to describe how researchers can use video techniques to test whether their scales measure what they purport to measure. First, we discuss how researchers can develop valid manipulations of the focal construct that they hope to measure. Next, we explain how to design a study to use this manipulation to test the validity of the scale. Finally, comparing and contrasting traditional and contemporary perspectives on validation, we discuss the advantages and limitations of video-based validation procedures. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  4. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    Science.gov (United States)

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than

  5. TRAN-STAT: statistics for environmental studies, Number 22. Comparison of soil-sampling techniques for plutonium at Rocky Flats

    International Nuclear Information System (INIS)

    Gilbert, R.O.; Bernhardt, D.E.; Hahn, P.B.

    1983-01-01

    A summary of a field soil sampling study conducted around the Rocky Flats Colorado plant in May 1977 is preseted. Several different soil sampling techniques that had been used in the area were applied at four different sites. One objective was to comparethe average 239 - 240 Pu concentration values obtained by the various soil sampling techniques used. There was also interest in determining whether there are differences in the reproducibility of the various techniques and how the techniques compared with the proposed EPA technique of sampling to 1 cm depth. Statistically significant differences in average concentrations between the techniques were found. The differences could be largely related to the differences in sampling depth-the primary physical variable between the techniques. The reproducibility of the techniques was evaluated by comparing coefficients of variation. Differences between coefficients of variation were not statistically significant. Average (median) coefficients ranged from 21 to 42 percent for the five sampling techniques. A laboratory study indicated that various sample treatment and particle sizing techniques could increase the concentration of plutonium in the less than 10 micrometer size fraction by up to a factor of about 4 compared to the 2 mm size fraction

  6. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Abbas Alkarkhi, F.M.; Ismail, Norli; Easa, Azhar Mat

    2008-01-01

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers

  7. Correlation analysis of energy indicators for sustainable development using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Carneiro, Alvaro Luiz Guimaraes; Santos, Francisco Carlos Barbosa dos

    2007-01-01

    Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)

  8. Application of multivariate statistical technique for hydrogeochemical assessment of groundwater within the Lower Pra Basin, Ghana

    Science.gov (United States)

    Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.

    2017-06-01

    Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal

  9. Source Identification of Heavy Metals in Soils Surrounding the Zanjan Zinc Town by Multivariate Statistical Techniques

    Directory of Open Access Journals (Sweden)

    M.A. Delavar

    2016-02-01

    Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and

  10. The adaptive statistical iterative reconstruction-V technique for radiation dose reduction in abdominal CT: comparison with the adaptive statistical iterative reconstruction technique.

    Science.gov (United States)

    Kwon, Heejin; Cho, Jinhan; Oh, Jongyeong; Kim, Dongwon; Cho, Junghyun; Kim, Sanghyun; Lee, Sangyun; Lee, Jihyun

    2015-10-01

    To investigate whether reduced radiation dose abdominal CT images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) compromise the depiction of clinically competent features when compared with the currently used routine radiation dose CT images reconstructed with ASIR. 27 consecutive patients (mean body mass index: 23.55 kg m(-2) underwent CT of the abdomen at two time points. At the first time point, abdominal CT was scanned at 21.45 noise index levels of automatic current modulation at 120 kV. Images were reconstructed with 40% ASIR, the routine protocol of Dong-A University Hospital. At the second time point, follow-up scans were performed at 30 noise index levels. Images were reconstructed with filtered back projection (FBP), 40% ASIR, 30% ASIR-V, 50% ASIR-V and 70% ASIR-V for the reduced radiation dose. Both quantitative and qualitative analyses of image quality were conducted. The CT dose index was also recorded. At the follow-up study, the mean dose reduction relative to the currently used common radiation dose was 35.37% (range: 19-49%). The overall subjective image quality and diagnostic acceptability of the 50% ASIR-V scores at the reduced radiation dose were nearly identical to those recorded when using the initial routine-dose CT with 40% ASIR. Subjective ratings of the qualitative analysis revealed that of all reduced radiation dose CT series reconstructed, 30% ASIR-V and 50% ASIR-V were associated with higher image quality with lower noise and artefacts as well as good sharpness when compared with 40% ASIR and FBP. However, the sharpness score at 70% ASIR-V was considered to be worse than that at 40% ASIR. Objective image noise for 50% ASIR-V was 34.24% and 46.34% which was lower than 40% ASIR and FBP. Abdominal CT images reconstructed with ASIR-V facilitate radiation dose reductions of to 35% when compared with the ASIR. This study represents the first clinical research experiment to use ASIR-V, the newest version of

  11. A statistical forecast model using the time-scale decomposition technique to predict rainfall during flood period over the middle and lower reaches of the Yangtze River Valley

    Science.gov (United States)

    Hu, Yijia; Zhong, Zhong; Zhu, Yimin; Ha, Yao

    2018-04-01

    In this paper, a statistical forecast model using the time-scale decomposition method is established to do the seasonal prediction of the rainfall during flood period (FPR) over the middle and lower reaches of the Yangtze River Valley (MLYRV). This method decomposites the rainfall over the MLYRV into three time-scale components, namely, the interannual component with the period less than 8 years, the interdecadal component with the period from 8 to 30 years, and the interdecadal component with the period larger than 30 years. Then, the predictors are selected for the three time-scale components of FPR through the correlation analysis. At last, a statistical forecast model is established using the multiple linear regression technique to predict the three time-scale components of the FPR, respectively. The results show that this forecast model can capture the interannual and interdecadal variation of FPR. The hindcast of FPR during 14 years from 2001 to 2014 shows that the FPR can be predicted successfully in 11 out of the 14 years. This forecast model performs better than the model using traditional scheme without time-scale decomposition. Therefore, the statistical forecast model using the time-scale decomposition technique has good skills and application value in the operational prediction of FPR over the MLYRV.

  12. Statistical zonation technique and its application to the San Andres reservoir in the Poza Rica area, Vera Cruz, Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Campa, M F; Romero, M R

    1969-01-01

    A statistical zonation technique developed by J.D. Testerman is presented referring to its application to the San Andres reservoir in the Poza Rica area in Veracruz, Mex. The method is based on a statistical technique which permits grouping of similar values of certain parameter, i.e., porosity, for individual wells within a field. The resulting groups or zones are used in a correlation analysis to deduce whether there is continuity of porosity in any direction. In the San Andres reservoir, there is a continuity of the porous media on NE-SW direction. This is an important fact for the waterflooding project being carried on.

  13. Leaf proteome analysis of clematis chinensis: a traditional chinese medicine (tcm) by two-dimensional electrophoresis technique

    International Nuclear Information System (INIS)

    Ishtiaq, M.; Maqbool, M.; Hussaini, T.; Azami, S.

    2014-01-01

    Leaf proteome of Clematis chinensis, a traditional Chinese medicine (TCM) was analyzed by two-dimensional electrophoresis (2-DE) technique. The samples were extracted by phenol-SDS method (PSM) with high protein quantity i.e. 2.35, 0.345 mg/g (yield/dw). Proteins were visualized by staining of gels by silver stain and CBB. The gel images of each species were compared by Image Master 2D Platinum software for analytical purpose. The 2-DE profile depicted distribution of 1085 spots and out of these only 255 protein spots (23.5%) were common to all analyzed taxa. The visualized protein spots showed pI range from 3.0 to 10.0 (pH) and Mr of 7 kDa to 70 kDa. Twelve proteins were exclusively specific to C. chinensis when compared with its allies, C. finetiana and C. armandii, which may be used as biomarkers. Thirteen proteins were up-regulated in C. finetiana (0.75-0.95 fold) and twelve proteins in C. armandii (1.05-1.66 fold) whilst seven proteins down-regulated (0.66-0.94 fold) in former and three proteins (1.07-1.20 fold) in later one in comparison with C. chinensis. Twenty five differential and similar protein spots were picked and analyzed by LC-MS/MS technique. Identified proteins are related to energy metabolism (ATP synthesis), photosynthesis. environmental stimuli, regulating RNA metabolism, growth hormone regulators, evolutionary trends and gene expression. The efficiency and applicability of proteomic approach as biomarker for identification of C. chinensis is discussed in its quality control (QC) perspectives. Leaf proteins of Clematis plants are explored for the first time by 2-DE technique and debated for their metabolic role. (author)

  14. Statistical techniques applied to aerial radiometric surveys (STAARS): series introduction and the principal-components-analysis method

    International Nuclear Information System (INIS)

    Pirkle, F.L.

    1981-04-01

    STAARS is a new series which is being published to disseminate information concerning statistical procedures for interpreting aerial radiometric data. The application of a particular data interpretation technique to geologic understanding for delineating regions favorable to uranium deposition is the primary concern of STAARS. Statements concerning the utility of a technique on aerial reconnaissance data as well as detailed aerial survey data will be included

  15. Assessment of roadside surface water quality of Savar, Dhaka, Bangladesh using GIS and multivariate statistical techniques

    Science.gov (United States)

    Ahmed, Fahad; Fakhruddin, A. N. M.; Imam, MD. Toufick; Khan, Nasima; Abdullah, Abu Tareq Mohammad; Khan, Tanzir Ahmed; Rahman, Md. Mahfuzur; Uddin, Mohammad Nashir

    2017-11-01

    In this study, multivariate statistical techniques in collaboration with GIS are used to assess the roadside surface water quality of Savar region. Nineteen water samples were collected in dry season and 15 water quality parameters including TSS, TDS, pH, DO, BOD, Cl-, F-, NO3 2-, NO2 -, SO4 2-, Ca, Mg, K, Zn and Pb were measured. The univariate overview of water quality parameters are TSS 25.154 ± 8.674 mg/l, TDS 840.400 ± 311.081 mg/l, pH 7.574 ± 0.256 pH unit, DO 4.544 ± 0.933 mg/l, BOD 0.758 ± 0.179 mg/l, Cl- 51.494 ± 28.095 mg/l, F- 0.771 ± 0.153 mg/l, NO3 2- 2.211 ± 0.878 mg/l, NO2 - 4.692 ± 5.971 mg/l, SO4 2- 69.545 ± 53.873 mg/l, Ca 48.458 ± 22.690 mg/l, Mg 19.676 ± 7.361 mg/l, K 12.874 ± 11.382 mg/l, Zn 0.027 ± 0.029 mg/l, Pb 0.096 ± 0.154 mg/l. The water quality data were subjected to R-mode PCA which resulted in five major components. PC1 explains 28% of total variance and indicates the roadside and brick field dust settle down (TDS, TSS) in the nearby water body. PC2 explains 22.123% of total variance and indicates the agricultural influence (K, Ca, and NO2 -). PC3 describes the contribution of nonpoint pollution from agricultural and soil erosion processes (SO4 2-, Cl-, and K). PC4 depicts heavy positively loaded by vehicle emission and diffusion from battery stores (Zn, Pb). PC5 depicts strong positive loading of BOD and strong negative loading of pH. Cluster analysis represents three major clusters for both water parameters and sampling sites. The site based on cluster showed similar grouping pattern of R-mode factor score map. The present work reveals a new scope to monitor the roadside water quality for future research in Bangladesh.

  16. A Comparison of Selected Statistical Techniques to Model Soil Cation Exchange Capacity

    Science.gov (United States)

    Khaledian, Yones; Brevik, Eric C.; Pereira, Paulo; Cerdà, Artemi; Fattah, Mohammed A.; Tazikeh, Hossein

    2017-04-01

    Cation exchange capacity (CEC) measures the soil's ability to hold positively charged ions and is an important indicator of soil quality (Khaledian et al., 2016). However, other soil properties are more commonly determined and reported, such as texture, pH, organic matter and biology. We attempted to predict CEC using different advanced statistical methods including monotone analysis of variance (MONANOVA), artificial neural networks (ANNs), principal components regressions (PCR), and particle swarm optimization (PSO) in order to compare the utility of these approaches and identify the best predictor. We analyzed 170 soil samples from four different nations (USA, Spain, Iran and Iraq) under three land uses (agriculture, pasture, and forest). Seventy percent of the samples (120 samples) were selected as the calibration set and the remaining 50 samples (30%) were used as the prediction set. The results indicated that the MONANOVA (R2= 0.82 and Root Mean Squared Error (RMSE) =6.32) and ANNs (R2= 0.82 and RMSE=5.53) were the best models to estimate CEC, PSO (R2= 0.80 and RMSE=5.54) and PCR (R2= 0.70 and RMSE=6.48) also worked well and the overall results were very similar to each other. Clay (positively correlated) and sand (negatively correlated) were the most influential variables for predicting CEC for the entire data set, while the most influential variables for the various countries and land uses were different and CEC was affected by different variables in different situations. Although the MANOVA and ANNs provided good predictions of the entire dataset, PSO gives a formula to estimate soil CEC using commonly tested soil properties. Therefore, PSO shows promise as a technique to estimate soil CEC. Establishing effective pedotransfer functions to predict CEC would be productive where there are limitations of time and money, and other commonly analyzed soil properties are available. References Khaledian, Y., Kiani, F., Ebrahimi, S., Brevik, E.C., Aitkenhead

  17. The Combination of DGT Technique and Traditional Chemical Methods for Evaluation of Cadmium Bioavailability in Contaminated Soils with Organic Amendment

    Science.gov (United States)

    Yao, Yu; Sun, Qin; Wang, Chao; Wang, Pei-Fang; Miao, Ling-Zhan; Ding, Shi-Ming

    2016-01-01

    Organic amendments have been proposed as a means of remediation for Cd-contaminated soils. However, understanding the inhibitory effects of organic materials on metal immobilization requires further research. In this study colza cake, a typical organic amendment material, was investigated in order to elucidate the ability of this material to reduce toxicity of Cd-contaminated soil. Available concentrations of Cd in soils were measured using an in situ diffusive gradients in thin films (DGT) technique in combination with traditional chemical methods, such as HOAc (aqua regia), EDTA (ethylene diamine tetraacetic acid), NaOAc (sodium acetate), CaCl2, and labile Cd in pore water. These results were applied to predict the Cd bioavailability after the addition of colza cake to Cd-contaminated soil. Two commonly grown cash crops, wheat and maize, were selected for Cd accumulation studies, and were found to be sensitive to Cd bioavailability. Results showed that the addition of colza cake may inhibit the growth of wheat and maize. Furthermore, the addition of increasing colza cake doses led to decreasing shoot and root biomass accumulation. However, increasing colza cake doses did lead to the reduction of Cd accumulation in plant tissues, as indicated by the decreasing Cd concentrations in shoots and roots. The labile concentration of Cd obtained by DGT measurements and the traditional chemical extraction methods, showed the clear decrease of Cd with the addition of increasing colza cake doses. All indicators showed significant positive correlations (p soil solution decreased with increasing colza cake doses. This was reflected by the decreases in the ratio (R) value of CDGT to Csol. Our study suggests that the sharp decrease in R values could not only reflect the extremely low capability of labile Cd to be released from its solid phase, but may also be applied to evaluate the abnormal growth of the plants. PMID:27314376

  18. The Combination of DGT Technique and Traditional Chemical Methods for Evaluation of Cadmium Bioavailability in Contaminated Soils with Organic Amendment.

    Science.gov (United States)

    Yao, Yu; Sun, Qin; Wang, Chao; Wang, Pei-Fang; Miao, Ling-Zhan; Ding, Shi-Ming

    2016-06-15

    Organic amendments have been proposed as a means of remediation for Cd-contaminated soils. However, understanding the inhibitory effects of organic materials on metal immobilization requires further research. In this study colza cake, a typical organic amendment material, was investigated in order to elucidate the ability of this material to reduce toxicity of Cd-contaminated soil. Available concentrations of Cd in soils were measured using an in situ diffusive gradients in thin films (DGT) technique in combination with traditional chemical methods, such as HOAc (aqua regia), EDTA (ethylene diamine tetraacetic acid), NaOAc (sodium acetate), CaCl₂, and labile Cd in pore water. These results were applied to predict the Cd bioavailability after the addition of colza cake to Cd-contaminated soil. Two commonly grown cash crops, wheat and maize, were selected for Cd accumulation studies, and were found to be sensitive to Cd bioavailability. Results showed that the addition of colza cake may inhibit the growth of wheat and maize. Furthermore, the addition of increasing colza cake doses led to decreasing shoot and root biomass accumulation. However, increasing colza cake doses did lead to the reduction of Cd accumulation in plant tissues, as indicated by the decreasing Cd concentrations in shoots and roots. The labile concentration of Cd obtained by DGT measurements and the traditional chemical extraction methods, showed the clear decrease of Cd with the addition of increasing colza cake doses. All indicators showed significant positive correlations (p soil solution decreased with increasing colza cake doses. This was reflected by the decreases in the ratio (R) value of CDGT to Csol. Our study suggests that the sharp decrease in R values could not only reflect the extremely low capability of labile Cd to be released from its solid phase, but may also be applied to evaluate the abnormal growth of the plants.

  19. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    Science.gov (United States)

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  20. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  1. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    OpenAIRE

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of...

  2. Blending online techniques with traditional face to face teaching methods to deliver final year undergraduate radiology learning content

    Energy Technology Data Exchange (ETDEWEB)

    Howlett, David, E-mail: david.howlett@esht.nhs.uk [Department of Radiology, Eastbourne District General Hospital, Kings Drive, Eastbourne, East Sussex BN21 2UD (United Kingdom); Vincent, Tim [Department of IT, Brighton and Sussex Medical School (BSMS) (United Kingdom); Watson, Gillian; Owens, Emma [Department of Radiology, Eastbourne District General Hospital, Kings Drive, Eastbourne, East Sussex BN21 2UD (United Kingdom); Webb, Richard; Gainsborough, Nicola [Department of Medicine, Royal Sussex County Hospital, Brighton (United Kingdom); Fairclough, Jil [Department of IT, Brighton and Sussex Medical School (BSMS) (United Kingdom); Taylor, Nick [Department of Medical Illustration, Eastbourne District General Hospital (United Kingdom); Miles, Ken [Department of Imaging, BSMS (United Kingdom); Cohen, Jon [Department of Infectious Diseases, BSMS (United Kingdom); Vincent, Richard [Department of Cardiology, BSMS (United Kingdom)

    2011-06-15

    Aim: To review the initial experience of blending a variety of online educational techniques with traditional face to face or contact-based teaching methods to deliver final year undergraduate radiology content at a UK Medical School. Materials and methods: The Brighton and Sussex Medical School opened in 2003 and offers a 5-year undergraduate programme, with the final 5 spent in several regional centres. Year 5 involves several core clinical specialities with onsite radiology teaching provided at regional centres in the form of small-group tutorials, imaging seminars and also a one-day course. An online educational module was introduced in 2007 to facilitate equitable delivery of the year 5 curriculum between the regional centres and to support students on placement. This module had a strong radiological emphasis, with a combination of imaging integrated into clinical cases to reflect everyday practice and also dedicated radiology cases. For the second cohort of year 5 students in 2008 two additional online media-rich initiatives were introduced, to complement the online module, comprising imaging tutorials and an online case discussion room. Results: In the first year for the 2007/2008 cohort, 490 cases were written, edited and delivered via the Medical School managed learning environment as part of the online module. 253 cases contained a form of image media, of which 195 cases had a radiological component with a total of 325 radiology images. Important aspects of radiology practice (e.g. consent, patient safety, contrast toxicity, ionising radiation) were also covered. There were 274,000 student hits on cases the first year, with students completing a mean of 169 cases each. High levels of student satisfaction were recorded in relation to the online module and also additional online radiology teaching initiatives. Conclusion: Online educational techniques can be effectively blended with other forms of teaching to allow successful undergraduate delivery of

  3. Evaluation of organic amendment on the effect of cadmium bioavailability in contaminated soils using the DGT technique and traditional methods.

    Science.gov (United States)

    Yao, Yu; Sun, Qin; Wang, Chao; Wang, Pei-Fang; Ding, Shi-Ming

    2017-03-01

    Organic amendments have been widely proposed as a remediation technology for metal-contaminated soils, but there exist controversial results on their effectiveness. In this study, the effect of pig manure addition on cadmium (Cd) bioavailability in Cd-contaminated soils was systematically evaluated by one dynamic, in situ technique of diffusive gradients in thin films (DGT) and four traditional methods based on the equilibrium theory (soil solution concentration and the three commonly used extractants, i.e., acetic acid (HAc), ethylenediamine tetraacetic acid (EDTA), and calcium chloride (CaCl 2 ). Wheat and maize were selected for measurement of plant Cd uptake. The results showed that pig manure addition could promote the growth of two plants, accompanied by increasing biomasses of shoots and roots with increasing doses of pig manure addition. Correspondingly, increasing additions of pig manure reduced plant Cd uptake and accumulation, as indicated by the decreases of Cd concentrations in shoots and roots. The bioavailable concentrations of Cd in Cd-contaminated soils reflected by the DGT technique obviously decreased with increasing doses of pig manure addition, following the same changing trend as plant Cd uptake. Changes in soil solution Cd concentration and extractable Cd by HAc, EDTA, and CaCl 2 in soils were similar to DGT measurement. Meanwhile, the capability of Cd resupply from solid phase to soil solution decreased with increasing additions of pig manure, as reflected by the decreases in the ratio (R) value of C DGT to C sol . Positive correlations were observed between various bioavailable indicators of Cd in soils and Cd concentrations in the tissues of the two plants. These findings provide stronger evidence that pig manure amendment is effective in reducing Cd mobility and bioavailability in soils and it is an ideal organic material for remediation of Cd-contaminated soils.

  4. Blending online techniques with traditional face to face teaching methods to deliver final year undergraduate radiology learning content.

    Science.gov (United States)

    Howlett, David; Vincent, Tim; Watson, Gillian; Owens, Emma; Webb, Richard; Gainsborough, Nicola; Fairclough, Jil; Taylor, Nick; Miles, Ken; Cohen, Jon; Vincent, Richard

    2011-06-01

    To review the initial experience of blending a variety of online educational techniques with traditional face to face or contact-based teaching methods to deliver final year undergraduate radiology content at a UK Medical School. The Brighton and Sussex Medical School opened in 2003 and offers a 5-year undergraduate programme, with the final 5 spent in several regional centres. Year 5 involves several core clinical specialities with onsite radiology teaching provided at regional centres in the form of small-group tutorials, imaging seminars and also a one-day course. An online educational module was introduced in 2007 to facilitate equitable delivery of the year 5 curriculum between the regional centres and to support students on placement. This module had a strong radiological emphasis, with a combination of imaging integrated into clinical cases to reflect everyday practice and also dedicated radiology cases. For the second cohort of year 5 students in 2008 two additional online media-rich initiatives were introduced, to complement the online module, comprising imaging tutorials and an online case discussion room. In the first year for the 2007/2008 cohort, 490 cases were written, edited and delivered via the Medical School managed learning environment as part of the online module. 253 cases contained a form of image media, of which 195 cases had a radiological component with a total of 325 radiology images. Important aspects of radiology practice (e.g. consent, patient safety, contrast toxicity, ionising radiation) were also covered. There were 274,000 student hits on cases the first year, with students completing a mean of 169 cases each. High levels of student satisfaction were recorded in relation to the online module and also additional online radiology teaching initiatives. Online educational techniques can be effectively blended with other forms of teaching to allow successful undergraduate delivery of radiology. Efficient IT links and good image quality

  5. Blending online techniques with traditional face to face teaching methods to deliver final year undergraduate radiology learning content

    International Nuclear Information System (INIS)

    Howlett, David; Vincent, Tim; Watson, Gillian; Owens, Emma; Webb, Richard; Gainsborough, Nicola; Fairclough, Jil; Taylor, Nick; Miles, Ken; Cohen, Jon; Vincent, Richard

    2011-01-01

    Aim: To review the initial experience of blending a variety of online educational techniques with traditional face to face or contact-based teaching methods to deliver final year undergraduate radiology content at a UK Medical School. Materials and methods: The Brighton and Sussex Medical School opened in 2003 and offers a 5-year undergraduate programme, with the final 5 spent in several regional centres. Year 5 involves several core clinical specialities with onsite radiology teaching provided at regional centres in the form of small-group tutorials, imaging seminars and also a one-day course. An online educational module was introduced in 2007 to facilitate equitable delivery of the year 5 curriculum between the regional centres and to support students on placement. This module had a strong radiological emphasis, with a combination of imaging integrated into clinical cases to reflect everyday practice and also dedicated radiology cases. For the second cohort of year 5 students in 2008 two additional online media-rich initiatives were introduced, to complement the online module, comprising imaging tutorials and an online case discussion room. Results: In the first year for the 2007/2008 cohort, 490 cases were written, edited and delivered via the Medical School managed learning environment as part of the online module. 253 cases contained a form of image media, of which 195 cases had a radiological component with a total of 325 radiology images. Important aspects of radiology practice (e.g. consent, patient safety, contrast toxicity, ionising radiation) were also covered. There were 274,000 student hits on cases the first year, with students completing a mean of 169 cases each. High levels of student satisfaction were recorded in relation to the online module and also additional online radiology teaching initiatives. Conclusion: Online educational techniques can be effectively blended with other forms of teaching to allow successful undergraduate delivery of

  6. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  7. Xenosurveillance reflects traditional sampling techniques for the identification of human pathogens: A comparative study in West Africa.

    Directory of Open Access Journals (Sweden)

    Joseph R Fauver

    2018-03-01

    Full Text Available Novel surveillance strategies are needed to detect the rapid and continuous emergence of infectious disease agents. Ideally, new sampling strategies should be simple to implement, technologically uncomplicated, and applicable to areas where emergence events are known to occur. To this end, xenosurveillance is a technique that makes use of blood collected by hematophagous arthropods to monitor and identify vertebrate pathogens. Mosquitoes are largely ubiquitous animals that often exist in sizable populations. As well, many domestic or peridomestic species of mosquitoes will preferentially take blood-meals from humans, making them a unique and largely untapped reservoir to collect human blood.We sought to take advantage of this phenomenon by systematically collecting blood-fed mosquitoes during a field trail in Northern Liberia to determine whether pathogen sequences from blood engorged mosquitoes accurately mirror those obtained directly from humans. Specifically, blood was collected from humans via finger-stick and by aspirating bloodfed mosquitoes from the inside of houses. Shotgun metagenomic sequencing of RNA and DNA derived from these specimens was performed to detect pathogen sequences. Samples obtained from xenosurveillance and from finger-stick blood collection produced a similar number and quality of reads aligning to two human viruses, GB virus C and hepatitis B virus.This study represents the first systematic comparison between xenosurveillance and more traditional sampling methodologies, while also demonstrating the viability of xenosurveillance as a tool to sample human blood for circulating pathogens.

  8. Experimental Assessment on the Hysteretic Behavior of a Full-Scale Traditional Chinese Timber Structure Using a Synchronous Loading Technique

    Directory of Open Access Journals (Sweden)

    XiWang Shi

    2018-01-01

    Full Text Available In traditional Chinese timber structures, few tie beams were used between columns, and the column base was placed directly on a stone base. In order to study the hysteretic behavior of such structures, a full-scale model was established. The model size was determined according to the requirements of an eighth grade material system specified in the architectural treatise Ying-zao-fa-shi written during the Song Dynasty. In light of the vertical lift and drop of the test model during horizontal reciprocating motions, the horizontal low-cycle reciprocating loading experiments were conducted using a synchronous loading technique. By analyzing the load-displacement hysteresis curves, envelope curves, deformation capacity, energy dissipation, and change in stiffness under different vertical loads, it is found that the timber frame exhibits obvious signs of self-restoring and favorable plastic deformation capacity. As the horizontal displacement increases, the equivalent viscous damping coefficient generally declines first and then increases. At the same time, the stiffness degrades rapidly first and then decreases slowly. Increasing vertical loading will improve the deformation, energy-dissipation capacity, and stiffness of the timber frame.

  9. Do traditional techniques produce better conventional complete dentures than simplified techniques? A 10-year follow-up of a randomized clinical trial.

    Science.gov (United States)

    Kawai, Yasuhiko; Muarakami, Hiroshi; Feine, Jocelyne S

    2018-07-01

    The use of a simplified method (S) of fabricating complete dentures has been shown to be more cost-efficient than the traditional method (T), and there are no negative consequences that detract from the cost savings in the short term. However, it is not clear whether this remains constant over a decade. The objective of this study was to clarify patients' perspectives and determine any differences between the dentures fabricated with these two different techniques after a decade of use. Edentate individuals participated in a randomized controlled clinical trial and completed a 6-month follow-up from 2001 to 2003 (T group n = 50; S group n = 54). For this 10-year follow-up, they were interviewed by telephone. The assessment included whether the denture was still in use or replaced, the condition of the dentures, patient satisfaction and oral health-related quality of life (OHRQoL). Between and within-group differences and the factors that cause deterioration of oral health-related quality of life (OHRQoL) were determined. Among 54 responders (25 T and 29 S), 14T and 21S kept the original dentures. Both groups were similar in ratings of satisfaction and OHRQoL (maxilla T: 80.0 S: 86.0, p = 0.36; mandibular; T: 66.1 S: 72.3, p = 0.48; OHRQoL T: 111.1 S: 108.5, p = 0.46). Irrespective of fabrication method, discomfort, chewing difficulty and esthetics were the factors that deteriorate OHRQoL (adjusted r = 0.76, p < 0.001). The results indicate that the simplified method remains more cost-efficient than the traditional method over a 10-year period. (IRB approval: A09-E71-12 B McGill University, trial registry: ClinicalTrial.org; NCT02289443). Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. The Combination of DGT Technique and Traditional Chemical Methods for Evaluation of Cadmium Bioavailability in Contaminated Soils with Organic Amendment

    Directory of Open Access Journals (Sweden)

    Yu Yao

    2016-06-01

    Full Text Available Organic amendments have been proposed as a means of remediation for Cd-contaminated soils. However, understanding the inhibitory effects of organic materials on metal immobilization requires further research. In this study colza cake, a typical organic amendment material, was investigated in order to elucidate the ability of this material to reduce toxicity of Cd-contaminated soil. Available concentrations of Cd in soils were measured using an in situ diffusive gradients in thin films (DGT technique in combination with traditional chemical methods, such as HOAc (aqua regia, EDTA (ethylene diamine tetraacetic acid, NaOAc (sodium acetate, CaCl2, and labile Cd in pore water. These results were applied to predict the Cd bioavailability after the addition of colza cake to Cd-contaminated soil. Two commonly grown cash crops, wheat and maize, were selected for Cd accumulation studies, and were found to be sensitive to Cd bioavailability. Results showed that the addition of colza cake may inhibit the growth of wheat and maize. Furthermore, the addition of increasing colza cake doses led to decreasing shoot and root biomass accumulation. However, increasing colza cake doses did lead to the reduction of Cd accumulation in plant tissues, as indicated by the decreasing Cd concentrations in shoots and roots. The labile concentration of Cd obtained by DGT measurements and the traditional chemical extraction methods, showed the clear decrease of Cd with the addition of increasing colza cake doses. All indicators showed significant positive correlations (p < 0.01 with the accumulation of Cd in plant tissues, however, all of the methods could not reflect plant growth status. Additionally, the capability of Cd to change from solid phase to become available in a soil solution decreased with increasing colza cake doses. This was reflected by the decreases in the ratio (R value of CDGT to Csol. Our study suggests that the sharp decrease in R values could not only

  11. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    Energy Technology Data Exchange (ETDEWEB)

    Solaimani, Mohiuddin [Univ. of Texas-Dallas, Richardson, TX (United States); Iftekhar, Mohammed [Univ. of Texas-Dallas, Richardson, TX (United States); Khan, Latifur [Univ. of Texas-Dallas, Richardson, TX (United States); Thuraisingham, Bhavani [Univ. of Texas-Dallas, Richardson, TX (United States); Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. As a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.

  12. Visual Analysis of North Atlantic Hurricane Trends Using Parallel Coordinates and Statistical Techniques

    National Research Council Canada - National Science Library

    Steed, Chad A; Fitzpatrick, Patrick J; Jankun-Kelly, T. J; Swan II, J. E

    2008-01-01

    ... for a particular dependent variable. These capabilities are combined into a unique visualization system that is demonstrated via a North Atlantic hurricane climate study using a systematic workflow. This research corroborates the notion that enhanced parallel coordinates coupled with statistical analysis can be used for more effective knowledge discovery and confirmation in complex, real-world data sets.

  13. Spectral deformation techniques applied to the study of quantum statistical irreversible processes

    International Nuclear Information System (INIS)

    Courbage, M.

    1978-01-01

    A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)

  14. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  15. Visual Analysis of North Atlantic Hurricane Trends Using Parallel Coordinates and Statistical Techniques

    Science.gov (United States)

    2008-07-07

    analyzing multivariate data sets. The system was developed using the Java Development Kit (JDK) version 1.5; and it yields interactive performance on a... script and captures output from the MATLAB’s “regress” and “stepwisefit” utilities that perform simple and stepwise regression, respectively. The MATLAB...Statistical Association, vol. 85, no. 411, pp. 664–675, 1990. [9] H. Hauser, F. Ledermann, and H. Doleisch, “ Angular brushing of extended parallel coordinates

  16. Vibration impact acoustic emission technique for identification and analysis of defects in carbon steel tubes: Part A Statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-04-15

    Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.

  17. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    Science.gov (United States)

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  18. Generic Techniques for the Calibration of Robots with Application of the 3-D Fixtures and Statistical Technique on the PUMA 500 and ARID Robots

    Science.gov (United States)

    Tawfik, Hazem

    1991-01-01

    A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.

  19. On some surprising statistical properties of a DNA fingerprinting technique called AFLP

    NARCIS (Netherlands)

    Gort, G.

    2010-01-01

    AFLP is a widely used DNA fingerprinting technique, resulting in band absence - presence profiles, like a bar code. Bands represent DNA fragments, sampled from the genome of an individual plant or other organism. The DNA fragments travel through a lane of an electrophoretic gel or microcapillary

  20. Statistical Analysis of Reactor Pressure Vessel Fluence Calculation Benchmark Data Using Multiple Regression Techniques

    International Nuclear Information System (INIS)

    Carew, John F.; Finch, Stephen J.; Lois, Lambros

    2003-01-01

    The calculated >1-MeV pressure vessel fluence is used to determine the fracture toughness and integrity of the reactor pressure vessel. It is therefore of the utmost importance to ensure that the fluence prediction is accurate and unbiased. In practice, this assurance is provided by comparing the predictions of the calculational methodology with an extensive set of accurate benchmarks. A benchmarking database is used to provide an estimate of the overall average measurement-to-calculation (M/C) bias in the calculations ( ). This average is used as an ad-hoc multiplicative adjustment to the calculations to correct for the observed calculational bias. However, this average only provides a well-defined and valid adjustment of the fluence if the M/C data are homogeneous; i.e., the data are statistically independent and there is no correlation between subsets of M/C data.Typically, the identification of correlations between the errors in the database M/C values is difficult because the correlation is of the same magnitude as the random errors in the M/C data and varies substantially over the database. In this paper, an evaluation of a reactor dosimetry benchmark database is performed to determine the statistical validity of the adjustment to the calculated pressure vessel fluence. Physical mechanisms that could potentially introduce a correlation between the subsets of M/C ratios are identified and included in a multiple regression analysis of the M/C data. Rigorous statistical criteria are used to evaluate the homogeneity of the M/C data and determine the validity of the adjustment.For the database evaluated, the M/C data are found to be strongly correlated with dosimeter response threshold energy and dosimeter location (e.g., cavity versus in-vessel). It is shown that because of the inhomogeneity in the M/C data, for this database, the benchmark data do not provide a valid basis for adjusting the pressure vessel fluence.The statistical criteria and methods employed in

  1. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1978-01-01

    Surveillance techniques which extend the sophistication existing in automated systems monitoring in industrial rotating equipment are described. The monitoring system automatically established limiting criteria during an initial learning period of a few days; and subsequently, while monitoring the test rotor during an extended period of normal operation, experienced a false alarm rate of 0.5%. At the same time, the monitoring system successfully detected all fault types that introduced into the test setup. Tests on real equipment are needed to provide final verification of the monitoring techniques. There are areas that would profit from additional investigation in the laboratory environment. A comparison of the relative value of alternate descriptors under given fault conditions would be worthwhile. This should be pursued in conjunction with extending the set of fault types available, e.g., lecaring problems. Other tests should examine the effects of using fewer (more coarse) intervals to define the lumped operational states. finally, techniques to diagnose the most probable fault should be developed by drawing upon the extensive data automatically logged by the monitoring system

  2. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    Directory of Open Access Journals (Sweden)

    Nsikak U Benson

    Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  3. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1979-01-01

    The level of technology utilized in automated systems that monitor industrial rotating equipment and the potential of alternative surveillance methods are assessed. It is concluded that changes in surveillance methodology would upgrade ongoing programs and yet still be practical for implementation. An improved anomaly recognition methodology is formulated and implemented on a minicomputer system. The effectiveness of the monitoring system was evaluated in laboratory tests on a small rotor assembly, using vibrational signals from both displacement probes and accelerometers. Time and frequency domain descriptors are selected to compose an overall signature that characterizes the monitored equipment. Limits for normal operation of the rotor assembly are established automatically during an initial learning period. Thereafter, anomaly detection is accomplished by applying an approximate statistical test to each signature descriptor. As demonstrated over months of testing, this monitoring system is capable of detecting anomalous conditions while exhibiting a false alarm rate below 0.5%

  4. Relating N2O emissions during biological nitrogen removal with operating conditions using multivariate statistical techniques.

    Science.gov (United States)

    Vasilaki, V; Volcke, E I P; Nandi, A K; van Loosdrecht, M C M; Katsou, E

    2018-04-26

    Multivariate statistical analysis was applied to investigate the dependencies and underlying patterns between N 2 O emissions and online operational variables (dissolved oxygen and nitrogen component concentrations, temperature and influent flow-rate) during biological nitrogen removal from wastewater. The system under study was a full-scale reactor, for which hourly sensor data were available. The 15-month long monitoring campaign was divided into 10 sub-periods based on the profile of N 2 O emissions, using Binary Segmentation. The dependencies between operating variables and N 2 O emissions fluctuated according to Spearman's rank correlation. The correlation between N 2 O emissions and nitrite concentrations ranged between 0.51 and 0.78. Correlation >0.7 between N 2 O emissions and nitrate concentrations was observed at sub-periods with average temperature lower than 12 °C. Hierarchical k-means clustering and principal component analysis linked N 2 O emission peaks with precipitation events and ammonium concentrations higher than 2 mg/L, especially in sub-periods characterized by low N 2 O fluxes. Additionally, the highest ranges of measured N 2 O fluxes belonged to clusters corresponding with NO 3 -N concentration less than 1 mg/L in the upstream plug-flow reactor (middle of oxic zone), indicating slow nitrification rates. The results showed that the range of N 2 O emissions partially depends on the prior behavior of the system. The principal component analysis validated the findings from the clustering analysis and showed that ammonium, nitrate, nitrite and temperature explained a considerable percentage of the variance in the system for the majority of the sub-periods. The applied statistical methods, linked the different ranges of emissions with the system variables, provided insights on the effect of operating conditions on N 2 O emissions in each sub-period and can be integrated into N 2 O emissions data processing at wastewater treatment plants

  5. Assessment of Reservoir Water Quality Using Multivariate Statistical Techniques: A Case Study of Qiandao Lake, China

    Directory of Open Access Journals (Sweden)

    Qing Gu

    2016-03-01

    Full Text Available Qiandao Lake (Xin’an Jiang reservoir plays a significant role in drinking water supply for eastern China, and it is an attractive tourist destination. Three multivariate statistical methods were comprehensively applied to assess the spatial and temporal variations in water quality as well as potential pollution sources in Qiandao Lake. Data sets of nine parameters from 12 monitoring sites during 2010–2013 were obtained for analysis. Cluster analysis (CA was applied to classify the 12 sampling sites into three groups (Groups A, B and C and the 12 monitoring months into two clusters (April-July, and the remaining months. Discriminant analysis (DA identified Secchi disc depth, dissolved oxygen, permanganate index and total phosphorus as the significant variables for distinguishing variations of different years, with 79.9% correct assignments. Dissolved oxygen, pH and chlorophyll-a were determined to discriminate between the two sampling periods classified by CA, with 87.8% correct assignments. For spatial variation, DA identified Secchi disc depth and ammonia nitrogen as the significant discriminating parameters, with 81.6% correct assignments. Principal component analysis (PCA identified organic pollution, nutrient pollution, domestic sewage, and agricultural and surface runoff as the primary pollution sources, explaining 84.58%, 81.61% and 78.68% of the total variance in Groups A, B and C, respectively. These results demonstrate the effectiveness of integrated use of CA, DA and PCA for reservoir water quality evaluation and could assist managers in improving water resources management.

  6. An Efficient Statistical Computation Technique for Health Care Big Data using R

    Science.gov (United States)

    Sushma Rani, N.; Srinivasa Rao, P., Dr; Parimala, P.

    2017-08-01

    Due to the changes in living conditions and other factors many critical health related problems are arising. The diagnosis of the problem at earlier stages will increase the chances of survival and fast recovery. This reduces the time of recovery and the cost associated for the treatment. One such medical related issue is cancer and breast cancer has been identified as the second leading cause of cancer death. If detected in the early stage it can be cured. Once a patient is detected with breast cancer tumor, it should be classified whether it is cancerous or non-cancerous. So the paper uses k-nearest neighbors(KNN) algorithm which is one of the simplest machine learning algorithms and is an instance-based learning algorithm to classify the data. Day-to -day new records are added which leds to increase in the data to be classified and this tends to be big data problem. The algorithm is implemented in R whichis the most popular platform applied to machine learning algorithms for statistical computing. Experimentation is conducted by using various classification evaluation metric onvarious values of k. The results show that the KNN algorithm out performes better than existing models.

  7. Utilizing Healthcare Developments, Demographic Data with Statistical Techniques to Estimate the Diarrhoea Prevalence in India.

    Science.gov (United States)

    Srivastava, Shweta; Vatsalya, Vatsalya; Arora, Ashoo; Arora, Kashmiri L; Karch, Robert

    2012-03-22

    Diarrhoea is one of the leading causes of morbidity and mortality in developing countries in Africa and South Asia such as India. Prevalence of diarrheal diseases in those countries is higher than developed western world and largely has been associated with socio-economic and sanitary conditions. However, present available data has not been sufficiently evaluated to study the role of other factors like healthcare development, population density, sex and regional influence on diarrheal prevalence pattern. Study was performed to understand the relationship of diarrheal prevalence with specific measures namely; healthcare services development, demographics, population density, socio-economic conditions, sex, and regional prevalence patterns in India. Data from Annual national health reports and other epidemiological studies were included and statistically analyzed. Our results demonstrate significant correlation of the disease prevalence pattern with certain measures like healthcare centers, population growth rate, sex and region-specific morbidity. Available information on sanitation like water supply and toilet availability and socioeconomic conditions like poverty and literacy measures could only be associated as trends of significance. This study can be valuable for improvisation of appropriate strategies focused on important measures like healthcare resources, population growth and regional significances to evaluate prevalence patterns and management of the diarrhoea locally and globally.

  8. Quantitative evaluation of ASiR image quality: an adaptive statistical iterative reconstruction technique

    Science.gov (United States)

    Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan

    2012-03-01

    Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.

  9. [Ideas and methods of two-dimensional zebrafish model combined with chromatographic techniques in high-throughput screening of active anti-osteoporosis components of traditional Chinese medicines].

    Science.gov (United States)

    Wei, Ying-Jie; Jing, Li-Jun; Zhan, Yang; Sun, E; Jia, Xiao-Bin

    2014-05-01

    To break through the restrictions of the evaluation model and the quantity of compounds by using the two-dimensional zebrafish model combined with chromatographic techniques, and establish a new method for the high-throughput screening of active anti-osteoporosis components. According to the research group-related studies and relevant foreign literatures, on the basis of the fact that the zebrafish osteoporosis model could efficiently evaluate the activity, the zebrafish metabolism model could efficiently enrich metabolites and the chromatographic techniques could efficiently separate and analyze components of traditional Chinese medicines, we proposed that the inherent combination of the three methods is expected to efficiently decode in vivo and in vitro efficacious anti-osteoporosis materials of traditional Chinese medicines. The method makes it simple and efficient in the enrichment, separation and analysis on components of traditional Chinese medicines, particularly micro-components and metabolites and the screening anti-osteoporosis activity, fully reflects that efficacious materials of traditional Chinese medicines contain original components and metabolites, with characteristic of "multi-components, multi-targets and integral effect", which provides new ideas and methods for the early and rapid discovery of active anti-osteoporosis components of traditional Chinese medicines.

  10. Proficiency Testing for Determination of Water Content in Toluene of Chemical Reagents by iteration robust statistic technique

    Science.gov (United States)

    Wang, Hao; Wang, Qunwei; He, Ming

    2018-05-01

    In order to investigate and improve the level of detection technology of water content in liquid chemical reagents of domestic laboratories, proficiency testing provider PT0031 (CNAS) has organized proficiency testing program of water content in toluene, 48 laboratories from 18 provinces/cities/municipals took part in the PT. This paper introduces the implementation process of proficiency testing for determination of water content in toluene, including sample preparation, homogeneity and stability test, the results of statistics of iteration robust statistic technique and analysis, summarized and analyzed those of the different test standards which are widely used in the laboratories, put forward the technological suggestions for the improvement of the test quality of water content. Satisfactory results were obtained by 43 laboratories, amounting to 89.6% of the total participating laboratories.

  11. A cost-saving statistically based screening technique for focused sampling of a lead-contaminated site

    International Nuclear Information System (INIS)

    Moscati, A.F. Jr.; Hediger, E.M.; Rupp, M.J.

    1986-01-01

    High concentrations of lead in soils along an abandoned railroad line prompted a remedial investigation to characterize the extent of contamination across a 7-acre site. Contamination was thought to be spotty across the site reflecting its past use in battery recycling operations at discrete locations. A screening technique was employed to delineate the more highly contaminated areas by testing a statistically determined minimum number of random samples from each of seven discrete site areas. The approach not only quickly identified those site areas which would require more extensive grid sampling, but also provided a statistically defensible basis for excluding other site areas from further consideration, thus saving the cost of additional sample collection and analysis. The reduction in the number of samples collected in ''clean'' areas of the site ranged from 45 to 60%

  12. Statistical Techniques Used in Three Applied Linguistics Journals: "Language Learning,""Applied Linguistics" and "TESOL Quarterly," 1980-1986: Implications for Readers and Researchers.

    Science.gov (United States)

    Teleni, Vicki; Baldauf, Richard B., Jr.

    A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…

  13. DETERMINING INDICATORS OF URBAN HOUSEHOLD WATER CONSUMPTION THROUGH MULTIVARIATE STATISTICAL TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Gledsneli Maria Lima Lins

    2010-12-01

    Full Text Available Water has a decisive influence on populations’ life quality – specifically in areas like urban supply, drainage, and effluents treatment – due to its sound impact over public health. Water rational use constitutes the greatest challenge faced by water demand management, mainly with regard to urban household water consumption. This makes it important to develop researches to assist water managers and public policy-makers in planning and formulating water demand measures which may allow urban water rational use to be met. This work utilized the multivariate techniques Factor Analysis and Multiple Linear Regression Analysis – in order to determine the participation level of socioeconomic and climatic variables in monthly urban household consumption changes – applying them to two districts of Campina Grande city (State of Paraíba, Brazil. The districts were chosen based on socioeconomic criterion (income level so as to evaluate their water consumer’s behavior. A 9-year monthly data series (from year 2000 up to 2008 was utilized, comprising family income, water tariff, and quantity of household connections (economies – as socioeconomic variables – and average temperature and precipitation, as climatic variables. For both the selected districts of Campina Grande city, the obtained results point out the variables “water tariff” and “family income” as indicators of these district’s household consumption.

  14. Denoised ordered subset statistically penalized algebraic reconstruction technique (DOS-SPART) in digital breast tomosynthesis

    Science.gov (United States)

    Garrett, John; Li, Yinsheng; Li, Ke; Chen, Guang-Hong

    2017-03-01

    Digital breast tomosynthesis (DBT) is a three dimensional (3D) breast imaging modality in which projections are acquired over a limited angular span around the compressed breast and reconstructed into image slices parallel to the detector. DBT has been shown to help alleviate the breast tissue overlapping issues of two dimensional (2D) mammography. Since the overlapping tissues may simulate cancer masses or obscure true cancers, this improvement is critically important for improved breast cancer screening and diagnosis. In this work, a model-based image reconstruction method is presented to show that spatial resolution in DBT volumes can be maintained while dose is reduced using the presented method when compared to that of a state-of-the-art commercial reconstruction technique. Spatial resolution was measured in phantom images and subjectively in a clinical dataset. Noise characteristics were explored in a cadaver study. In both the quantitative and subjective results the image sharpness was maintained and overall image quality was maintained at reduced doses when the model-based iterative reconstruction was used to reconstruct the volumes.

  15. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  16. Multivariate statistical techniques for the evaluation of surface water quality of the Himalayan foothills streams, Pakistan

    Science.gov (United States)

    Malik, Riffat Naseem; Hashmi, Muhammad Zaffar

    2017-10-01

    Himalayan foothills streams, Pakistan play an important role in living water supply and irrigation of farmlands; thus, the water quality is closely related to public health. Multivariate techniques were applied to check spatial and seasonal trends, and metals contamination sources of the Himalayan foothills streams, Pakistan. Grab surface water samples were collected from different sites (5-15 cm water depth) in pre-washed polyethylene containers. Fast Sequential Atomic Absorption Spectrophotometer (Varian FSAA-240) was used to measure the metals concentration. Concentrations of Ni, Cu, and Mn were high in pre-monsoon season than the post-monsoon season. Cluster analysis identified impaired, moderately impaired and least impaired clusters based on water parameters. Discriminant function analysis indicated spatial variability in water was due to temperature, electrical conductivity, nitrates, iron and lead whereas seasonal variations were correlated with 16 physicochemical parameters. Factor analysis identified municipal and poultry waste, automobile activities, surface runoff, and soil weathering as major sources of contamination. Levels of Mn, Cr, Fe, Pb, Cd, Zn and alkalinity were above the WHO and USEPA standards for surface water. The results of present study will help to higher authorities for the management of the Himalayan foothills streams.

  17. Nonparametric statistical techniques used in dose estimation for beagles exposed to inhaled plutonium nitrate

    International Nuclear Information System (INIS)

    Stevens, D.L.; Dagle, G.E.

    1986-01-01

    Retention and translocation of inhaled radionuclides are often estimated from the sacrifice of multiple animals at different time points. The data for each time point can be averaged and a smooth curve fitted to the mean values, or a smooth curve may be fitted to the entire data set. However, an analysis based on means may not be the most appropriate if there is substantial variation in the initial amount of the radionuclide inhaled or if the data are subject to outliers. A method has been developed that takes account of these problems. The body burden is viewed as a compartmental system, with the compartments identified with body organs. A median polish is applied to the multiple logistic transform of the compartmental fractions (compartment burden/total burden) at each time point. A smooth function is fitted to the results of the median polish. This technique was applied to data from beagles exposed to an aerosol of 239 Pu(NO 3 ) 4 . Models of retention and translocation for lungs, skeleton, liver, kidneys, and tracheobronchial lymph nodes were developed and used to estimate dose. 4 refs., 3 figs., 4 tabs

  18. The Validation of an Interactive Videodisc as an Alternative to Traditional Teaching Techniques: Auscultation of the Heart.

    Science.gov (United States)

    Branck, Charles E.; And Others

    1987-01-01

    This study of 87 veterinary medical students at Auburn University tests the effectiveness and student acceptance of interactive videodisc as an alternative to animal experimentation and other traditional teaching methods in analyzing canine cardiovascular sounds. Results of the questionnaire used are presented, and benefits of interactive video…

  19. Statistical signal processing techniques for coherent transversal beam dynamics in synchrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Alhumaidi, Mouhammad

    2015-03-04

    identifying and analyzing the betatron oscillation sourced from the kick based on its mixing and temporal patterns. The accelerator magnets can generate unwanted spurious linear and non-linear fields due to fabrication errors or aging. These error fields in the magnets can excite undesired resonances leading together with the space charge tune spread to long term beam losses and reducing dynamic aperture. Therefore, the knowledge of the linear and non-linear magnets errors in circular accelerator optics is very crucial for controlling and compensating resonances and their consequent beam losses and beam quality deterioration. This is indispensable, especially for high beam intensity machines. Fortunately, the relationship between the beam offset oscillation signals recorded at the BPMs is a manifestation of the accelerator optics, and can therefore be exploited in the determination of the optics linear and non-linear components. Thus, beam transversal oscillations can be excited deliberately for purposes of diagnostics operation of particle accelerators. In this thesis, we propose a novel method for detecting and estimating the optics lattice non-linear components located in-between the locations of two BPMs by analyzing the beam offset oscillation signals of a BPMs-triple containing these two BPMs. Depending on the non-linear components in-between the locations of the BPMs-triple, the relationship between the beam offsets follows a multivariate polynomial accordingly. After calculating the covariance matrix of the polynomial terms, the Generalized Total Least Squares method is used to find the model parameters, and thus the non-linear components. A bootstrap technique is used to detect the existing polynomial model orders by means of multiple hypothesis testing, and determine confidence intervals for the model parameters.

  20. Transforming traditional Tai Ji Quan techniques into integrative movement therapy-Tai Ji Quan: Moving for Better Balance.

    Science.gov (United States)

    Li, Fuzhong

    2014-03-01

    Tai Ji Quan, developed as a martial art, has traditionally served multiple purposes, including self-defense, competition/performance, and health promotion. With respect to health, the benefits historically and anecdotally associated with Tai Ji Quan are now being supported by scientific and clinical research, with mounting evidence indicating its potential value in preventing and managing various diseases and improving well-being and quality of life in middle-aged and older adults. The research findings produced to date have both public health significance and clinical relevance. However, because of its roots in the martial arts, transforming traditional Tai Ji Quan movements and training approaches into contemporary therapeutic programs and functional applications is needed to maximize its ultimate utility. This paper addresses this issue by introducing Tai Ji Quan: Moving for Better Balance , a functional therapy that involves the use of Tai Ji Quan principles and Yang-style-based movements to form an innovative, contemporary therapeutic approach that integrates motor, sensory, and cognitive components to improve postural control, gait, and mobility for older adults and those who have neurodegenerative movement impairments. It provides a synergy of traditional and contemporary Tai Ji Quan practice with the ultimate goal of improving balance and gait, enhancing performance of daily functional tasks, and reducing incidence of falls among older adults.

  1. Transforming traditional Tai Ji Quan techniques into integrative movement therapy—Tai Ji Quan: Moving for Better Balance

    Directory of Open Access Journals (Sweden)

    Fuzhong Li

    2014-03-01

    Full Text Available Tai Ji Quan, developed as a martial art, has traditionally served multiple purposes, including self-defense, competition/performance, and health promotion. With respect to health, the benefits historically and anecdotally associated with Tai Ji Quan are now being supported by scientific and clinical research, with mounting evidence indicating its potential value in preventing and managing various diseases and improving well-being and quality of life in middle-aged and older adults. The research findings produced to date have both public health significance and clinical relevance. However, because of its roots in the martial arts, transforming traditional Tai Ji Quan movements and training approaches into contemporary therapeutic programs and functional applications is needed to maximize its ultimate utility. This paper addresses this issue by introducing Tai Ji Quan: Moving for Better Balance, a functional therapy that involves the use of Tai Ji Quan principles and Yang-style-based movements to form an innovative, contemporary therapeutic approach that integrates motor, sensory, and cognitive components to improve postural control, gait, and mobility for older adults and those who have neurodegenerative movement impairments. It provides a synergy of traditional and contemporary Tai Ji Quan practice with the ultimate goal of improving balance and gait, enhancing performance of daily functional tasks, and reducing incidence of falls among older adults.

  2. Multi-Site and Multi-Variables Statistical Downscaling Technique in the Monsoon Dominated Region of Pakistan

    Science.gov (United States)

    Khan, Firdos; Pilz, Jürgen

    2016-04-01

    South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological

  3. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques

    International Nuclear Information System (INIS)

    Hernandez M, B.

    1997-01-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  4. Remote sensing estimation of the total phosphorus concentration in a large lake using band combinations and regional multivariate statistical modeling techniques.

    Science.gov (United States)

    Gao, Yongnian; Gao, Junfeng; Yin, Hongbin; Liu, Chuansheng; Xia, Ting; Wang, Jing; Huang, Qi

    2015-03-15

    Remote sensing has been widely used for ater quality monitoring, but most of these monitoring studies have only focused on a few water quality variables, such as chlorophyll-a, turbidity, and total suspended solids, which have typically been considered optically active variables. Remote sensing presents a challenge in estimating the phosphorus concentration in water. The total phosphorus (TP) in lakes has been estimated from remotely sensed observations, primarily using the simple individual band ratio or their natural logarithm and the statistical regression method based on the field TP data and the spectral reflectance. In this study, we investigated the possibility of establishing a spatial modeling scheme to estimate the TP concentration of a large lake from multi-spectral satellite imagery using band combinations and regional multivariate statistical modeling techniques, and we tested the applicability of the spatial modeling scheme. The results showed that HJ-1A CCD multi-spectral satellite imagery can be used to estimate the TP concentration in a lake. The correlation and regression analysis showed a highly significant positive relationship between the TP concentration and certain remotely sensed combination variables. The proposed modeling scheme had a higher accuracy for the TP concentration estimation in the large lake compared with the traditional individual band ratio method and the whole-lake scale regression-modeling scheme. The TP concentration values showed a clear spatial variability and were high in western Lake Chaohu and relatively low in eastern Lake Chaohu. The northernmost portion, the northeastern coastal zone and the southeastern portion of western Lake Chaohu had the highest TP concentrations, and the other regions had the lowest TP concentration values, except for the coastal zone of eastern Lake Chaohu. These results strongly suggested that the proposed modeling scheme, i.e., the band combinations and the regional multivariate

  5. Analysis of tribological behaviour of zirconia reinforced Al-SiC hybrid composites using statistical and artificial neural network technique

    Science.gov (United States)

    Arif, Sajjad; Tanwir Alam, Md; Ansari, Akhter H.; Bilal Naim Shaikh, Mohd; Arif Siddiqui, M.

    2018-05-01

    The tribological performance of aluminium hybrid composites reinforced with micro SiC (5 wt%) and nano zirconia (0, 3, 6 and 9 wt%) fabricated through powder metallurgy technique were investigated using statistical and artificial neural network (ANN) approach. The influence of zirconia reinforcement, sliding distance and applied load were analyzed with test based on full factorial design of experiments. Analysis of variance (ANOVA) was used to evaluate the percentage contribution of each process parameters on wear loss. ANOVA approach suggested that wear loss be mainly influenced by sliding distance followed by zirconia reinforcement and applied load. Further, a feed forward back propagation neural network was applied on input/output date for predicting and analyzing the wear behaviour of fabricated composite. A very close correlation between experimental and ANN output were achieved by implementing the model. Finally, ANN model was effectively used to find the influence of various control factors on wear behaviour of hybrid composites.

  6. Post-fire debris flow prediction in Western United States: Advancements based on a nonparametric statistical technique

    Science.gov (United States)

    Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.

    2017-12-01

    Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.

  7. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques

    Science.gov (United States)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John

    2016-02-01

    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  8. Arsenic health risk assessment in drinking water and source apportionment using multivariate statistical techniques in Kohistan region, northern Pakistan.

    Science.gov (United States)

    Muhammad, Said; Tahir Shah, M; Khan, Sardar

    2010-10-01

    The present study was conducted in Kohistan region, where mafic and ultramafic rocks (Kohistan island arc and Indus suture zone) and metasedimentary rocks (Indian plate) are exposed. Water samples were collected from the springs, streams and Indus river and analyzed for physical parameters, anions, cations and arsenic (As(3+), As(5+) and arsenic total). The water quality in Kohistan region was evaluated by comparing the physio-chemical parameters with permissible limits set by Pakistan environmental protection agency and world health organization. Most of the studied parameters were found within their respective permissible limits. However in some samples, the iron and arsenic concentrations exceeded their permissible limits. For health risk assessment of arsenic, the average daily dose, hazards quotient (HQ) and cancer risk were calculated by using statistical formulas. The values of HQ were found >1 in the samples collected from Jabba, Dubair, while HQ values were pollution load was also calculated by using multivariate statistical techniques like one-way ANOVA, correlation analysis, regression analysis, cluster analysis and principle component analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. The problem of sexual imbalance and techniques of the self in the Diagnostic and Statistical Manual of Mental Disorders.

    Science.gov (United States)

    Flore, Jacinthe

    2016-09-01

    This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination. © The Author(s) 2016.

  10. Application of Statistical Downscaling Techniques to Predict Rainfall and Its Spatial Analysis Over Subansiri River Basin of Assam, India

    Science.gov (United States)

    Barman, S.; Bhattacharjya, R. K.

    2017-12-01

    The River Subansiri is the major north bank tributary of river Brahmaputra. It originates from the range of Himalayas beyond the Great Himalayan range at an altitude of approximately 5340m. Subansiri basin extends from tropical to temperate zones and hence exhibits a great diversity in rainfall characteristics. In the Northern and Central Himalayan tracts, precipitation is scarce on account of high altitudes. On the other hand, Southeast part of the Subansiri basin comprising the sub-Himalayan and the plain tract in Arunachal Pradesh and Assam, lies in the tropics. Due to Northeast as well as Southwest monsoon, precipitation occurs in this region in abundant quantities. Particularly, Southwest monsoon causes very heavy precipitation in the entire Subansiri basin during May to October. In this study, the rainfall over Subansiri basin has been studied at 24 different locations by multiple linear and non-linear regression based statistical downscaling techniques and by Artificial Neural Network based model. APHRODITE's gridded rainfall data of 0.25˚ x 0.25˚ resolutions and climatic parameters of HadCM3 GCM of resolution 2.5˚ x 3.75˚ (latitude by longitude) have been used in this study. It has been found that multiple non-linear regression based statistical downscaling technique outperformed the other techniques. Using this method, the future rainfall pattern over the Subansiri basin has been analyzed up to the year 2099 for four different time periods, viz., 2020-39, 2040-59, 2060-79, and 2080-99 at all the 24 locations. On the basis of historical rainfall, the months have been categorized as wet months, months with moderate rainfall and dry months. The spatial changes in rainfall patterns for all these three types of months have also been analyzed over the basin. Potential decrease of rainfall in the wet months and months with moderate rainfall and increase of rainfall in the dry months are observed for the future rainfall pattern of the Subansiri basin.

  11. Risk factors for pericardial effusion after chemoradiotherapy for thoracic esophageal cancer-comparison of four-field technique and traditional two opposed fields technique.

    Science.gov (United States)

    Takata, Noriko; Kataoka, Masaaki; Hamamoto, Yasushi; Tsuruoka, Shintaro; Kanzaki, Hiromitsu; Uwatsu, Kotaro; Nagasaki, Kei; Mochizuki, Teruhito

    2018-04-11

    Pericardial effusion is an important late toxicity after concurrent chemoradiotherapy (CCRT) for locally advanced esophageal cancer. We investigated the clinical and dosimetric factors that were related to pericardial effusion among patients with thoracic esophageal cancer who were treated with definitive CCRT using the two opposed fields technique (TFT) or the four-field technique (FFT), as well as the effectiveness of FFT. During 2007-2015, 169 patients with middle and/or lower thoracic esophageal cancer received definitive CCRT, and 94 patients were evaluable (51 FFT cases and 43 TFT cases). Pericardial effusion was observed in 74 patients (79%) and appeared at 1-18.5 months (median: 5.25 months) after CCRT. The 1-year incidences of pericardial effusions were 73.2% and 76.7% in the FFT and TFT groups, respectively (P = 0.6395). The mean doses to the pericardium were 28.6 Gy and 31.8 Gy in the FFT and TFT groups, respectively (P = 0.0259), and the V40 Gy proportions were 33.5% and 48.2% in the FFT and TFT groups, respectively (P effusion was not observed in patients with a pericardial V40 Gy of effusion after CCRT were similar in both groups. As symptomatic pericardial effusion was not observed in patients with a pericardial V40 Gy of effusion.

  12. New technique targeting the C5 nerve root proximal to the traditional interscalene sonoanatomical approach is analgesic for outpatient arthroscopic shoulder surgery.

    Science.gov (United States)

    Dobie, Katherine H; Shi, Yaping; Shotwell, Matthew S; Sandberg, Warren S

    2016-11-01

    Regional anesthesia and analgesia for shoulder surgery is most commonly performed via interscalene nerve block. We developed an ultrasound-guided technique that specifically targets the C5 nerve root proximal to the traditional interscalene block and assessed its efficacy for shoulder analgesia. Prospective case series. Vanderbilt Bone and Joint Surgery Center. Patients undergoing shoulder arthroscopy at an ambulatory surgery center. Thirty-five outpatient shoulder arthroscopy patients underwent an analgesic nerve block using a new technique where ultrasound visualization of the C5 nerve root served as the primary target at a level proximal to the traditional interscalene approach. The block was performed with 15mL of 0.5% plain ropivicaine. Post anesthesia care unit pain scores, opioid consumption, hand strength, and duration of block were recorded. Cadaver dissection after injection with methylene blue confirmed that the primary target under ultrasound visualization was the C5 nerve root. Pain scores revealed 97% patients had 0/10 pain at arrival to PACU, with 91% having a pain score of 3/10 or less at discharge from PACU. Medical Research Council (MRC) hand strength mean (SD) score was 4.17 (0.92) on a scale of 1-5. The mean (SD) duration of the block was 13.9 (3.5) hours. A new technique for ultrasound-guided blockade at the level of the C5 nerve root proximal to the level of the traditional interscalene block is efficacious for shoulder post-operative pain control. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Traditional genetic improvement and use of biotechnological techniques in searching of resistance to main fungi pathogens of Musa spp.

    Directory of Open Access Journals (Sweden)

    Michel Leiva-Mora

    2006-07-01

    Full Text Available Bananas and plantain are important food staple in human diet, even cooked or consumed fresh. Fungal diseases caused by Fusarium oxysporum f. sp. cubense (Foc and Mycosphaerella fijiensis have threated to distroy Musa spp. Those crops are difficult to breed genetically because they are steriles, do not produce fertil seeds and they are partenocarpic. Genetic crossing by hibridization have been used successfully in FHIA and IITA Musa breeding programs, they have released numerous improved hybrids to those diseases. Plant Biotechnology has developed a set of techniques for Musa micropropagation to increase multiplication rates, healthy and safety plant material for plantation. Mutagenic techniques, somaclonal variation, somatic embryogenesis and more recient genetic transformation have enabled advances and complementation with clasical Musa breeding for searching resistance to principal fungal pathogen of Musa spp. Field evaluation systems to find Musa resistant genotypes to Foc and M. fijiensis have demostrated to be usefull but laborious. Nevertheless to enhance eficacy in selection of promissory genotypes the development of reproducible early evaluation methodologies by using fungal pathogens or their derivates is needed. Key words: evaluation and selection, Fusarium oxysporum, improvement

  14. Adapting the Caesium-137 technique to document soil redistribution rates associated with traditional cultivation practices in Haiti.

    Science.gov (United States)

    Velasco, H; Astorga, R Torres; Joseph, D; Antoine, J S; Mabit, L; Toloza, A; Dercon, G; Walling, Des E

    2018-03-01

    Large-scale deforestation, intensive land use and unfavourable rainfall conditions are responsible for significant continuous degradation of the Haitian uplands. To develop soil conservation strategies, simple and cost-effective methods are needed to assess rates of soil loss from farmland in Haiti. The fallout radionuclide caesium-137 ( 137 Cs) provides one such means of documenting medium-term soil redistribution rates. In this contribution, the authors report the first use in Haiti of 137 Cs measurements to document soil redistribution rates and the associated pattern of erosion/sedimentation rates along typical hillslopes within a traditional upland Haitian farming area. The local 137 Cs reference inventory, measured at an adjacent undisturbed flat area, was 670 Bq m -2 (SD = 100 Bq m -2 , CV = 15%, n = 7). Within the study area, where cultivation commenced in 1992 after deforestation, three representative downslope transects were sampled. These were characterized by 137 Cs inventories ranging from 190 to 2200 Bq m -2 . Although, the study area was cultivated by the local farmers, the 137 Cs depth distributions obtained from the area differed markedly from those expected from a cultivated area. They showed little evidence of tillage mixing within the upper part of the soil or, more particularly, of the near-uniform activities normally associated with the plough layer or cultivation horizon. They were very similar to that found at the reference site and were characterized by high 137 Cs activities at the surface and much lower activities at greater depths. This situation is thought to reflect the traditional manual tillage practices which cause limited disturbance and mixing of the upper part of the soil. It precluded the use of the conversion models normally used to estimate soil redistribution rates from 137 Cs measurements on cultivated soils and the Diffusion and Migration conversion model frequently used for uncultivated soils was modified for

  15. Traditional versus commercial food processing techniques - A comparative study based on chemical analysis of selected foods consumed in rural Zimbabwe.

    Directory of Open Access Journals (Sweden)

    Abraham I. C. Mwadiwa

    2012-01-01

    Full Text Available With the advent of industrialisation, food processors are constantly looking for ways to cut costs, increase production and maximise profits at the expense of quality. Commercial food processors have since shifted their focus from endogenous ways of processing food to more profitable commercial food processing techniques. The aim of this study was to investigate the holistic impact of commercial food processing techniques on nutrition by comparing commercially (industrially processed food products and endogenously processed food products through chemical analysis of selected foods. Eight food samples which included commercially processed peanut butter, mealie-meal, dried vegetables (mufushwa and rice and endogenously processed peanut butter, mealie-meal, dried vegetables (mufushwa and rice were randomly sampled from rural communities in the south-eastern and central provinces of Zimbabwe. They were analysed for ash, zinc, iron, copper, magnesium, protein, fat, carbohydrates, energy, crude fibre, vitamin C and moisture contents. The results of chemical analysis indicate that endogenously processed mealie-meal, dried vegetables and rice contained higher ash values of 2.00g/100g, 17.83g/100g, and 3.28g/100g respectively than commercially processed mealie-meal, dried vegetables and rice, which had ash values of 1.56g/100g, 15.25g/100g and 1.46g/100g respectively. The results also show that endogenously processed foods have correspondingly higher iron, zinc and magnesium contents and, on the whole, a higher protein content. The results also indicate that commercially processed foods have higher fat and energy contents. The result led to the conclusion that the foods are likely to pose a higher risk of causing adverse conditions to health, such as obesity and cardiovascular diseases to susceptible individuals. Based on these findings, it can, therefore, be concluded that endogenously processed foods have a better nutrient value and health implications

  16. Water quality assessment and apportionment of pollution sources of Gomti river (India) using multivariate statistical techniques--a case study

    International Nuclear Information System (INIS)

    Singh, Kunwar P.; Malik, Amrita; Sinha, Sarita

    2005-01-01

    Multivariate statistical techniques, such as cluster analysis (CA), factor analysis (FA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the data set on water quality of the Gomti river (India), generated during three years (1999-2001) monitoring at eight different sites for 34 parameters (9792 observations). This study presents usefulness of multivariate statistical techniques for evaluation and interpretation of large complex water quality data sets and apportionment of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Three significant groups, upper catchments (UC), middle catchments (MC) and lower catchments (LC) of sampling sites were obtained through CA on the basis of similarity between them. FA/PCA applied to the data sets pertaining to three catchments regions of the river resulted in seven, seven and six latent factors, respectively responsible for the data structure, explaining 74.3, 73.6 and 81.4% of the total variance of the respective data sets. These included the trace metals group (leaching from soil and industrial waste disposal sites), organic pollution group (municipal and industrial effluents), nutrients group (agricultural runoff), alkalinity, hardness, EC and solids (soil leaching and runoff process). DA showed the best results for data reduction and pattern recognition during both temporal and spatial analysis. It rendered five parameters (temperature, total alkalinity, Cl, Na and K) affording more than 94% right assignations in temporal analysis, while 10 parameters (river discharge, pH, BOD, Cl, F, PO 4 , NH 4 -N, NO 3 -N, TKN and Zn) to afford 97% right assignations in spatial analysis of three different regions in the basin. Thus, DA allowed reduction in dimensionality of the large data set, delineating a few indicator parameters responsible for large variations in water quality. Further

  17. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  18. Physical, physicochemical and nutritional characteristics of Bhoja chaul, a traditional ready-to-eat dry heat parboiled rice product processed by an improvised soaking technique.

    Science.gov (United States)

    Dutta, Himjyoti; Mahanta, Charu Lata; Singh, Vasudeva; Das, Barnali Baruah; Rahman, Narzu

    2016-01-15

    Bhoja chaul is a traditional whole rice product processed by the dry heat parboiling technique of low amylose/waxy paddy that is eaten after soaking in water and requires no cooking. The essential steps in Bhoja chaul making are soaking paddy in water, roasting with sand, drying and milling. In this study, the product was prepared from a low amylose variety and a waxy rice variety by an improvised laboratory scale technique. Bhoja chaul prepared in the laboratory by this technique was studied for physical, physicochemical, and textural properties. Improvised method shortened the processing time and gave a product with good textural characteristics. Shape of the rice kernels became bolder on processing. RVA studies and DSC endotherms suggested molecular damage and amylose-lipid complex formation by the linear B-chains of amylopectin, respectively. X-ray diffractography indicated formation of partial B-type pattern. Shifting of the crystalline region of the XRD curve towards lower values of Bragg's angle was attributed to the overall increase in inter-planar spacing of the crystalline lamellae. Resistant starch was negligible. Bhoja chaul may be useful for children and people with poor state of digestibility. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Introductory statistics for engineering experimentation

    CERN Document Server

    Nelson, Peter R; Coffin, Marie

    2003-01-01

    The Accreditation Board for Engineering and Technology (ABET) introduced a criterion starting with their 1992-1993 site visits that "Students must demonstrate a knowledge of the application of statistics to engineering problems." Since most engineering curricula are filled with requirements in their own discipline, they generally do not have time for a traditional two semesters of probability and statistics. Attempts to condense that material into a single semester often results in so much time being spent on probability that the statistics useful for designing and analyzing engineering/scientific experiments is never covered. In developing a one-semester course whose purpose was to introduce engineering/scientific students to the most useful statistical methods, this book was created to satisfy those needs. - Provides the statistical design and analysis of engineering experiments & problems - Presents a student-friendly approach through providing statistical models for advanced learning techniques - Cove...

  20. The Multivariate Regression Statistics Strategy to Investigate Content-Effect Correlation of Multiple Components in Traditional Chinese Medicine Based on a Partial Least Squares Method.

    Science.gov (United States)

    Peng, Ying; Li, Su-Ning; Pei, Xuexue; Hao, Kun

    2018-03-01

    Amultivariate regression statisticstrategy was developed to clarify multi-components content-effect correlation ofpanaxginseng saponins extract and predict the pharmacological effect by components content. In example 1, firstly, we compared pharmacological effects between panax ginseng saponins extract and individual saponin combinations. Secondly, we examined the anti-platelet aggregation effect in seven different saponin combinations of ginsenoside Rb1, Rg1, Rh, Rd, Ra3 and notoginsenoside R1. Finally, the correlation between anti-platelet aggregation and the content of multiple components was analyzed by a partial least squares algorithm. In example 2, firstly, 18 common peaks were identified in ten different batches of panax ginseng saponins extracts from different origins. Then, we investigated the anti-myocardial ischemia reperfusion injury effects of the ten different panax ginseng saponins extracts. Finally, the correlation between the fingerprints and the cardioprotective effects was analyzed by a partial least squares algorithm. Both in example 1 and 2, the relationship between the components content and pharmacological effect was modeled well by the partial least squares regression equations. Importantly, the predicted effect curve was close to the observed data of dot marked on the partial least squares regression model. This study has given evidences that themulti-component content is a promising information for predicting the pharmacological effects of traditional Chinese medicine.

  1. The Multivariate Regression Statistics Strategy to Investigate Content-Effect Correlation of Multiple Components in Traditional Chinese Medicine Based on a Partial Least Squares Method

    Directory of Open Access Journals (Sweden)

    Ying Peng

    2018-03-01

    Full Text Available Amultivariate regression statisticstrategy was developed to clarify multi-components content-effect correlation ofpanaxginseng saponins extract and predict the pharmacological effect by components content. In example 1, firstly, we compared pharmacological effects between panax ginseng saponins extract and individual saponin combinations. Secondly, we examined the anti-platelet aggregation effect in seven different saponin combinations of ginsenoside Rb1, Rg1, Rh, Rd, Ra3 and notoginsenoside R1. Finally, the correlation between anti-platelet aggregation and the content of multiple components was analyzed by a partial least squares algorithm. In example 2, firstly, 18 common peaks were identified in ten different batches of panax ginseng saponins extracts from different origins. Then, we investigated the anti-myocardial ischemia reperfusion injury effects of the ten different panax ginseng saponins extracts. Finally, the correlation between the fingerprints and the cardioprotective effects was analyzed by a partial least squares algorithm. Both in example 1 and 2, the relationship between the components content and pharmacological effect was modeled well by the partial least squares regression equations. Importantly, the predicted effect curve was close to the observed data of dot marked on the partial least squares regression model. This study has given evidences that themulti-component content is a promising information for predicting the pharmacological effects of traditional Chinese medicine.

  2. Statistical Physics

    CERN Document Server

    Wannier, Gregory Hugh

    1966-01-01

    Until recently, the field of statistical physics was traditionally taught as three separate subjects: thermodynamics, statistical mechanics, and kinetic theory. This text, a forerunner in its field and now a classic, was the first to recognize the outdated reasons for their separation and to combine the essentials of the three subjects into one unified presentation of thermal physics. It has been widely adopted in graduate and advanced undergraduate courses, and is recommended throughout the field as an indispensable aid to the independent study and research of statistical physics.Designed for

  3. Fit accuracy of metal partial removable dental prosthesis frameworks fabricated by traditional or light curing modeling material technique: An in vitro study

    Science.gov (United States)

    Anan, Mohammad Tarek M.; Al-Saadi, Mohannad H.

    2015-01-01

    Objective The aim of this study was to compare the fit accuracies of metal partial removable dental prosthesis (PRDP) frameworks fabricated by the traditional technique (TT) or the light-curing modeling material technique (LCMT). Materials and methods A metal model of a Kennedy class III modification 1 mandibular dental arch with two edentulous spaces of different spans, short and long, was used for the study. Thirty identical working casts were used to produce 15 PRDP frameworks each by TT and by LCMT. Every framework was transferred to a metal master cast to measure the gap between the metal base of the framework and the crest of the alveolar ridge of the cast. Gaps were measured at three points on each side by a USB digital intraoral camera at ×16.5 magnification. Images were transferred to a graphics editing program. A single examiner performed all measurements. The two-tailed t-test was performed at the 5% significance level. Results The mean gap value was significantly smaller in the LCMT group compared to the TT group. The mean value of the short edentulous span was significantly smaller than that of the long edentulous span in the LCMT group, whereas the opposite result was obtained in the TT group. Conclusion Within the limitations of this study, it can be concluded that the fit of the LCMT-fabricated frameworks was better than the fit of the TT-fabricated frameworks. The framework fit can differ according to the span of the edentate ridge and the fabrication technique for the metal framework. PMID:26236129

  4. Statistical methods

    CERN Document Server

    Szulc, Stefan

    1965-01-01

    Statistical Methods provides a discussion of the principles of the organization and technique of research, with emphasis on its application to the problems in social statistics. This book discusses branch statistics, which aims to develop practical ways of collecting and processing numerical data and to adapt general statistical methods to the objectives in a given field.Organized into five parts encompassing 22 chapters, this book begins with an overview of how to organize the collection of such information on individual units, primarily as accomplished by government agencies. This text then

  5. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 1: Review and comparison of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked

  6. Statistical analyses of scatterplots to identify important factors in large-scale simulations, 2: robustness of techniques

    International Nuclear Information System (INIS)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-01-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (i) Type I errors are unavoidable, (ii) Type II errors can occur when inappropriate analysis procedures are used, (iii) physical explanations should always be sought for why statistical procedures identify variables as being important, and (iv) the identification of important variables tends to be stable for independent Latin hypercube samples

  7. A split-mouth randomized clinical trial to evaluate the performance of piezosurgery compared with traditional technique in lower wisdom tooth removal.

    Science.gov (United States)

    Mantovani, Edoardo; Arduino, Paolo Giacomo; Schierano, Gianmario; Ferrero, Luca; Gallesio, Giorgia; Mozzati, Marco; Russo, Andrea; Scully, Crispian; Carossa, Stefano

    2014-10-01

    The surgical removal of mandibular third molars is frequently accompanied by significant postsurgical sequelae, and different protocols have been described to decrease such adverse events. The aim of this study was to investigate the performance of piezosurgery compared with traditional rotating instruments during mandibular third molar removal. A single-center, randomized, split-mouth study was performed using a consecutive series of unrelated healthy patients attending the Oral Surgery Unit of the University of Turin for surgical removal of bilateral mandibular third molar teeth. Each patient was treated, at the same appointment, using bur removal on 1 side of the mandible and a piezoelectric device on the contralateral side. The primary outcomes reported were postoperative pain, objective orofacial swelling, and surgical duration; secondary outcomes were gender, age, and possible adverse events. Analysis of variance or paired t test was used as appropriate to test any significant differences at baseline according to each treatment subgroup, and categorical variables were analyzed by χ(2) test. The study sample consisted of 100 otherwise healthy patients. The mean pain evaluation reported by patients who underwent surgery with piezosurgery was significantly lower than that reported after bur (conventional) removal, reaching statistical difference after 4 days (P = .043). The clinical value of orofacial swelling at day 7, normalized to baseline, was lower in the piezosurgery group (P piezosurgery group (P piezosurgery for lower third molar tooth removal. This study also compared surgeons with different degrees of experience. It is evident that using a piezoelectric device can enhance the patient experience and decrease postoperative pain and swelling. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  8. The integrated analyses of digital field mapping techniques and traditional field methods: implications from the Burdur-Fethiye Shear Zone, SW Turkey as a case-study

    Science.gov (United States)

    Elitez, İrem; Yaltırak, Cenk; Zabcı, Cengiz; Şahin, Murat

    2015-04-01

    The precise geological mapping is one of the most important issues in geological studies. Documenting the spatial distribution of geological bodies and their contacts play a crucial role on interpreting the tectonic evolution of any region. Although the traditional field techniques are still accepted to be the most fundamental tools in construction of geological maps, we suggest that the integration of digital technologies to the classical methods significantly increases the resolution and the quality of such products. We simply follow the following steps in integration of the digital data with the traditional field observations. First, we create the digital elevation model (DEM) of the region of interest by interpolating the digital contours of 1:25000 scale topographic maps to 10 m of ground pixel resolution. The non-commercial Google Earth satellite imagery and geological maps of previous studies are draped over the interpolated DEMs in the second stage. The integration of all spatial data is done by using the market leading GIS software, ESRI ArcGIS. We make the preliminary interpretation of major structures as tectonic lineaments and stratigraphic contacts. These preliminary maps are controlled and precisely coordinated during the field studies by using mobile tablets and/or phablets with GPS receivers. The same devices are also used in measuring and recording the geologic structures of the study region. Finally, all digitally collected measurements and observations are added to the GIS database and we finalise our geological map with all available information. We applied this integrated method to map the Burdur-Fethiye Shear Zone (BFSZ) in the southwest Turkey. The BFSZ is an active sinistral 60-to-90 km-wide shear zone, which prolongs about 300 km-long between Suhut-Cay in the northeast and Köyceğiz Lake-Kalkan in the southwest on land. The numerous studies suggest contradictory models not only about the evolution but also about the fault geometry of this

  9. Comparison of a new hydro-surgical technique to traditional methods for the preparation of full-thickness skin grafts from canine cadaveric skin and report of a single clinical case.

    Science.gov (United States)

    Townsend, F I; Ralphs, S C; Coronado, G; Sweet, D C; Ward, J; Bloch, C P

    2012-01-01

    To compare the hydro-surgical technique to traditional techniques for removal of subcutaneous tissue in the preparation of full-thickness skin grafts. Ex vivo experimental study and a single clinical case report. Four canine cadavers and a single clinical case. Four sections of skin were harvested from the lateral flank of recently euthanatized dogs. Traditional preparation methods used included both a blade or scissors technique, each of which were compared to the hydro-surgical technique individually. Preparation methods were compared based on length of time for removal of the subcutaneous tissue from the graft, histologic grading, and measurable thickness as compared to an untreated sample. The hydro-surgical technique had the shortest skin graft preparation time as compared to traditional techniques (p = 0.002). There was no significant difference in the histological grading or measurable subcutaneous thickness between skin specimens. The hydro-surgical technique provides a rapid, effective debridement of subcutaneous tissue in the preparation of full-thickness skin grafts. There were not any significant changes in histological grade and subcutaneous tissue remaining among all treatment types. Additionally the hydro-surgical technique was successfully used to prepare a full-thickness meshed free skin graft in the reconstruction of a traumatic medial tarsal wound in a dog.

  10. Are conventional statistical techniques exhaustive for defining metal background concentrations in harbour sediments? A case study: The Coastal Area of Bari (Southeast Italy).

    Science.gov (United States)

    Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio

    2015-11-01

    Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Application of a Multivariate Statistical Technique to Interpreting Data from Multichannel Equipment for the Example of the KLEM Spectrometer

    International Nuclear Information System (INIS)

    Podorozhnyi, D.M.; Postnikov, E.B.; Sveshnikova, L.G.; Turundaevsky, A.N.

    2005-01-01

    A multivariate statistical procedure for solving problems of estimating physical parameters on the basis of data from measurements with multichannel equipment is described. Within the multivariate procedure, an algorithm is constructed for estimating the energy of primary cosmic rays and the exponent in their power-law spectrum. They are investigated by using the KLEM spectrometer (NUCLEON project) as a specific example of measuring equipment. The results of computer experiments simulating the operation of the multivariate procedure for this equipment are given, the proposed approach being compared in these experiments with the one-parameter approach presently used in data processing

  12. Evaluation of a dimension-reduction-based statistical technique for Temperature, Water Vapour and Ozone retrievals from IASI radiances

    Science.gov (United States)

    Amato, Umberto; Antoniadis, Anestis; De Feis, Italia; Masiello, Guido; Matricardi, Marco; Serio, Carmine

    2009-03-01

    Remote sensing of atmosphere is changing rapidly thanks to the development of high spectral resolution infrared space-borne sensors. The aim is to provide more and more accurate information on the lower atmosphere, as requested by the World Meteorological Organization (WMO), to improve reliability and time span of weather forecasts plus Earth's monitoring. In this paper we show the results we have obtained on a set of Infrared Atmospheric Sounding Interferometer (IASI) observations using a new statistical strategy based on dimension reduction. Retrievals have been compared to time-space colocated ECMWF analysis for temperature, water vapor and ozone.

  13. Influence of manufacturing parameters on the strength of PLA parts using Layered Manufacturing technique: A statistical approach

    Science.gov (United States)

    Jaya Christiyan, K. G.; Chandrasekhar, U.; Mathivanan, N. Rajesh; Venkateswarlu, K.

    2018-02-01

    A 3D printing was successfully used to fabricate samples of Polylactic Acid (PLA). Processing parameters such as Lay-up speed, Lay-up thickness, and printing nozzle were varied. All samples were tested for flexural strength using three point load test. A statistical mathematical model was developed to correlate the processing parameters with flexural strength. The result clearly demonstrated that the lay-up thickness and nozzle diameter influenced flexural strength significantly, whereas lay-up speed hardly influenced the flexural strength.

  14. Design and performance characteristics of solar adsorption refrigeration system using parabolic trough collector: Experimental and statistical optimization technique

    International Nuclear Information System (INIS)

    Abu-Hamdeh, Nidal H.; Alnefaie, Khaled A.; Almitani, Khalid H.

    2013-01-01

    Highlights: • The successes of using olive waste/methanol as an adsorbent/adsorbate pair. • The experimental gross cycle coefficient of performance obtained was COP a = 0.75. • Optimization showed expanding adsorbent mass to a certain range increases the COP. • The statistical optimization led to optimum tank volume between 0.2 and 0.3 m 3 . • Increasing the collector area to a certain range increased the COP. - Abstract: The current work demonstrates a developed model of a solar adsorption refrigeration system with specific requirements and specifications. The recent scheme can be employed as a refrigerator and cooler unit suitable for remote areas. The unit runs through a parabolic trough solar collector (PTC) and uses olive waste as adsorbent with methanol as adsorbate. Cooling production, COP (coefficient of performance, and COP a (cycle gross coefficient of performance) were used to assess the system performance. The system’s design optimum parameters in this study were arrived to through statistical and experimental methods. The lowest temperature attained in the refrigerated space was 4 °C and the equivalent ambient temperature was 27 °C. The temperature started to decrease steadily at 20:30 – when the actual cooling started – until it reached 4 °C at 01:30 in the next day when it rose again. The highest COP a obtained was 0.75

  15. Model-based iterative reconstruction technique for radiation dose reduction in chest CT: comparison with the adaptive statistical iterative reconstruction technique

    International Nuclear Information System (INIS)

    Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni

    2012-01-01

    To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 ± 3.00) than low-dose ASIR (49.24 ± 9.11, P < 0.01) and reference-dose ASIR images (24.93 ± 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)

  16. Model-based iterative reconstruction technique for radiation dose reduction in chest CT: comparison with the adaptive statistical iterative reconstruction technique

    Energy Technology Data Exchange (ETDEWEB)

    Katsura, Masaki; Matsuda, Izuru; Akahane, Masaaki; Sato, Jiro; Akai, Hiroyuki; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [University of Tokyo, Department of Radiology, Graduate School of Medicine, Bunkyo-ku, Tokyo (Japan)

    2012-08-15

    To prospectively evaluate dose reduction and image quality characteristics of chest CT reconstructed with model-based iterative reconstruction (MBIR) compared with adaptive statistical iterative reconstruction (ASIR). One hundred patients underwent reference-dose and low-dose unenhanced chest CT with 64-row multidetector CT. Images were reconstructed with 50 % ASIR-filtered back projection blending (ASIR50) for reference-dose CT, and with ASIR50 and MBIR for low-dose CT. Two radiologists assessed the images in a blinded manner for subjective image noise, artefacts and diagnostic acceptability. Objective image noise was measured in the lung parenchyma. Data were analysed using the sign test and pair-wise Student's t-test. Compared with reference-dose CT, there was a 79.0 % decrease in dose-length product with low-dose CT. Low-dose MBIR images had significantly lower objective image noise (16.93 {+-} 3.00) than low-dose ASIR (49.24 {+-} 9.11, P < 0.01) and reference-dose ASIR images (24.93 {+-} 4.65, P < 0.01). Low-dose MBIR images were all diagnostically acceptable. Unique features of low-dose MBIR images included motion artefacts and pixellated blotchy appearances, which did not adversely affect diagnostic acceptability. Diagnostically acceptable chest CT images acquired with nearly 80 % less radiation can be obtained using MBIR. MBIR shows greater potential than ASIR for providing diagnostically acceptable low-dose CT images without severely compromising image quality. (orig.)

  17. Vital statistics

    CERN Document Server

    MacKenzie, Dana

    2004-01-01

    The drawbacks of using 19th-century mathematics in physics and astronomy are illustrated. To continue with the expansion of the knowledge about the cosmos, the scientists will have to come in terms with modern statistics. Some researchers have deliberately started importing techniques that are used in medical research. However, the physicists need to identify the brand of statistics that will be suitable for them, and make a choice between the Bayesian and the frequentists approach. (Edited abstract).

  18. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques

    International Nuclear Information System (INIS)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-01-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock–water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. - Highlights: • Hydrochemical investigations were carried out in Dhurma aquifer in Saudi Arabia. • The factors controlling potential groundwater pollution in an arid region were studied. • Chemical and statistical analyses are integrated to assess these factors. • Five main factors were extracted, which explain >77% of the total data variance. • The chemical characteristics of the groundwater were influenced by rock–water interactions

  19. Evaluation of image quality and radiation dose by adaptive statistical iterative reconstruction technique level for chest CT examination.

    Science.gov (United States)

    Hong, Sun Suk; Lee, Jong-Woong; Seo, Jeong Beom; Jung, Jae-Eun; Choi, Jiwon; Kweon, Dae Cheol

    2013-12-01

    The purpose of this research is to determine the adaptive statistical iterative reconstruction (ASIR) level that enables optimal image quality and dose reduction in the chest computed tomography (CT) protocol with ASIR. A chest phantom with 0-50 % ASIR levels was scanned and then noise power spectrum (NPS), signal and noise and the degree of distortion of peak signal-to-noise ratio (PSNR) and the root-mean-square error (RMSE) were measured. In addition, the objectivity of the experiment was measured using the American College of Radiology (ACR) phantom. Moreover, on a qualitative basis, five lesions' resolution, latitude and distortion degree of chest phantom and their compiled statistics were evaluated. The NPS value decreased as the frequency increased. The lowest noise and deviation were at the 20 % ASIR level, mean 126.15 ± 22.21. As a result of the degree of distortion, signal-to-noise ratio and PSNR at 20 % ASIR level were at the highest value as 31.0 and 41.52. However, maximum absolute error and RMSE showed the lowest deviation value as 11.2 and 16. In the ACR phantom study, all ASIR levels were within acceptable allowance of guidelines. The 20 % ASIR level performed best in qualitative evaluation at five lesions of chest phantom as resolution score 4.3, latitude 3.47 and the degree of distortion 4.25. The 20 % ASIR level was proved to be the best in all experiments, noise, distortion evaluation using ImageJ and qualitative evaluation of five lesions of a chest phantom. Therefore, optimal images as well as reduce radiation dose would be acquired when 20 % ASIR level in thoracic CT is applied.

  20. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  1. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  2. Establishing structure-property correlations and classification of base oils using statistical techniques and artificial neural networks

    International Nuclear Information System (INIS)

    Kapur, G.S.; Sastry, M.I.S.; Jaiswal, A.K.; Sarpal, A.S.

    2004-01-01

    The present paper describes various classification techniques like cluster analysis, principal component (PC)/factor analysis to classify different types of base stocks. The API classification of base oils (Group I-III) has been compared to a more detailed NMR derived chemical compositional and molecular structural parameters based classification in order to point out the similarities of the base oils in the same group and the differences between the oils placed in different groups. The detailed compositional parameters have been generated using 1 H and 13 C nuclear magnetic resonance (NMR) spectroscopic methods. Further, oxidation stability, measured in terms of rotating bomb oxidation test (RBOT) life, of non-conventional base stocks and their blends with conventional base stocks, has been quantitatively correlated with their 1 H NMR and elemental (sulphur and nitrogen) data with the help of multiple linear regression (MLR) and artificial neural networks (ANN) techniques. The MLR based model developed using NMR and elemental data showed a high correlation between the 'measured' and 'estimated' RBOT values for both training (R=0.859) and validation (R=0.880) data sets. The ANN based model, developed using fewer number of input variables (only 1 H NMR data) also showed high correlation between the 'measured' and 'estimated' RBOT values for training (R=0.881), validation (R=0.860) and test (R=0.955) data sets

  3. Suprafascial versus traditional harvesting technique for free antero lateral thigh flap: A case-control study to assess the best functional and aesthetic result in extremity reconstruction.

    Science.gov (United States)

    Maruccia, Michele; Fallico, Nefer; Cigna, Emanuele; Ciudad, Pedro; Nicoli, Fabio; Trignano, Emilio; Nacchiero, Eleonora; Giudice, Giuseppe; Ribuffo, Diego; Chen, Hung-Chi

    2017-11-01

    Clinical applications of ALT flap have currently extended to extremity (hand and foot) as well as oral cavity reconstruction. In these anatomical areas, the traditional harvesting technique presents a few disadvantages such as bulkiness of the recipient site and potential donor site morbidity including damage to the deep fascia and skin graft adhesions. The purpose of the present study was to compare the functional and aesthetic outcomes of upper and lower extremity reconstruction with either suprafascial or subfascial harvested anterolateral (ALT) flaps. Sixty patients who underwent hand or foot reconstruction with an ALT flap between January 2013 and January 2015 were included in the study (34 flaps elevated on a subfascial plane and 26 on a suprafascial plane). Group 1 (subfascial harvested ALT flap) was composed of 23 male and 11 female patients with an average age of 53.4 years (range, 36-72 years). Group 2 (suprafascial harvested ALT flap) was composed of 18 male and 8 female patients with an average age of 48.7 years (range, 32-69 years). Surgical indication was tumor resection for 20 patients in group 1 and 16 patients in group 2, chronic ulcer for 8 patients in group 1 and 6 patients in group 2, and trauma for 6 patients in group 1 and 4 patients in group 2. Complications were documented. Aesthetic outcomes were considered in terms of bulkiness of the recipient site, subsequent request for a debulking procedure, and donor site morbidity. Donor site scars were evaluated for cosmesis using a modified Hollander Wound Evaluation Scale (HWES). Skin grafts outcomes were assessed according to the modified Vancouver Scar Scale (VSS). Functional outcome at the recipient site was measured using the Enneking functional outcome score (ESS). Total range of motion (ROM) was recorded. All flaps were successfully elevated with at least one viable perforator with both approaches. The survival rates of suprafascial and subfascial harvested ALT flaps were 96.2 and 97

  4. Assessing the hydrogeochemical processes affecting groundwater pollution in arid areas using an integration of geochemical equilibrium and multivariate statistical techniques.

    Science.gov (United States)

    El Alfy, Mohamed; Lashin, Aref; Abdalla, Fathy; Al-Bassam, Abdulaziz

    2017-10-01

    Rapid economic expansion poses serious problems for groundwater resources in arid areas, which typically have high rates of groundwater depletion. In this study, integration of hydrochemical investigations involving chemical and statistical analyses are conducted to assess the factors controlling hydrochemistry and potential pollution in an arid region. Fifty-four groundwater samples were collected from the Dhurma aquifer in Saudi Arabia, and twenty-one physicochemical variables were examined for each sample. Spatial patterns of salinity and nitrate were mapped using fitted variograms. The nitrate spatial distribution shows that nitrate pollution is a persistent problem affecting a wide area of the aquifer. The hydrochemical investigations and cluster analysis reveal four significant clusters of groundwater zones. Five main factors were extracted, which explain >77% of the total data variance. These factors indicated that the chemical characteristics of the groundwater were influenced by rock-water interactions and anthropogenic factors. The identified clusters and factors were validated with hydrochemical investigations. The geogenic factors include the dissolution of various minerals (calcite, aragonite, gypsum, anhydrite, halite and fluorite) and ion exchange processes. The anthropogenic factors include the impact of irrigation return flows and the application of potassium, nitrate, and phosphate fertilizers. Over time, these anthropogenic factors will most likely contribute to further declines in groundwater quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Paediatric lower limb deformity correction using the Ilizarov technique: a statistical analysis of factors affecting the complication rate.

    Science.gov (United States)

    Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M

    2014-01-01

    Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.

  6. New scanning technique using Adaptive Statistical lterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT

    International Nuclear Information System (INIS)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-01-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550mA (450–600) vs. 650mA (500–711.25) (median (interquartile range)), respectively, P<0.001. There was 27% effective radiation dose reduction in the ASIR group compared with FBP group, 4.29mSv (2.84–6.02) vs. 5.84mSv (3.88–8.39) (median (interquartile range)), respectively, P<0.001. Although ASIR was associated with increased image noise compared with FBP (39.93±10.22 vs. 37.63±18.79 (mean ±standard deviation), respectively, P<001), it did not affect the signal intensity, signal-to-noise ratio, contrast-to-noise ratio or the diagnostic quality of CCTA. Application of ASIR reduces the radiation dose of CCTA without affecting the image quality.

  7. New scanning technique using Adaptive Statistical Iterative Reconstruction (ASIR) significantly reduced the radiation dose of cardiac CT.

    Science.gov (United States)

    Tumur, Odgerel; Soon, Kean; Brown, Fraser; Mykytowycz, Marcus

    2013-06-01

    The aims of our study were to evaluate the effect of application of Adaptive Statistical Iterative Reconstruction (ASIR) algorithm on the radiation dose of coronary computed tomography angiography (CCTA) and its effects on image quality of CCTA and to evaluate the effects of various patient and CT scanning factors on the radiation dose of CCTA. This was a retrospective study that included 347 consecutive patients who underwent CCTA at a tertiary university teaching hospital between 1 July 2009 and 20 September 2011. Analysis was performed comparing patient demographics, scan characteristics, radiation dose and image quality in two groups of patients in whom conventional Filtered Back Projection (FBP) or ASIR was used for image reconstruction. There were 238 patients in the FBP group and 109 patients in the ASIR group. There was no difference between the groups in the use of prospective gating, scan length or tube voltage. In ASIR group, significantly lower tube current was used compared with FBP group, 550 mA (450-600) vs. 650 mA (500-711.25) (median (interquartile range)), respectively, P ASIR group compared with FBP group, 4.29 mSv (2.84-6.02) vs. 5.84 mSv (3.88-8.39) (median (interquartile range)), respectively, P ASIR was associated with increased image noise compared with FBP (39.93 ± 10.22 vs. 37.63 ± 18.79 (mean ± standard deviation), respectively, P ASIR reduces the radiation dose of CCTA without affecting the image quality. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.

  8. Reducing abdominal CT radiation dose with the adaptive statistical iterative reconstruction technique in children: a feasibility study

    Energy Technology Data Exchange (ETDEWEB)

    Vorona, Gregory A. [The Children' s Hospital of Pittsburgh of UPMC, Department of Radiology, Pittsburgh, PA (United States); Allegheny General Hospital, Department of Radiology, Pittsburgh, PA (United States); Ceschin, Rafael C.; Clayton, Barbara L.; Sutcavage, Tom; Tadros, Sameh S.; Panigrahy, Ashok [The Children' s Hospital of Pittsburgh of UPMC, Department of Radiology, Pittsburgh, PA (United States)

    2011-09-15

    The use of the adaptive statistical iterative reconstruction (ASIR) algorithm has been shown to reduce radiation doses in adults undergoing abdominal CT studies while preserving image quality. To our knowledge, no studies have been done to validate the use of ASIR in children. To retrospectively evaluate differences in radiation dose and image quality in pediatric CT abdominal studies utilizing 40% ASIR compared with filtered-back projection (FBP). Eleven patients (mean age 8.5 years, range 2-17 years) had separate 40% ASIR and FBP enhanced abdominal CT studies on different days between July 2009 and October 2010. The ASIR studies utilized a 38% mA reduction in addition to our pediatric protocol mAs. Study volume CT dose indexes (CTDI{sub vol}) and dose-length products (DLP) were recorded. A consistent representative image was obtained from each study. The images were independently evaluated by two radiologists in a blinded manner for diagnostic utility, image sharpness and image noise. The average CTDI{sub vol} and DLP for the 40% ASIR studies were 4.25 mGy and 185.04 mGy-cm, compared with 6.75 mGy and 275.79 mGy-cm for the FBP studies, representing 37% and 33% reductions in both, respectively. The radiologists' assessments of subjective image quality did not demonstrate any significant differences between the ASIR and FBP images. In our experience, the use of 40% ASIR with a 38% decrease in mA lowers the radiation dose for children undergoing enhanced abdominal examinations by an average of 33%, while maintaining diagnostically acceptable images. (orig.)

  9. Reducing abdominal CT radiation dose with the adaptive statistical iterative reconstruction technique in children: a feasibility study

    International Nuclear Information System (INIS)

    Vorona, Gregory A.; Ceschin, Rafael C.; Clayton, Barbara L.; Sutcavage, Tom; Tadros, Sameh S.; Panigrahy, Ashok

    2011-01-01

    The use of the adaptive statistical iterative reconstruction (ASIR) algorithm has been shown to reduce radiation doses in adults undergoing abdominal CT studies while preserving image quality. To our knowledge, no studies have been done to validate the use of ASIR in children. To retrospectively evaluate differences in radiation dose and image quality in pediatric CT abdominal studies utilizing 40% ASIR compared with filtered-back projection (FBP). Eleven patients (mean age 8.5 years, range 2-17 years) had separate 40% ASIR and FBP enhanced abdominal CT studies on different days between July 2009 and October 2010. The ASIR studies utilized a 38% mA reduction in addition to our pediatric protocol mAs. Study volume CT dose indexes (CTDI vol ) and dose-length products (DLP) were recorded. A consistent representative image was obtained from each study. The images were independently evaluated by two radiologists in a blinded manner for diagnostic utility, image sharpness and image noise. The average CTDI vol and DLP for the 40% ASIR studies were 4.25 mGy and 185.04 mGy-cm, compared with 6.75 mGy and 275.79 mGy-cm for the FBP studies, representing 37% and 33% reductions in both, respectively. The radiologists' assessments of subjective image quality did not demonstrate any significant differences between the ASIR and FBP images. In our experience, the use of 40% ASIR with a 38% decrease in mA lowers the radiation dose for children undergoing enhanced abdominal examinations by an average of 33%, while maintaining diagnostically acceptable images. (orig.)

  10. Statistical optimization of cell disruption techniques for releasing intracellular X-prolyl dipeptidyl aminopeptidase from Lactococcus lactis spp. lactis.

    Science.gov (United States)

    Üstün-Aytekin, Özlem; Arısoy, Sevda; Aytekin, Ali Özhan; Yıldız, Ece

    2016-03-01

    X-prolyl dipeptidyl aminopeptidase (PepX) is an intracellular enzyme from the Gram-positive bacterium Lactococcus lactis spp. lactis NRRL B-1821, and it has commercial importance. The objective of this study was to compare the effects of several cell disruption methods on the activity of PepX. Statistical optimization methods were performed for two cavitation methods, hydrodynamic (high-pressure homogenization) and acoustic (sonication), to determine the more appropriate disruption method. Two level factorial design (2FI), with the parameters of number of cycles and pressure, and Box-Behnken design (BBD), with the parameters of cycle, sonication time, and power, were used for the optimization of the high-pressure homogenization and sonication methods, respectively. In addition, disruption methods, consisting of lysozyme, bead milling, heat treatment, freeze-thawing, liquid nitrogen, ethylenediaminetetraacetic acid (EDTA), Triton-X, sodium dodecyl sulfate (SDS), chloroform, and antibiotics, were performed and compared with the high-pressure homogenization and sonication methods. The optimized values of high-pressure homogenization were one cycle at 130 MPa providing activity of 114.47 mU ml(-1), while sonication afforded an activity of 145.09 mU ml(-1) at 28 min with 91% power and three cycles. In conclusion, sonication was the more effective disruption method, and its optimal operation parameters were manifested for the release of intracellular enzyme from a L. lactis spp. lactis strain, which is a Gram-positive bacterium. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. MODEL APPLICATION MULTIVARIATE ANALYSIS OF STATISTICAL TECHNIQUES PCA AND HCA ASSESSMENT QUESTIONNAIRE ON CUSTOMER SATISFACTION: CASE STUDY IN A METALLURGICAL COMPANY OF METAL CONTAINERS

    Directory of Open Access Journals (Sweden)

    Cláudio Roberto Rosário

    2012-07-01

    Full Text Available The purpose of this research is to improve the practice on customer satisfaction analysis The article presents an analysis model to analyze the answers of a customer satisfaction evaluation in a systematic way with the aid of multivariate statistical techniques, specifically, exploratory analysis with PCA – Partial Components Analysis with HCA - Hierarchical Cluster Analysis. It was tried to evaluate the applicability of the model to be used by the issue company as a tool to assist itself on identifying the value chain perceived by the customer when applied the questionnaire of customer satisfaction. It was found with the assistance of multivariate statistical analysis that it was observed similar behavior among customers. It also allowed the company to conduct reviews on questions of the questionnaires, using analysis of the degree of correlation between the questions that was not a company’s practice before this research.

  12. Comparison of NDA and DA measurement techniques for excess plutonium powders at the Hanford Site: Statistical design and heterogeneity testing

    International Nuclear Information System (INIS)

    Welsh, T.L.; McRae, L.P.; Delegard, C.H.; Liebetrau, A.M.; Johnson, W.C.; Theis, W.; Lemaire, R.J.; Xiao, J.

    1995-06-01

    Quantitative physical measurements are a n component of the International Atomic Energy Agency (IAEA) nuclear material m ampersand guards verification regime. In December 1994, LA.FA safeguards were initiated on an inventory of excess plutonium powder items at the Plutonium Finishing Plant, Vault 3, on the US Department of Energy's Hanford Site. The material originl from the US nuclear weapons complex. The diversity of the chemical form and the heterogenous physical form of this inventory were anticipated to challenge the precision and accuracy of quantitative destructive analytical techniques. A sampling design was used to estimate the degree of heterogeneity of the plutonium content of a variety of inventory items. Plutonium concentration, the item net weight, and the 240 Pu content were among the variables considered in the design. Samples were obtained from randomly selected location within each item. Each sample was divided into aliquots and analyzed chemically. Operator measurements by calorimetry and IAEA measurements by coincident neutron nondestructive analysis also were performed for the initial physical inventory verification materials and similar items not yet under IAEA safeguards. The heterogeneity testing has confirmed that part of the material is indeed significantly heterogeneous; this means that precautionary measures must be taken to obtain representative samples for destructive analysis. In addition, the sampling variability due to material heterogeneity was found to be comparable with, or greater than, the variability of the operator's calorimetric measurements

  13. A scanning electron microscope study and statistical analysis of adipocyte morphology in lipofilling: comparing the effects of harvesting and purification procedures with 2 different techniques.

    Science.gov (United States)

    Rubino, Corrado; Mazzarello, Vittorio; Faenza, Mario; Montella, Andrea; Santanelli, Fabio; Farace, Francesco

    2015-06-01

    The aim of this study was to evaluate the effects on adipocyte morphology of 2 techniques of fat harvesting and of fat purification in lipofilling, considering that the number of viable healthy adipocytes is important in fat survival in recipient areas of lipofilling. Fat harvesting was performed in 10 female patients from flanks, on one side with a 2-mm Coleman cannula and on the other side with a 3-mm Mercedes cannula. Thirty milliliter of fat tissue from each side was collected and divided into three 10 mL syringes: A, B, and C. The fat inside syringe A was left untreated, the fat in syringe B underwent simple sedimentation, and the fat inside syringe C underwent centrifugation at 3000 rpm for 3 minutes. Each fat graft specimen was processed for examination under low-vacuum scanning electron microscope. Diameter (μ) and number of adipocytes per square millimeter and number of altered adipocytes per square millimeter were evaluated. Untreated specimens harvested with the 2 different techniques were first compared, then sedimented versus centrifuged specimens harvested with the same technique were compared. Statistical analysis was performed using Wilcoxon signed rank test. The number of adipocytes per square millimeter was statistically higher in specimens harvested with the 3-mm Mercedes cannula (P = 0.0310). The number of altered cells was statistically higher in centrifuged specimens than in sedimented ones using both methods of fat harvesting (P = 0.0080) with a 2-mm Coleman cannula and (P = 0.0050) with a 3-mm Mercedes cannula. Alterations in adipocyte morphology consisted in wrinkling of the membrane, opening of pore with leakage of oily material, reduction of cellular diameter, and total collapse of the cellular membrane. Fat harvesting by a 3-mm cannula results in a higher number of adipocytes and centrifugation of the harvested fat results in a higher number of morphologic altered cells than sedimentation.

  14. Evolution of Cognitive Rehabilitation After Stroke From Traditional Techniques to Smart and Personalized Home-Based Information and Communication Technology Systems: Literature Review.

    Science.gov (United States)

    Cogollor, José M; Rojo-Lacal, Javier; Hermsdörfer, Joachim; Ferre, Manuel; Arredondo Waldmeyer, Maria Teresa; Giachritsis, Christos; Armstrong, Alan; Breñosa Martinez, Jose Manuel; Bautista Loza, Doris Anabelle; Sebastián, José María

    2018-03-26

    Neurological patients after stroke usually present cognitive deficits that cause dependencies in their daily living. These deficits mainly affect the performance of some of their daily activities. For that reason, stroke patients need long-term processes for their cognitive rehabilitation. Considering that classical techniques are focused on acting as guides and are dependent on help from therapists, significant efforts are being made to improve current methodologies and to use eHealth and Web-based architectures to implement information and communication technology (ICT) systems that achieve reliable, personalized, and home-based platforms to increase efficiency and level of attractiveness for patients and carers. The goal of this work was to provide an overview of the practices implemented for the assessment of stroke patients and cognitive rehabilitation. This study puts together traditional methods and the most recent personalized platforms based on ICT technologies and Internet of Things. A literature review has been distributed to a multidisciplinary team of researchers from engineering, psychology, and sport science fields. The systematic review has been focused on published scientific research, other European projects, and the most current innovative large-scale initiatives in the area. A total of 3469 results were retrieved from Web of Science, 284 studies from Journal of Medical Internet Research, and 15 European research projects from Community Research and Development Information Service from the last 15 years were reviewed for classification and selection regarding their relevance. A total of 7 relevant studies on the screening of stroke patients have been presented with 6 additional methods for the analysis of kinematics and 9 studies on the execution of goal-oriented activities. Meanwhile, the classical methods to provide cognitive rehabilitation have been classified in the 5 main techniques implemented. Finally, the review has been finalized with

  15. Impact of the adaptive statistical iterative reconstruction technique on image quality in ultra-low-dose CT

    International Nuclear Information System (INIS)

    Xu, Yan; He, Wen; Chen, Hui; Hu, Zhihai; Li, Juan; Zhang, Tingting

    2013-01-01

    Aim: To evaluate the relationship between different noise indices (NIs) and radiation dose and to compare the effect of different reconstruction algorithm applications for ultra-low-dose chest computed tomography (CT) on image quality improvement and the accuracy of volumetric measurement of ground-glass opacity (GGO) nodules using a phantom study. Materials and methods: A 11 cm thick transverse phantom section with a chest wall, mediastinum, and 14 artificial GGO nodules with known volumes (919.93 ± 64.05 mm 3 ) was constructed. The phantom was scanned on a Discovery CT 750HD scanner with five different NIs (NIs = 20, 30, 40, 50, and 60). All data were reconstructed with a 0.625 mm section thickness using the filtered back-projection (FBP), 50% adaptive statistical iterative reconstruction (ASiR), and Veo model-base iterative reconstruction algorithms. Image noise was measured in six regions of interest (ROIs). Nodule volumes were measured using a commercial volumetric software package. The image quality and the volume measurement errors were analysed. Results: Image noise increased dramatically from 30.7 HU at NI 20 to 122.4 HU at NI 60, with FBP reconstruction. Conversely, Veo reconstruction effectively controlled the noise increase, with an increase from 9.97 HU at NI 20 to only 15.1 HU at NI 60. Image noise at NI 60 with Veo was even lower (50.8%) than that at NI 20 with FBP. The contrast-to-noise ratio (CNR) of Veo at NI 40 was similar to that of FBP at NI 20. All artificial GGO nodules were successfully identified and measured with an average relative volume measurement error with Veo at NI 60 of 4.24%, comparable to a value of 10.41% with FBP at NI 20. At NI 60, the radiation dose was only one-tenth that at NI 20. Conclusion: The Veo reconstruction algorithms very effectively reduced image noise compared with the conventional FBP reconstructions. Using ultra-low-dose CT scanning and Veo reconstruction, GGOs can be detected and quantified with an acceptable

  16. Assessment of Groundwater Quality of Udayagiri area, Nellore District, Andhra Pradesh, South India Using Multivariate Statistical Techniques

    Directory of Open Access Journals (Sweden)

    Arveti Nagaraju

    2016-10-01

    Full Text Available Hydrogeochemical studies were carried out in and around Udayagiri area of Andhra Pradesh in order to assess the chemistry of the groundwater and to identify the dominant hydrogeochemical processes and mechanisms responsible for the evolution of the chemical composition of the groundwater. Descriptive statistics, correlation matrices, principal component analysis (PCA, together with cluster analysis (CA were used to gain an understanding of the hydrogeochemical processes in the study area. PCA has identified 4 main processes influencing the groundwater chemistry viz., mineral precipitation and dissolution, seawater intrusion, cation exchange, and carbonate balance. Further, three clusters C1, C2 and C3 were obtained. Samples from C1 contain high level of Cl− and may be due to the intensive evaporation and contamination from landfill leachate. Most of the samples from C2 are located closer to the sea and the high level of Na+ +K+ in these samples may be attributed to seawater intrusion. The geochemistry of water samples in C3 are more likely to originate from rock weathering. This has been supported by Gibbs diagram. The groundwater geochemistry in the study area is mostly of natural origin, but is influenced to some degree by human activity.    Evaluación de la calidad del agua subterránea a través de técnicas estadísticas multivariadas en el área Udayagiri, distrito Nellore, Andhra Pradesh, en el sur de India Resumen Se realizaron estudios hidrogeoquímicos en y alrededor del área Udayagiri de Andhra Pradesh para evaluar la química del agua subterránea e identificar los procesos hidrogeoquímicos dominantes y los mecanismos responsables de la evolución en la composición química del agua subterránea. Se utilizaron estadísticas descriptivas, matrices de correlación, análisis de componentes principales, al igual que análisis de grupos, para obtener y entender los procesos hidrogeoquímicos en el área de estudio. Los an

  17. Quality characterization and pollution source identification of surface water using multivariate statistical techniques, Nalagarh Valley, Himachal Pradesh, India

    Science.gov (United States)

    Herojeet, Rajkumar; Rishi, Madhuri S.; Lata, Renu; Dolma, Konchok

    2017-09-01

    multivariate techniques for reliable quality characterization of surface water quality to develop effective pollution reduction strategies and maintain a fine balance between the industrialization and ecological integrity.

  18. Assessment of statistical agreement of three techniques for the study of cut marks: 3D digital microscope, laser scanning confocal microscopy and micro-photogrammetry.

    Science.gov (United States)

    Maté-González, Miguel Ángel; Aramendi, Julia; Yravedra, José; Blasco, Ruth; Rosell, Jordi; González-Aguilera, Diego; Domínguez-Rodrigo, Manuel

    2017-09-01

    In the last few years, the study of cut marks on bone surfaces has become fundamental for the interpretation of prehistoric butchery practices. Due to the difficulties in the correct identification of cut marks, many criteria for their description and classification have been suggested. Different techniques, such as three-dimensional digital microscope (3D DM), laser scanning confocal microscopy (LSCM) and micro-photogrammetry (M-PG) have been recently applied to the study of cut marks. Although the 3D DM and LSCM microscopic techniques are the most commonly used for the 3D identification of cut marks, M-PG has also proved to be very efficient and a low-cost method. M-PG is a noninvasive technique that allows the study of the cortical surface without any previous preparation of the samples, and that generates high-resolution models. Despite the current application of microscopic and micro-photogrammetric techniques to taphonomy, their reliability has never been tested. In this paper, we compare 3D DM, LSCM and M-PG in order to assess their resolution and results. In this study, we analyse 26 experimental cut marks generated with a metal knife. The quantitative and qualitative information registered is analysed by means of standard multivariate statistics and geometric morphometrics to assess the similarities and differences obtained with the different methodologies. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  19. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  20. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt)

    Science.gov (United States)

    Velasco-Tapia, Fernando

    2014-01-01

    Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994

  1. Multivariate Analysis, Mass Balance Techniques, and Statistical Tests as Tools in Igneous Petrology: Application to the Sierra de las Cruces Volcanic Range (Mexican Volcanic Belt

    Directory of Open Access Journals (Sweden)

    Fernando Velasco-Tapia

    2014-01-01

    Full Text Available Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC volcanic range (Mexican Volcanic Belt. In this locality, the volcanic activity (3.7 to 0.5 Ma was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward’s linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas in the comingled lavas (binary mixtures.

  2. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    Directory of Open Access Journals (Sweden)

    S. Ars

    2017-12-01

    Full Text Available This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping

  3. Statistical atmospheric inversion of local gas emissions by coupling the tracer release technique and local-scale transport modelling: a test case with controlled methane emissions

    Science.gov (United States)

    Ars, Sébastien; Broquet, Grégoire; Yver Kwok, Camille; Roustan, Yelva; Wu, Lin; Arzoumanian, Emmanuel; Bousquet, Philippe

    2017-12-01

    This study presents a new concept for estimating the pollutant emission rates of a site and its main facilities using a series of atmospheric measurements across the pollutant plumes. This concept combines the tracer release method, local-scale atmospheric transport modelling and a statistical atmospheric inversion approach. The conversion between the controlled emission and the measured atmospheric concentrations of the released tracer across the plume places valuable constraints on the atmospheric transport. This is used to optimise the configuration of the transport model parameters and the model uncertainty statistics in the inversion system. The emission rates of all sources are then inverted to optimise the match between the concentrations simulated with the transport model and the pollutants' measured atmospheric concentrations, accounting for the transport model uncertainty. In principle, by using atmospheric transport modelling, this concept does not strongly rely on the good colocation between the tracer and pollutant sources and can be used to monitor multiple sources within a single site, unlike the classical tracer release technique. The statistical inversion framework and the use of the tracer data for the configuration of the transport and inversion modelling systems should ensure that the transport modelling errors are correctly handled in the source estimation. The potential of this new concept is evaluated with a relatively simple practical implementation based on a Gaussian plume model and a series of inversions of controlled methane point sources using acetylene as a tracer gas. The experimental conditions are chosen so that they are suitable for the use of a Gaussian plume model to simulate the atmospheric transport. In these experiments, different configurations of methane and acetylene point source locations are tested to assess the efficiency of the method in comparison to the classic tracer release technique in coping with the distances

  4. COMPARISON OF THE TRADITIONAL CHALK AND BOARD LECTURE SYSTEM VERSUS POWER POINT PRESENTATION AS A TEACHING TECHNIQUE FOR TEACHING GROSS ANATOMY TO THE FIRST PROFESSIONAL MEDICAL STUDENTS

    OpenAIRE

    Nusrat; Abdul

    2015-01-01

    Traditionally and conventionally, gross anatomy is taught by lectures and cadaveric dissection and the lectures are taken with chalk and board (C&B) or chalk and talk method in, India. But there is always a debate over the most effective method of lecture delivery. AIM : The aim of this study was to compare the role and effecti...

  5. Interactive statistics with ILLMO

    NARCIS (Netherlands)

    Martens, J.B.O.S.

    2014-01-01

    Progress in empirical research relies on adequate statistical analysis and reporting. This article proposes an alternative approach to statistical modeling that is based on an old but mostly forgotten idea, namely Thurstone modeling. Traditional statistical methods assume that either the measured

  6. Statistical mechanics

    CERN Document Server

    Davidson, Norman

    2003-01-01

    Clear and readable, this fine text assists students in achieving a grasp of the techniques and limitations of statistical mechanics. The treatment follows a logical progression from elementary to advanced theories, with careful attention to detail and mathematical development, and is sufficiently rigorous for introductory or intermediate graduate courses.Beginning with a study of the statistical mechanics of ideal gases and other systems of non-interacting particles, the text develops the theory in detail and applies it to the study of chemical equilibrium and the calculation of the thermody

  7. Keeping Tradition

    NARCIS (Netherlands)

    Zenhong, C.; Buwalda, P.L.

    2011-01-01

    Chinese dumplings such as Jiao Zi and Bao Zi are two of the popular traditional foods in Asia. They are usually made from wheat flour dough (rice flour or starch is sometimes used) that contains fillings. They can be steamed, boiled and fried and are consumed either as a main meal or dessert. As

  8. [Statistics and analysis on acupuncture and moxibustion projects of the National Natural Science Foundation of China of traditional Chinese medicine universities and colleges in recent 10 years: taking the General Program and National Science Fund for Young Scholars as examples].

    Science.gov (United States)

    Li, Qingling; Ma, Qiang; Li, Dan; Liu, Nana; Yang, Jiahui; Sun, Chun; Cheng, Cheng; Jia, Xuezhao; Wang, Jing; Zeng, Yonglei

    2018-03-12

    To analyze statistically the situation of the National Natural Science Foundation of China (NSFC) from 2007 to 2016 in the field of acupuncture and moxibustion for supporting the national Universities colleges of traditional Chinese medicine on the General Program (GP) and the National Science Fund for Young Scholars (NSFYS). In view of five aspects, named fund, supporting units, key words, method, disorder and signal path, the differences were compared between GP and NSFYS, the following characteristics were summarized. ① The fund aid was increased from 2007 through 2013 and down-regulated from 2013 through 2016. In recent ten years, the funding condition was fluctuated, but increasing in tendency generally. ② The relevant projects of the same research direction had been approved continuously for over 3 years in a part of TCM universities, in which, the research continuity was the hot topic. ③ Regarding the therapeutic methods, acupuncture was the chief therapy; electroacupuncture, moxibustion and acupoints were involved as well. ④ The disorders involved in the research were cerebral ischemia, myocardial ischemia and reperfusion injury. It is suggested that the ischemic disorder is predominated in the research. ⑤ The signal path occupied the main research index system, including cell proliferation, metabolism, immune, apoptosis and autophagy. The researches on the other aspects were less.

  9. Subgingival microbiome in smokers and non-smokers in periodontitis: an exploratory study using traditional targeted techniques and a next-generation sequencing

    NARCIS (Netherlands)

    Bizzarro, S.; Loos, B.G.; Laine, M.L.; Crielaard, W.; Zaura, E.

    2013-01-01

    Aim To compare the results of two targeted techniques to an open-ended technique in periodontitis patients, differentiated on the basis of smoking habit. Materials & Methods Thirty periodontitis patients (15 smokers and 15 non-smokers) provided subgingival plaque samples for 16S rRNA gene amplicon

  10. Workshop statistics discovery with data and Minitab

    CERN Document Server

    Rossman, Allan J

    1998-01-01

    Shorn of all subtlety and led naked out of the protec­ tive fold of educational research literature, there comes a sheepish little fact: lectures don't work nearly as well as many of us would like to think. -George Cobb (1992) This book contains activities that guide students to discover statistical concepts, explore statistical principles, and apply statistical techniques. Students work toward these goals through the analysis of genuine data and through inter­ action with one another, with their instructor, and with technology. Providing a one-semester introduction to fundamental ideas of statistics for college and advanced high school students, Warkshop Statistics is designed for courses that employ an interactive learning environment by replacing lectures with hands­ on activities. The text contains enough expository material to stand alone, but it can also be used to supplement a more traditional textbook. Some distinguishing features of Workshop Statistics are its emphases on active learning, conceptu...

  11. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    Science.gov (United States)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  12. Determination of the archaeological origin of ceramic fragments characterized by neutron activation analysis, by means of the application of multivariable statistical analysis techniques

    International Nuclear Information System (INIS)

    Almazan T, M. G.; Jimenez R, M.; Monroy G, F.; Tenorio, D.; Rodriguez G, N. L.

    2009-01-01

    The elementary composition of archaeological ceramic fragments obtained during the explorations in San Miguel Ixtapan, Mexico State, was determined by the neutron activation analysis technique. The samples irradiation was realized in the research reactor TRIGA Mark III with a neutrons flow of 1·10 13 n·cm -2 ·s -1 . The irradiation time was of 2 hours. Previous to the acquisition of the gamma rays spectrum the samples were allowed to decay from 12 to 14 days. The analyzed elements were: Nd, Ce, Lu, Eu, Yb, Pa(Th), Tb, La, Cr, Hf, Sc, Co, Fe, Cs, Rb. The statistical treatment of the data, consistent in the group analysis and the main components analysis allowed to identify three different origins of the archaeological ceramic, designated as: local, foreign and regional. (Author)

  13. Experimental statistics

    CERN Document Server

    Natrella, Mary Gibbons

    1963-01-01

    Formulated to assist scientists and engineers engaged in army ordnance research and development programs, this well-known and highly regarded handbook is a ready reference for advanced undergraduate and graduate students as well as for professionals seeking engineering information and quantitative data for designing, developing, constructing, and testing equipment. Topics include characterizing and comparing the measured performance of a material, product, or process; general considerations in planning experiments; statistical techniques for analyzing extreme-value data; use of transformations

  14. Temporal and spatial assessment of river surface water quality using multivariate statistical techniques: a study in Can Tho City, a Mekong Delta area, Vietnam.

    Science.gov (United States)

    Phung, Dung; Huang, Cunrui; Rutherford, Shannon; Dwirahmadi, Febi; Chu, Cordia; Wang, Xiaoming; Nguyen, Minh; Nguyen, Nga Huy; Do, Cuong Manh; Nguyen, Trung Hieu; Dinh, Tuan Anh Diep

    2015-05-01

    The present study is an evaluation of temporal/spatial variations of surface water quality using multivariate statistical techniques, comprising cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA). Eleven water quality parameters were monitored at 38 different sites in Can Tho City, a Mekong Delta area of Vietnam from 2008 to 2012. Hierarchical cluster analysis grouped the 38 sampling sites into three clusters, representing mixed urban-rural areas, agricultural areas and industrial zone. FA/PCA resulted in three latent factors for the entire research location, three for cluster 1, four for cluster 2, and four for cluster 3 explaining 60, 60.2, 80.9, and 70% of the total variance in the respective water quality. The varifactors from FA indicated that the parameters responsible for water quality variations are related to erosion from disturbed land or inflow of effluent from sewage plants and industry, discharges from wastewater treatment plants and domestic wastewater, agricultural activities and industrial effluents, and contamination by sewage waste with faecal coliform bacteria through sewer and septic systems. Discriminant analysis (DA) revealed that nephelometric turbidity units (NTU), chemical oxygen demand (COD) and NH₃ are the discriminating parameters in space, affording 67% correct assignation in spatial analysis; pH and NO₂ are the discriminating parameters according to season, assigning approximately 60% of cases correctly. The findings suggest a possible revised sampling strategy that can reduce the number of sampling sites and the indicator parameters responsible for large variations in water quality. This study demonstrates the usefulness of multivariate statistical techniques for evaluation of temporal/spatial variations in water quality assessment and management.

  15. [Combined use of wide-detector and adaptive statistical iterative reconstruction-V technique in abdominal CT with low radiation dose].

    Science.gov (United States)

    Wang, H X; Lü, P J; Yue, S W; Chang, L Y; Li, Y; Zhao, H P; Li, W R; Gao, J B

    2017-12-05

    Objective: To investigate the image quality and radiation dose with wide-detector(80 mm) and adaptive statistical iterative reconstruction-V (ASIR-V) technique at abdominal contrast enhanced CT scan. Methods: In the first phantom experiment part, the percentage of ASIR-V for half dose of combined wide detector with ASIR-V technique as compared with standard-detector (40 mm) technique was determined. The human experiment was performed based on the phantom study, 160 patients underwent contrast-enhanced abdominal CT scan were prospectively collected and divided into the control group ( n =40) with image reconstruction using 40% ASIR (group A) and the study group ( n =120) with random number table. According to pre-ASIR-V percentage, the study group was assigned into three groups[40 cases in each group, group B: 0 pre-ASIR-V scan with image reconstruction of 0-100% post-ASIR-V (interval 10%, subgroups B0-B10); group C: 20% pre-ASIR-V with 20%, 40% and 60% post-ASIR-V (subgroups C1-C3); group D: 40%pre-ASIR-V with 40% and 60% post-ASIR-V (subgroups D1-D2)]. Image noise, CT attenuation values and CNR of the liver, pancreas, aorta and portal vein were compared by using two sample t test and One-way ANOVA. Qualitative visual parameters (overall image quality as graded on a 5-point scale) was compared by Mann-Whitney U test and Kruskal-Wallis H test. Results: The phantom experiment showed that the percentage of pre-ASIR-V for half dose was 40%. With the 40% pre-ASIR-V, radiation dose in the study group was reduced by 35.5% as compared with the control group. Image noise in the subgroups of B2-B10, C2-C3 and D1-D2 were lower ( t =-14.681--3.046, all P 0.05). The subjective image quality scores increased gradually in the range of 0-60% post-ASIR-V and decreased with post-ASIR-V larger than 70%. The overall image quality of subgroup B3-B8, C2-C3 and D1-D2 were higher than that in group A ( Z =-2.229--6.533, all P ASIR technique, wide-detector combined with 40% pre

  16. Sadum: Traditional and Contemporary

    Directory of Open Access Journals (Sweden)

    Ratna Panggabean

    2009-07-01

    Full Text Available Sadum is one of the traditional cloths of the Batak people in North Sumatra. It is woven on a back strap loom with supplementary weft technique. Sadum is a warp faced weaving made of cotton and beads woven into the cloth. Ritually it is used as a shoulder cloth, gifts exchanges, and in dances. It also bears the symbol of good tidings and blessings for the receiver. The cloth has change during times in technique, color, patterns, as well as in functions. But the use as a ritual cloth stays the same. The basic weaving techniques and equipments used to create it hasn’t change, but its material and added techniques has made this cloth become more rich in color, pattern, and texture. Most changes began when the Europeans came to Indonesia and introduced new material such as synthetic fibers and colors. In the 70s traditional cloth of Indonesia got its boost when the government declared batik as Indonesian national attire. This encourages other traditional weavings to develop into contemporary clothing. Later, new techniques and material were introduced to the Sadum weavings including embroidery, silk and golden threads which were never used before.

  17. [Traditional nostrum].

    Science.gov (United States)

    Sugiyama, Shigeru

    2006-01-01

    The commercialization of drugs started toward the end of Heian period (794-1192) when not only aristocrats and monks who were traditional patrons to drug makers, but also local clans and landlords who became powerful as a result of the disbanding of aristocratic manors accumulated enough wealth to spend money on medicine. Although traveling around the country was still a dangerous endeavor, merchants assembled groups to bring lucrative foreign drugs (mainly Chinese) to remote areas. The spread of commercial drugs to common people, however, did not happen until the early Edo period (1603-1867), when the so-called barrier system was installed nationwide to make domestic travel safe. Commercialization started in large cities and gradually spread to other areas. Many nostrums popular until recently appeared in the Genroku period (1688-1703) or later. Many such nostrums were all-cures, often consisting of such active ingredients as Saussureae radix, Agalloch, or Gambir. Even in the Edo period, many people living in agricultural or fishing villages, as well as those in the lower tier, were still poor. Much of the medication available to those people was therefore made of various plant or animal-derived substances that were traditionally used as folk medicines.

  18. A Third-Generation Adaptive Statistical Iterative Reconstruction Technique: Phantom Study of Image Noise, Spatial Resolution, Lesion Detectability, and Dose Reduction Potential.

    Science.gov (United States)

    Euler, André; Solomon, Justin; Marin, Daniele; Nelson, Rendon C; Samei, Ehsan

    2018-06-01

    The purpose of this study was to assess image noise, spatial resolution, lesion detectability, and the dose reduction potential of a proprietary third-generation adaptive statistical iterative reconstruction (ASIR-V) technique. A phantom representing five different body sizes (12-37 cm) and a contrast-detail phantom containing lesions of five low-contrast levels (5-20 HU) and three sizes (2-6 mm) were deployed. Both phantoms were scanned on a 256-MDCT scanner at six different radiation doses (1.25-10 mGy). Images were reconstructed with filtered back projection (FBP), ASIR-V with 50% blending with FBP (ASIR-V 50%), and ASIR-V without blending (ASIR-V 100%). In the first phantom, noise properties were assessed by noise power spectrum analysis. Spatial resolution properties were measured by use of task transfer functions for objects of different contrasts. Noise magnitude, noise texture, and resolution were compared between the three groups. In the second phantom, low-contrast detectability was assessed by nine human readers independently for each condition. The dose reduction potential of ASIR-V was estimated on the basis of a generalized linear statistical regression model. On average, image noise was reduced 37.3% with ASIR-V 50% and 71.5% with ASIR-V 100% compared with FBP. ASIR-V shifted the noise power spectrum toward lower frequencies compared with FBP. The spatial resolution of ASIR-V was equivalent or slightly superior to that of FBP, except for the low-contrast object, which had lower resolution. Lesion detection significantly increased with both ASIR-V levels (p = 0.001), with an estimated radiation dose reduction potential of 15% ± 5% (SD) for ASIR-V 50% and 31% ± 9% for ASIR-V 100%. ASIR-V reduced image noise and improved lesion detection compared with FBP and had potential for radiation dose reduction while preserving low-contrast detectability.

  19. Industrial statistics with Minitab

    CERN Document Server

    Cintas, Pere Grima; Llabres, Xavier Tort-Martorell

    2012-01-01

    Industrial Statistics with MINITAB demonstrates the use of MINITAB as a tool for performing statistical analysis in an industrial context. This book covers introductory industrial statistics, exploring the most commonly used techniques alongside those that serve to give an overview of more complex issues. A plethora of examples in MINITAB are featured along with case studies for each of the statistical techniques presented. Industrial Statistics with MINITAB: Provides comprehensive coverage of user-friendly practical guidance to the essential statistical methods applied in industry.Explores

  20. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  1. UNA TECNICA ESTADISTICA PARA MEDIR LA CONFLICTIVIDAD SOCIAL A TRAVES DEL REGISTRO ARQUEOLOGICO (A Statistical Technique to Measure Social Conflict through the Archaeological Record

    Directory of Open Access Journals (Sweden)

    Pascual Izquierdo-Egea

    2015-03-01

    Full Text Available Se presenta aqui una tecnica estadistica para medir la conflictividad social a traves del registro mortuorio. Nace al amparo del metodo de valoracion contextual empleado en el analisis de los ajuares funerarios desde 1993. Se trata de una herramienta fundamental para el desarrollo de la arqueologia de los fenomenos sociales, cuyos relevantes resultados empiricos avalan su trascendencia teorica. Tras proceder a su conceptualizacion en funcion de la desigualdad social y la riqueza relativa, se explican las dos clases de conflictividad social definidas: estructural o estatica y coyuntural o dinamica. Finalmente, se incluyen sus conexiones con la ley demografica de Malthus a traves de sus dos parametros: poblacion y recursos. Todo este entramado teorico se ilustra con algunas aplicaciones referidas a las civilizaciones antiguas, abarcando la protohistoria iberica, la Mesoamerica prehispanica o la Roma altoimperial. ENGLISH: A statistical technique to measure social conflict through the mortuary record is presented here. It is born under the contextual valuation method used in the analysis of grave goods since 1993. This is a fundamental tool for the development of the archaeology of social phenomena, whose relevant empirical results support its theoretical significance. After conveying its conceptualization in terms of social inequality and relative wealth, the two classes of social conflict are explained: static or structural and dynamic or conjunctural. Finally, connections with the Malthusian demographic law through its two parameters—population and resources—are included. The synthesis of these theoretical frameworks is illustrated with applications to ancient civilizations, including Iberian protohistory, prehispanic Mesoamerica, and early imperial Rome.

  2. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  3. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  4. Method for estimating potential wetland extent by utilizing streamflow statistics and flood-inundation mapping techniques: Pilot study for land along the Wabash River near Terre Haute, Indiana

    Science.gov (United States)

    Kim, Moon H.; Ritz, Christian T.; Arvin, Donald V.

    2012-01-01

    Potential wetland extents were estimated for a 14-mile reach of the Wabash River near Terre Haute, Indiana. This pilot study was completed by the U.S. Geological Survey in cooperation with the U.S. Department of Agriculture, Natural Resources Conservation Service (NRCS). The study showed that potential wetland extents can be estimated by analyzing streamflow statistics with the available streamgage data, calculating the approximate water-surface elevation along the river, and generating maps by use of flood-inundation mapping techniques. Planning successful restorations for Wetland Reserve Program (WRP) easements requires a determination of areas that show evidence of being in a zone prone to sustained or frequent flooding. Zone determinations of this type are used by WRP planners to define the actively inundated area and make decisions on restoration-practice installation. According to WRP planning guidelines, a site needs to show evidence of being in an "inundation zone" that is prone to sustained or frequent flooding for a period of 7 consecutive days at least once every 2 years on average in order to meet the planning criteria for determining a wetland for a restoration in agricultural land. By calculating the annual highest 7-consecutive-day mean discharge with a 2-year recurrence interval (7MQ2) at a streamgage on the basis of available streamflow data, one can determine the water-surface elevation corresponding to the calculated flow that defines the estimated inundation zone along the river. By using the estimated water-surface elevation ("inundation elevation") along the river, an approximate extent of potential wetland for a restoration in agricultural land can be mapped. As part of the pilot study, a set of maps representing the estimated potential wetland extents was generated in a geographic information system (GIS) application by combining (1) a digital water-surface plane representing the surface of inundation elevation that sloped in the downstream

  5. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  6. Cancer Statistics

    Science.gov (United States)

    ... What Is Cancer? Cancer Statistics Cancer Disparities Cancer Statistics Cancer has a major impact on society in ... success of efforts to control and manage cancer. Statistics at a Glance: The Burden of Cancer in ...

  7. Peer-harassment prevalence in self-reports by primary and lower secondary school students. Statistical comparisons of samples from years 2000 and 2013, investigating traditional and cyber-harassment.

    OpenAIRE

    Hjelmen, Kari Jeanette Langseth

    2015-01-01

    Comparative investigation of traditional peer-harassment and cyber-harassment prevalence, examining first year baseline sample of a longitudinal project in a North-Norwegian setting. Thesis contributes into a main study, “Trivsel i Tromsø” (“Well-being in Tromsø”), which aims to examine psychosocial and psychiatric risk factor associations with bullying and cyberbullying, using a combination of survey tools. The thesis explore one of the three survey tools. Investigation of sample administere...

  8. Statistical Pattern Recognition

    CERN Document Server

    Webb, Andrew R

    2011-01-01

    Statistical pattern recognition relates to the use of statistical techniques for analysing data measurements in order to extract information and make justified decisions.  It is a very active area of study and research, which has seen many advances in recent years. Applications such as data mining, web searching, multimedia data retrieval, face recognition, and cursive handwriting recognition, all require robust and efficient pattern recognition techniques. This third edition provides an introduction to statistical pattern theory and techniques, with material drawn from a wide range of fields,

  9. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    Science.gov (United States)

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  10. Teaching Statistics Online Using "Excel"

    Science.gov (United States)

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  11. Statistics in a nutshell

    CERN Document Server

    Boslaugh, Sarah

    2013-01-01

    Need to learn statistics for your job? Want help passing a statistics course? Statistics in a Nutshell is a clear and concise introduction and reference for anyone new to the subject. Thoroughly revised and expanded, this edition helps you gain a solid understanding of statistics without the numbing complexity of many college texts. Each chapter presents easy-to-follow descriptions, along with graphics, formulas, solved examples, and hands-on exercises. If you want to perform common statistical analyses and learn a wide range of techniques without getting in over your head, this is your book.

  12. Business statistics for dummies

    CERN Document Server

    Anderson, Alan

    2013-01-01

    Score higher in your business statistics course? Easy. Business statistics is a common course for business majors and MBA candidates. It examines common data sets and the proper way to use such information when conducting research and producing informational reports such as profit and loss statements, customer satisfaction surveys, and peer comparisons. Business Statistics For Dummies tracks to a typical business statistics course offered at the undergraduate and graduate levels and provides clear, practical explanations of business statistical ideas, techniques, formulas, and calculations, w

  13. On quantum statistical inference

    NARCIS (Netherlands)

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  14. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  15. Reduction of Complications of Local Anaesthesia in Dental Healthcare Setups by Application of the Six Sigma Methodology: A Statistical Quality Improvement Technique.

    Science.gov (United States)

    Akifuddin, Syed; Khatoon, Farheen

    2015-12-01

    Health care faces challenges due to complications, inefficiencies and other concerns that threaten the safety of patients. The purpose of his study was to identify causes of complications encountered after administration of local anaesthesia for dental and oral surgical procedures and to reduce the incidence of complications by introduction of six sigma methodology. DMAIC (Define, Measure, Analyse, Improve and Control) process of Six Sigma was taken into consideration to reduce the incidence of complications encountered after administration of local anaesthesia injections for dental and oral surgical procedures using failure mode and effect analysis. Pareto analysis was taken into consideration to analyse the most recurring complications. Paired z-sample test using Minitab Statistical Inference and Fisher's exact test was used to statistically analyse the obtained data. The p-value six sigma improvement methodology in healthcare tends to deliver consistently better results to the patients as well as hospitals and results in better patient compliance as well as satisfaction.

  16. Usage Statistics

    Science.gov (United States)

    ... this page: https://medlineplus.gov/usestatistics.html MedlinePlus Statistics To use the sharing features on this page, ... By Quarter View image full size Quarterly User Statistics Quarter Page Views Unique Visitors Oct-Dec-98 ...

  17. Frog Statistics

    Science.gov (United States)

    Whole Frog Project and Virtual Frog Dissection Statistics wwwstats output for January 1 through duplicate or extraneous accesses. For example, in these statistics, while a POST requesting an image is as well. Note that this under-represents the bytes requested. Starting date for following statistics

  18. Lectures on algebraic statistics

    CERN Document Server

    Drton, Mathias; Sullivant, Seth

    2009-01-01

    How does an algebraic geometer studying secant varieties further the understanding of hypothesis tests in statistics? Why would a statistician working on factor analysis raise open problems about determinantal varieties? Connections of this type are at the heart of the new field of "algebraic statistics". In this field, mathematicians and statisticians come together to solve statistical inference problems using concepts from algebraic geometry as well as related computational and combinatorial techniques. The goal of these lectures is to introduce newcomers from the different camps to algebraic statistics. The introduction will be centered around the following three observations: many important statistical models correspond to algebraic or semi-algebraic sets of parameters; the geometry of these parameter spaces determines the behaviour of widely used statistical inference procedures; computational algebraic geometry can be used to study parameter spaces and other features of statistical models.

  19. Experimental Mathematics and Computational Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  20. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: an intra-individual comparison.

    Science.gov (United States)

    Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung

    2012-09-01

    To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Twenty-six patients (M:F=13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, PASIR studies (20.81 vs. 16.67, P=0.004), but was not different in the aorta (18.23 vs. 18.72, P=0.726). The subjective image quality demonstrated no difference between the two studies. A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hyun, E-mail: circle1128@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, Myung-Joon, E-mail: mjkim@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Yoon, Choon-Sik, E-mail: yooncs58@yuhs.ac [Department of Radiology, Gangnam Severance Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Lee, Mi-Jung, E-mail: mjl1213@yuhs.ac [Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, Yonsei University, College of Medicine, Seoul (Korea, Republic of)

    2012-09-15

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality.

  2. Radiation dose reduction with the adaptive statistical iterative reconstruction (ASIR) technique for chest CT in children: An intra-individual comparison

    International Nuclear Information System (INIS)

    Lee, Seung Hyun; Kim, Myung-Joon; Yoon, Choon-Sik; Lee, Mi-Jung

    2012-01-01

    Objective: To retrospectively compare radiation dose and image quality of pediatric chest CT using a routine dose protocol reconstructed with filtered back projection (FBP) (the Routine study) and a low-dose protocol with 50% adaptive statistical iterative reconstruction (ASIR) (the ASIR study). Materials and methods: We retrospectively reviewed chest CT performed in pediatric patients who underwent both the Routine study and the ASIR study on different days between January 2010 and August 2011. Volume CT dose indices (CTDIvol), dose length products (DLP), and effective doses were obtained to estimate radiation dose. The image quality was evaluated objectively as noise measured in the descending aorta and paraspinal muscle, and subjectively by three radiologists for noise, sharpness, artifacts, and diagnostic acceptability using a four-point scale. The paired Student's t-test and the Wilcoxon signed-rank test were used for statistical analysis. Results: Twenty-six patients (M:F = 13:13, mean age 11.7) were enrolled. The ASIR studies showed 60.3%, 56.2%, and 55.2% reductions in CTDIvol (from 18.73 to 7.43 mGy, P < 0.001), DLP (from 307.42 to 134.51 mGy × cm, P < 0.001), and effective dose (from 4.12 to 1.84 mSv, P < 0.001), respectively, compared with the Routine studies. The objective noise was higher in the paraspinal muscle of the ASIR studies (20.81 vs. 16.67, P = 0.004), but was not different in the aorta (18.23 vs. 18.72, P = 0.726). The subjective image quality demonstrated no difference between the two studies. Conclusion: A low-dose protocol with 50% ASIR allows radiation dose reduction in pediatric chest CT by more than 55% while maintaining image quality

  3. Statistical physics

    CERN Document Server

    Sadovskii, Michael V

    2012-01-01

    This volume provides a compact presentation of modern statistical physics at an advanced level. Beginning with questions on the foundations of statistical mechanics all important aspects of statistical physics are included, such as applications to ideal gases, the theory of quantum liquids and superconductivity and the modern theory of critical phenomena. Beyond that attention is given to new approaches, such as quantum field theory methods and non-equilibrium problems.

  4. Statistical optics

    CERN Document Server

    Goodman, Joseph W

    2015-01-01

    This book discusses statistical methods that are useful for treating problems in modern optics, and the application of these methods to solving a variety of such problems This book covers a variety of statistical problems in optics, including both theory and applications.  The text covers the necessary background in statistics, statistical properties of light waves of various types, the theory of partial coherence and its applications, imaging with partially coherent light, atmospheric degradations of images, and noise limitations in the detection of light. New topics have been introduced i

  5. Harmonic statistics

    Energy Technology Data Exchange (ETDEWEB)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    2017-05-15

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  6. Harmonic statistics

    International Nuclear Information System (INIS)

    Eliazar, Iddo

    2017-01-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.

  7. Histoplasmosis Statistics

    Science.gov (United States)

    ... Testing Treatment & Outcomes Health Professionals Statistics More Resources Candidiasis Candida infections of the mouth, throat, and esophagus Vaginal candidiasis Invasive candidiasis Definition Symptoms Risk & Prevention Sources Diagnosis ...

  8. Understanding traditional African healing

    OpenAIRE

    MOKGOBI, M.G.

    2014-01-01

    Traditional African healing has been in existence for many centuries yet many people still seem not to understand how it relates to God and religion/spirituality. Some people seem to believe that traditional healers worship the ancestors and not God. It is therefore the aim of this paper to clarify this relationship by discussing a chain of communication between the worshipers and the Almighty God. Other aspects of traditional healing namely types of traditional healers, training of tradition...

  9. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…

  10. Statistical Diversions

    Science.gov (United States)

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  11. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  12. Practical Statistics

    CERN Document Server

    Lyons, L.

    2016-01-01

    Accelerators and detectors are expensive, both in terms of money and human effort. It is thus important to invest effort in performing a good statistical anal- ysis of the data, in order to extract the best information from it. This series of five lectures deals with practical aspects of statistical issues that arise in typical High Energy Physics analyses.

  13. Discovery and characterisation of dietary patterns in two Nordic countries. Using non-supervised and supervised multivariate statistical techniques to analyse dietary survey data

    DEFF Research Database (Denmark)

    Edberg, Anna; Freyhult, Eva; Sand, Salomon

    - and inter-national data excerpts. For example, major PCA loadings helped deciphering both shared and disparate features, relating to food groups, across Danish and Swedish preschool consumers. Data interrogation, reliant on the above-mentioned composite techniques, disclosed one outlier dietary prototype...... prototype with the latter property was identified also in the Danish data material, but without low consumption of Vegetables or Fruit & berries. The second MDA-type of data interrogation involved Supervised Learning, also known as Predictive Modelling. These exercises involved the Random Forest (RF...... not elaborated on in-depth, output from several analyses suggests a preference for energy-based consumption data for Cluster Analysis and Predictive Modelling, over those appearing as weight....

  14. Effects of statistical quality, sampling rate and temporal filtering techniques on the extraction of functional parameters from the left ventricular time-activity curves

    Energy Technology Data Exchange (ETDEWEB)

    Guignard, P.A.; Chan, W. (Royal Melbourne Hospital, Parkville (Australia). Dept. of Nuclear Medicine)

    1984-09-01

    Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned.

  15. Effects of statistical quality, sampling rate and temporal filtering techniques on the extraction of functional parameters from the left ventricular time-activity curves

    International Nuclear Information System (INIS)

    Guignard, P.A.; Chan, W.

    1984-01-01

    Several techniques for the processing of a series of curves derived from two left ventricular time-activity curves acquired at rest and during exercise with a nuclear stethoscope were evaluated. They were three and five point time smoothing. Fourier filtering preserving one to four harmonics (H), truncated curve Fourier filtering, and third degree polynomial curve fitting. Each filter's ability to recover, with fidelity, systolic and diastolic function parameters was evaluated under increasingly 'noisy' conditions and at several sampling rates. Third degree polynomial curve fittings and truncated Fourier filters exhibited very high sensitivity to noise. Three and five point time smoothing had moderate sensitivity to noise, but were highly affected by sampling rate. Fourier filtering preserving 2H or 3H produced the best compromise with high resilience to noise and independence of sampling rate as far as the recovery of these functional parameters is concerned. (author)

  16. Modeling and Analysis of Mechanical Properties of Aluminium Alloy (A413 Processed through Squeeze Casting Route Using Artificial Neural Network Model and Statistical Technique

    Directory of Open Access Journals (Sweden)

    R. Soundararajan

    2015-01-01

    Full Text Available Artificial Neural Network (ANN approach was used for predicting and analyzing the mechanical properties of A413 aluminum alloy produced by squeeze casting route. The experiments are carried out with different controlled input variables such as squeeze pressure, die preheating temperature, and melt temperature as per Full Factorial Design (FFD. The accounted absolute process variables produce a casting with pore-free and ideal fine grain dendritic structure resulting in good mechanical properties such as hardness, ultimate tensile strength, and yield strength. As a primary objective, a feed forward back propagation ANN model has been developed with different architectures for ensuring the definiteness of the values. The developed model along with its predicted data was in good agreement with the experimental data, inferring the valuable performance of the optimal model. From the work it was ascertained that, for castings produced by squeeze casting route, the ANN is an alternative method for predicting the mechanical properties and appropriate results can be estimated rather than measured, thereby reducing the testing time and cost. As a secondary objective, quantitative and statistical analysis was performed in order to evaluate the effect of process parameters on the mechanical properties of the castings.

  17. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    Science.gov (United States)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-09-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 × 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver ( p fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  18. Descriptive statistics.

    Science.gov (United States)

    Nick, Todd G

    2007-01-01

    Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.

  19. 105-116 Effect of Winged Subsoiler and Traditional Tillage ...

    African Journals Online (AJOL)

    3) compared to traditional tillage (Qs = 34 mm-season-. 1, T = 49 ... Maresha plow that cuts soil deeper than achieved with the traditional .... Data Processing and Analysis. Statistical ... soil compaction and shallow depth could be addressed.

  20. Understanding traditional African healing.

    Science.gov (United States)

    Mokgobi, M G

    2014-09-01

    Traditional African healing has been in existence for many centuries yet many people still seem not to understand how it relates to God and religion/spirituality. Some people seem to believe that traditional healers worship the ancestors and not God. It is therefore the aim of this paper to clarify this relationship by discussing a chain of communication between the worshipers and the Almighty God. Other aspects of traditional healing namely types of traditional healers, training of traditional healers as well as the role of traditional healers in their communities are discussed. In conclusion, the services of traditional healers go far beyond the uses of herbs for physical illnesses. Traditional healers serve many roles which include but not limited to custodians of the traditional African religion and customs, educators about culture, counselors, social workers and psychologists.

  1. Statistical-techniques-based computer-aided diagnosis (CAD) using texture feature analysis: application in computed tomography (CT) imaging to fatty liver disease

    International Nuclear Information System (INIS)

    Chung, Woon-Kwan; Park, Hyong-Hu; Im, In-Chul; Lee, Jae-Seung; Goo, Eun-Hoe; Dong, Kyung-Rae

    2012-01-01

    This paper proposes a computer-aided diagnosis (CAD) system based on texture feature analysis and statistical wavelet transformation technology to diagnose fatty liver disease with computed tomography (CT) imaging. In the target image, a wavelet transformation was performed for each lesion area to set the region of analysis (ROA, window size: 50 x 50 pixels) and define the texture feature of a pixel. Based on the extracted texture feature values, six parameters (average gray level, average contrast, relative smoothness, skewness, uniformity, and entropy) were determined to calculate the recognition rate for a fatty liver. In addition, a multivariate analysis of the variance (MANOVA) method was used to perform a discriminant analysis to verify the significance of the extracted texture feature values and the recognition rate for a fatty liver. According to the results, each texture feature value was significant for a comparison of the recognition rate for a fatty liver (p < 0.05). Furthermore, the F-value, which was used as a scale for the difference in recognition rates, was highest in the average gray level, relatively high in the skewness and the entropy, and relatively low in the uniformity, the relative smoothness and the average contrast. The recognition rate for a fatty liver had the same scale as that for the F-value, showing 100% (average gray level) at the maximum and 80% (average contrast) at the minimum. Therefore, the recognition rate is believed to be a useful clinical value for the automatic detection and computer-aided diagnosis (CAD) using the texture feature value. Nevertheless, further study on various diseases and singular diseases will be needed in the future.

  2. Semiconductor statistics

    CERN Document Server

    Blakemore, J S

    1962-01-01

    Semiconductor Statistics presents statistics aimed at complementing existing books on the relationships between carrier densities and transport effects. The book is divided into two parts. Part I provides introductory material on the electron theory of solids, and then discusses carrier statistics for semiconductors in thermal equilibrium. Of course a solid cannot be in true thermodynamic equilibrium if any electrical current is passed; but when currents are reasonably small the distribution function is but little perturbed, and the carrier distribution for such a """"quasi-equilibrium"""" co

  3. Statistics Clinic

    Science.gov (United States)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  4. Cooperative Learning in Virtual Environments: The Jigsaw Method in Statistical Courses

    Science.gov (United States)

    Vargas-Vargas, Manuel; Mondejar-Jimenez, Jose; Santamaria, Maria-Letica Meseguer; Alfaro-Navarro, Jose-Luis; Fernandez-Aviles, Gema

    2011-01-01

    This document sets out a novel teaching methodology as used in subjects with statistical content, traditionally regarded by students as "difficult". In a virtual learning environment, instructional techniques little used in mathematical courses were employed, such as the Jigsaw cooperative learning method, which had to be adapted to the…

  5. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  6. Using Artificial Neural Networks in Educational Research: Some Comparisons with Linear Statistical Models.

    Science.gov (United States)

    Everson, Howard T.; And Others

    This paper explores the feasibility of neural computing methods such as artificial neural networks (ANNs) and abductory induction mechanisms (AIM) for use in educational measurement. ANNs and AIMS methods are contrasted with more traditional statistical techniques, such as multiple regression and discriminant function analyses, for making…

  7. Total coliforms, arsenic and cadmium exposure through drinking water in the Western Region of Ghana: application of multivariate statistical technique to groundwater quality.

    Science.gov (United States)

    Affum, Andrews Obeng; Osae, Shiloh Dede; Nyarko, Benjamin Jabez Botwe; Afful, Samuel; Fianko, Joseph Richmond; Akiti, Tetteh Thomas; Adomako, Dickson; Acquaah, Samuel Osafo; Dorleku, Micheal; Antoh, Emmanuel; Barnes, Felix; Affum, Enoch Acheampong

    2015-02-01

    In recent times, surface water resource in the Western Region of Ghana has been found to be inadequate in supply and polluted by various anthropogenic activities. As a result of these problems, the demand for groundwater by the human populations in the peri-urban communities for domestic, municipal and irrigation purposes has increased without prior knowledge of its water quality. Water samples were collected from 14 public hand-dug wells during the rainy season in 2013 and investigated for total coliforms, Escherichia coli, mercury (Hg), arsenic (As), cadmium (Cd) and physicochemical parameters. Multivariate statistical analysis of the dataset and a linear stoichiometric plot of major ions were applied to group the water samples and to identify the main factors and sources of contamination. Hierarchal cluster analysis revealed four clusters from the hydrochemical variables (R-mode) and three clusters in the case of water samples (Q-mode) after z score standardization. Principal component analysis after a varimax rotation of the dataset indicated that the four factors extracted explained 93.3 % of the total variance, which highlighted salinity, toxic elements and hardness pollution as the dominant factors affecting groundwater quality. Cation exchange, mineral dissolution and silicate weathering influenced groundwater quality. The ranking order of major ions was Na(+) > Ca(2+) > K(+) > Mg(2+) and Cl(-) > SO4 (2-) > HCO3 (-). Based on piper plot and the hydrogeology of the study area, sodium chloride (86 %), sodium hydrogen carbonate and sodium carbonate (14 %) water types were identified. Although E. coli were absent in the water samples, 36 % of the wells contained total coliforms (Enterobacter species) which exceeded the WHO guidelines limit of zero colony-forming unit (CFU)/100 mL of drinking water. With the exception of Hg, the concentration of As and Cd in 79 and 43 % of the water samples exceeded the WHO guideline limits of 10 and 3

  8. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  9. Image Statistics

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-08

    In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.

  10. Accident Statistics

    Data.gov (United States)

    Department of Homeland Security — Accident statistics available on the Coast Guard’s website by state, year, and one variable to obtain tables and/or graphs. Data from reports has been loaded for...

  11. CMS Statistics

    Data.gov (United States)

    U.S. Department of Health & Human Services — The CMS Center for Strategic Planning produces an annual CMS Statistics reference booklet that provides a quick reference for summary information about health...

  12. WPRDC Statistics

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Data about the usage of the WPRDC site and its various datasets, obtained by combining Google Analytics statistics with information from the WPRDC's data portal.

  13. Gonorrhea Statistics

    Science.gov (United States)

    ... Search Form Controls Cancel Submit Search the CDC Gonorrhea Note: Javascript is disabled or is not supported ... Twitter STD on Facebook Sexually Transmitted Diseases (STDs) Gonorrhea Statistics Recommend on Facebook Tweet Share Compartir Gonorrhea ...

  14. Reversible Statistics

    DEFF Research Database (Denmark)

    Tryggestad, Kjell

    2004-01-01

    The study aims is to describe how the inclusion and exclusion of materials and calculative devices construct the boundaries and distinctions between statistical facts and artifacts in economics. My methodological approach is inspired by John Graunt's (1667) Political arithmetic and more recent work...... within constructivism and the field of Science and Technology Studies (STS). The result of this approach is here termed reversible statistics, reconstructing the findings of a statistical study within economics in three different ways. It is argued that all three accounts are quite normal, albeit...... in different ways. The presence and absence of diverse materials, both natural and political, is what distinguishes them from each other. Arguments are presented for a more symmetric relation between the scientific statistical text and the reader. I will argue that a more symmetric relation can be achieved...

  15. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  16. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  17. Statistical methods with applications to demography and life insurance

    CERN Document Server

    Khmaladze, Estáte V

    2013-01-01

    Suitable for statisticians, mathematicians, actuaries, and students interested in the problems of insurance and analysis of lifetimes, Statistical Methods with Applications to Demography and Life Insurance presents contemporary statistical techniques for analyzing life distributions and life insurance problems. It not only contains traditional material but also incorporates new problems and techniques not discussed in existing actuarial literature. The book mainly focuses on the analysis of an individual life and describes statistical methods based on empirical and related processes. Coverage ranges from analyzing the tails of distributions of lifetimes to modeling population dynamics with migrations. To help readers understand the technical points, the text covers topics such as the Stieltjes, Wiener, and Itô integrals. It also introduces other themes of interest in demography, including mixtures of distributions, analysis of longevity and extreme value theory, and the age structure of a population. In addi...

  18. Counting statistics in radioactivity measurements

    International Nuclear Information System (INIS)

    Martin, J.

    1975-01-01

    The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr

  19. Multilayer Statistical Intrusion Detection in Wireless Networks

    Science.gov (United States)

    Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine

    2008-12-01

    The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.

  20. Mangghuer Embroidery: A Vanishing Tradition

    OpenAIRE

    Aila Pullinen

    2015-01-01

    Aila Pullinen. 2015. Mangghuer Embroidery: A Vanishing Tradition IN Gerald Roche and CK Stuart (eds) Asian Highlands Perspectives 36: Mapping the Monguor, 178-188, 301-332. Visits were undertaken in the years 2001 and 2002 to Minhe Hui and Mangghuer (Tu) Autonomous County, Haidong Municipality, Qinghai Province, China to research and document Mangghuer embroidery. This research is summarized in terms of the history of Mangghuer embroidery, tools and materials, embroidery techniques, embr...

  1. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    Science.gov (United States)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic

  2. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  3. Statistical mechanics

    CERN Document Server

    Schwabl, Franz

    2006-01-01

    The completely revised new edition of the classical book on Statistical Mechanics covers the basic concepts of equilibrium and non-equilibrium statistical physics. In addition to a deductive approach to equilibrium statistics and thermodynamics based on a single hypothesis - the form of the microcanonical density matrix - this book treats the most important elements of non-equilibrium phenomena. Intermediate calculations are presented in complete detail. Problems at the end of each chapter help students to consolidate their understanding of the material. Beyond the fundamentals, this text demonstrates the breadth of the field and its great variety of applications. Modern areas such as renormalization group theory, percolation, stochastic equations of motion and their applications to critical dynamics, kinetic theories, as well as fundamental considerations of irreversibility, are discussed. The text will be useful for advanced students of physics and other natural sciences; a basic knowledge of quantum mechan...

  4. Statistical mechanics

    CERN Document Server

    Jana, Madhusudan

    2015-01-01

    Statistical mechanics is self sufficient, written in a lucid manner, keeping in mind the exam system of the universities. Need of study this subject and its relation to Thermodynamics is discussed in detail. Starting from Liouville theorem gradually, the Statistical Mechanics is developed thoroughly. All three types of Statistical distribution functions are derived separately with their periphery of applications and limitations. Non-interacting ideal Bose gas and Fermi gas are discussed thoroughly. Properties of Liquid He-II and the corresponding models have been depicted. White dwarfs and condensed matter physics, transport phenomenon - thermal and electrical conductivity, Hall effect, Magneto resistance, viscosity, diffusion, etc. are discussed. Basic understanding of Ising model is given to explain the phase transition. The book ends with a detailed coverage to the method of ensembles (namely Microcanonical, canonical and grand canonical) and their applications. Various numerical and conceptual problems ar...

  5. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  6. Statistical Physics

    CERN Document Server

    Mandl, Franz

    1988-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scient

  7. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  8. AP statistics

    CERN Document Server

    Levine-Wissing, Robin

    2012-01-01

    All Access for the AP® Statistics Exam Book + Web + Mobile Everything you need to prepare for the Advanced Placement® exam, in a study system built around you! There are many different ways to prepare for an Advanced Placement® exam. What's best for you depends on how much time you have to study and how comfortable you are with the subject matter. To score your highest, you need a system that can be customized to fit you: your schedule, your learning style, and your current level of knowledge. This book, and the online tools that come with it, will help you personalize your AP® Statistics prep

  9. Statistical Model Checking of Rich Models and Properties

    DEFF Research Database (Denmark)

    Poulsen, Danny Bøgsted

    in undecidability issues for the traditional model checking approaches. Statistical model checking has proven itself a valuable supplement to model checking and this thesis is concerned with extending this software validation technique to stochastic hybrid systems. The thesis consists of two parts: the first part...... motivates why existing model checking technology should be supplemented by new techniques. It also contains a brief introduction to probability theory and concepts covered by the six papers making up the second part. The first two papers are concerned with developing online monitoring techniques...... systems. The fifth paper shows how stochastic hybrid automata are useful for modelling biological systems and the final paper is concerned with showing how statistical model checking is efficiently distributed. In parallel with developing the theory contained in the papers, a substantial part of this work...

  10. Multiple Intelligences in Online, Hybrid, and Traditional Business Statistics Courses

    Science.gov (United States)

    Lopez, Salvador; Patron, Hilde

    2012-01-01

    According to Howard Gardner, Professor of Cognition and Education at Harvard University, intelligence of humans cannot be measured with a single factor such as the IQ level. Instead, he and others have suggested that humans have different types of intelligence. This paper examines whether students registered in online or mostly online courses have…

  11. Statistics 101 for Radiologists.

    Science.gov (United States)

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.

  12. Active Learning versus Traditional Teaching

    Directory of Open Access Journals (Sweden)

    L.A. Azzalis

    2009-05-01

    Full Text Available In traditional teaching most of the class time is spent with the professor lecturing and the students watching and listening. The students work individually, and cooperation is discouraged. On the other hand,  active learning  changes the focus of activity from the teacher to the learners, in which students solve problems, answer questions, formulate questions of their own, discuss, explain, debate during class;  moreover, students work in teams on problems and projects under conditions that assure positive interdependence and individual accountability. Although student-centered methods have repeatedly been shown to be superior to the traditional teacher-centered approach to instruction, the literature regarding the efficacy of various teaching methods is inconclusive. The purpose of this study was to compare the student perceptions of course and instructor effectiveness, course difficulty, and amount learned between the active learning and lecture sections  in Health Sciences´ courses by statistical data from Anhembi Morumbi University. Results indicated significant  difference between active  learning and traditional  teaching. Our conclusions were that strategies promoting  active  learning to  traditional lectures could increase knowledge and understanding.

  13. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  14. Statistical thermodynamics

    CERN Document Server

    Schrödinger, Erwin

    1952-01-01

    Nobel Laureate's brilliant attempt to develop a simple, unified standard method of dealing with all cases of statistical thermodynamics - classical, quantum, Bose-Einstein, Fermi-Dirac, and more.The work also includes discussions of Nernst theorem, Planck's oscillator, fluctuations, the n-particle problem, problem of radiation, much more.

  15. Official Statistics and Statistics Education: Bridging the Gap

    Directory of Open Access Journals (Sweden)

    Gal Iddo

    2017-03-01

    Full Text Available This article aims to challenge official statistics providers and statistics educators to ponder on how to help non-specialist adult users of statistics develop those aspects of statistical literacy that pertain to official statistics. We first document the gap in the literature in terms of the conceptual basis and educational materials needed for such an undertaking. We then review skills and competencies that may help adults to make sense of statistical information in areas of importance to society. Based on this review, we identify six elements related to official statistics about which non-specialist adult users should possess knowledge in order to be considered literate in official statistics: (1 the system of official statistics and its work principles; (2 the nature of statistics about society; (3 indicators; (4 statistical techniques and big ideas; (5 research methods and data sources; and (6 awareness and skills for citizens’ access to statistical reports. Based on this ad hoc typology, we discuss directions that official statistics providers, in cooperation with statistics educators, could take in order to (1 advance the conceptualization of skills needed to understand official statistics, and (2 expand educational activities and services, specifically by developing a collaborative digital textbook and a modular online course, to improve public capacity for understanding of official statistics.

  16. Statistics and Informatics in Space Astrophysics

    Science.gov (United States)

    Feigelson, E.

    2017-12-01

    The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.

  17. Statistics techniques applied to electron probe microanalysis

    International Nuclear Information System (INIS)

    Brizuela, H.; Del Giorgio, M.; Budde, C.; Briozzo, C.; Riveros, J.

    1987-01-01

    A description of Montroll-West's general theory for a tridimensional random walk of a particle with internal degrees of freedom is given, connecting this problem with the master equation solution. The possibility of its application to EPMA is discussed. Numerical solutions are given for thick or collimated beams at several energies interacting with samples of different shape and size. Spatial distribution of particles within the sample -for a stationary state- is analized, as well as the electron backscattering coefficient. (Author) [es

  18. Statistically tuned Gaussian background subtraction technique for ...

    Indian Academy of Sciences (India)

    temporal median method and mixture of Gaussian model and performance evaluation ... to process the videos captured by unmanned aerial vehicle (UAV). ..... The output is obtained by simulation using MATLAB 2010 in a standalone PC with ...

  19. Traditional Chinese food technology and cuisine.

    Science.gov (United States)

    Li, Jian-rong; Hsieh, Yun-Hwa P

    2004-01-01

    From ancient wisdom to modern science and technology, Chinese cuisine has been established from a long history of the country and gained a global reputation of its sophistication. Traditional Chinese foods and cuisine that exhibit Chinese culture, art and reality play an essential role in Chinese people's everyday lives. Recently, traditional Chinese foods have drawn a great degree of attention from food scientists and technologists, the food industry, and health promotion institutions worldwide due to the extensive values they offer beyond being merely another ethnic food. These traditional foods comprise a wide variety of products, such as pickled vegetables, salted fish and jellyfish, tofu and tofu derived products, rice and rice snack foods, fermented sauces, fish balls and thousand-year-old eggs. An overview of selected popular traditional Chinese foods and their processing techniques are included in this paper. Further development of the traditional techniques for formulation and production of these foods is expected to produce economic, social and health benefits.

  20. Acceleration techniques in the univariate Lipschitz global optimization

    Science.gov (United States)

    Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.; De Franco, Angela

    2016-10-01

    Univariate box-constrained Lipschitz global optimization problems are considered in this contribution. Geometric and information statistical approaches are presented. The novel powerful local tuning and local improvement techniques are described in the contribution as well as the traditional ways to estimate the Lipschitz constant. The advantages of the presented local tuning and local improvement techniques are demonstrated using the operational characteristics approach for comparing deterministic global optimization algorithms on the class of 100 widely used test functions.

  1. Energy Statistics

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    For the years 1992 and 1993, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period. The tables and figures shown in this publication are: Changes in the volume of GNP and energy consumption; Coal consumption; Natural gas consumption; Peat consumption; Domestic oil deliveries; Import prices of oil; Price development of principal oil products; Fuel prices for power production; Total energy consumption by source; Electricity supply; Energy imports by country of origin in 1993; Energy exports by recipient country in 1993; Consumer prices of liquid fuels; Consumer prices of hard coal and natural gas, prices of indigenous fuels; Average electricity price by type of consumer; Price of district heating by type of consumer and Excise taxes and turnover taxes included in consumer prices of some energy sources

  2. Statistical mechanics of superconductivity

    CERN Document Server

    Kita, Takafumi

    2015-01-01

    This book provides a theoretical, step-by-step comprehensive explanation of superconductivity for undergraduate and graduate students who have completed elementary courses on thermodynamics and quantum mechanics. To this end, it adopts the unique approach of starting with the statistical mechanics of quantum ideal gases and successively adding and clarifying elements and techniques indispensible for understanding it. They include the spin-statistics theorem, second quantization, density matrices, the Bloch–De Dominicis theorem, the variational principle in statistical mechanics, attractive interaction, and bound states. Ample examples of their usage are also provided in terms of topics from advanced statistical mechanics such as two-particle correlations of quantum ideal gases, derivation of the Hartree–Fock equations, and Landau’s Fermi-liquid theory, among others. With these preliminaries, the fundamental mean-field equations of superconductivity are derived with maximum mathematical clarity based on ...

  3. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  4. Statistical utilitarianism

    OpenAIRE

    Pivato, Marcus

    2013-01-01

    We show that, in a sufficiently large population satisfying certain statistical regularities, it is often possible to accurately estimate the utilitarian social welfare function, even if we only have very noisy data about individual utility functions and interpersonal utility comparisons. In particular, we show that it is often possible to identify an optimal or close-to-optimal utilitarian social choice using voting rules such as the Borda rule, approval voting, relative utilitarianism, or a...

  5. Comparative analysis of profitability of honey production using traditional and box hives.

    Science.gov (United States)

    Al-Ghamdi, Ahmed A; Adgaba, Nuru; Herab, Ahmed H; Ansari, Mohammad J

    2017-07-01

    Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA), the Cobb-Douglas (CD) production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.

  6. Comparative analysis of profitability of honey production using traditional and box hives

    Directory of Open Access Journals (Sweden)

    Ahmed A. Al-Ghamdi

    2017-07-01

    Full Text Available Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA, the Cobb-Douglas (CD production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.

  7. Conformity and statistical tolerancing

    Science.gov (United States)

    Leblond, Laurent; Pillet, Maurice

    2018-02-01

    Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).

  8. Current application of chemometrics in traditional Chinese herbal medicine research.

    Science.gov (United States)

    Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke

    2016-07-15

    Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. TRADITIONAL CHINESE HERBAL MEDICINE

    NARCIS (Netherlands)

    ZHU, YP; WOERDENBAG, HJ

    1995-01-01

    Herbal medicine, acupuncture and moxibustion, and massage and the three major constituent parts of traditional Chinese medicine. Although acupuncture is well known in many Western countries, Chinese herbal medicine, the mos important part of traditional Chinese medicine, is less well known in the

  10. Traditional timber frames

    NARCIS (Netherlands)

    Jorissen, A.J.M.; Hamer, den J.; Leijten, A.J.M.; Salenikovich, A.

    2014-01-01

    Due to new possibilities traditional timber framing has become increasingly popular since the beginning of the 21e century. Although traditional timber framing has been used for centuries, the expected mechanical behaviour is not dealt with in great detail in building codes, guidelines or text

  11. Estudo comparativo entre duas técnicas de tonsilectomia: bisturi harmônico (Ultracision e dissecção tradicional com bisturi de lâmina fria Comparative study between two tonsillectomy techniques: Ultracision harmonic scalpel and traditional dissection with cold scalpel

    Directory of Open Access Journals (Sweden)

    Fernando A. Ramos

    2004-06-01

    tonsillectomies with good outcomes. AIM: To compair the time of the procedure, bleeding and the need of trans operative hemostasis, post operative pain, healing aspect of the tonsilar fossa and complications in patients submitted to tonsillectomy with cold and ultrassonic scalpel. STUDY DESIGN: Transversal cohort. MATERIAL AND METHOD: Twenty six patients underwent tonsillectomy: 13 using the traditional thecnique with cold instruments and 13 using the ultrassonic one. They were evaluated with a standart protocol. The post operative pain were graduated through the horizontal visual scale analogue. RESULTS: Surgical time were shorter with the ultrassonic thecnique compared to the traditional cold instruments. The amount of stitch at tonsilar fossa was lesser than the traditional thecnique. There were no post operative statistical difference in pain and in the aspect of the tonsilar fossa. CONCLUSION: The ultrassonic scalpel is an excellent choice in surgeries where surgical time and transoperatory bleeding are important.

  12. Energy statistics

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    World data from the United Nation's latest Energy Statistics Yearbook, first published in our last issue, are completed here. The 1984-86 data were revised and 1987 data added for world commercial energy production and consumption, world natural gas plant liquids production, world LP-gas production, imports, exports, and consumption, world residual fuel oil production, imports, exports, and consumption, world lignite production, imports, exports, and consumption, world peat production and consumption, world electricity production, imports, exports, and consumption (Table 80), and world nuclear electric power production

  13. A computerized diagnostic system for nuclear plant control rooms based on statistical quality control

    International Nuclear Information System (INIS)

    Heising, C.D.; Grenzebach, W.S.

    1990-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps of the St. Lucie Unit 2 nuclear power plant located in Florida. A 30-day history of the four pumps prior to a plant shutdown caused by pump failure and a related fire within the containment was analyzed. Statistical quality control charts of recorded variables were constructed for each pump, which were shown to go out of statistical control many days before the plant trip. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators

  14. Traditional medicine and genomics

    Directory of Open Access Journals (Sweden)

    Kalpana Joshi

    2010-01-01

    Full Text Available ′Omics′ developments in the form of genomics, proteomics and metabolomics have increased the impetus of traditional medicine research. Studies exploring the genomic, proteomic and metabolomic basis of human constitutional types based on Ayurveda and other systems of oriental medicine are becoming popular. Such studies remain important to developing better understanding of human variations and individual differences. Countries like India, Korea, China and Japan are investing in research on evidence-based traditional medicines and scientific validation of fundamental principles. This review provides an account of studies addressing relationships between traditional medicine and genomics.

  15. Traditional medicine and genomics.

    Science.gov (United States)

    Joshi, Kalpana; Ghodke, Yogita; Shintre, Pooja

    2010-01-01

    'Omics' developments in the form of genomics, proteomics and metabolomics have increased the impetus of traditional medicine research. Studies exploring the genomic, proteomic and metabolomic basis of human constitutional types based on Ayurveda and other systems of oriental medicine are becoming popular. Such studies remain important to developing better understanding of human variations and individual differences. Countries like India, Korea, China and Japan are investing in research on evidence-based traditional medicines and scientific validation of fundamental principles. This review provides an account of studies addressing relationships between traditional medicine and genomics.

  16. Traditional, complementary, and alternative medicine: Focusing on research into traditional Tibetan medicine in China.

    Science.gov (United States)

    Song, Peipei; Xia, Jufeng; Rezeng, Caidan; Tong, Li; Tang, Wei

    2016-07-19

    As a form of traditional, complementary, and alternative medicine (TCAM), traditional Tibetan medicine has developed into a mainstay of medical care in Tibet and has spread from there to China and then to the rest of the world. Thus far, research on traditional Tibetan medicine has focused on the study of the plant and animal sources of traditional medicines, study of the histology of those plants and animals, chemical analysis of traditional medicines, pharmacological study of those medicines, and evaluation of the clinical efficacy of those medicines. A number of papers on traditional Tibetan medicines have been published, providing some evidence of the efficacy of traditional Tibetan medicine. However, many traditional Tibetan medicines have unknown active ingredients, hampering the establishment of drug quality standards, the development of new medicines, commercial production of medicines, and market availability of those medicines. Traditional Tibetan medicine must take several steps to modernize and spread to the rest of the world: the pharmacodynamics of traditional Tibetan medicines need to be determined, the clinical efficacy of those medicines needs to be verified, criteria to evaluate the efficacy of those medicines need to be established in order to guide their clinical use, and efficacious medicines need to be acknowledged by the pharmaceutical market. The components of traditional Tibetan medicine should be studied, traditional Tibetan medicines should be screened for their active ingredients, and techniques should be devised to prepare and manufacture those medicines.

  17. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  18. Initiating statistical maintenance optimization

    International Nuclear Information System (INIS)

    Doyle, E. Kevin; Tuomi, Vesa; Rowley, Ian

    2007-01-01

    Since the 1980 s maintenance optimization has been centered around various formulations of Reliability Centered Maintenance (RCM). Several such optimization techniques have been implemented at the Bruce Nuclear Station. Further cost refinement of the Station preventive maintenance strategy includes evaluation of statistical optimization techniques. A review of successful pilot efforts in this direction is provided as well as initial work with graphical analysis. The present situation reguarding data sourcing, the principle impediment to use of stochastic methods in previous years, is discussed. The use of Crowe/AMSAA (Army Materials Systems Analysis Activity) plots is demonstrated from the point of view of justifying expenditures in optimization efforts. (author)

  19. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  20. Inverting an Introductory Statistics Classroom

    Science.gov (United States)

    Kraut, Gertrud L.

    2015-01-01

    The inverted classroom allows more in-class time for inquiry-based learning and for working through more advanced problem-solving activities than does the traditional lecture class. The skills acquired in this learning environment offer benefits far beyond the statistics classroom. This paper discusses four ways that can make the inverted…

  1. Statistics for Finance

    DEFF Research Database (Denmark)

    Lindström, Erik; Madsen, Henrik; Nielsen, Jan Nygaard

    Statistics for Finance develops students’ professional skills in statistics with applications in finance. Developed from the authors’ courses at the Technical University of Denmark and Lund University, the text bridges the gap between classical, rigorous treatments of financial mathematics...... that rarely connect concepts to data and books on econometrics and time series analysis that do not cover specific problems related to option valuation. The book discusses applications of financial derivatives pertaining to risk assessment and elimination. The authors cover various statistical...... and mathematical techniques, including linear and nonlinear time series analysis, stochastic calculus models, stochastic differential equations, Itō’s formula, the Black–Scholes model, the generalized method-of-moments, and the Kalman filter. They explain how these tools are used to price financial derivatives...

  2. National Statistical Commission and Indian Official Statistics*

    Indian Academy of Sciences (India)

    IAS Admin

    a good collection of official statistics of that time. With more .... statistical agencies and institutions to provide details of statistical activities .... ing several training programmes. .... ful completion of Indian Statistical Service examinations, the.

  3. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  4. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  5. Statistical Thermodynamics and Microscale Thermophysics

    Science.gov (United States)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  6. KASTAMONU TRADITIONAL WOMEN CLOTHES

    Directory of Open Access Journals (Sweden)

    E.Elhan ÖZUS

    2015-08-01

    Full Text Available Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting the characteristics of Turkish society is our most beautiful heritage from past to present. From this heritage there are several examples of women's clothes c arried to present. When these examples are examined, it is possible to see the taste, the way of understanding art, joy and the lifestyle of the history. These garments are also the documents outlining the taste and grace of Turkish people. In the present study, traditional Kastamonu women's clothing, that has an important place in traditional cultural clothes of Anatolia, is investigated . The method of the present research is primarily defined as the examination of the written sources. The study is complet ed with the observations and examinations made in Kastamonu. According to the findings of the study, traditional Kastamonu women's clothing are examined and adapted to todays’ clothing.

  7. Intuitive introductory statistics

    CERN Document Server

    Wolfe, Douglas A

    2017-01-01

    This textbook is designed to give an engaging introduction to statistics and the art of data analysis. The unique scope includes, but also goes beyond, classical methodology associated with the normal distribution. What if the normal model is not valid for a particular data set? This cutting-edge approach provides the alternatives. It is an introduction to the world and possibilities of statistics that uses exercises, computer analyses, and simulations throughout the core lessons. These elementary statistical methods are intuitive. Counting and ranking features prominently in the text. Nonparametric methods, for instance, are often based on counts and ranks and are very easy to integrate into an introductory course. The ease of computation with advanced calculators and statistical software, both of which factor into this text, allows important techniques to be introduced earlier in the study of statistics. This book's novel scope also includes measuring symmetry with Walsh averages, finding a nonp...

  8. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    Science.gov (United States)

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  9. Traditional Chinese Biotechnology

    Science.gov (United States)

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed.

  10. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    Directory of Open Access Journals (Sweden)

    Adrion Christine

    2012-09-01

    Full Text Available Abstract Background A statistical analysis plan (SAP is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs. The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC or probability integral transform (PIT, and by using proper scoring rules (e.g. the logarithmic score. Results The instruments under study

  11. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: an example from a vertigo phase III study with longitudinal count data as primary endpoint.

    Science.gov (United States)

    Adrion, Christine; Mansmann, Ulrich

    2012-09-10

    A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions

  12. Engaging with the Art & Science of Statistics

    Science.gov (United States)

    Peters, Susan A.

    2010-01-01

    How can statistics clearly be mathematical and yet distinct from mathematics? The answer lies in the reality that statistics is both an art and a science, and both aspects are important for teaching and learning statistics. Statistics is a mathematical science in that it applies mathematical theories and techniques. Mathematics provides the…

  13. Healthier Traditional Food

    OpenAIRE

    Edward F. Millen

    2017-01-01

    The study of traditional food and healthy eating habits has been one of the fast growing areas. All humans, both men and women, require food for their survival. However, both men and women indulge in food as if it were their sole purpose of existence. Hence, eating disorders are common among men and women. Then media has played an effective role not only in establishing faulty standards for traditional healthy food but also it has highlighted the importance of healthy eating. It has brought t...

  14. A note on the statistical analysis of point judgment matrices

    Directory of Open Access Journals (Sweden)

    MG Kabera

    2013-06-01

    Full Text Available The Analytic Hierarchy Process is a multicriteria decision making technique developed by Saaty in the 1970s. The core of the approach is the pairwise comparison of objects according to a single criterion using a 9-point ratio scale and the estimation of weights associated with these objects based on the resultant judgment matrix. In the present paper some statistical approaches to extracting the weights of objects from a judgment matrix are reviewed and new ideas which are rooted in the traditional method of paired comparisons are introduced.

  15. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  16. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  17. A Comparison of Collaborative and Traditional Instruction in Higher Education

    Science.gov (United States)

    Gubera, Chip; Aruguete, Mara S.

    2013-01-01

    Although collaborative instructional techniques have become popular in college courses, it is unclear whether collaborative techniques can replace more traditional instructional methods. We examined the efficacy of collaborative courses (in-class, collaborative activities with no lectures) compared to traditional lecture courses (in-class,…

  18. Noodles, traditionally and today

    Directory of Open Access Journals (Sweden)

    Na Zhang

    2016-09-01

    Full Text Available Chinese noodles originated in the Han dynasty, which has more than 4,000 years of history. There are many stories about the origin of noodles. To a certain extent, noodles also reflect the cultural traditions and customs of China, which essentially means “human nature” and “worldly common sense”. There are thousands of varieties of noodles in China, according to the classification of the shape of noodles, seasoning gravy, cooking craft, and so on. Many noodles have local characteristics. Noodles are accepted by people from all over the world. The industrial revolution and the development of the food industry realized the transition from a traditional handicraft industry to mass production using machinery. In addition, the invention of instant noodles and their mass production also greatly changed the noodle industry. In essence, noodles are a kind of cereal food, which is the main body of the traditional Chinese diet. It is the main source of energy for Chinese people and the most economical energy food. Adhering to the principle of “making cereal food the main food”, is to maintain our Chinese good diet tradition, which can avoid the disadvantages of a high energy, high fat, and low carbohydrate diet, and promote health. The importance of the status of noodles in the dietary structure of residents in our country and the health impact should not be ignored.

  19. Traditional Cherokee Food.

    Science.gov (United States)

    Hendrix, Janey B.

    A collection for children and teachers of traditional Cherokee recipes emphasizes the art, rather than the science, of cooking. The hand-printed, illustrated format is designed to communicate the feeling of Cherokee history and culture and to encourage readers to collect and add family recipes. The cookbook could be used as a starting point for…

  20. Modern vs. Traditional.

    Science.gov (United States)

    Zhenhui, Rao

    1999-01-01

    This article discusses traditional methods, such as the grammar-translation, and modern methods, the communicative approach, for teaching English-as-a-foreign-language in China. The relationship between linguistic accuracy and communicative competence, student-centered orientation, and the role of the teacher are highlighted. (Author/VWL)

  1. Non-Traditional Wraps

    Science.gov (United States)

    Owens, Buffy

    2009-01-01

    This article presents a recipe for non-traditional wraps. In this article, the author describes how adults and children can help with the recipe and the skills involved with this recipe. The bigger role that children can play in the making of the item the more they are apt to try new things and appreciate the texture and taste.

  2. Making Tradition Healthy

    Centers for Disease Control (CDC) Podcasts

    2007-11-01

    In this podcast, a Latina nutrition educator shows how a community worked with local farmers to grow produce traditionally enjoyed by Hispanic/Latinos.  Created: 11/1/2007 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 11/10/2007.

  3. A statistical manual for chemists

    CERN Document Server

    Bauer, Edward

    1971-01-01

    A Statistical Manual for Chemists, Second Edition presents simple and fast statistical tools for data analysis of working chemists. This edition is organized into nine chapters and begins with an overview of the fundamental principles of the statistical techniques used in experimental data analysis. The subsequent chapters deal with the concept of statistical average, experimental design, and analysis of variance. The discussion then shifts to control charts, with particular emphasis on variable charts that are more useful to chemists and chemical engineers. A chapter focuses on the effect

  4. Statistical modeling for degradation data

    CERN Document Server

    Lio, Yuhlong; Ng, Hon; Tsai, Tzong-Ru

    2017-01-01

    This book focuses on the statistical aspects of the analysis of degradation data. In recent years, degradation data analysis has come to play an increasingly important role in different disciplines such as reliability, public health sciences, and finance. For example, information on products’ reliability can be obtained by analyzing degradation data. In addition, statistical modeling and inference techniques have been developed on the basis of different degradation measures. The book brings together experts engaged in statistical modeling and inference, presenting and discussing important recent advances in degradation data analysis and related applications. The topics covered are timely and have considerable potential to impact both statistics and reliability engineering.

  5. Statistical methods for ranking data

    CERN Document Server

    Alvo, Mayer

    2014-01-01

    This book introduces advanced undergraduate, graduate students and practitioners to statistical methods for ranking data. An important aspect of nonparametric statistics is oriented towards the use of ranking data. Rank correlation is defined through the notion of distance functions and the notion of compatibility is introduced to deal with incomplete data. Ranking data are also modeled using a variety of modern tools such as CART, MCMC, EM algorithm and factor analysis. This book deals with statistical methods used for analyzing such data and provides a novel and unifying approach for hypotheses testing. The techniques described in the book are illustrated with examples and the statistical software is provided on the authors’ website.

  6. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  7. Challenging tradition in Nigeria.

    Science.gov (United States)

    Supriya, K E

    1991-01-01

    In Nigeria since 1987, the National Association of Nigeria Nurses and Midwives (NSNNM) has used traditional medial and traditional health care workers to curtail the practice of female circumcision. Other harmful traditions are being changed also, such as early marriage, taboos of pregnancy and childbirth, and scarification. 30,000 member of NANNM are involved in this effort to halt the harmful practices themselves and to change community opinion. The program involved national and state level workshops on harmful health consequences of traditional practices and instruction on how to conduct focus group discussions to assess women's beliefs and practices. The focus groups were found to be a particularly successful method of opening up discussion of taboo topics and expressing deep emotions. The response to the knowledge that circumcision was not necessary was rage and anger, which was channeled into advocacy roles or change in the practice. The result was the channeled into advocacy roles for change in the practice. The result was the development of books, leaflets and videos. One community group designed a dress with a decorative motif of tatoos and bodily cuts to symbolize circumcision and scarring. Plays and songs were written and performed. Artists provided models of female genitalia both before and after circumcision. The campaign has been successful in bringing this issue to the public attention in prominent ways, such a national television, health talk shows, and women;s magazines. One of the most important results of the effort has been the demonstration that culture and tradition can be changed from within, rather than from outside imposition of values and beliefs.

  8. Statistics for lawyers

    CERN Document Server

    Finkelstein, Michael O

    2015-01-01

    This classic text, first published in 1990, is designed to introduce law students, law teachers, practitioners, and judges to the basic ideas of mathematical probability and statistics as they have been applied in the law. The third edition includes over twenty new sections, including the addition of timely topics, like New York City police stops, exonerations in death-sentence cases, projecting airline costs, and new material on various statistical techniques such as the randomized response survey technique, rare-events meta-analysis, competing risks, and negative binomial regression. The book consists of sections of exposition followed by real-world cases and case studies in which statistical data have played a role. The reader is asked to apply the theory to the facts, to calculate results (a hand calculator is sufficient), and to explore legal issues raised by quantitative findings. The authors' calculations and comments are given in the back of the book. As with previous editions, the cases and case stu...

  9. Statistics in biomedical research

    Directory of Open Access Journals (Sweden)

    González-Manteiga, Wenceslao

    2007-06-01

    Full Text Available The discipline of biostatistics is nowadays a fundamental scientific component of biomedical, public health and health services research. Traditional and emerging areas of application include clinical trials research, observational studies, physiology, imaging, and genomics. The present article reviews the current situation of biostatistics, considering the statistical methods traditionally used in biomedical research, as well as the ongoing development of new methods in response to the new problems arising in medicine. Clearly, the successful application of statistics in biomedical research requires appropriate training of biostatisticians. This training should aim to give due consideration to emerging new areas of statistics, while at the same time retaining full coverage of the fundamentals of statistical theory and methodology. In addition, it is important that students of biostatistics receive formal training in relevant biomedical disciplines, such as epidemiology, clinical trials, molecular biology, genetics, and neuroscience.La Bioestadística es hoy en día una componente científica fundamental de la investigación en Biomedicina, salud pública y servicios de salud. Las áreas tradicionales y emergentes de aplicación incluyen ensayos clínicos, estudios observacionales, fisología, imágenes, y genómica. Este artículo repasa la situación actual de la Bioestadística, considerando los métodos estadísticos usados tradicionalmente en investigación biomédica, así como los recientes desarrollos de nuevos métodos, para dar respuesta a los nuevos problemas que surgen en Medicina. Obviamente, la aplicación fructífera de la estadística en investigación biomédica exige una formación adecuada de los bioestadísticos, formación que debería tener en cuenta las áreas emergentes en estadística, cubriendo al mismo tiempo los fundamentos de la teoría estadística y su metodología. Es importante, además, que los estudiantes de

  10. Why Tsallis statistics?

    Science.gov (United States)

    Baranger, Michel

    2002-03-01

    It is a remarkable fact that the traditional teaching of thermodynamics, as reflected in the textbooks and including the long developments about ensembles and thermodynamic functions, is almost entirely about systems in equilibrium. The time variable does not enter. There is one exception, however. The single most important item, the flagship of the thermodynamic navy, the second law, is about the irreversibility of the time evolution of systems out of equilibrium. This is a bizarre situation, to say the least; a glaring case of the drunk man looking for his key under the lamp-post, when he knows that he lost it in the dark part of the street. The moment has come for us to go looking in the dark part, the behavior of systems as a function of time. We have been given a powerful new flashlight, chaos theory. We should use it. There, on the formerly dark pavement, we can find Tsallis statistics.

  11. Identification of heavy metals sources in the Mexico city atmosphere, using the proton induced x-ray analytical technique and multifactorial statistics techniques; Identificacion de fuentes de metales pesados en la atmosfera de la Ciudad de Mexico, usando la tecnica de analisis por induccion de rayos X con proton y tecnicas estadisticas multifactoriales

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez M, B [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)

    1997-07-01

    The objectives of this work are: to identify the heavy metals present in the air, and its concentrations. To know the behavior from the polluting chemical elements to the long of an annual cycle corresponding to 1990, based on the concentrations of the same ones, obtained through the PIXE technique. To identify the suitable statistical methods to use to the data of metals concentration in form of total suspended particle (PST), found in this investigation. To relate the concentrations and the meteorological parameters considered to be able to suggest the possible pollution sources. In function of the obtained results, to serve as base to the decisions making and measures control that are planned by diverse institutions focused to the problem of the atmospheric pollution in the Metropolitan area of Mexico City (ZMCM). (Author)

  12. Modernism and tradition and the traditions of modernism

    Directory of Open Access Journals (Sweden)

    Kros Džonatan

    2006-01-01

    Full Text Available Conventionally, the story of musical modernism has been told in terms of a catastrophic break with the (tonal past and the search for entirely new techniques and modes of expression suitable to a new age. The resulting notion of a single, linear, modernist mainstream (predicated on the basis of a Schoenbergian model of musical progress has served to conceal a more subtle relationship between past and present. Increasingly, it is being recognized that there exist many modernisms and their various identities are forged from a continual renegotiation between past and present, between tradition(s and the avant-garde. This is especially relevant when attempting to discuss the reception of modernism outside central Europe, where the adoption of (Germanic avant-garde attitudes was often interpreted as being "unpatriotic". The case of Great Britain is examined in detail: Harrison Birtwistle’s opera The Mask of Orpheus (1973–83 forms the focus for a wider discussion of modernism within the context of late/post-modern thought.

  13. Statistical core design

    International Nuclear Information System (INIS)

    Oelkers, E.; Heller, A.S.; Farnsworth, D.A.; Kearfott, K.J.

    1978-01-01

    The report describes the statistical analysis of DNBR thermal-hydraulic margin of a 3800 MWt, 205-FA core under design overpower conditions. The analysis used LYNX-generated data at predetermined values of the input variables whose uncertainties were to be statistically combined. LYNX data were used to construct an efficient response surface model in the region of interest; the statistical analysis was accomplished through the evaluation of core reliability; utilizing propagation of the uncertainty distributions of the inputs. The response surface model was implemented in both the analytical error propagation and Monte Carlo Techniques. The basic structural units relating to the acceptance criteria are fuel pins. Therefore, the statistical population of pins with minimum DNBR values smaller than specified values is determined. The specified values are designated relative to the most probable and maximum design DNBR values on the power limiting pin used in present design analysis, so that gains over the present design criteria could be assessed for specified probabilistic acceptance criteria. The results are equivalent to gains ranging from 1.2 to 4.8 percent of rated power dependent on the acceptance criterion. The corresponding acceptance criteria range from 95 percent confidence that no pin will be in DNB to 99.9 percent of the pins, which are expected to avoid DNB

  14. Enhancing the Lecture: Revitalizing the Traditional Format.

    Science.gov (United States)

    Bonwell, Charles C.

    1996-01-01

    The traditional lecture format of college courses can be enhanced by including active learning designed to further course goals of learning knowledge, developing skills, or fostering attitudes. Techniques suggested include using pauses, short writing periods, think-pair-share activities, formative quizzes, lecture summaries, and several assessment…

  15. Written mathematical traditions in Ancient Mesopotamia

    DEFF Research Database (Denmark)

    Høyrup, Jens

    2015-01-01

    Writing, as well as various mathematical techniques, were created in proto-literate Uruk in order to serve accounting, and Mesopotamian mathematics as we know it was always expressed in writing. In so far, mathematics generically regarded was always part of the generic written tradition....

  16. Factors influencing awareness and attendance of traditional oral ...

    African Journals Online (AJOL)

    Data were recorded using SPSS version 16 software. ... Conclusion: The study showed moderate awareness of traditional oral care .... Descriptive and inferential statistics were used as ..... C. Pilot survey of oral health-related quality of life: a.

  17. The composite sequential clustering technique for analysis of multispectral scanner data

    Science.gov (United States)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  18. On quantum statistical inference

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Gill, Richard D.; Jupp, Peter E.

    Recent developments in the mathematical foundations of quantum mechanics have brought the theory closer to that of classical probability and statistics. On the other hand, the unique character of quantum physics sets many of the questions addressed apart from those met classically in stochastics....... Furthermore, concurrent advances in experimental techniques and in the theory of quantum computation have led to a strong interest in questions of quantum information, in particular in the sense of the amount of information about unknown parameters in given observational data or accessible through various...

  19. Application of Statistics in Engineering Technology Programs

    Science.gov (United States)

    Zhan, Wei; Fink, Rainer; Fang, Alex

    2010-01-01

    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry…

  20. Neymar, defender of brazilian tradition

    Directory of Open Access Journals (Sweden)

    Francisca Islandia Cardoso da Silva

    2016-09-01

    Full Text Available The purpose of this article is to analyze how university students of Teresina-PI appropriate of the message of a report of the television show Esporte Espetacular. There was use of the technique of focus groups and analytical-descriptive method for collecting and analyzing data. The sample consisted of 24 university students, aged between 18 and 24 years. The report features Neymar as responsible to follow the "tradition" of Brazilians and to be crowned as the best player in the world. The subjects of research said that the speech conveyed by the report can reproduce and create a reality sometimes dreamlike, because objective to confer to Neymar great importance with regard to national identity.

  1. An introduction to statistical thermodynamics

    CERN Document Server

    Hill, Terrell L

    1987-01-01

    ""A large number of exercises of a broad range of difficulty make this book even more useful…a good addition to the literature on thermodynamics at the undergraduate level."" - Philosophical MagazineAlthough written on an introductory level, this wide-ranging text provides extensive coverage of topics of current interest in equilibrium statistical mechanics. Indeed, certain traditional topics are given somewhat condensed treatment to allow room for a survey of more recent advances.The book is divided into four major sections. Part I deals with the principles of quantum statistical mechanics a

  2. Traditional preventive treatment options

    DEFF Research Database (Denmark)

    Longbottom, C; Ekstrand, K; Zero, D

    2009-01-01

    Preventive treatment options can be divided into primary, secondary and tertiary prevention techniques, which can involve patient- or professionally applied methods. These include: oral hygiene (instruction), pit and fissure sealants ('temporary' or 'permanent'), fluoride applications (patient...... options....

  3. Lies, damn lies and statistics

    International Nuclear Information System (INIS)

    Jones, M.D.

    2001-01-01

    Statistics are widely employed within archaeological research. This is becoming increasingly so as user friendly statistical packages make increasingly sophisticated analyses available to non statisticians. However, all statistical techniques are based on underlying assumptions of which the end user may be unaware. If statistical analyses are applied in ignorance of the underlying assumptions there is the potential for highly erroneous inferences to be drawn. This does happen within archaeology and here this is illustrated with the example of 'date pooling', a technique that has been widely misused in archaeological research. This misuse may have given rise to an inevitable and predictable misinterpretation of New Zealand's archaeological record. (author). 10 refs., 6 figs., 1 tab

  4. A conceptual guide to statistics using SPSS

    CERN Document Server

    Berkman, Elliot T

    2011-01-01

    Bridging an understanding of Statistics and SPSS. This unique text helps students develop a conceptual understanding of a variety of statistical tests by linking the ideas learned in a statistics class from a traditional statistics textbook with the computational steps and output from SPSS. Each chapter begins with a student-friendly explanation of the concept behind each statistical test and how the test relates to that concept. The authors then walk through the steps to compute the test in SPSS and the output, clearly linking how the SPSS procedure and output connect back to the conceptual u

  5. Childhood Cancer Statistics

    Science.gov (United States)

    ... Watchdog Ratings Feedback Contact Select Page Childhood Cancer Statistics Home > Cancer Resources > Childhood Cancer Statistics Childhood Cancer Statistics – Graphs and Infographics Number of Diagnoses Incidence Rates ...

  6. Official statistics and Big Data

    Directory of Open Access Journals (Sweden)

    Peter Struijs

    2014-07-01

    Full Text Available The rise of Big Data changes the context in which organisations producing official statistics operate. Big Data provides opportunities, but in order to make optimal use of Big Data, a number of challenges have to be addressed. This stimulates increased collaboration between National Statistical Institutes, Big Data holders, businesses and universities. In time, this may lead to a shift in the role of statistical institutes in the provision of high-quality and impartial statistical information to society. In this paper, the changes in context, the opportunities, the challenges and the way to collaborate are addressed. The collaboration between the various stakeholders will involve each partner building on and contributing different strengths. For national statistical offices, traditional strengths include, on the one hand, the ability to collect data and combine data sources with statistical products and, on the other hand, their focus on quality, transparency and sound methodology. In the Big Data era of competing and multiplying data sources, they continue to have a unique knowledge of official statistical production methods. And their impartiality and respect for privacy as enshrined in law uniquely position them as a trusted third party. Based on this, they may advise on the quality and validity of information of various sources. By thus positioning themselves, they will be able to play their role as key information providers in a changing society.

  7. Traditional sorghum beer "ikigage"

    OpenAIRE

    Lyumugabe Loshima, François

    2010-01-01

    Samples of traditional sorghum beer Ikigage was collected in the southern province of Rwanda and analyzed for microbiological and physico-chemical contents. Ikigage contained total aerobic mesophilic bacteria (33.55 x 106 cfu/ml), yeast (10.15 x 106 cfu/ml), lactic acid bacteria (35.35 x 104 cfu/ml), moulds (4.12 x 104 cfu/ml), E. coli (21.90 x 103 cfu/ml), fecal streptococci (22.50 x 103 cfu/ml), Staphylococcus aureus (16.02 x 103 cfu/ml), total coliform (32.30 x 103 cfu/ml), eth...

  8. In the Dirac tradition

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1988-04-15

    It was Paul Dirac who cast quantum mechanics into the form we now use, and many generations of theoreticians openly acknowledge his influence on their thinking. When Dirac died in 1984, St. John's College, Cambridge, his base for most of his lifetime, instituted an annual lecture in his memory at Cambridge. The first lecture, in 1986, attracted two heavyweights - Richard Feynman and Steven Weinberg. Far from using the lectures as a platform for their own work, in the Dirac tradition they presented stimulating material on deep underlying questions.

  9. In the Dirac tradition

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    It was Paul Dirac who cast quantum mechanics into the form we now use, and many generations of theoreticians openly acknowledge his influence on their thinking. When Dirac died in 1984, St. John's College, Cambridge, his base for most of his lifetime, instituted an annual lecture in his memory at Cambridge. The first lecture, in 1986, attracted two heavyweights - Richard Feynman and Steven Weinberg. Far from using the lectures as a platform for their own work, in the Dirac tradition they presented stimulating material on deep underlying questions

  10. Preserving traditional medical knowledge through modes of transmission: A post-positivist enquiry

    Directory of Open Access Journals (Sweden)

    Janet Adekannbi

    2014-11-01

    Objectives: This study investigated the role which the mode of transmission plays in the preservation of traditional medical knowledge. Method: A post-positivist methodology was adopted. A purposive sampling technique was used to select three communities from each of the six states in South-Western Nigeria. The snowball technique was used in selecting 228 traditional medical practitioners, whilst convenience sampling was adopted in selecting 529 apprentices and 120 children who were not learning the profession. A questionnaire with a five-point Likert scale, key-informant interviews and focus-group discussions were used to collect data. The quantitative data was analysed using descriptive statistics whilst qualitative data was analysed thematically. Results: The dominant mode of knowledge transmission was found to be oblique (66.5% whilst vertical transmission (29.3% and horizontal transmission (4.2% occurred much less. Conclusion: Traditional medical knowledge is at risk of being lost in the study area because most of the apprentices were children from other parents, whereas most traditional medical practitioners preferred to transmit knowledge only to their children.

  11. Effect of the diet traditional and non-traditional on the respiration and excretion in larvae of white shrimp Litopenaeus vannamei

    Directory of Open Access Journals (Sweden)

    María Alejandra Medina-Jasso

    2015-11-01

    Full Text Available Objetive. It was studied the respiration and ammoniacal excretion of zoeas and mysis of Litopenaeus vannamei fed with the diet used traditionally (of microalgae and nauplios of artemia and another alternative (not traditional of microalgae with rotifers. Materials and methods. After four hours the oxygen consumption and ammonia excretion in BOD bottles with 60 larvae (closed respirometers was estimated. The concentrations of O2 and NH4 + were measured with an electrode polarográfico in the first case and with the indophenol blue technique for the second. Results. In zoea, oxygen consumption increased with development and showed statistical differences (p=0.023. In mysis, the oxygen consumption were significance in the traditional diet, whereas no differences were alternative (p=0.003. In both stages for the ammoniacal excretion increased development stage and there were detected statistical difference (p<0.001, although to the diets were not noticed significant differences. Conclusions. A higher energy absorption for zoea (I, II y III what mysis (I, II y III larvae was obtained, this is likely an interaction between rates of respiration and excretion caused by variations in the efficiency of absorption by the larvae. The weights obtained in both larvae were not supplied with differences between diets.

  12. A primer of multivariate statistics

    CERN Document Server

    Harris, Richard J

    2014-01-01

    Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why

  13. Isotopic safeguards statistics

    International Nuclear Information System (INIS)

    Timmerman, C.L.; Stewart, K.B.

    1978-06-01

    The methods and results of our statistical analysis of isotopic data using isotopic safeguards techniques are illustrated using example data from the Yankee Rowe reactor. The statistical methods used in this analysis are the paired comparison and the regression analyses. A paired comparison results when a sample from a batch is analyzed by two different laboratories. Paired comparison techniques can be used with regression analysis to detect and identify outlier batches. The second analysis tool, linear regression, involves comparing various regression approaches. These approaches use two basic types of models: the intercept model (y = α + βx) and the initial point model [y - y 0 = β(x - x 0 )]. The intercept model fits strictly the exposure or burnup values of isotopic functions, while the initial point model utilizes the exposure values plus the initial or fabricator's data values in the regression analysis. Two fitting methods are applied to each of these models. These methods are: (1) the usual least squares fitting approach where x is measured without error, and (2) Deming's approach which uses the variance estimates obtained from the paired comparison results and considers x and y are both measured with error. The Yankee Rowe data were first measured by Nuclear Fuel Services (NFS) and remeasured by Nuclear Audit and Testing Company (NATCO). The ratio of Pu/U versus 235 D (in which 235 D is the amount of depleted 235 U expressed in weight percent) using actual numbers is the isotopic function illustrated. Statistical results using the Yankee Rowe data indicates the attractiveness of Deming's regression model over the usual approach by simple comparison of the given regression variances with the random variance from the paired comparison results

  14. MQSA National Statistics

    Science.gov (United States)

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2018 Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard ...

  15. State Transportation Statistics 2014

    Science.gov (United States)

    2014-12-15

    The Bureau of Transportation Statistics (BTS) presents State Transportation Statistics 2014, a statistical profile of transportation in the 50 states and the District of Columbia. This is the 12th annual edition of State Transportation Statistics, a ...

  16. Effects of traditional Japanese massage therapy on gene expression: preliminary study.

    Science.gov (United States)

    Donoyama, Nozomi; Ohkoshi, Norio

    2011-06-01

    Changes in gene expression after traditional Japanese massage therapy were investigated to clarify the mechanisms of the clinical effects of traditional Japanese massage therapy. This was a pilot experimental study. The study was conducted in a laboratory at Tsukuba University of Technology. The subjects were 2 healthy female volunteers (58-year-old Participant A, 55-year-old Participant B). The intervention consisted of a 40-minute full-body massage using standard traditional Japanese massage techniques through the clothing and a 40-minute rest as a control, in which participants lie on the massage table without being massaged. Before and after an intervention, blood was taken and analyzed by microarray: (1) The number of genes whose expression was more than double after the intervention than before was examined; (2) For those genes, gene ontology analysis identified statistically significant gene ontology terms. The gene expression count in the total of 41,000 genes was 1256 genes for Participant A and 1778 for Participant B after traditional Japanese massage, and was 157 and 82 after the control, respectively. The significant gene ontology terms selected by both Participants A and B after massage were "immune response" and "immune system," whereas no gene ontology terms were selected by them in the control. It is implied that traditional Japanese massage therapy may affect the immune function. Further studies with more samples are necessary.

  17. Non-traditional inheritance

    International Nuclear Information System (INIS)

    Hall, J.G.

    1992-01-01

    In the last few years, several non-traditional forms of inheritance have been recognized. These include mosaicism, cytoplasmic inheritance, uniparental disomy, imprinting, amplification/anticipation, and somatic recombination. Genomic imprinting (GI) is the dependence of the phenotype on the sex of the transmitting parent. GI in humans seems to involve growth, behaviour, and survival in utero. The detailed mechanism of genomic imprinting is not known, but it seems that some process is involved in turning a gene off; this probably involves two genes, one of which produces a product that turns a gene off, and the gene that is itself turned off. The process of imprinting (turning off) may be associated with methylation. Erasure of imprinting can occur, and seems to be associated with meiosis. 10 refs

  18. Renyi statistics in equilibrium statistical mechanics

    International Nuclear Information System (INIS)

    Parvan, A.S.; Biro, T.S.

    2010-01-01

    The Renyi statistics in the canonical and microcanonical ensembles is examined both in general and in particular for the ideal gas. In the microcanonical ensemble the Renyi statistics is equivalent to the Boltzmann-Gibbs statistics. By the exact analytical results for the ideal gas, it is shown that in the canonical ensemble, taking the thermodynamic limit, the Renyi statistics is also equivalent to the Boltzmann-Gibbs statistics. Furthermore it satisfies the requirements of the equilibrium thermodynamics, i.e. the thermodynamical potential of the statistical ensemble is a homogeneous function of first degree of its extensive variables of state. We conclude that the Renyi statistics arrives at the same thermodynamical relations, as those stemming from the Boltzmann-Gibbs statistics in this limit.

  19. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  20. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  1. Statistical physics of vaccination

    Science.gov (United States)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  2. Intermediate statistics a modern approach

    CERN Document Server

    Stevens, James P

    2007-01-01

    Written for those who use statistical techniques, this text focuses on a conceptual understanding of the material. It uses definitional formulas on small data sets to provide conceptual insight into what is being measured. It emphasizes the assumptions underlying each analysis, and shows how to test the critical assumptions using SPSS or SAS.

  3. Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine

    Science.gov (United States)

    Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit

    2017-01-01

    In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804

  4. Research on the Application of Traditional Embroidery Technology in Modern Jewelry Creation

    Directory of Open Access Journals (Sweden)

    Min Li

    2017-10-01

    Full Text Available Tradition and modernity are the eternal topics of art, especially in the age of information.Efficient mechanical production methods to improve the traditional jewelry production process, which achieved the demand for mass production of jewelry.However, with the development of society and the progress of science and technology,  many traditional arts and crafts are lost.From the traditional culture, this paper analyzes the work of embroidery jewelry by studying the traditional Chinese embroidery culture and technique, summarizes the traditional embroidery technology and modern jewelry design techniques combined approach to guide the creative practice.On the basis of studying the theoretical method of combining traditional embroidery technology with modern jewelry, this paper focuses on the application of traditional embroidery techniques in jewelry creation, inspire the potential of traditional craft, to provide reference for modern jewelry design rich Chinese characteristics and attract the attention of  Chinese jewelry industry and inherit the traditional arts.

  5. COMPARi\\ TIVE STUDIES OF TRADITIONAL (NON-ENERG\\T

    African Journals Online (AJOL)

    2012-12-19

    Dec 19, 2012 ... more energy and utilities cost than the traditional energy technique. . " ' .... ,. Keywords: ... An additional major advantage of the Pinch approach is that ... modification before embarking on actual implementation.(Adefila, I 994}.

  6. Traditional boat-building and navigational techniques of southern Orissa

    Digital Repository Service at National Institute of Oceanography (India)

    Tripati, S.

    stream_size 14 stream_content_type text/plain stream_name J_Indian_Ocean_Stud_3_66.pdf.txt stream_source_info J_Indian_Ocean_Stud_3_66.pdf.txt Content-Encoding ISO-8859-1 Content-Type text/plain; charset=ISO-8859-1 ...

  7. Effect of Traditional Processing Techniques on the Nutritional and ...

    African Journals Online (AJOL)

    Michael Horsfall

    Composition of African Bread-Fruit (Treculia africana) Seeds. *IFEOMA I IJEH .... located mainly in the seed coat (Kumar et al, 1979;. Singh ... development and control of some metabolic processes ... (1996). Regulation of selenoprotein gene.

  8. The Cassava Processing Industry in Brazil: Traditional Techniques ...

    African Journals Online (AJOL)

    The paper considers the evolution of cassava-based industrial production, processing and marketing in Brazil, in light of the great technological diversification to be found in Brazil. It discusses the private role of the small- and medium-scale food and related processing enterprises in the food industry, as they employ ...

  9. Traditional Medicine in Developing Countries

    DEFF Research Database (Denmark)

    Thorsen, Rikke Stamp

    or spiritual healer and self-treatment with herbal medicine or medicinal plants. Reliance on traditional medicine varies between countries and rural and urban areas, but is reported to be as high as 80% in some developing countries. Increased realization of the continued importance of traditional medicine has......People use traditional medicine to meet their health care needs in developing countries and medical pluralism persists worldwide despite increased access to allopathic medicine. Traditional medicine includes a variety of treatment opportunities, among others, consultation with a traditional healer...... led to the formulation of policies on the integration of traditional medicine into public health care. Local level integration is already taking place as people use multiple treatments when experiencing illness. Research on local level use of traditional medicine for health care, in particular the use...

  10. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  11. State Transportation Statistics 2010

    Science.gov (United States)

    2011-09-14

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2010, a statistical profile of transportation in the 50 states and the District of Col...

  12. State Transportation Statistics 2012

    Science.gov (United States)

    2013-08-15

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportation's (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2012, a statistical profile of transportation ...

  13. Adrenal Gland Tumors: Statistics

    Science.gov (United States)

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  14. State transportation statistics 2009

    Science.gov (United States)

    2009-01-01

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and : Innovative Technology Administration (RITA), presents State Transportation : Statistics 2009, a statistical profile of transportation in the 50 states and the : District ...

  15. State Transportation Statistics 2011

    Science.gov (United States)

    2012-08-08

    The Bureau of Transportation Statistics (BTS), a part of DOTs Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2011, a statistical profile of transportation in the 50 states and the District of Col...

  16. Neuroendocrine Tumor: Statistics

    Science.gov (United States)

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 01/ ... the body. It is important to remember that statistics on the survival rates for people with a ...

  17. State Transportation Statistics 2013

    Science.gov (United States)

    2014-09-19

    The Bureau of Transportation Statistics (BTS), a part of the U.S. Department of Transportations (USDOT) Research and Innovative Technology Administration (RITA), presents State Transportation Statistics 2013, a statistical profile of transportatio...

  18. BTS statistical standards manual

    Science.gov (United States)

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  19. Spatio-temporal statistical models with applications to atmospheric processes

    International Nuclear Information System (INIS)

    Wikle, C.K.

    1996-01-01

    This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model

  20. The Hausa Lexicographic Tradition

    Directory of Open Access Journals (Sweden)

    Roxana Ma Newman

    2011-10-01

    Full Text Available

    Abstract: Hausa, a major language of West Africa, is one of the most widely studied languagesof Sub-Saharan Africa. It has a rich lexicographic tradition dating back some two centuries. Sincethe first major vocabulary published in 1843 up to the present time, almost 60 lexicographic works— dictionaries, vocabularies, glossaries — have been published, in a range of metalanguages, fromEnglish to Hausa itself. This article traces the historical development of the major studies accordingto their type and function as general reference works, specialized works, pedagogical works, andterminological works. For each work, there is a general discussion of its size, accuracy of the phonological,lexical, and grammatical information, and the adequacy of its definitions and illustrativematerial. A complete list of the lexicographic works is included.

    Keywords: ARABIC, BILINGUAL LEXICOGRAPHY, DIALECTAL VARIANTS, DICTIONARIES,ENGLISH, ETYMOLOGIES, FRENCH, GERMAN, GLOSSARIES, GRAMMATICALCATEGORIES, HAUSA, LANGUAGE LEARNING, LOANWORDS, NEOLOGISMS, NIGER,NIGERIA, ORTHOGRAPHY, PHONETIC TRANSCRIPTION, PHONOLOGY, RUSSIAN, STANDARDDIALECT, STANDARDIZATION, TERMINOLOGY, VOCABULARIES, WEST AFRICA.

    Opsomming: Die leksikografiese tradisie in Hausa. Hausa, 'n belangrike taal vanWes-Afrika, is een van die tale van Afrika suid van die Sahara wat die wydste bestudeer word. Dithet 'n ryk leksikografiese tradisie wat ongeveer twee eeue oud is. Van die eerste groot woordeboekwat in 1843 gepubliseer is tot die hede is ongeveer 60 leksikografiese werke — woordeboeke,naamlyste, woordelyste — gepubliseer in 'n reeks metatale van Engels tot Hausa self. Hierdie artikelgaan die historiese ontwikkeling van die groter studies aan die hand van hulle tipe en funksieas algemene naslaanwerke, gespesialiseerde werke, opvoedkundige werke, en terminologiesewerke na. Vir elke werk is daar 'n algemene bespreking oor sy grootte, akkuraatheid van die fonologiese,leksikale en