WorldWideScience

Sample records for stage statistical analysis

  1. Staging Liver Fibrosis with Statistical Observers

    Science.gov (United States)

    Brand, Jonathan Frieman

    Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically on order of 1mm, which close to the resolution limit of in vivo Gd-enhanced MRI. In this work the methods to collect training and testing images for a Hotelling observer are covered. An observer based on local texture analysis is trained and tested using wet-tissue phantoms. The technique is used to optimize the MRI sequence based on task performance. The final method developed is a two stage model observer to classify fibrotic and healthy tissue in both phantoms and in vivo MRI images. The first stage observer tests for the presence of local texture. Test statistics from the first observer are used to train the second stage observer to globally sample the local observer results. A decision of the disease class is made for an entire MRI image slice using test statistics collected from the second observer. The techniques are tested on wet-tissue phantoms and in vivo clinical patient data.

  2. Meta-analysis of Gaussian individual patient data: Two-stage or not two-stage?

    Science.gov (United States)

    Morris, Tim P; Fisher, David J; Kenward, Michael G; Carpenter, James R

    2018-04-30

    Quantitative evidence synthesis through meta-analysis is central to evidence-based medicine. For well-documented reasons, the meta-analysis of individual patient data is held in higher regard than aggregate data. With access to individual patient data, the analysis is not restricted to a "two-stage" approach (combining estimates and standard errors) but can estimate parameters of interest by fitting a single model to all of the data, a so-called "one-stage" analysis. There has been debate about the merits of one- and two-stage analysis. Arguments for one-stage analysis have typically noted that a wider range of models can be fitted and overall estimates may be more precise. The two-stage side has emphasised that the models that can be fitted in two stages are sufficient to answer the relevant questions, with less scope for mistakes because there are fewer modelling choices to be made in the two-stage approach. For Gaussian data, we consider the statistical arguments for flexibility and precision in small-sample settings. Regarding flexibility, several of the models that can be fitted only in one stage may not be of serious interest to most meta-analysis practitioners. Regarding precision, we consider fixed- and random-effects meta-analysis and see that, for a model making certain assumptions, the number of stages used to fit this model is irrelevant; the precision will be approximately equal. Meta-analysts should choose modelling assumptions carefully. Sometimes relevant models can only be fitted in one stage. Otherwise, meta-analysts are free to use whichever procedure is most convenient to fit the identified model. © 2018 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  3. Meta‐analysis using individual participant data: one‐stage and two‐stage approaches, and why they may differ

    Science.gov (United States)

    Ensor, Joie; Riley, Richard D.

    2016-01-01

    Meta‐analysis using individual participant data (IPD) obtains and synthesises the raw, participant‐level data from a set of relevant studies. The IPD approach is becoming an increasingly popular tool as an alternative to traditional aggregate data meta‐analysis, especially as it avoids reliance on published results and provides an opportunity to investigate individual‐level interactions, such as treatment‐effect modifiers. There are two statistical approaches for conducting an IPD meta‐analysis: one‐stage and two‐stage. The one‐stage approach analyses the IPD from all studies simultaneously, for example, in a hierarchical regression model with random effects. The two‐stage approach derives aggregate data (such as effect estimates) in each study separately and then combines these in a traditional meta‐analysis model. There have been numerous comparisons of the one‐stage and two‐stage approaches via theoretical consideration, simulation and empirical examples, yet there remains confusion regarding when each approach should be adopted, and indeed why they may differ. In this tutorial paper, we outline the key statistical methods for one‐stage and two‐stage IPD meta‐analyses, and provide 10 key reasons why they may produce different summary results. We explain that most differences arise because of different modelling assumptions, rather than the choice of one‐stage or two‐stage itself. We illustrate the concepts with recently published IPD meta‐analyses, summarise key statistical software and provide recommendations for future IPD meta‐analyses. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27747915

  4. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  5. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    Energy Technology Data Exchange (ETDEWEB)

    Fountain, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brigantic, Robert T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Reid A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  6. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    Directory of Open Access Journals (Sweden)

    Charmaine eDemanuele

    2015-10-01

    Full Text Available Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from fMRI blood oxygenation level dependent (BOLD time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC, but not in the primary visual cortex (V1. Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel

  7. Statistical Analysis of Zebrafish Locomotor Response.

    Science.gov (United States)

    Liu, Yiwen; Carmer, Robert; Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling's T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling's T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure.

  8. [Methods of statistical analysis in differential diagnostics of the degree of brain glioma anaplasia during preoperative stage].

    Science.gov (United States)

    Glavatskiĭ, A Ia; Guzhovskaia, N V; Lysenko, S N; Kulik, A V

    2005-12-01

    The authors proposed a possible preoperative diagnostics of the degree of supratentorial brain gliom anaplasia using statistical analysis methods. It relies on a complex examination of 934 patients with I-IV degree anaplasias, which had been treated in the Institute of Neurosurgery from 1990 to 2004. The use of statistical analysis methods for differential diagnostics of the degree of brain gliom anaplasia may optimize a diagnostic algorithm, increase reliability of obtained data and in some cases avoid carrying out irrational operative intrusions. Clinically important signs for the use of statistical analysis methods directed to preoperative diagnostics of brain gliom anaplasia have been defined

  9. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    Science.gov (United States)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  10. Data analysis for radiological characterisation: Geostatistical and statistical complementarity

    International Nuclear Information System (INIS)

    Desnoyers, Yvon; Dubot, Didier

    2012-01-01

    Radiological characterisation may cover a large range of evaluation objectives during a decommissioning and dismantling (D and D) project: removal of doubt, delineation of contaminated materials, monitoring of the decontamination work and final survey. At each stage, collecting relevant data to be able to draw the conclusions needed is quite a big challenge. In particular two radiological characterisation stages require an advanced sampling process and data analysis, namely the initial categorization and optimisation of the materials to be removed and the final survey to demonstrate compliance with clearance levels. On the one hand the latter is widely used and well developed in national guides and norms, using random sampling designs and statistical data analysis. On the other hand a more complex evaluation methodology has to be implemented for the initial radiological characterisation, both for sampling design and for data analysis. The geostatistical framework is an efficient way to satisfy the radiological characterisation requirements providing a sound decision-making approach for the decommissioning and dismantling of nuclear premises. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. Thus geo-statistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, leading to a sound classification of radiological waste (surfaces and volumes). This way, the radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive or not) surface survey of the contamination is implemented on a regular grid. Finally, in order to assess activity levels and contamination depths, destructive samples are collected at several locations within the premises (based on the surface survey results) and analysed. Combined with

  11. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  12. Weak instruments and the first stage F-statistic in IV models with a nonscalar error covariance structure

    NARCIS (Netherlands)

    Bun, M.; de Haan, M.

    2010-01-01

    We analyze the usefulness of the first stage F-statistic for detecting weak instruments in the IV model with a nonscalar error covariance structure. More in particular, we question the validity of the rule of thumb of a first stage F-statistic of 10 or higher for models with correlated errors

  13. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  14. Bayesian Sensitivity Analysis of Statistical Models with Missing Data.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Tang, Niansheng

    2014-04-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures.

  15. Colorectal cancer stages transcriptome analysis.

    Directory of Open Access Journals (Sweden)

    Tianyao Huo

    Full Text Available Colorectal cancer (CRC is the third most common cancer and the second leading cause of cancer-related deaths in the United States. The purpose of this study was to evaluate the gene expression differences in different stages of CRC. Gene expression data on 433 CRC patient samples were obtained from The Cancer Genome Atlas (TCGA. Gene expression differences were evaluated across CRC stages using linear regression. Genes with p≤0.001 in expression differences were evaluated further in principal component analysis and genes with p≤0.0001 were evaluated further in gene set enrichment analysis. A total of 377 patients with gene expression data in 20,532 genes were included in the final analysis. The numbers of patients in stage I through IV were 59, 147, 116 and 55, respectively. NEK4 gene, which encodes for NIMA related kinase 4, was differentially expressed across the four stages of CRC. The stage I patients had the highest expression of NEK4 genes, while the stage IV patients had the lowest expressions (p = 9*10-6. Ten other genes (RNF34, HIST3H2BB, NUDT6, LRCh4, GLB1L, HIST2H4A, TMEM79, AMIGO2, C20orf135 and SPSB3 had p value of 0.0001 in the differential expression analysis. Principal component analysis indicated that the patients from the 4 clinical stages do not appear to have distinct gene expression pattern. Network-based and pathway-based gene set enrichment analyses showed that these 11 genes map to multiple pathways such as meiotic synapsis and packaging of telomere ends, etc. Ten of these 11 genes were linked to Gene Ontology terms such as nucleosome, DNA packaging complex and protein-DNA interactions. The protein complex-based gene set analysis showed that four genes were involved in H2AX complex II. This study identified a small number of genes that might be associated with clinical stages of CRC. Our analysis was not able to find a molecular basis for the current clinical staging for CRC based on the gene expression patterns.

  16. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  17. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  18. Sensitivity Analysis in Two-Stage DEA

    Directory of Open Access Journals (Sweden)

    Athena Forghani

    2015-07-01

    Full Text Available Data envelopment analysis (DEA is a method for measuring the efficiency of peer decision making units (DMUs which uses a set of inputs to produce a set of outputs. In some cases, DMUs have a two-stage structure, in which the first stage utilizes inputs to produce outputs used as the inputs of the second stage to produce final outputs. One important issue in two-stage DEA is the sensitivity of the results of an analysis to perturbations in the data. The current paper looks into combined model for two-stage DEA and applies the sensitivity analysis to DMUs on the entire frontier. In fact, necessary and sufficient conditions for preserving a DMU's efficiency classiffication are developed when various data changes are applied to all DMUs.

  19. Sensitivity Analysis in Two-Stage DEA

    Directory of Open Access Journals (Sweden)

    Athena Forghani

    2015-12-01

    Full Text Available Data envelopment analysis (DEA is a method for measuring the efficiency of peer decision making units (DMUs which uses a set of inputs to produce a set of outputs. In some cases, DMUs have a two-stage structure, in which the first stage utilizes inputs to produce outputs used as the inputs of the second stage to produce final outputs. One important issue in two-stage DEA is the sensitivity of the results of an analysis to perturbations in the data. The current paper looks into combined model for two-stage DEA and applies the sensitivity analysis to DMUs on the entire frontier. In fact, necessary and sufficient conditions for preserving a DMU's efficiency classiffication are developed when various data changes are applied to all DMUs.

  20. Two-Stage Regularized Linear Discriminant Analysis for 2-D Data.

    Science.gov (United States)

    Zhao, Jianhua; Shi, Lei; Zhu, Ji

    2015-08-01

    Fisher linear discriminant analysis (LDA) involves within-class and between-class covariance matrices. For 2-D data such as images, regularized LDA (RLDA) can improve LDA due to the regularized eigenvalues of the estimated within-class matrix. However, it fails to consider the eigenvectors and the estimated between-class matrix. To improve these two matrices simultaneously, we propose in this paper a new two-stage method for 2-D data, namely a bidirectional LDA (BLDA) in the first stage and the RLDA in the second stage, where both BLDA and RLDA are based on the Fisher criterion that tackles correlation. BLDA performs the LDA under special separable covariance constraints that incorporate the row and column correlations inherent in 2-D data. The main novelty is that we propose a simple but effective statistical test to determine the subspace dimensionality in the first stage. As a result, the first stage reduces the dimensionality substantially while keeping the significant discriminant information in the data. This enables the second stage to perform RLDA in a much lower dimensional subspace, and thus improves the two estimated matrices simultaneously. Experiments on a number of 2-D synthetic and real-world data sets show that BLDA+RLDA outperforms several closely related competitors.

  1. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  2. Surface electromyography based muscle fatigue analysis for stroke patients at different Brunnstrom stages.

    Science.gov (United States)

    Yinjun Tu; Zhe Zhang; Xudong Gu; Qiang Fang

    2016-08-01

    Muscle fatigue analysis has been an important topic in sport and rehabilitation medicine due to its role in muscle performance evaluation and pathology investigation. This paper proposes a surface electromyography (sEMG) based muscle fatigue analysis approach which was specifically designed for stroke rehabilitation applications. 14 stroke patients from 5 different Brunnstrom recovery stage groups were involved in the experiment and features including median frequency and mean power frequency were extracted from the collected sEMG samples for investigation. After signal decomposition, the decline of motor unit firing rate of patients from different groups had also been studied. Statistically significant presence of fatigue had been observed in deltoideus medius and extensor digitorum communis of patients at early recovery stages (P0.01). It had also been discovered that the motor unit firing frequency declines with a range positively correlated to the recovery stage during repetitive movements. Based on the experiment result, it can be verified that as the recovery stage increases, the central nervous system's control ability strengthens and the patient motion becomes more stable and resistive to fatigue.

  3. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  4. Two-stage meta-analysis of survival data from individual participants using percentile ratios

    Science.gov (United States)

    Barrett, Jessica K; Farewell, Vern T; Siannis, Fotios; Tierney, Jayne; Higgins, Julian P T

    2012-01-01

    Methods for individual participant data meta-analysis of survival outcomes commonly focus on the hazard ratio as a measure of treatment effect. Recently, Siannis et al. (2010, Statistics in Medicine 29:3030–3045) proposed the use of percentile ratios as an alternative to hazard ratios. We describe a novel two-stage method for the meta-analysis of percentile ratios that avoids distributional assumptions at the study level. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22825835

  5. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    Science.gov (United States)

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.

  6. EUS for the staging of gastric cancer: a meta-analysis.

    Science.gov (United States)

    Mocellin, Simone; Marchet, Alberto; Nitti, Donato

    2011-06-01

    The role of EUS in the locoregional staging of gastric carcinoma is undefined. We aimed to comprehensively review and quantitatively summarize the available evidence on the staging performance of EUS. We systematically searched the MEDLINE, Cochrane, CANCERLIT, and EMBASE databases for relevant studies published until July 2010. Formal meta-analysis of diagnostic accuracy parameters was performed by using a bivariate random-effects model. Fifty-four studies enrolling 5601 patients with gastric cancer undergoing disease staging with EUS were eligible for the meta-analysis. EUS staging accuracy across eligible studies was measured by computing overall sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), and diagnostic odds ratio (DOR). EUS can differentiate T1-2 from T3-4 gastric cancer with high accuracy, with overall sensitivity, specificity, PLR, NLR, and DOR of 0.86 (95% CI, 0.81-0.90), 0.91 (95% CI, 0.89-0.93), 9.8 (95% CI, 7.5-12.8), 0.15 (95% CI, 0.11-0.21), and 65 (95% CI, 41-105), respectively. In contrast, the diagnostic performance of EUS for lymph node status is less reliable, with overall sensitivity, specificity, PLR, NLR, and DOR of 0.69 (95% CI, 0.63-0.74), 0.84 (95% CI, 0.81-0.88), 4.4 (95% CI, 3.6-5.4), 0.37 (95% CI, 0.32-0.44), and 12 (95% CI, 9-16), respectively. Results regarding single T categories (including T1 substages) and Bayesian nomograms to calculate posttest probabilities for any target condition prevalence are also provided. Statistical heterogeneity was generally high; unfortunately, subgroup analysis did not identify a consistent source of the heterogeneity. Our results support the use of EUS for the locoregional staging of gastric cancer, which can affect the therapeutic management of these patients. However, clinicians must be aware of the performance limits of this staging tool. Copyright © 2011 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

  7. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  8. Cerebral blood flow imaging in staging of dementia of Alzheimer type. A study using statistical parametric mapping (SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Kawahata, Nobuya; Daitoh, Nobuyuki; Gotoh, Chiharu; Yokoyama, Sakura [Narita Memorial Hospital, Toyohashi, Aichi (Japan)

    2001-10-01

    One hundred twenty-two patients with dementia of Alzheimer type (DAT) were selected from consecutive referrals to the Memory Clinic at Narita Memorial Hospital. All fulfilled the NINCDS-ADRDA diagnostic criteria for probable Alzheimer's disease. Patients with DAT were subdivided on the results of their score on the Japanese version of Alzheimer's Disease Assessment Scale (ADAS-J cog.), into mild (score lower than or equal to 15, n=47), moderate (score 16 to 35, n=63), and severe (score greater than or equal to 36, n=12) groups. Seventy age-matched controls who had no signs or symptoms of dementia were recruited from the Memory Clinic. The reconstructed {sup 123}I-IMP-SPECT data were analyzed using a statistical parametric mapping technique. In mild stage of DAT, SPM analysis demonstrated that bilateral posterior parietal lobes, superior occipital lobes, and posterior cingulated gyri were significantly decreased in CBF as compared with the normal controls. In moderate stage of DAT, extension of the hypoperfusion to the frontal lobes was detected as compared with the mild stage of DAT. In severe stage of DAT, SPM image showed the diffuse hypoperfusion in bilateral hemispheres. The frequency of hypoperfusions in the DAT group was observed with the following: 15.0% in the mild stage, 34.9% in the moderate stage, and 66.7% in the severe stage. Our results indicate that the frequency and progression of the hypoperfusion in the temporoparietal regions and/or the other regions in DAT is related to the severity of the dementia. (author)

  9. Cerebral blood flow imaging in staging of dementia of Alzheimer type. A study using statistical parametric mapping (SPM)

    International Nuclear Information System (INIS)

    Kawahata, Nobuya; Daitoh, Nobuyuki; Gotoh, Chiharu; Yokoyama, Sakura

    2001-01-01

    One hundred twenty-two patients with dementia of Alzheimer type (DAT) were selected from consecutive referrals to the Memory Clinic at Narita Memorial Hospital. All fulfilled the NINCDS-ADRDA diagnostic criteria for probable Alzheimer's disease. Patients with DAT were subdivided on the results of their score on the Japanese version of Alzheimer's Disease Assessment Scale (ADAS-J cog.), into mild (score lower than or equal to 15, n=47), moderate (score 16 to 35, n=63), and severe (score greater than or equal to 36, n=12) groups. Seventy age-matched controls who had no signs or symptoms of dementia were recruited from the Memory Clinic. The reconstructed 123 I-IMP-SPECT data were analyzed using a statistical parametric mapping technique. In mild stage of DAT, SPM analysis demonstrated that bilateral posterior parietal lobes, superior occipital lobes, and posterior cingulated gyri were significantly decreased in CBF as compared with the normal controls. In moderate stage of DAT, extension of the hypoperfusion to the frontal lobes was detected as compared with the mild stage of DAT. In severe stage of DAT, SPM image showed the diffuse hypoperfusion in bilateral hemispheres. The frequency of hypoperfusions in the DAT group was observed with the following: 15.0% in the mild stage, 34.9% in the moderate stage, and 66.7% in the severe stage. Our results indicate that the frequency and progression of the hypoperfusion in the temporoparietal regions and/or the other regions in DAT is related to the severity of the dementia. (author)

  10. Regularized Statistical Analysis of Anatomy

    DEFF Research Database (Denmark)

    Sjöstrand, Karl

    2007-01-01

    This thesis presents the application and development of regularized methods for the statistical analysis of anatomical structures. Focus is on structure-function relationships in the human brain, such as the connection between early onset of Alzheimer’s disease and shape changes of the corpus...... and mind. Statistics represents a quintessential part of such investigations as they are preluded by a clinical hypothesis that must be verified based on observed data. The massive amounts of image data produced in each examination pose an important and interesting statistical challenge...... efficient algorithms which make the analysis of large data sets feasible, and gives examples of applications....

  11. Stages of tuberculous meningitis: a clinicoradiologic analysis

    International Nuclear Information System (INIS)

    Sher, K.; Firdaus, A.; Bullo, N.; Kumar, S.; Abbasi, A.

    2013-01-01

    Objective: To determine the frequencies and percentages of various clinicoradiologic variables of tuberculosis meningitis (TBM) with reference to British Medical Research Council (BMRC) staging of the disease. Study Design: A case series. Place and Duration of Study: Department of Neurology, Jinnah Postgraduate Medical Centre, Karachi, from October 2010 to September 2011. Methodology: The study included 93 adult patients with the diagnosis of tuberculous meningitis (TBM) at the study place. Patients were divided in three groups according to British Medical Research Council (BMRC) staging of TBM. Different clinical and radiological findings were analyzed at different stages of the disease. Data was analyzed using SPSS (Statistical Package of Social Sciences) version 11.0. Results: A majority of patients were found to be in stage-II disease at the time of admission. History of illness at the time of admission was more than 2 weeks in 50% of stage-I patients but around 80% in stage-II and stage-III patients. Neck stiffness was the most commonly reported finding in all stages. Cranial nerve palsies were higher in stage-III (75%) than in stage-II (43%) and in stage-I (24%) patients. Hydrocephalus and basal enhancement was the most frequently reported radiographic abnormalities. Conclusion: Duration of illness and cranial nerve palsies are important variables in the diagnosis of TBM stages and if TBM is suspected, empiric treatment should be started immediately without bacteriologic proof to prevent morbidity and mortality. (author)

  12. Bladder cancer staging in CT urography: effect of stage labels on statistical modeling of a decision support system

    Science.gov (United States)

    Gandikota, Dhanuj; Hadjiiski, Lubomir; Cha, Kenny H.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.; Weizer, Alon; Alva, Ajjai; Paramagul, Chintana; Wei, Jun; Zhou, Chuan

    2018-02-01

    In bladder cancer, stage T2 is an important threshold in the decision of administering neoadjuvant chemotherapy. Our long-term goal is to develop a quantitative computerized decision support system (CDSS-S) to aid clinicians in accurate staging. In this study, we examined the effect of stage labels of the training samples on modeling such a system. We used a data set of 84 bladder cancers imaged with CT Urography (CTU). At clinical staging prior to treatment, 43 lesions were staged as below stage T2 and 41 were stage T2 or above. After cystectomy and pathological staging that is considered the gold standard, 10 of the lesions were upstaged to stage T2 or above. After correcting the stage labels, 33 lesions were below stage T2, and 51 were stage T2 or above. For the CDSS-S, the lesions were segmented using our AI-CALS method and radiomic features were extracted. We trained a linear discriminant analysis (LDA) classifier with leave-one-case-out cross validation to distinguish between bladder lesions of stage T2 or above and those below stage T2. The CDSS-S was trained and tested with the corrected post-cystectomy labels, and as a comparison, CDSS-S was also trained with understaged pre-treatment labels and tested on lesions with corrected labels. The test AUC for the CDSS-S trained with corrected labels was 0.89 +/- 0.04. For the CDSS-S trained with understaged pre-treatment labels and tested on the lesions with corrected labels, the test AUC was 0.86 +/- 0.04. The likelihood of stage T2 or above for 9 out of the 10 understaged lesions was correctly increased for the CDSS-S trained with corrected labels. The CDSS-S is sensitive to the accuracy of stage labeling. The CDSS-S trained with correct labels shows promise in prediction of the bladder cancer stage.

  13. A retrospective analysis of preoperative staging modalities for oral squamous cell carcinoma.

    Science.gov (United States)

    Kähling, Ch; Langguth, T; Roller, F; Kroll, T; Krombach, G; Knitschke, M; Streckbein, Ph; Howaldt, H P; Wilbrand, J-F

    2016-12-01

    An accurate preoperative assessment of cervical lymph node status is a prerequisite for individually tailored cancer therapies in patients with oral squamous cell carcinoma. The detection of malignant spread and its treatment crucially influence the prognosis. The aim of the present study was to analyze the different staging modalities used among patients with a diagnosis of primary oral squamous cell carcinoma between 2008 and 2015. An analysis of preoperative staging findings, collected by clinical palpation, ultrasound, and computed tomography (CT), was performed. The results obtained were compared with the results of the final histopathological findings of the neck dissection specimens. A statistical analysis using McNemar's test was performed. The sensitivity of CT for the detection of malignant cervical tumor spread was 74.5%. The ultrasound obtained a sensitivity of 60.8%. Both CT and ultrasound demonstrated significantly enhanced sensitivity compared to the clinical palpation with a sensitivity of 37.1%. No significant difference was observed between CT and ultrasound. A combination of different staging modalities increased the sensitivity significantly compared with ultrasound staging alone. No significant difference in sensitivity was found between the combined use of different staging modalities and CT staging alone. The highest sensitivity, of 80.0%, was obtained by a combination of all three staging modalities: clinical palpation, ultrasound and CT. The present study indicates that CT has an essential role in the preoperative staging of patients with oral squamous cell carcinoma. Its use not only significantly increases the sensitivity of cervical lymph node metastasis detection but also offers a preoperative assessment of local tumor spread and resection borders. An additional non-invasive cervical lymph node examination increases the sensitivity of the tumor staging process and reduces the risk of occult metastasis. Copyright © 2016 European

  14. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  15. A STATISTICAL ANALYSIS OF LARYNGEAL MALIGNANCIES AT OUR INSTITUTION

    Directory of Open Access Journals (Sweden)

    Bharathi Mohan Mathan

    2017-03-01

    Full Text Available BACKGROUND Malignancies of larynx are an increasing global burden with a distribution of approximately 2-5% of all malignancies with an incidence of 3.6/1,00,000 for men and 1.3/1,00,000 for women with a male-to-female ratio of 4:1. Smoking and alcohol are major established risk factors. More than 90-95% of all malignancies are squamous cell type. Three main subsite of laryngeal malignancies are glottis, supraglottis and subglottis. Improved surgical techniques and advanced chemoradiotherapy has increased the overall 5 year survival rate. The above study is statistical analysis of laryngeal malignancies at our institution for a period of one year and analysis of pattern of distribution, aetiology, sites and subsites and causes for recurrence. MATERIALS AND METHODS Based on the statistical data available in the institution for the period of one year from January 2016-December 2016, all laryngeal malignancies were analysed with respect to demographic pattern, age, gender, site, subsite, aetiology, staging, treatment received and probable cause for failure of treatment. Patients were followed up for 12 months period during the study. RESULTS Total number of cases studied are 27 (twenty seven. Male cases are 23 and female cases are 4, male-to-female ratio is 5.7:1, most common age is above 60 years, most common site is supraglottis, most common type is moderately-differentiated squamous cell carcinoma, most common cause for relapse or recurrence is advanced stage of disease and poor differentiation. CONCLUSION The commonest age occurrence at the end of the study is above 60 years and male-to-female ratio is 5.7:1, which is slightly above the international standards. Most common site is supraglottis and not glottis. The relapse and recurrences are higher compared to the international standards.

  16. Operational statistical analysis of the results of computer-based testing of students

    Directory of Open Access Journals (Sweden)

    Виктор Иванович Нардюжев

    2018-12-01

    Full Text Available The article is devoted to the issues of statistical analysis of results of computer-based testing for evaluation of educational achievements of students. The issues are relevant due to the fact that computerbased testing in Russian universities has become an important method for evaluation of educational achievements of students and quality of modern educational process. Usage of modern methods and programs for statistical analysis of results of computer-based testing and assessment of quality of developed tests is an actual problem for every university teacher. The article shows how the authors solve this problem using their own program “StatInfo”. For several years the program has been successfully applied in a credit system of education at such technological stages as loading computerbased testing protocols into a database, formation of queries, generation of reports, lists, and matrices of answers for statistical analysis of quality of test items. Methodology, experience and some results of its usage by university teachers are described in the article. Related topics of a test development, models, algorithms, technologies, and software for large scale computer-based testing has been discussed by the authors in their previous publications which are presented in the reference list.

  17. Chronic infections in hip arthroplasties: comparing risk of reinfection following one-stage and two-stage revision: a systematic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Lange J

    2012-03-01

    Full Text Available Jeppe Lange1,2, Anders Troelsen3, Reimar W Thomsen4, Kjeld Søballe1,51Lundbeck Foundation Centre for Fast-Track Hip and Knee Surgery, Aarhus C, 2Center for Planned Surgery, Silkeborg Regional Hospital, Silkeborg, 3Department of Orthopaedics, Hvidovre Hospital, Hvidovre, 4Department of Clinical Epidemiology, Aarhus University Hospital, Aalborg, 5Department of Orthopaedics, Aarhus University Hospital, Aarhus C, DenmarkBackground: Two-stage revision is regarded by many as the best treatment of chronic infection in hip arthroplasties. Some international reports, however, have advocated one-stage revision. No systematic review or meta-analysis has ever compared the risk of reinfection following one-stage and two-stage revisions for chronic infection in hip arthroplasties.Methods: The review was performed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analysis. Relevant studies were identified using PubMed and Embase. We assessed studies that included patients with a chronic infection of a hip arthroplasty treated with either one-stage or two-stage revision and with available data on occurrence of reinfections. We performed a meta-analysis estimating absolute risk of reinfection using a random-effects model.Results: We identified 36 studies eligible for inclusion. None were randomized controlled trials or comparative studies. The patients in these studies had received either one-stage revision (n = 375 or two-stage revision (n = 929. Reinfection occurred with an estimated absolute risk of 13.1% (95% confidence interval: 10.0%–17.1% in the one-stage cohort and 10.4% (95% confidence interval: 8.5%–12.7% in the two-stage cohort. The methodological quality of most included studies was considered low, with insufficient data to evaluate confounding factors.Conclusions: Our results may indicate three additional reinfections per 100 reimplanted patients when performing a one-stage versus two-stage revision. However, the

  18. Computerized analysis of fetal heart rate variability signal during the stages of labor.

    Science.gov (United States)

    Annunziata, Maria Laura; Tagliaferri, Salvatore; Esposito, Francesca Giovanna; Giuliano, Natascia; Mereghini, Flavia; Di Lieto, Andrea; Campanile, Marta

    2016-03-01

    To analyze computerized cardiotocographic (cCTG) parameters (baseline fetal heart rate, baseline FHR; short term variability, STV; approximate entropy, ApEn; low frequency, LF; movement frequency, MF; high frequency, HF) in physiological pregnancy in order to correlate them with the stages of labor. This could provide more information for understanding the mechanisms of nervous system control of FHR during labor progression. A total of 534 pregnant women were monitored on cCTG from the 37th week before the onset of spontaneous labor and during the first and the second stage of labor. Statistical analysis was performed using Kruskal-Wallis test and Wilcoxon rank-sum test with the Bonferroni adjusted α (labor, and the first and second stages of labor. Differences between some of the stages were found for ApEn, LF and for LF/(HF + MF), where the first and the third were reduced and the second was increased. cCTG modifications during labor may reflect the physiologic increased activation of the autonomous nervous system. Using computerized fetal heart rate analysis during labor it may be possible to obtain more information from the fetal cardiac signal, in comparison with the traditional tracing. © 2016 Japan Society of Obstetrics and Gynecology.

  19. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  20. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the

  1. Statistical shape analysis with applications in R

    CERN Document Server

    Dryden, Ian L

    2016-01-01

    A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...

  2. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  3. Statistical methods in personality assessment research.

    Science.gov (United States)

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  4. Statistical inference for extended or shortened phase II studies based on Simon's two-stage designs.

    Science.gov (United States)

    Zhao, Junjun; Yu, Menggang; Feng, Xi-Ping

    2015-06-07

    Simon's two-stage designs are popular choices for conducting phase II clinical trials, especially in the oncology trials to reduce the number of patients placed on ineffective experimental therapies. Recently Koyama and Chen (2008) discussed how to conduct proper inference for such studies because they found that inference procedures used with Simon's designs almost always ignore the actual sampling plan used. In particular, they proposed an inference method for studies when the actual second stage sample sizes differ from planned ones. We consider an alternative inference method based on likelihood ratio. In particular, we order permissible sample paths under Simon's two-stage designs using their corresponding conditional likelihood. In this way, we can calculate p-values using the common definition: the probability of obtaining a test statistic value at least as extreme as that observed under the null hypothesis. In addition to providing inference for a couple of scenarios where Koyama and Chen's method can be difficult to apply, the resulting estimate based on our method appears to have certain advantage in terms of inference properties in many numerical simulations. It generally led to smaller biases and narrower confidence intervals while maintaining similar coverages. We also illustrated the two methods in a real data setting. Inference procedures used with Simon's designs almost always ignore the actual sampling plan. Reported P-values, point estimates and confidence intervals for the response rate are not usually adjusted for the design's adaptiveness. Proper statistical inference procedures should be used.

  5. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  6. Statistical Analysis of Research Data | Center for Cancer Research

    Science.gov (United States)

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  7. Sensitivity of the Hydrogen Epoch of Reionization Array and its build-out stages to one-point statistics from redshifted 21 cm observations

    Science.gov (United States)

    Kittiwisit, Piyanat; Bowman, Judd D.; Jacobs, Daniel C.; Beardsley, Adam P.; Thyagarajan, Nithyanandan

    2018-03-01

    We present a baseline sensitivity analysis of the Hydrogen Epoch of Reionization Array (HERA) and its build-out stages to one-point statistics (variance, skewness, and kurtosis) of redshifted 21 cm intensity fluctuation from the Epoch of Reionization (EoR) based on realistic mock observations. By developing a full-sky 21 cm light-cone model, taking into account the proper field of view and frequency bandwidth, utilizing a realistic measurement scheme, and assuming perfect foreground removal, we show that HERA will be able to recover statistics of the sky model with high sensitivity by averaging over measurements from multiple fields. All build-out stages will be able to detect variance, while skewness and kurtosis should be detectable for HERA128 and larger. We identify sample variance as the limiting constraint of the measurements at the end of reionization. The sensitivity can also be further improved by performing frequency windowing. In addition, we find that strong sample variance fluctuation in the kurtosis measured from an individual field of observation indicates the presence of outlying cold or hot regions in the underlying fluctuations, a feature that can potentially be used as an EoR bubble indicator.

  8. Statistical analysis with Excel for dummies

    CERN Document Server

    Schmuller, Joseph

    2013-01-01

    Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro

  9. Analysis and Evaluation of Statistical Models for Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    Sáenz-Noval J.J.

    2011-10-01

    Full Text Available Statistical models for integrated circuits (IC allow us to estimate the percentage of acceptable devices in the batch before fabrication. Actually, Pelgrom is the statistical model most accepted in the industry; however it was derived from a micrometer technology, which does not guarantee reliability in nanometric manufacturing processes. This work considers three of the most relevant statistical models in the industry and evaluates their limitations and advantages in analog design, so that the designer has a better criterion to make a choice. Moreover, it shows how several statistical models can be used for each one of the stages and design purposes.

  10. Statistical analysis of dynamic parameters of the core

    International Nuclear Information System (INIS)

    Ionov, V.S.

    2007-01-01

    The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)

  11. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  12. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  13. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2013-01-01

    Statistics and Analysis of Scientific Data covers the foundations of probability theory and statistics, and a number of numerical and analytical methods that are essential for the present-day analyst of scientific data. Topics covered include probability theory, distribution functions of statistics, fits to two-dimensional datasheets and parameter estimation, Monte Carlo methods and Markov chains. Equal attention is paid to the theory and its practical application, and results from classic experiments in various fields are used to illustrate the importance of statistics in the analysis of scientific data. The main pedagogical method is a theory-then-application approach, where emphasis is placed first on a sound understanding of the underlying theory of a topic, which becomes the basis for an efficient and proactive use of the material for practical applications. The level is appropriate for undergraduates and beginning graduate students, and as a reference for the experienced researcher. Basic calculus is us...

  14. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  15. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  16. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    Science.gov (United States)

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  17. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  18. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  19. Statistical hot spot analysis of reactor cores

    International Nuclear Information System (INIS)

    Schaefer, H.

    1974-05-01

    This report is an introduction into statistical hot spot analysis. After the definition of the term 'hot spot' a statistical analysis is outlined. The mathematical method is presented, especially the formula concerning the probability of no hot spots in a reactor core is evaluated. A discussion with the boundary conditions of a statistical hot spot analysis is given (technological limits, nominal situation, uncertainties). The application of the hot spot analysis to the linear power of pellets and the temperature rise in cooling channels is demonstrated with respect to the test zone of KNK II. Basic values, such as probability of no hot spots, hot spot potential, expected hot spot diagram and cumulative distribution function of hot spots, are discussed. It is shown, that the risk of hot channels can be dispersed equally over all subassemblies by an adequate choice of the nominal temperature distribution in the core

  20. The statistical analysis of anisotropies

    International Nuclear Information System (INIS)

    Webster, A.

    1977-01-01

    One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)

  1. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  2. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  3. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  4. The IASLC Lung Cancer Staging Project

    DEFF Research Database (Denmark)

    Chansky, Kari; Detterbeck, Frank C; Nicholson, Andrew G

    2017-01-01

    INTRODUCTION: Revisions to the TNM stage classifications for lung cancer, informed by the international database (N = 94,708) of the International Association for the Study of Lung Cancer (IASLC) Staging and Prognostic Factors Committee, need external validation. The objective was to externally...... demonstrated consistent ability to discriminate TNM categories and stage groups for clinical and pathologic stage. CONCLUSIONS: The IASLC revisions made for the eighth edition of lung cancer staging are validated by this analysis of the NCDB database by the ordering, statistical differences, and homogeneity...... validate the revisions by using the National Cancer Data Base (NCDB) of the American College of Surgeons. METHODS: Cases presenting from 2000 through 2012 were drawn from the NCDB and reclassified according to the eighth edition stage classification. Clinically and pathologically staged subsets of NSCLC...

  5. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2017-01-01

    The revised second edition of this textbook provides the reader with a solid foundation in probability theory and statistics as applied to the physical sciences, engineering and related fields. It covers a broad range of numerical and analytical methods that are essential for the correct analysis of scientific data, including probability theory, distribution functions of statistics, fits to two-dimensional data and parameter estimation, Monte Carlo methods and Markov chains. Features new to this edition include: • a discussion of statistical techniques employed in business science, such as multiple regression analysis of multivariate datasets. • a new chapter on the various measures of the mean including logarithmic averages. • new chapters on systematic errors and intrinsic scatter, and on the fitting of data with bivariate errors. • a new case study and additional worked examples. • mathematical derivations and theoretical background material have been appropriately marked,to improve the readabili...

  6. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  7. Bayesian Inference in Statistical Analysis

    CERN Document Server

    Box, George E P

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob

  8. Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals

    Directory of Open Access Journals (Sweden)

    Syafiq Hazwan

    2016-01-01

    Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process.

  9. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  10. Data analysis in high energy physics a practical guide to statistical methods

    CERN Document Server

    Behnke, Olaf; Kröninger, Kevin; Schott, Grégory; Schörner-Sadenius, Thomas

    2013-01-01

    This practical guide covers the most essential statistics-related tasks and problems encountered in high-energy physics data analyses. It addresses both advanced students entering the field of particle physics as well as researchers looking for a reliable source on optimal separation of signal and background, determining signals or estimating upper limits, correcting the data for detector effects and evaluating systematic uncertainties. Each chapter is dedicated to a single topic and supplemented by a substantial number of both paper and computer exercises related to real experiments, with the solutions provided at the end of the book along with references. A special feature of the book are the analysis walk-throughs used to illustrate the application of the methods discussed beforehand. The authors give examples of data analysis, referring to real problems in HEP, and display the different stages of data analysis in a descriptive manner. The accompanying website provides more algorithms as well as up-to-date...

  11. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    Science.gov (United States)

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  12. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  13. Statistical analysis of correlated experimental data and neutron cross section evaluation

    International Nuclear Information System (INIS)

    Badikov, S.A.

    1998-01-01

    The technique for evaluation of neutron cross sections on the basis of statistical analysis of correlated experimental data is presented. The most important stages of evaluation beginning from compilation of correlation matrix for measurement uncertainties till representation of the analysis results in the ENDF-6 format are described in details. Special attention is paid to restrictions (positive uncertainty) on covariation matrix of approximate parameters uncertainties generated within the method of least square fit which is derived from physical reasons. The requirements for source experimental data assuring satisfaction of the restrictions mentioned above are formulated. Correlation matrices of measurement uncertainties in particular should be also positively determined. Variants of modelling the positively determined correlation matrices of measurement uncertainties in a situation when their consequent calculation on the basis of experimental information is impossible are discussed. The technique described is used for creating the new generation of estimates of dosimetric reactions cross sections for the first version of the Russian dosimetric file (including nontrivial covariation information)

  14. Methodology сomparative statistical analysis of Russian industry based on cluster analysis

    Directory of Open Access Journals (Sweden)

    Sergey S. Shishulin

    2017-01-01

    Full Text Available The article is devoted to researching of the possibilities of applying multidimensional statistical analysis in the study of industrial production on the basis of comparing its growth rates and structure with other developed and developing countries of the world. The purpose of this article is to determine the optimal set of statistical methods and the results of their application to industrial production data, which would give the best access to the analysis of the result.Data includes such indicators as output, output, gross value added, the number of employed and other indicators of the system of national accounts and operational business statistics. The objects of observation are the industry of the countrys of the Customs Union, the United States, Japan and Erope in 2005-2015. As the research tool used as the simplest methods of transformation, graphical and tabular visualization of data, and methods of statistical analysis. In particular, based on a specialized software package (SPSS, the main components method, discriminant analysis, hierarchical methods of cluster analysis, Ward’s method and k-means were applied.The application of the method of principal components to the initial data makes it possible to substantially and effectively reduce the initial space of industrial production data. Thus, for example, in analyzing the structure of industrial production, the reduction was from fifteen industries to three basic, well-interpreted factors: the relatively extractive industries (with a low degree of processing, high-tech industries and consumer goods (medium-technology sectors. At the same time, as a result of comparison of the results of application of cluster analysis to the initial data and data obtained on the basis of the principal components method, it was established that clustering industrial production data on the basis of new factors significantly improves the results of clustering.As a result of analyzing the parameters of

  15. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  16. Application of Ontology Technology in Health Statistic Data Analysis.

    Science.gov (United States)

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  17. Combining evidence from multiple electronic health care databases: performances of one-stage and two-stage meta-analysis in matched case-control studies.

    Science.gov (United States)

    La Gamba, Fabiola; Corrao, Giovanni; Romio, Silvana; Sturkenboom, Miriam; Trifirò, Gianluca; Schink, Tania; de Ridder, Maria

    2017-10-01

    Clustering of patients in databases is usually ignored in one-stage meta-analysis of multi-database studies using matched case-control data. The aim of this study was to compare bias and efficiency of such a one-stage meta-analysis with a two-stage meta-analysis. First, we compared the approaches by generating matched case-control data under 5 simulated scenarios, built by varying: (1) the exposure-outcome association; (2) its variability among databases; (3) the confounding strength of one covariate on this association; (4) its variability; and (5) the (heterogeneous) confounding strength of two covariates. Second, we made the same comparison using empirical data from the ARITMO project, a multiple database study investigating the risk of ventricular arrhythmia following the use of medications with arrhythmogenic potential. In our study, we specifically investigated the effect of current use of promethazine. Bias increased for one-stage meta-analysis with increasing (1) between-database variance of exposure effect and (2) heterogeneous confounding generated by two covariates. The efficiency of one-stage meta-analysis was slightly lower than that of two-stage meta-analysis for the majority of investigated scenarios. Based on ARITMO data, there were no evident differences between one-stage (OR = 1.50, CI = [1.08; 2.08]) and two-stage (OR = 1.55, CI = [1.12; 2.16]) approaches. When the effect of interest is heterogeneous, a one-stage meta-analysis ignoring clustering gives biased estimates. Two-stage meta-analysis generates estimates at least as accurate and precise as one-stage meta-analysis. However, in a study using small databases and rare exposures and/or outcomes, a correct one-stage meta-analysis becomes essential. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Explorations in Statistics: The Analysis of Change

    Science.gov (United States)

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  19. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  20. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  1. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  2. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  3. A novel statistic for genome-wide interaction analysis.

    Directory of Open Access Journals (Sweden)

    Xuesen Wu

    2010-09-01

    Full Text Available Although great progress in genome-wide association studies (GWAS has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked. The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001analysis is a valuable tool for finding remaining missing heritability unexplained by the current GWAS, and the developed novel statistic is able to search significant interaction between SNPs across the genome. Real data analysis showed that the results of genome-wide interaction analysis can be replicated in two independent studies.

  4. Analysis of neutron flux measurement systems using statistical functions

    International Nuclear Information System (INIS)

    Pontes, Eduardo Winston

    1997-01-01

    This work develops an integrated analysis for neutron flux measurement systems using the concepts of cumulants and spectra. Its major contribution is the generalization of Campbell's theorem in the form of spectra in the frequency domain, and its application to the analysis of neutron flux measurement systems. Campbell's theorem, in its generalized form, constitutes an important tool, not only to find the nth-order frequency spectra of the radiation detector, but also in the system analysis. The radiation detector, an ionization chamber for neutrons, is modeled for cylindrical, plane and spherical geometries. The detector current pulses are characterized by a vector of random parameters, and the associated charges, statistical moments and frequency spectra of the resulting current are calculated. A computer program is developed for application of the proposed methodology. In order for the analysis to integrate the associated electronics, the signal processor is studied, considering analog and digital configurations. The analysis is unified by developing the concept of equivalent systems that can be used to describe the cumulants and spectra in analog or digital systems. The noise in the signal processor input stage is analysed in terms of second order spectrum. Mathematical expressions are presented for cumulants and spectra up to fourth order, for important cases of filter positioning relative to detector spectra. Unbiased conventional estimators for cumulants are used, and, to evaluate systems precision and response time, expressions are developed for their variances. Finally, some possibilities for obtaining neutron radiation flux as a function of cumulants are discussed. In summary, this work proposes some analysis tools which make possible important decisions in the design of better neutron flux measurement systems. (author)

  5. A statistical approach to plasma profile analysis

    International Nuclear Information System (INIS)

    Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.

    1990-05-01

    A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)

  6. Study designs, use of statistical tests, and statistical analysis software choice in 2015: Results from two Pakistani monthly Medline indexed journals.

    Science.gov (United States)

    Shaikh, Masood Ali

    2017-09-01

    Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.

  7. Statistical analysis of brake squeal noise

    Science.gov (United States)

    Oberst, S.; Lai, J. C. S.

    2011-06-01

    Despite substantial research efforts applied to the prediction of brake squeal noise since the early 20th century, the mechanisms behind its generation are still not fully understood. Squealing brakes are of significant concern to the automobile industry, mainly because of the costs associated with warranty claims. In order to remedy the problems inherent in designing quieter brakes and, therefore, to understand the mechanisms, a design of experiments study, using a noise dynamometer, was performed by a brake system manufacturer to determine the influence of geometrical parameters (namely, the number and location of slots) of brake pads on brake squeal noise. The experimental results were evaluated with a noise index and ranked for warm and cold brake stops. These data are analysed here using statistical descriptors based on population distributions, and a correlation analysis, to gain greater insight into the functional dependency between the time-averaged friction coefficient as the input and the peak sound pressure level data as the output quantity. The correlation analysis between the time-averaged friction coefficient and peak sound pressure data is performed by applying a semblance analysis and a joint recurrence quantification analysis. Linear measures are compared with complexity measures (nonlinear) based on statistics from the underlying joint recurrence plots. Results show that linear measures cannot be used to rank the noise performance of the four test pad configurations. On the other hand, the ranking of the noise performance of the test pad configurations based on the noise index agrees with that based on nonlinear measures: the higher the nonlinearity between the time-averaged friction coefficient and peak sound pressure, the worse the squeal. These results highlight the nonlinear character of brake squeal and indicate the potential of using nonlinear statistical analysis tools to analyse disc brake squeal.

  8. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  9. Analysis of room transfer function and reverberant signal statistics

    DEFF Research Database (Denmark)

    Georganti, Eleftheria; Mourjopoulos, John; Jacobsen, Finn

    2008-01-01

    For some time now, statistical analysis has been a valuable tool in analyzing room transfer functions (RTFs). This work examines existing statistical time-frequency models and techniques for RTF analysis (e.g., Schroeder's stochastic model and the standard deviation over frequency bands for the RTF...... magnitude and phase). RTF fractional octave smoothing, as with 1-slash 3 octave analysis, may lead to RTF simplifications that can be useful for several audio applications, like room compensation, room modeling, auralisation purposes. The aim of this work is to identify the relationship of optimal response...... and the corresponding ratio of the direct and reverberant signal. In addition, this work examines the statistical quantities for speech and audio signals prior to their reproduction within rooms and when recorded in rooms. Histograms and other statistical distributions are used to compare RTF minima of typical...

  10. Transit safety & security statistics & analysis 2002 annual report (formerly SAMIS)

    Science.gov (United States)

    2004-12-01

    The Transit Safety & Security Statistics & Analysis 2002 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  11. Transit safety & security statistics & analysis 2003 annual report (formerly SAMIS)

    Science.gov (United States)

    2005-12-01

    The Transit Safety & Security Statistics & Analysis 2003 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  12. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  13. Geographical, temporal and racial disparities in late-stage prostate cancer incidence across Florida: A multiscale joinpoint regression analysis

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2011-12-01

    Full Text Available Abstract Background Although prostate cancer-related incidence and mortality have declined recently, striking racial/ethnic differences persist in the United States. Visualizing and modelling temporal trends of prostate cancer late-stage incidence, and how they vary according to geographic locations and race, should help explaining such disparities. Joinpoint regression is increasingly used to identify the timing and extent of changes in time series of health outcomes. Yet, most analyses of temporal trends are aspatial and conducted at the national level or for a single cancer registry. Methods Time series (1981-2007 of annual proportions of prostate cancer late-stage cases were analyzed for non-Hispanic Whites and non-Hispanic Blacks in each county of Florida. Noise in the data was first filtered by binomial kriging and results were modelled using joinpoint regression. A similar analysis was also conducted at the state level and for groups of metropolitan and non-metropolitan counties. Significant racial differences were detected using tests of parallelism and coincidence of time trends. A new disparity statistic was introduced to measure spatial and temporal changes in the frequency of racial disparities. Results State-level percentage of late-stage diagnosis decreased 50% since 1981; a decline that accelerated in the 90's when Prostate Specific Antigen (PSA screening was introduced. Analysis at the metropolitan and non-metropolitan levels revealed that the frequency of late-stage diagnosis increased recently in urban areas, and this trend was significant for white males. The annual rate of decrease in late-stage diagnosis and the onset years for significant declines varied greatly among counties and racial groups. Most counties with non-significant average annual percent change (AAPC were located in the Florida Panhandle for white males, whereas they clustered in South-eastern Florida for black males. The new disparity statistic indicated

  14. Geographical, temporal and racial disparities in late-stage prostate cancer incidence across Florida: a multiscale joinpoint regression analysis.

    Science.gov (United States)

    Goovaerts, Pierre; Xiao, Hong

    2011-12-05

    Although prostate cancer-related incidence and mortality have declined recently, striking racial/ethnic differences persist in the United States. Visualizing and modelling temporal trends of prostate cancer late-stage incidence, and how they vary according to geographic locations and race, should help explaining such disparities. Joinpoint regression is increasingly used to identify the timing and extent of changes in time series of health outcomes. Yet, most analyses of temporal trends are aspatial and conducted at the national level or for a single cancer registry. Time series (1981-2007) of annual proportions of prostate cancer late-stage cases were analyzed for non-Hispanic Whites and non-Hispanic Blacks in each county of Florida. Noise in the data was first filtered by binomial kriging and results were modelled using joinpoint regression. A similar analysis was also conducted at the state level and for groups of metropolitan and non-metropolitan counties. Significant racial differences were detected using tests of parallelism and coincidence of time trends. A new disparity statistic was introduced to measure spatial and temporal changes in the frequency of racial disparities. State-level percentage of late-stage diagnosis decreased 50% since 1981; a decline that accelerated in the 90's when Prostate Specific Antigen (PSA) screening was introduced. Analysis at the metropolitan and non-metropolitan levels revealed that the frequency of late-stage diagnosis increased recently in urban areas, and this trend was significant for white males. The annual rate of decrease in late-stage diagnosis and the onset years for significant declines varied greatly among counties and racial groups. Most counties with non-significant average annual percent change (AAPC) were located in the Florida Panhandle for white males, whereas they clustered in South-eastern Florida for black males. The new disparity statistic indicated that the spatial extent of racial disparities reached a

  15. Statistical analysis of long term spatial and temporal trends of ...

    Indian Academy of Sciences (India)

    Statistical analysis of long term spatial and temporal trends of temperature ... CGCM3; HadCM3; modified Mann–Kendall test; statistical analysis; Sutlej basin. ... Water Resources Systems Division, National Institute of Hydrology, Roorkee 247 ...

  16. Probability analysis of MCO over-pressurization during staging

    International Nuclear Information System (INIS)

    Pajunen, A.L.

    1997-01-01

    The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience

  17. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    Science.gov (United States)

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  18. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  19. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  20. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  1. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  2. Statistical analysis and data mining of digital reconstructions of dendritic morphologies

    Directory of Open Access Journals (Sweden)

    Sridevi ePolavaram

    2014-12-01

    Full Text Available Neuronal morphology is diverse among animal species, developmental stages, brain regions, and cell types. The geometry of individual neurons also varies substantially even within the same cell class. Moreover, specific histological, imaging, and reconstruction methodologies can differentially affect morphometric measures. The quantitative characterization of neuronal arbors is necessary for in-depth understanding of the structure-function relationship in nervous systems. The large collection of community-contributed digitally reconstructed neurons available at NeuroMorpho.Org constitutes a big data research opportunity for neuroscience discovery beyond the approaches typically pursued in single laboratories. To illustrate these potential and related challenges, we present a database-wide statistical analysis of dendritic arbors enabling the quantification of major morphological similarities and differences across broadly adopted metadata categories. Furthermore, we adopt a complementary unsupervised approach based on clustering and dimensionality reduction to identify the main morphological parameters leading to the most statistically informative structural classification. We find that specific combinations of measures related to branching density, overall size, tortuosity, bifurcation angles, arbor flatness, and topological asymmetry can capture anatomically and functionally relevant features of dendritic trees. The reported results only represent a small fraction of the relationships available for data exploration and hypothesis testing enabled by digital sharing of morphological reconstructions.

  3. Statistical analysis and data mining of digital reconstructions of dendritic morphologies.

    Science.gov (United States)

    Polavaram, Sridevi; Gillette, Todd A; Parekh, Ruchi; Ascoli, Giorgio A

    2014-01-01

    Neuronal morphology is diverse among animal species, developmental stages, brain regions, and cell types. The geometry of individual neurons also varies substantially even within the same cell class. Moreover, specific histological, imaging, and reconstruction methodologies can differentially affect morphometric measures. The quantitative characterization of neuronal arbors is necessary for in-depth understanding of the structure-function relationship in nervous systems. The large collection of community-contributed digitally reconstructed neurons available at NeuroMorpho.Org constitutes a "big data" research opportunity for neuroscience discovery beyond the approaches typically pursued in single laboratories. To illustrate these potential and related challenges, we present a database-wide statistical analysis of dendritic arbors enabling the quantification of major morphological similarities and differences across broadly adopted metadata categories. Furthermore, we adopt a complementary unsupervised approach based on clustering and dimensionality reduction to identify the main morphological parameters leading to the most statistically informative structural classification. We find that specific combinations of measures related to branching density, overall size, tortuosity, bifurcation angles, arbor flatness, and topological asymmetry can capture anatomically and functionally relevant features of dendritic trees. The reported results only represent a small fraction of the relationships available for data exploration and hypothesis testing enabled by sharing of digital morphological reconstructions.

  4. Statistical analysis on extreme wave height

    Digital Repository Service at National Institute of Oceanography (India)

    Teena, N.V.; SanilKumar, V.; Sudheesh, K.; Sajeev, R.

    -294. • WAFO (2000) – A MATLAB toolbox for analysis of random waves and loads, Lund University, Sweden, homepage http://www.maths.lth.se/matstat/wafo/,2000. 15    Table 1: Statistical results of data and fitted distribution for cumulative distribution...

  5. Statistical analysis of corn yields responding to climate variability at various spatio-temporal resolutions

    Science.gov (United States)

    Jiang, H.; Lin, T.

    2017-12-01

    Rain-fed corn production systems are subject to sub-seasonal variations of precipitation and temperature during the growing season. As each growth phase has varied inherent physiological process, plants necessitate different optimal environmental conditions during each phase. However, this temporal heterogeneity towards climate variability alongside the lifecycle of crops is often simplified and fixed as constant responses in large scale statistical modeling analysis. To capture the time-variant growing requirements in large scale statistical analysis, we develop and compare statistical models at various spatial and temporal resolutions to quantify the relationship between corn yield and weather factors for 12 corn belt states from 1981 to 2016. The study compares three spatial resolutions (county, agricultural district, and state scale) and three temporal resolutions (crop growth phase, monthly, and growing season) to characterize the effects of spatial and temporal variability. Our results show that the agricultural district model together with growth phase resolution can explain 52% variations of corn yield caused by temperature and precipitation variability. It provides a practical model structure balancing the overfitting problem in county specific model and weak explanation power in state specific model. In US corn belt, precipitation has positive impact on corn yield in growing season except for vegetative stage while extreme heat attains highest sensitivity from silking to dough phase. The results show the northern counties in corn belt area are less interfered by extreme heat but are more vulnerable to water deficiency.

  6. MORTICIA, a statistical analysis software package for determining optical surveillance system effectiveness.

    Science.gov (United States)

    Ramkilowan, A.; Griffith, D. J.

    2017-10-01

    Surveillance modelling in terms of the standard Detect, Recognise and Identify (DRI) thresholds remains a key requirement for determining the effectiveness of surveillance sensors. With readily available computational resources it has become feasible to perform statistically representative evaluations of the effectiveness of these sensors. A new capability for performing this Monte-Carlo type analysis is demonstrated in the MORTICIA (Monte- Carlo Optical Rendering for Theatre Investigations of Capability under the Influence of the Atmosphere) software package developed at the Council for Scientific and Industrial Research (CSIR). This first generation, python-based open-source integrated software package, currently in the alpha stage of development aims to provide all the functionality required to perform statistical investigations of the effectiveness of optical surveillance systems in specific or generic deployment theatres. This includes modelling of the mathematical and physical processes that govern amongst other components of a surveillance system; a sensor's detector and optical components, a target and its background as well as the intervening atmospheric influences. In this paper we discuss integral aspects of the bespoke framework that are critical to the longevity of all subsequent modelling efforts. Additionally, some preliminary results are presented.

  7. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  8. Sensitivity analysis of ranked data: from order statistics to quantiles

    NARCIS (Netherlands)

    Heidergott, B.F.; Volk-Makarewicz, W.

    2015-01-01

    In this paper we provide the mathematical theory for sensitivity analysis of order statistics of continuous random variables, where the sensitivity is with respect to a distributional parameter. Sensitivity analysis of order statistics over a finite number of observations is discussed before

  9. The significance of OLGA and OLGIM staging systems in the risk assessment of gastric cancer: a systematic review and meta-analysis.

    Science.gov (United States)

    Yue, Hu; Shan, Liu; Bin, Lv

    2018-02-19

    Despite extensive research on the criteria for the assessment of gastric cancer risk using the Operative Link on Gastritis Assessment (OLGA) and Operative Link on Gastritis/Intestinal-Metaplasia Assessment (OLGIM) systems, no comprehensive overview or systematic summary on their use is currently available. To perform a systematic review and meta-analysis to assess the efficacy of the OLGA and OLGIM staging systems in evaluating gastric cancer risk. We searched various databases, including PubMed, EMBASE, Medline, and Cochrane's library, for articles published before March 2017 on the association between OLGA/OLGIM stages and risk of gastric cancer. Statistical analysis was performed using RevMan 5.30 and Stata 14.0, with the odds ratio, risk ratio, and 95% confidence interval as the effect measures. A meta-analysis of six case-control studies and two cohort studies, comprising 2700 subjects, was performed. The meta-analysis of prospective case-control studies demonstrated a significant association between the OLGA/OLGIM stages III/IV and gastric cancer. The Newcastle-Ottawa Scale (NOS) score reflected heterogeneity in the case-control studies on OLGA. Subgroup analysis of high-quality (NOS score ≥ 5) studies showed an association between OLGA stage III/IV and increased risk of gastric cancer; the association was also high in the remaining study with low NOS score. The association between higher stages of gastritis defined by OLGA and risk of gastric cancer was significant. This correlation implies that close and frequent monitoring of such high-risk patients is necessary to facilitate timely diagnosis of gastric cancer.

  10. The price of electricity from private power producers: Stage 2, Expansion of sample and preliminary statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Comnes, G.A.; Belden, T.N.; Kahn, E.P.

    1995-02-01

    The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.

  11. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  12. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  13. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  14. Early stage design and analysis of biorefinery networks

    DEFF Research Database (Denmark)

    Sin, Gürkan

    2013-01-01

    Recent work regarding biorefineries resulted in many competing concepts and technologies for conversion of renewable bio-based feedstock into many promising products including fuels, chemicals, materials, etc. The design of a biorefinery process requires, at its earlier stages, the selection...... of the process configuration which exhibits the best performances, for a given set of economical, technical and environmental criteria. To this end, we formulate a computer-aided framework as an enabling technology for early stage design and analysis of biorefineries. The tool represents different raw materials......, different products and different available technologies and proposes a conceptual (early stage) biorefinery network. This network can then be the basis for further detailed and rigorous model-based studies. In this talk, we demonstrate the application of the tool for generating an early stage optimal...

  15. The fuzzy approach to statistical analysis

    NARCIS (Netherlands)

    Coppi, Renato; Gil, Maria A.; Kiers, Henk A. L.

    2006-01-01

    For the last decades, research studies have been developed in which a coalition of Fuzzy Sets Theory and Statistics has been established with different purposes. These namely are: (i) to introduce new data analysis problems in which the objective involves either fuzzy relationships or fuzzy terms;

  16. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  17. Findings of the analysis of antineoplastic therapy in patients with thyroid cancer at early stages of special treatment

    International Nuclear Information System (INIS)

    Vasil'jev, L. Ya.; Kulyinyich, G.V.; Radzyishevs'ka, Je.B.; Savchenko, A.S.

    2017-01-01

    The object of research is to estimate risks of development of early somatic neurological complications depending on the scheme of treatment of the thyroid according to the findings obtained due to analysis of 120 case histories of the patients. Research methods: nonparametric statistics methods, multivariance statistical analysis, hidden knowledge search technology. According to the accumulated data packages the frequency of unfavorable early somatic neurological consequences was ascertained, i.e. anemia - 15.8%, sialoadenitis - 27.5%, gastritis - 2.5%, heart beat disorder - 72.5%, polyneuropathy - 83,0%. The dependence of appearing on present concomitant pathology (hypoparathyrosis, hypocalcemia, peptic ulcer, diabetes mellitus, hypertension, cardiac failure, connective tissue diseases, varix dilatation of the lower extremities) and medical history data such as age, sex, disease stage, number of surgeries and radionuclide treatment courses, ECOG scale condition was clarified for the first time. The factors which affect appearing of immediate complications of radioactive iodine therapy at a statistically significant level were ascertained. It was specified that these factors include concomitant somatic diseases, involving cardiovascular, endocrine and nervous systems in particular. The further analysis of accumulated experience of treatment of thyroid cancer will make it possible to elaborate approaches to optimization choice appropriate treatment schemes and post-treatment monitoring of patients.

  18. Foundation of statistical energy analysis in vibroacoustics

    CERN Document Server

    Le Bot, A

    2015-01-01

    This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.

  19. Classification Preictal and Interictal Stages via Integrating Interchannel and Time-Domain Analysis of EEG Features.

    Science.gov (United States)

    Lin, Lung-Chang; Chen, Sharon Chia-Ju; Chiang, Ching-Tai; Wu, Hui-Chuan; Yang, Rei-Cheng; Ouyang, Chen-Sen

    2017-03-01

    The life quality of patients with refractory epilepsy is extremely affected by abrupt and unpredictable seizures. A reliable method for predicting seizures is important in the management of refractory epilepsy. A critical factor in seizure prediction involves the classification of the preictal and interictal stages. This study aimed to develop an efficient, automatic, quantitative, and individualized approach for preictal/interictal stage identification. Five epileptic children, who had experienced at least 2 episodes of seizures during a 24-hour video EEG recording, were included. Artifact-free preictal and interictal EEG epochs were acquired, respectively, and characterized with 216 global feature descriptors. The best subset of 5 discriminative descriptors was identified. The best subsets showed differences among the patients. Statistical analysis revealed most of the 5 descriptors in each subset were significantly different between the preictal and interictal stages for each patient. The proposed approach yielded weighted averages of 97.50% correctness, 96.92% sensitivity, 97.78% specificity, and 95.45% precision on classifying test epochs. Although the case number was limited, this study successfully integrated a new EEG analytical method to classify preictal and interictal EEG segments and might be used further in predicting the occurrence of seizures.

  20. Statistical study of clone survival curves after irradiation in one or two stages. Comparison and generalization of different models

    International Nuclear Information System (INIS)

    Lachet, Bernard.

    1975-01-01

    A statistical study was carried out on 208 survival curves for chlorella subjected to γ or particle radiations. The computing programmes used were written in Fortran. The different experimental causes contributing to the variance of a survival rate are analyzed and consequently the experiments can be planned. Each curve was fitted to four models by the weighted least squares method applied to non-linear functions. The validity of the fits obtained can be checked by the F test. It was possible to define the confidence and prediction zones around an adjusted curve by weighting of the residual variance, in spite of error on the doses delivered; the confidence limits can them be fixed for a dose estimated from an exact or measured survival. The four models adopted were compared for the precision of their fit (by a non-parametric simultaneous comparison test) and the scattering of their adjusted parameters: Wideroe's model gives a very good fit with the experimental points in return for a scattering of its parameters, which robs them of their presumed meaning. The principal component analysis showed the statistical equivalence of the 1 and 2 hit target models. Division of the irradiation into two doses, the first fixed by the investigator, leads to families of curves for which the equation was established from that of any basic model expressing the dose survival relationship in one-stage irradiation [fr

  1. Statistical Analysis of the labor Market in Ukraine Using Multidimensional Classification Methods: the Regional Aspect

    Directory of Open Access Journals (Sweden)

    Korepanov Oleksiy S.

    2017-12-01

    Full Text Available The aim of the article is to study the labor market in Ukraine in the regional context using cluster analysis methods. The current state of the labor market in regions of Ukraine is analyzed, and a system of statistical indicators that influence the state and development of this market is formed. The expediency of using cluster analysis for grouping regions according to the level of development of the labor market is substantiated. The essence of cluster analysis is revealed, its main goal, key tasks, which can be solved by means of such analysis, are presented, basic stages of the analysis are considered. The main methods of clustering are described and, based on the results of the simulation, the advantages and disadvantages of each method are justified. In the work the clustering of regions of Ukraine by the level of labor market development using different methods of cluster analysis is carried out, conclusions on the results of the calculations performed are presented, and the main directions for further research are outlined.

  2. Statistical process control for residential treated wood

    Science.gov (United States)

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  3. Statistical Analysis of Big Data on Pharmacogenomics

    Science.gov (United States)

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  4. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  5. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  6. Robust statistics and geochemical data analysis

    International Nuclear Information System (INIS)

    Di, Z.

    1987-01-01

    Advantages of robust procedures over ordinary least-squares procedures in geochemical data analysis is demonstrated using NURE data from the Hot Springs Quadrangle, South Dakota, USA. Robust principal components analysis with 5% multivariate trimming successfully guarded the analysis against perturbations by outliers and increased the number of interpretable factors. Regression with SINE estimates significantly increased the goodness-of-fit of the regression and improved the correspondence of delineated anomalies with known uranium prospects. Because of the ubiquitous existence of outliers in geochemical data, robust statistical procedures are suggested as routine procedures to replace ordinary least-squares procedures

  7. Principles of applied statistics

    National Research Council Canada - National Science Library

    Cox, D. R; Donnelly, Christl A

    2011-01-01

    .... David Cox and Christl Donnelly distil decades of scientific experience into usable principles for the successful application of statistics, showing how good statistical strategy shapes every stage of an investigation...

  8. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  9. Statistical image analysis of cerebral blood flow in moyamoya disease

    International Nuclear Information System (INIS)

    Yamada, Masaru; Yuzawa, Izumi; Suzuki, Sachio; Kurata, Akira; Fujii, Kiyotaka; Asano, Yuji

    2007-01-01

    The Summary of this study was to investigate pathophysiology of moyamoya disease, we analyzed brain single photon emission tomography (SPECT) images of patients with this disease by using interface software for a 3-dimensional (3D) data extraction format. Presenting symptoms were transient ischemic attack (TIA) in 21 patients and hemorrhage in 6 patients. All the patients underwent brain SPECT scan of 123 I-iofetamine (IMP) at rest and after acetazolamide challenge (17 mg/kg iv, 2-day method). Cerebral blood flow (CBF) was quantitatively measured using arterial blood sampling and an autoradiography model. The group of the patients who presented with TIAs showed decreased CBF in the frontal lobe at rest compared to that of patients with hemorrhage, but Z-score ((mean-patient data)/ standard deviation (SD)) did not reach statistical significance. Significant CBF decrease after acetazolamide challenge was observed in a wider cerebral cortical area in the TIA group than in the hemorrhagic group. The brain region of hemodynamic ischemia (stage II) correlated well with the responsible cortical area for clinical symptoms of TIA. A hemodynamic ischemia stage image clearly represented recovery of reserve capacity after bypass surgery. Statistical evaluation of SPECT may be useful to understand and clarify the pathophysiology of this disease. (author)

  10. Conjunction analysis and propositional logic in fMRI data analysis using Bayesian statistics.

    Science.gov (United States)

    Rudert, Thomas; Lohmann, Gabriele

    2008-12-01

    To evaluate logical expressions over different effects in data analyses using the general linear model (GLM) and to evaluate logical expressions over different posterior probability maps (PPMs). In functional magnetic resonance imaging (fMRI) data analysis, the GLM was applied to estimate unknown regression parameters. Based on the GLM, Bayesian statistics can be used to determine the probability of conjunction, disjunction, implication, or any other arbitrary logical expression over different effects or contrast. For second-level inferences, PPMs from individual sessions or subjects are utilized. These PPMs can be combined to a logical expression and its probability can be computed. The methods proposed in this article are applied to data from a STROOP experiment and the methods are compared to conjunction analysis approaches for test-statistics. The combination of Bayesian statistics with propositional logic provides a new approach for data analyses in fMRI. Two different methods are introduced for propositional logic: the first for analyses using the GLM and the second for common inferences about different probability maps. The methods introduced extend the idea of conjunction analysis to a full propositional logic and adapt it from test-statistics to Bayesian statistics. The new approaches allow inferences that are not possible with known standard methods in fMRI. (c) 2008 Wiley-Liss, Inc.

  11. Sensitivity analysis and optimization of system dynamics models : Regression analysis and statistical design of experiments

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This tutorial discusses what-if analysis and optimization of System Dynamics models. These problems are solved, using the statistical techniques of regression analysis and design of experiments (DOE). These issues are illustrated by applying the statistical techniques to a System Dynamics model for

  12. Multivariate Statistical Methods as a Tool of Financial Analysis of Farm Business

    Czech Academy of Sciences Publication Activity Database

    Novák, J.; Sůvová, H.; Vondráček, Jiří

    2002-01-01

    Roč. 48, č. 1 (2002), s. 9-12 ISSN 0139-570X Institutional research plan: AV0Z1030915 Keywords : financial analysis * financial ratios * multivariate statistical methods * correlation analysis * discriminant analysis * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research

  13. A two stage data envelopment analysis model with undesirable output

    Science.gov (United States)

    Shariff Adli Aminuddin, Adam; Izzati Jaini, Nur; Mat Kasim, Maznah; Nawawi, Mohd Kamal Mohd

    2017-09-01

    The dependent relationship among the decision making units (DMU) is usually assumed to be non-existent in the development of Data Envelopment Analysis (DEA) model. The dependency can be represented by the multi-stage DEA model, where the outputs from the precedent stage will be the inputs for the latter stage. The multi-stage DEA model evaluate both the efficiency score for each stages and the overall efficiency of the whole process. The existing multi stage DEA models do not focus on the integration with the undesirable output, in which the higher input will generate lower output unlike the normal desirable output. This research attempts to address the inclusion of such undesirable output and investigate the theoretical implication and potential application towards the development of multi-stage DEA model.

  14. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  15. Statistical analysis of environmental data

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Bowman, K.O.; Miller, F.L. Jr.

    1975-10-01

    This report summarizes the analyses of data obtained by the Radiological Hygiene Branch of the Tennessee Valley Authority from samples taken around the Browns Ferry Nuclear Plant located in Northern Alabama. The data collection was begun in 1968 and a wide variety of types of samples have been gathered on a regular basis. The statistical analysis of environmental data involving very low-levels of radioactivity is discussed. Applications of computer calculations for data processing are described

  16. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  17. Statistical Power Analysis with Missing Data A Structural Equation Modeling Approach

    CERN Document Server

    Davey, Adam

    2009-01-01

    Statistical power analysis has revolutionized the ways in which we conduct and evaluate research.  Similar developments in the statistical analysis of incomplete (missing) data are gaining more widespread applications. This volume brings statistical power and incomplete data together under a common framework, in a way that is readily accessible to those with only an introductory familiarity with structural equation modeling.  It answers many practical questions such as: How missing data affects the statistical power in a study How much power is likely with different amounts and types

  18. Do stages of dentistry training affect anxiety provoking situations ...

    African Journals Online (AJOL)

    Getting diagnosis wrong, help in faint episode, not developing radiograph properly and coping with children were the anxiety provoking situations that showed statistically significant difference in the 3 studied training stages of dentistry. Bonferroni post‑hoc analysis significant difference was in the preclinical and clinical ...

  19. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2003-01-01

    Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...... for timber are investigated....

  20. Histogram analysis of apparent diffusion coefficient maps for assessing thymic epithelial tumours: correlation with world health organization classification and clinical staging.

    Science.gov (United States)

    Kong, Ling-Yan; Zhang, Wei; Zhou, Yue; Xu, Hai; Shi, Hai-Bin; Feng, Qing; Xu, Xiao-Quan; Yu, Tong-Fu

    2018-04-01

    To investigate the value of apparent diffusion coefficients (ADCs) histogram analysis for assessing World Health Organization (WHO) pathological classification and Masaoka clinical stages of thymic epithelial tumours. 37 patients with histologically confirmed thymic epithelial tumours were enrolled. ADC measurements were performed using hot-spot ROI (ADC HS-ROI ) and histogram-based approach. ADC histogram parameters included mean ADC (ADC mean ), median ADC (ADC median ), 10 and 90 percentile of ADC (ADC 10 and ADC 90 ), kurtosis and skewness. One-way ANOVA, independent-sample t-test, and receiver operating characteristic were used for statistical analyses. There were significant differences in ADC mean , ADC median , ADC 10 , ADC 90 and ADC HS-ROI among low-risk thymoma (type A, AB, B1; n = 14), high-risk thymoma (type B2, B3; n = 9) and thymic carcinoma (type C, n = 14) groups (all p-values histogram analysis may assist in assessing the WHO pathological classification and Masaoka clinical stages of thymic epithelial tumours. Advances in knowledge: 1. ADC histogram analysis could help to assess WHO pathological classification of thymic epithelial tumours. 2. ADC histogram analysis could help to evaluate Masaoka clinical stages of thymic epithelial tumours. 3. ADC 10 might be a promising imaging biomarker for assessing and characterizing thymic epithelial tumours.

  1. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  2. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  3. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  4. On the Statistical Validation of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Rosane Riera Freire

    2007-06-01

    Full Text Available Technical analysis, or charting, aims on visually identifying geometrical patterns in price charts in order to antecipate price "trends". In this paper we revisit the issue of thecnical analysis validation which has been tackled in the literature without taking care for (i the presence of heterogeneity and (ii statistical dependence in the analyzed data - various agglutinated return time series from distinct financial securities. The main purpose here is to address the first cited problem by suggesting a validation methodology that also "homogenizes" the securities according to the finite dimensional probability distribution of their return series. The general steps go through the identification of the stochastic processes for the securities returns, the clustering of similar securities and, finally, the identification of presence, or absence, of informatinal content obtained from those price patterns. We illustrate the proposed methodology with a real data exercise including several securities of the global market. Our investigation shows that there is a statistically significant informational content in two out of three common patterns usually found through technical analysis, namely: triangle, rectangle and head and shoulders.

  5. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Wendelberger, J.R.; McVittie, T.I.

    1995-01-01

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  6. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  7. Diagnosis checking of statistical analysis in RCTs indexed in PubMed.

    Science.gov (United States)

    Lee, Paul H; Tse, Andy C Y

    2017-11-01

    Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  8. A κ-generalized statistical mechanics approach to income analysis

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  9. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  10. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians

    Science.gov (United States)

    Ghasemi, Asghar; Zahediasl, Saleh

    2012-01-01

    Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808

  11. Systematic mediastinal lymphadenectomy or mediastinal lymph node sampling in patients with pathological stage I NSCLC: a meta-analysis.

    Science.gov (United States)

    Dong, Siyuan; Du, Jiang; Li, Wenya; Zhang, Shuguang; Zhong, Xinwen; Zhang, Lin

    2015-02-01

    To evaluate the evidence comparing systematic mediastinal lymphadenectomy (SML) and mediastinal lymph node sampling (MLS) in the treatment of pathological stage I NSCLC using meta-analytical techniques. A literature search was undertaken until January 2014 to identify the comparative studies evaluating 1-, 3-, and 5-year survival rates. The pooled odds ratios (OR) and the 95 % confidence intervals (95 % CI) were calculated with either the fixed or random effect models. One RCT study and four retrospective studies were included in our meta-analysis. These studies included a total of 711 patients: 317 treated with SML, and 394 treated with MLS. The SML and the MLS did not demonstrate a significant difference in the 1-year survival rate. There were significant statistical differences between the 3-year (P = 0.03) and 5-year survival rates (P = 0.004), which favored SML. This meta-analysis suggests that in pathological stage I NSCLC, the MLS can get the similar outcome to the SML in terms of 1-year survival rate. However, the SML is superior to MLS in terms of 3- and 5-year survival rates.

  12. Optic Nerve Head and Retinal Nerve Fiber Layer Analysis in Ocular Hypertension and Early-Stage Glucoma Using Spectral-Domain Optical Coherence Tomography Copernicus

    Directory of Open Access Journals (Sweden)

    Nilgün Solmaz

    2014-01-01

    Full Text Available Objectives: Evaluation of structural alterations of the optic nerve head (ONH and the retinal nerve fiber layer (RNFL in patients with ocular hypertension (OHT and early-stage glaucoma and assessment of the discriminatory diagnostic performance of spectral-domain optical coherence tomography (SD-OCT Copernicus (Optopol Technology S.A.. Materials and Methods: This study included 59 eyes of a total of 59 patients, 29 of whom were diagnosed with OHT (Group 1 and 30 with early-stage glaucoma (Group 2. The differentiation of early-stage glaucoma and OHT was carried out on the basis of standard achromatic visual field test results. Analysis of the ONH and RNFL thickness of all cases was made using SD-OCT. Group 1 and Group 2 were compared with respect to the ONH parameters and RNFL thickness. The diagnostic sensitivity of the OCT parameters was evaluated by the area under the receiver operating characteristics curves (AUC. Results: The average, superior, inferior, and nasal RNFL thicknesses in early-stage glaucoma cases were approximately 10% (12-14 µm less compared to the OHT eyes, with differences being highly significant (p≤0.001. However, there was no statistically significant difference in the temporal RNFL thicknesses. The most sensitive parameter in the early diagnosis of glaucoma was average RNFL thickness corresponding to AUC: 0.852, followed by AUC: 0.816 and AUC: 0.773 values in superior and inferior RNFL thickness, respectively. In localized RNFL defects, the highest sensitivity corresponded to superior and superonasal quadrants (ACU: 0.805 and ACU: 0.781, respectively. There were not any statistically significant differences between the ONH morphological parameters of the two groups. Conclusion: RNFL analysis obtained using SD-OCT Copernicus is able to discriminate early-stage glaucoma eyes from those with OHT. However, ONH morphological parameters do not have the same diagnostic sensitivity. Turk J Ophthalmol 2014; 44: 35-41

  13. Statistical analysis of metallicity in spiral galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Galeotti, P [Consiglio Nazionale delle Ricerche, Turin (Italy). Lab. di Cosmo-Geofisica; Turin Univ. (Italy). Ist. di Fisica Generale)

    1981-04-01

    A principal component analysis of metallicity and other integral properties of 33 spiral galaxies is presented; the involved parameters are: morphological type, diameter, luminosity and metallicity. From the statistical analysis it is concluded that the sample has only two significant dimensions and additonal tests, involving different parameters, show similar results. Thus it seems that only type and luminosity are independent variables, being the other integral properties of spiral galaxies correlated with them.

  14. Minimally Invasive Surgical Staging in Early-stage Ovarian Carcinoma: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Bogani, Giorgio; Borghi, Chiara; Leone Roberti Maggiore, Umberto; Ditto, Antonino; Signorelli, Mauro; Martinelli, Fabio; Chiappa, Valentina; Lopez, Carlos; Sabatucci, Ilaria; Scaffa, Cono; Indini, Alice; Ferrero, Simone; Lorusso, Domenica; Raspagliesi, Francesco

    Few studies investigated the efficacy and safety of minimally invasive surgery for the treatment of early-stage epithelial ovarian cancer (eEOC). In this context, we aimed to review the current evidence comparing laparoscopy and the laparotomic approach for staging procedures in eEOC. This systematic review was registered in the International Prospective Register of Systematic Reviews. Overall, 3065 patients were included: 1450 undergoing laparoscopy and 1615 undergoing laparotomic staging. Patients undergoing laparoscopy experienced a longer (but not statistically significant) operative time (weighted mean difference [WMD] = 28.3 minutes; 95% confidence interval [CI], -2.59 to 59.2), a lower estimated blood loss (WMD = -156.5 mL; 95% CI, -216.4 to -96.5), a shorter length of hospital stay (WMD = -3.7 days; 95% CI, -5.2 to -2.1), and a lower postoperative complication rate (odds ratio [OR] = 0.48; 95% CI, 0.29-0.81) than patients undergoing laparotomy. The upstaging (OR = 0.81; 95% CI, 0.55-1.20) and cyst rupture (OR = 1.32; 95% CI, 0.52-3.38) rates were similar between groups. Laparoscopic staging is associated with a shorter time to chemotherapy than laparotomic procedures (WMD = -5.16 days; 95% CI, -8.68 to -1.64). Survival outcomes were not influenced by the route of surgery. Pooled data suggested that the minimally invasive surgical approach is equivalent to laparotomy for the treatment of eEOC and may be superior in terms of perioperative outcomes. However, because of the low level of evidence of the included studies, further randomized trials are warranted. Copyright © 2017 AAGL. Published by Elsevier Inc. All rights reserved.

  15. Statistical Analysis of Protein Ensembles

    Science.gov (United States)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  16. State analysis of BOP using statistical and heuristic methods

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Chang, Soon Heung

    2003-01-01

    Under the deregulation environment, the performance enhancement of BOP in nuclear power plants is being highlighted. To analyze performance level of BOP, we use the performance test procedures provided from an authorized institution such as ASME. However, through plant investigation, it was proved that the requirements of the performance test procedures about the reliability and quantity of sensors was difficult to be satisfied. As a solution of this, state analysis method that are the expanded concept of signal validation, was proposed on the basis of the statistical and heuristic approaches. Authors recommended the statistical linear regression model by analyzing correlation among BOP parameters as a reference state analysis method. Its advantage is that its derivation is not heuristic, it is possible to calculate model uncertainty, and it is easy to apply to an actual plant. The error of the statistical linear regression model is below 3% under normal as well as abnormal system states. Additionally a neural network model was recommended since the statistical model is impossible to apply to the validation of all of the sensors and is sensitive to the outlier that is the signal located out of a statistical distribution. Because there are a lot of sensors need to be validated in BOP, wavelet analysis (WA) were applied as a pre-processor for the reduction of input dimension and for the enhancement of training accuracy. The outlier localization capability of WA enhanced the robustness of the neural network. The trained neural network restored the degraded signals to the values within ±3% of the true signals

  17. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  18. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Unified risk analysis of fatigue failure in ductile alloy components during all three stages of fatigue crack evolution process.

    Science.gov (United States)

    Patankar, Ravindra

    2003-10-01

    Statistical fatigue life of a ductile alloy specimen is traditionally divided into three stages, namely, crack nucleation, small crack growth, and large crack growth. Crack nucleation and small crack growth show a wide variation and hence a big spread on cycles versus crack length graph. Relatively, large crack growth shows a lesser variation. Therefore, different models are fitted to the different stages of the fatigue evolution process, thus treating different stages as different phenomena. With these independent models, it is impossible to predict one phenomenon based on the information available about the other phenomenon. Experimentally, it is easier to carry out crack length measurements of large cracks compared to nucleating cracks and small cracks. Thus, it is easier to collect statistical data for large crack growth compared to the painstaking effort it would take to collect statistical data for crack nucleation and small crack growth. This article presents a fracture mechanics-based stochastic model of fatigue crack growth in ductile alloys that are commonly encountered in mechanical structures and machine components. The model has been validated by Ray (1998) for crack propagation by various statistical fatigue data. Based on the model, this article proposes a technique to predict statistical information of fatigue crack nucleation and small crack growth properties that uses the statistical properties of large crack growth under constant amplitude stress excitation. The statistical properties of large crack growth under constant amplitude stress excitation can be obtained via experiments.

  20. Alternatives generation and analysis for phase I intermediate waste feed staging system design requirements

    Energy Technology Data Exchange (ETDEWEB)

    Britton, M.D.

    1996-10-02

    This document provides; a decision analysis summary; problem statement; constraints, requirements, and assumptions; decision criteria; intermediate waste feed staging system options and alternatives generation and screening; intermediate waste feed staging system design concepts; intermediate waste feed staging system alternative evaluation and analysis; and open issues and actions.

  1. Statistical analysis of RHIC beam position monitors performance

    Science.gov (United States)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  2. Statistical analysis of RHIC beam position monitors performance

    Directory of Open Access Journals (Sweden)

    R. Calaga

    2004-04-01

    Full Text Available A detailed statistical analysis of beam position monitors (BPM performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  3. Gaining power and precision by using model-based weights in the analysis of late stage cancer trials with substantial treatment switching.

    Science.gov (United States)

    Bowden, Jack; Seaman, Shaun; Huang, Xin; White, Ian R

    2016-04-30

    In randomised controlled trials of treatments for late-stage cancer, it is common for control arm patients to receive the experimental treatment around the point of disease progression. This treatment switching can dilute the estimated treatment effect on overall survival and impact the assessment of a treatment's benefit on health economic evaluations. The rank-preserving structural failure time model of Robins and Tsiatis (Comm. Stat., 20:2609-2631) offers a potential solution to this problem and is typically implemented using the logrank test. However, in the presence of substantial switching, this test can have low power because the hazard ratio is not constant over time. Schoenfeld (Biometrika, 68:316-319) showed that when the hazard ratio is not constant, weighted versions of the logrank test become optimal. We present a weighted logrank test statistic for the late stage cancer trial context given the treatment switching pattern and working assumptions about the underlying hazard function in the population. Simulations suggest that the weighted approach can lead to large efficiency gains in either an intention-to-treat or a causal rank-preserving structural failure time model analysis compared with the unweighted approach. Furthermore, violation of the working assumptions used in the derivation of the weights only affects the efficiency of the estimates and does not induce bias or inflate the type I error rate. The weighted logrank test statistic should therefore be considered for use as part of a careful secondary, exploratory analysis of trial data affected by substantial treatment switching. ©2015 The Authors. Statistics inMedicine Published by John Wiley & Sons Ltd.

  4. Statistics Education Research in Malaysia and the Philippines: A Comparative Analysis

    Science.gov (United States)

    Reston, Enriqueta; Krishnan, Saras; Idris, Noraini

    2014-01-01

    This paper presents a comparative analysis of statistics education research in Malaysia and the Philippines by modes of dissemination, research areas, and trends. An electronic search for published research papers in the area of statistics education from 2000-2012 yielded 20 for Malaysia and 19 for the Philippines. Analysis of these papers showed…

  5. Statistical analysis of next generation sequencing data

    CERN Document Server

    Nettleton, Dan

    2014-01-01

    Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...

  6. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  7. Bias due to two-stage residual-outcome regression analysis in genetic association studies.

    Science.gov (United States)

    Demissie, Serkalem; Cupples, L Adrienne

    2011-11-01

    Association studies of risk factors and complex diseases require careful assessment of potential confounding factors. Two-stage regression analysis, sometimes referred to as residual- or adjusted-outcome analysis, has been increasingly used in association studies of single nucleotide polymorphisms (SNPs) and quantitative traits. In this analysis, first, a residual-outcome is calculated from a regression of the outcome variable on covariates and then the relationship between the adjusted-outcome and the SNP is evaluated by a simple linear regression of the adjusted-outcome on the SNP. In this article, we examine the performance of this two-stage analysis as compared with multiple linear regression (MLR) analysis. Our findings show that when a SNP and a covariate are correlated, the two-stage approach results in biased genotypic effect and loss of power. Bias is always toward the null and increases with the squared-correlation between the SNP and the covariate (). For example, for , 0.1, and 0.5, two-stage analysis results in, respectively, 0, 10, and 50% attenuation in the SNP effect. As expected, MLR was always unbiased. Since individual SNPs often show little or no correlation with covariates, a two-stage analysis is expected to perform as well as MLR in many genetic studies; however, it produces considerably different results from MLR and may lead to incorrect conclusions when independent variables are highly correlated. While a useful alternative to MLR under , the two -stage approach has serious limitations. Its use as a simple substitute for MLR should be avoided. © 2011 Wiley Periodicals, Inc.

  8. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  9. Thermodynamic analysis of single-stage and multi-stage adsorption refrigeration cycles with activated carbon–ammonia working pair

    International Nuclear Information System (INIS)

    Xu, S.Z.; Wang, L.W.; Wang, R.Z.

    2016-01-01

    Highlights: • Activated carbon–ammonia multi-stage adsorption refrigerator was analyzed. • COP, exergetic efficiency and entropy production of cycles were calculated. • Single-stage cycle usually has the advantages of simple structure and high COP. • Multi-stage cycles adapt to critical conditions better than single-stage cycle. • Boundary conditions for choosing optimal cycle were summarized as tables. - Abstract: Activated carbon–ammonia multi-stage adsorption refrigeration cycle was analyzed in this article, which realized deep-freezing for evaporating temperature under −18 °C with heating source temperature much lower than 100 °C. Cycle mathematical models for single, two and three-stage cycles were established on the basis of thorough thermodynamic analysis. According to simulation results of thermodynamic evaluation indicators such as COP (coefficient of performance), exergetic efficiency and cycle entropy production, multi-stage cycle adapts to high condensing temperature, low evaporating temperature and low heating source temperature well. Proposed cycle with selected working pair can theoretically work under very severe conditions, such as −25 °C evaporating temperature, 40 °C condensing temperature, and 70 °C heating source temperature, but under these working conditions it has the drawback of low cycle adsorption quantity. It was found that both COP and exergetic efficiency are of great reference value in the choice of cycle, whereas entropy production is not so useful for cycle stage selection. Finally, the application boundary conditions of single-stage, two-stage, and three-stage cycles were summarized as tables according to the simulation results, which provides reference for choosing optimal cycle under different conditions.

  10. Comparative analysis of positive and negative attitudes toward statistics

    Science.gov (United States)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  11. Vapor Pressure Data Analysis and Statistics

    Science.gov (United States)

    2016-12-01

    near 8, 2000, and 200, respectively. The A (or a) value is directly related to vapor pressure and will be greater for high vapor pressure materials...1, (10) where n is the number of data points, Yi is the natural logarithm of the i th experimental vapor pressure value, and Xi is the...VAPOR PRESSURE DATA ANALYSIS AND STATISTICS ECBC-TR-1422 Ann Brozena RESEARCH AND TECHNOLOGY DIRECTORATE

  12. Statistical analysis of planktic foraminifera of the surface Continental ...

    African Journals Online (AJOL)

    Planktic foraminiferal assemblage recorded from selected samples obtained from shallow continental shelf sediments off southwestern Nigeria were subjected to statistical analysis. The Principal Component Analysis (PCA) was used to determine variants of planktic parameters. Values obtained for these parameters were ...

  13. Imaging mass spectrometry statistical analysis.

    Science.gov (United States)

    Jones, Emrys A; Deininger, Sören-Oliver; Hogendoorn, Pancras C W; Deelder, André M; McDonnell, Liam A

    2012-08-30

    Imaging mass spectrometry is increasingly used to identify new candidate biomarkers. This clinical application of imaging mass spectrometry is highly multidisciplinary: expertise in mass spectrometry is necessary to acquire high quality data, histology is required to accurately label the origin of each pixel's mass spectrum, disease biology is necessary to understand the potential meaning of the imaging mass spectrometry results, and statistics to assess the confidence of any findings. Imaging mass spectrometry data analysis is further complicated because of the unique nature of the data (within the mass spectrometry field); several of the assumptions implicit in the analysis of LC-MS/profiling datasets are not applicable to imaging. The very large size of imaging datasets and the reporting of many data analysis routines, combined with inadequate training and accessible reviews, have exacerbated this problem. In this paper we provide an accessible review of the nature of imaging data and the different strategies by which the data may be analyzed. Particular attention is paid to the assumptions of the data analysis routines to ensure that the reader is apprised of their correct usage in imaging mass spectrometry research. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Information, Privacy and Stability in Adaptive Data Analysis

    OpenAIRE

    Smith, Adam

    2017-01-01

    Traditional statistical theory assumes that the analysis to be performed on a given data set is selected independently of the data themselves. This assumption breaks downs when data are re-used across analyses and the analysis to be performed at a given stage depends on the results of earlier stages. Such dependency can arise when the same data are used by several scientific studies, or when a single analysis consists of multiple stages. How can we draw statistically valid conclusions when da...

  15. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  16. Second order statistical analysis of US image texture

    International Nuclear Information System (INIS)

    Tanzi, F.; Novario, R.

    1999-01-01

    The study reports the sonographic image texture of the neonatal heart in different stages of development by calculating numerical parameters extracted from the gray scale co-occurrence matrix. To show pixel values differences and enhance texture structure, images were equalized and then the gray level range was reduced to 16 to allow sufficiently high occupancy frequency of the co-occurrence matrix. Differences are so little significant that they may be due to different factors affecting image texture and the variability introduced by manual ROI positioning; therefore no definitive conclusions can be drawn as to considering this kind of analysis capable of discriminating different stages of myocardial development [it

  17. Statistical analysis of JET disruptions

    International Nuclear Information System (INIS)

    Tanga, A.; Johnson, M.F.

    1991-07-01

    In the operation of JET and of any tokamak many discharges are terminated by a major disruption. The disruptive termination of a discharge is usually an unwanted event which may cause damage to the structure of the vessel. In a reactor disruptions are potentially a very serious problem, hence the importance of studying them and devising methods to avoid disruptions. Statistical information has been collected about the disruptions which have occurred at JET over a long span of operations. The analysis is focused on the operational aspects of the disruptions rather than on the underlining physics. (Author)

  18. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  19. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  20. Seal Analysis for the Ares-I Upper Stage Fuel Tank Manhole Cover

    Science.gov (United States)

    Phillips, Dawn R.; Wingate, Robert J.

    2010-01-01

    Techniques for studying the performance of Naflex pressure-assisted seals in the Ares-I Upper Stage liquid hydrogen tank manhole cover seal joint are explored. To assess the feasibility of using the identical seal design for the Upper Stage as was used for the Space Shuttle External Tank manhole covers, a preliminary seal deflection analysis using the ABAQUS commercial finite element software is employed. The ABAQUS analyses are performed using three-dimensional symmetric wedge finite element models. This analysis technique is validated by first modeling a heritage External Tank liquid hydrogen tank manhole cover joint and correlating the results to heritage test data. Once the technique is validated, the Upper Stage configuration is modeled. The Upper Stage analyses are performed at 1.4 times the expected pressure to comply with the Constellation Program factor of safety requirement on joint separation. Results from the analyses performed with the External Tank and Upper Stage models demonstrate the effects of several modeling assumptions on the seal deflection. The analyses for Upper Stage show that the integrity of the seal is successfully maintained.

  1. Engineering analysis of the two-stage trifluoride precipitation process

    International Nuclear Information System (INIS)

    Luerkens, D.w.W.

    1984-06-01

    An engineering analysis of two-stage trifluoride precipitation processes is developed. Precipitation kinetics are modeled using consecutive reactions to represent fluoride complexation. Material balances across the precipitators are used to model the time dependent concentration profiles of the main chemical species. The results of the engineering analysis are correlated with previous experimental work on plutonium trifluoride and cerium trifluoride

  2. Modified Statistical Dynamical Diffraction Theory: A Novel Metrological Analysis Method for Partially Relaxed and Defective Carbon-doped Silicon and Silicon Germanium Heterostructures

    Science.gov (United States)

    Shreeman, Paul K.

    The statistical dynamical diffraction theory, which has been initially developed by late Kato remained in obscurity for many years due to intense and difficult mathematical treatment that proved to be quite challenging to implement and apply. With assistance of many authors in past (including Bushuev, Pavlov, Pungeov, and among the others), it became possible to implement this unique x-ray diffraction theory that combines the kinematical (ideally imperfect) and dynamical (the characteristically perfect diffraction) into a single system of equations controlled by two factors determined by long range order and correlation function within the structure. The first stage is completed by the publication (Shreeman and Matyi, J. Appl. Cryst., 43, 550 (2010)) demonstrating the functionality of this theory with new modifications hence called modified statistical dynamical diffraction theory (mSDDT). The foundation of the theory is also incorporated into this dissertation, and the next stage of testing the model against several ion-implanted SiGe materials has been published: (Shreeman and Matyi, physica status solidi (a)208(11), 2533-2538, 2011). The dissertation with all the previous results summarized, dives into comprehensive analysis of HRXRD analyses complete with several different types of reflections (symmetrical, asymmetrical and skewed geometry). The dynamical results (with almost no defects) are compared with well-known commercial software. The defective materials, to which commercially available modeling software falls short, is then characterized and discussed in depth. The results will exemplify the power of the novel approach in the modified statistical dynamical diffraction theory: Ability to detect and measure defective structures qualitatively and quantitatively. The analysis will be compared alongside with TEM data analysis for verification and confirmation. The application of this theory will accelerate the ability to quickly characterize the relaxed

  3. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  4. Statistical trend analysis methods for temporal phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  5. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  6. StOCNET : Software for the statistical analysis of social networks

    NARCIS (Netherlands)

    Huisman, M.; van Duijn, M.A.J.

    2003-01-01

    StOCNET3 is an open software system in a Windows environment for the advanced statistical analysis of social networks. It provides a platform to make a number of recently developed and therefore not (yet) standard statistical methods available to a wider audience. A flexible user interface utilizing

  7. AutoBayes: A System for Generating Data Analysis Programs from Statistical Models

    OpenAIRE

    Fischer, Bernd; Schumann, Johann

    2003-01-01

    Data analysis is an important scientific task which is required whenever information needs to be extracted from raw data. Statistical approaches to data analysis, which use methods from probability theory and numerical analysis, are well-founded but dificult to implement: the development of a statistical data analysis program for any given application is time-consuming and requires substantial knowledge and experience in several areas. In this paper, we describe AutoBayes, a program synthesis...

  8. Vaginal cancer, an analysis of stages 1 and 2

    International Nuclear Information System (INIS)

    Dickie, G.J.; Tripcony, L.; Otten, G.; Nicklin, J.

    2003-01-01

    A retrospective analysis was performed of 70 patients with stages 1 and 2 vaginal cancer seen between 1982 and 1998 at the Royal Brisbane and Royal Women's Hospitals, Queensland. Forty three patients had previously had a hysterectomy. Stage, histology and grade were the most important prognostic factors. The 5 year survival rate for stage 1 was 71%, compared to 48% for stage 2. The majority (61 patients) had squamous cell carcinoma with a 68% survival compared to 22% for adenocarcinoma. Those with histological grade 1 and 2 had a 69% survival compared to 40% for grade 3 disease. Age, whether the patient had a previous hysterectomy, tumour site and size were not significant prognostic factors. The majority of patients were treated with radiotherapy alone. However those that had surgery alone or surgery combined with radiotherapy had a significantly improved survival compared to the radiotherapy alone group. The majority of tumours recurred in the loco-regional area and the median time to recurrence was 12 months

  9. Using DEWIS and R for Multi-Staged Statistics e-Assessments

    Science.gov (United States)

    Gwynllyw, D. Rhys; Weir, Iain S.; Henderson, Karen L.

    2016-01-01

    We demonstrate how the DEWIS e-Assessment system may use embedded R code to facilitate the assessment of students' ability to perform involved statistical analyses. The R code has been written to emulate SPSS output and thus the statistical results for each bespoke data set can be generated efficiently and accurately using standard R routines.…

  10. An analysis of the Athetis lepigone transcriptome from four developmental stages.

    Directory of Open Access Journals (Sweden)

    Li-Tao Li

    Full Text Available Athetis lepigone Möschler (Lepidoptera: Noctuidae has recently become an important insect pest of maize (Zea mays crops in China. In order to understand the characteristics of the different developmental stages of this pest, we used Illumina short-read sequences to perform de novo transcriptome assembly and gene expression analysis for egg, larva, pupa and adult developmental stages. We obtained 10.08 Gb of raw data from Illumina sequencing and recovered 81,356 unigenes longer than 100 bp through a de novo assembly. The total sequence length reached 49.75 Mb with 858 bp of N50 and an average unigene length of 612 bp. Annotation analysis of predicted proteins indicate that 33,736 unigenes (41.47% of total unigenes are matches to genes in the Genbank Nr database. The unigene sequences were subjected to GO, COG and KEGG functional classification. A large number of differentially expressed genes were recovered by pairwise comparison of the four developmental stages. The most dramatic differences in gene expression were found in the transitions from one stage to another stage. Some of these differentially expressed genes are related to cuticle and wing formation as well as the growth and development. We identified more than 2,500 microsatellite markers that may be used for population studies of A. lepigone. This study lays the foundation for further research on population genetics and gene function analysis in A. lepigone.

  11. Network similarity and statistical analysis of earthquake seismic data

    OpenAIRE

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban

    2016-01-01

    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We cal...

  12. Distributed activation energy model for kinetic analysis of multi-stage hydropyrolysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.; Li, W.; Wang, N.; Li, B. [Chinese Academy of Sciences, Taiyuan (China). Inst. of Coal Chemistry

    2003-07-01

    Based on the new analysis of distributed activation energy model, a bicentral distribution model was introduced to the analysis of multi-stage hydropyrolysis of coal. The hydropyrolysis for linear temperature programming with and without holding stage were mathematically described and the corresponding kinetic expressions were achieved. Based on the kinetics, the hydropyrolysis (HyPr) and multi-stage hydropyrolysis (MHyPr) of Xundian brown coal was simulated. The results shows that both Mo catalyst and 2-stage holding can lower the apparent activation energy of hydropyrolysis and make activation energy distribution become narrow. Besides, there exists an optimum Mo loading of 0.2% for HyPy of Xundian lignite. 10 refs.

  13. Statistical analysis and interpolation of compositional data in materials science.

    Science.gov (United States)

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  14. The statistical analysis of energy release in small-scale coronal structures

    Science.gov (United States)

    Ulyanov, Artyom; Kuzin, Sergey; Bogachev, Sergey

    We present the results of statistical analysis of impulsive flare-like brightenings, which numerously occur in the quiet regions of solar corona. For our study, we utilized high-cadence observations performed with two EUV-telescopes - TESIS/Coronas-Photon and AIA/SDO. In total, we processed 6 sequences of images, registered throughout the period between 2009 and 2013, covering the rising phase of the 24th solar cycle. Based on high-speed DEM estimation method, we developed a new technique to evaluate the main parameters of detected events (geometrical sizes, duration, temperature and thermal energy). We then obtained the statistical distributions of these parameters and examined their variations depending on the level of solar activity. The results imply that near the minimum of the solar cycle the energy release in quiet corona is mainly provided by small-scale events (nanoflares), whereas larger events (microflares) prevail on the peak of activity. Furthermore, we investigated the coronal conditions that had specified the formation and triggering of registered flares. By means of photospheric magnetograms obtained with MDI/SoHO and HMI/SDO instruments, we examined the topology of local magnetic fields at different stages: the pre-flare phase, the peak of intensity and the ending phase. To do so, we introduced a number of topological parameters including the total magnetic flux, the distance between magnetic sources and their mutual arrangement. The found correlation between the change of these parameters and the formation of flares may offer an important tool for application of flare forecasting.

  15. An Application of Multivariate Statistical Analysis for Query-Driven Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Gosink, Luke J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Garth, Christoph [Univ. of California, Davis, CA (United States); Anderson, John C. [Univ. of California, Davis, CA (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Joy, Kenneth I. [Univ. of California, Davis, CA (United States)

    2011-03-01

    Driven by the ability to generate ever-larger, increasingly complex data, there is an urgent need in the scientific community for scalable analysis methods that can rapidly identify salient trends in scientific data. Query-Driven Visualization (QDV) strategies are among the small subset of techniques that can address both large and highly complex datasets. This paper extends the utility of QDV strategies with a statistics-based framework that integrates non-parametric distribution estimation techniques with a new segmentation strategy to visually identify statistically significant trends and features within the solution space of a query. In this framework, query distribution estimates help users to interactively explore their query's solution and visually identify the regions where the combined behavior of constrained variables is most important, statistically, to their inquiry. Our new segmentation strategy extends the distribution estimation analysis by visually conveying the individual importance of each variable to these regions of high statistical significance. We demonstrate the analysis benefits these two strategies provide and show how they may be used to facilitate the refinement of constraints over variables expressed in a user's query. We apply our method to datasets from two different scientific domains to demonstrate its broad applicability.

  16. A Two-Stage DEA to Analyze the Effect of Entrance Deregulation on Iranian Insurers: A Robust Approach

    Directory of Open Access Journals (Sweden)

    Seyed Gholamreza Jalali Naini

    2012-01-01

    Full Text Available We use two-stage data envelopment analysis (DEA model to analyze the effects of entrance deregulation on the efficiency in the Iranian insurance market. In the first stage, we propose a robust optimization approach in order to overcome the sensitivity of DEA results to any uncertainty in the output parameters. Hence, the efficiency of each ongoing insurer is estimated using our proposed robust DEA model. The insurers are then ranked based on their relative efficiency scores for an eight-year period from 2003 to 2010. In the second stage, a comprehensive statistical analysis using generalized estimating equations (GEE is conducted to analyze some other factors which could possibly affect the efficiency scores. The first results from DEA model indicate a decline in efficiency over the entrance deregulation period while further statistical analysis confirms that the solvency ignorance which is a widespread paradigm among state owned companies is one of the main drivers of efficiency in the Iranian insurance market.

  17. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  18. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    Science.gov (United States)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  19. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  20. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  1. Reports on Cancer - Cancer Statistics

    Science.gov (United States)

    Interactive tools for access to statistics for a cancer site by gender, race, ethnicity, calendar year, age, state, county, stage, and histology. Statistics include incidence, mortality, prevalence, cost, risk factors, behaviors, tobacco use, and policies and are presented as graphs, tables, or maps.

  2. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  3. Statistical wind analysis for near-space applications

    Science.gov (United States)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  4. Analysis of photon statistics with Silicon Photomultiplier

    International Nuclear Information System (INIS)

    D'Ascenzo, N.; Saveliev, V.; Wang, L.; Xie, Q.

    2015-01-01

    The Silicon Photomultiplier (SiPM) is a novel silicon-based photodetector, which represents the modern perspective of low photon flux detection. The aim of this paper is to provide an introduction on the statistical analysis methods needed to understand and estimate in quantitative way the correct features and description of the response of the SiPM to a coherent source of light

  5. Development of statistical analysis code for meteorological data (W-View)

    International Nuclear Information System (INIS)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  6. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  7. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  8. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  9. Statistical analysis of thermal conductivity of nanofluid containing ...

    Indian Academy of Sciences (India)

    Thermal conductivity measurements of nanofluids were analysed via two-factor completely randomized design and comparison of data means is carried out with Duncan's multiple-range test. Statistical analysis of experimental data show that temperature and weight fraction have a reasonable impact on the thermal ...

  10. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  11. Interpretation of electrocardiographic, echocardiographic and biochemical findings during different stages of Canine Visceral Leishmaniasis

    Directory of Open Access Journals (Sweden)

    Ural K

    2017-09-01

    Full Text Available Objective. The purpose of the present study was to test the hypothesis that cardiac alterations participate within different stages of CVL. Materials and methods. Twenty-eight dogs were diagnosed with CVL, were classified [based on clinical signs, rapid ELISA/IFAT, hematological and serum biochemical tests, urinary protein/creatinine ratio, ECG and ECHO]. as follows; group I (mild disease, group II (moderate disease, group III (severe disease, group IV (very severe disease and group V included healthy controls. Results. Ig G antibodies against leishmaniasis as tested by IFAT, were deemed 1/64 to 1/16000 among infected groups. There were statistical significance regarding mean values for WBC [among healthy control group (V. and other groups (p=0.049], RBC [among stage III-stage IV and other groups (p=0.001], Hb [between stage I and stage III-stage IV (p=0.008, HCT [between stage I and other groups (p = 0.001] MCHC [between stage I and stage II- stage IV (p=0.046], serum creatinine [(p=0.008 stage IV and stage I-II within group V], serum protein [(p=0.002 among stage IV and stage I-III- healthy control groups] and serum albumine [(p=0.004 among stage IV and stage I-II]. There was no alteration in CTnI concentrations,among groups. UPC analysis revealed statistical difference among control group and stage II to IV dogs (p=0.000. Moderate or severe ECG abnormalities were detected in 6/28 of diseased dogs. Regarding ECHO examination LA/ Ao value presented significant difference (p=0.003 among stage IV and other groups. Conclusions. It may be suggested that establishing Leishvet Working Group to those of dogs classified into stage I to IV, ECG [left atrial/ventricular enlargement, myocardial hypoxia] and ECHO [left atrial dilation, decrased/increased LVIDs, decrased/increased LVIDd, shortened FS and EF (systolic dysfunction] alterations must be taken into account along with hematological and serum biochemical analysis.

  12. Longitudinal data analysis a handbook of modern statistical methods

    CERN Document Server

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  13. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  14. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  15. Quantitative analysis and IBM SPSS statistics a guide for business and finance

    CERN Document Server

    Aljandali, Abdulkader

    2016-01-01

    This guide is for practicing statisticians and data scientists who use IBM SPSS for statistical analysis of big data in business and finance. This is the first of a two-part guide to SPSS for Windows, introducing data entry into SPSS, along with elementary statistical and graphical methods for summarizing and presenting data. Part I also covers the rudiments of hypothesis testing and business forecasting while Part II will present multivariate statistical methods, more advanced forecasting methods, and multivariate methods. IBM SPSS Statistics offers a powerful set of statistical and information analysis systems that run on a wide variety of personal computers. The software is built around routines that have been developed, tested, and widely used for more than 20 years. As such, IBM SPSS Statistics is extensively used in industry, commerce, banking, local and national governments, and education. Just a small subset of users of the package include the major clearing banks, the BBC, British Gas, British Airway...

  16. What type of statistical model to choose for the analysis of radioimmunoassays

    International Nuclear Information System (INIS)

    Huet, S.

    1984-01-01

    The current techniques used for statistical analysis of radioimmunoassays are not very satisfactory for either the statistician or the biologist. They are based on an attempt to make the response curve linear to avoid complicated computations. The present article shows that this practice has considerable effects (often neglected) on the statistical assumptions which must be formulated. A more strict analysis is proposed by applying the four-parameter logistic model. The advantages of this method are: the statistical assumptions formulated are based on observed data, and the model can be applied to almost all radioimmunoassays [fr

  17. Design and Analysis of an Abbé Free Coplanar Stage

    Directory of Open Access Journals (Sweden)

    Chung Tien-Tung

    2017-01-01

    Full Text Available Design and analysis of a new Abbé free coplanar xy stage are presented in this paper. The xy stage is formed as conventional xy stages which combine with two stacked up linear guides. The x-guide is on the bottom and the y-guide is on top of the x-guide. The travel range of this xy stage is 300mm × 300mm, which fits dimensions of 12-inch wafers. A special mechanism is designed such that the z-surface of the y-guide has the same height as the z-surface of x-guide, and the Abbé error and cumulative error of this coplanar stage can be reduced. The concept of symmetric structure design is also considered to eliminate the structural deformation due to driving force of two guides. For this long travel range precision stage, the finite element method (FEM is applied to analyze the structural deformation and vibration natural frequencies. The requirement of structural deformation due to self-weight load is limited to 1.5μm, and first natural frequency is limited to over 100 Hz.

  18. Computerized statistical analysis with bootstrap method in nuclear medicine

    International Nuclear Information System (INIS)

    Zoccarato, O.; Sardina, M.; Zatta, G.; De Agostini, A.; Barbesti, S.; Mana, O.; Tarolo, G.L.

    1988-01-01

    Statistical analysis of data samples involves some hypothesis about the features of data themselves. The accuracy of these hypotheses can influence the results of statistical inference. Among the new methods of computer-aided statistical analysis, the bootstrap method appears to be one of the most powerful, thanks to its ability to reproduce many artificial samples starting from a single original sample and because it works without hypothesis about data distribution. The authors applied the bootstrap method to two typical situation of Nuclear Medicine Department. The determination of the normal range of serum ferritin, as assessed by radioimmunoassay and defined by the mean value ±2 standard deviations, starting from an experimental sample of small dimension, shows an unacceptable lower limit (ferritin plasmatic levels below zero). On the contrary, the results obtained by elaborating 5000 bootstrap samples gives ans interval of values (10.95 ng/ml - 72.87 ng/ml) corresponding to the normal ranges commonly reported. Moreover the authors applied the bootstrap method in evaluating the possible error associated with the correlation coefficient determined between left ventricular ejection fraction (LVEF) values obtained by first pass radionuclide angiocardiography with 99m Tc and 195m Au. The results obtained indicate a high degree of statistical correlation and give the range of r 2 values to be considered acceptable for this type of studies

  19. Software for statistical data analysis used in Higgs searches

    International Nuclear Information System (INIS)

    Gumpert, Christian; Moneta, Lorenzo; Cranmer, Kyle; Kreiss, Sven; Verkerke, Wouter

    2014-01-01

    The analysis and interpretation of data collected by the Large Hadron Collider (LHC) requires advanced statistical tools in order to quantify the agreement between observation and theoretical models. RooStats is a project providing a statistical framework for data analysis with the focus on discoveries, confidence intervals and combination of different measurements in both Bayesian and frequentist approaches. It employs the RooFit data modelling language where mathematical concepts such as variables, (probability density) functions and integrals are represented as C++ objects. RooStats and RooFit rely on the persistency technology of the ROOT framework. The usage of a common data format enables the concept of digital publishing of complicated likelihood functions. The statistical tools have been developed in close collaboration with the LHC experiments to ensure their applicability to real-life use cases. Numerous physics results have been produced using the RooStats tools, with the discovery of the Higgs boson by the ATLAS and CMS experiments being certainly the most popular among them. We will discuss tools currently used by LHC experiments to set exclusion limits, to derive confidence intervals and to estimate discovery significances based on frequentist statistics and the asymptotic behaviour of likelihood functions. Furthermore, new developments in RooStats and performance optimisation necessary to cope with complex models depending on more than 1000 variables will be reviewed

  20. PRECISE - pregabalin in addition to usual care: Statistical analysis plan

    NARCIS (Netherlands)

    S. Mathieson (Stephanie); L. Billot (Laurent); C. Maher (Chris); A.J. McLachlan (Andrew J.); J. Latimer (Jane); B.W. Koes (Bart); M.J. Hancock (Mark J.); I. Harris (Ian); R.O. Day (Richard O.); J. Pik (Justin); S. Jan (Stephen); C.-W.C. Lin (Chung-Wei Christine)

    2016-01-01

    textabstractBackground: Sciatica is a severe, disabling condition that lacks high quality evidence for effective treatment strategies. This a priori statistical analysis plan describes the methodology of analysis for the PRECISE study. Methods/design: PRECISE is a prospectively registered, double

  1. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  2. Multivariate statistical analysis of atom probe tomography data

    International Nuclear Information System (INIS)

    Parish, Chad M.; Miller, Michael K.

    2010-01-01

    The application of spectrum imaging multivariate statistical analysis methods, specifically principal component analysis (PCA), to atom probe tomography (APT) data has been investigated. The mathematical method of analysis is described and the results for two example datasets are analyzed and presented. The first dataset is from the analysis of a PM 2000 Fe-Cr-Al-Ti steel containing two different ultrafine precipitate populations. PCA properly describes the matrix and precipitate phases in a simple and intuitive manner. A second APT example is from the analysis of an irradiated reactor pressure vessel steel. Fine, nm-scale Cu-enriched precipitates having a core-shell structure were identified and qualitatively described by PCA. Advantages, disadvantages, and future prospects for implementing these data analysis methodologies for APT datasets, particularly with regard to quantitative analysis, are also discussed.

  3. Development of statistical analysis code for meteorological data (W-View)

    Energy Technology Data Exchange (ETDEWEB)

    Tachibana, Haruo; Sekita, Tsutomu; Yamaguchi, Takenori [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-03-01

    A computer code (W-View: Weather View) was developed to analyze the meteorological data statistically based on 'the guideline of meteorological statistics for the safety analysis of nuclear power reactor' (Nuclear Safety Commission on January 28, 1982; revised on March 29, 2001). The code gives statistical meteorological data to assess the public dose in case of normal operation and severe accident to get the license of nuclear reactor operation. This code was revised from the original code used in a large office computer code to enable a personal computer user to analyze the meteorological data simply and conveniently and to make the statistical data tables and figures of meteorology. (author)

  4. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    Science.gov (United States)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  5. Recent advances in statistical energy analysis

    Science.gov (United States)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  6. Statistical analysis of tourism destination competitiveness

    Directory of Open Access Journals (Sweden)

    Attilio Gardini

    2013-05-01

    Full Text Available The growing relevance of tourism industry for modern advanced economies has increased the interest among researchers and policy makers in the statistical analysis of destination competitiveness. In this paper we outline a new model of destination competitiveness based on sound theoretical grounds and we develop a statistical test of the model on sample data based on Italian tourist destination decisions and choices. Our model focuses on the tourism decision process which starts from the demand schedule for holidays and ends with the choice of a specific holiday destination. The demand schedule is a function of individual preferences and of destination positioning, while the final decision is a function of the initial demand schedule and the information concerning services for accommodation and recreation in the selected destinations. Moreover, we extend previous studies that focused on image or attributes (such as climate and scenery by paying more attention to the services for accommodation and recreation in the holiday destinations. We test the proposed model using empirical data collected from a sample of 1.200 Italian tourists interviewed in 2007 (October - December. Data analysis shows that the selection probability for the destination included in the consideration set is not proportional to the share of inclusion because the share of inclusion is determined by the brand image, while the selection of the effective holiday destination is influenced by the real supply conditions. The analysis of Italian tourists preferences underline the existence of a latent demand for foreign holidays which points out a risk of market share reduction for Italian tourism system in the global market. We also find a snow ball effect which helps the most popular destinations, mainly in the northern Italian regions.

  7. Visual and statistical analysis of 18F-FDG PET in primary progressive aphasia

    International Nuclear Information System (INIS)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge; Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis

    2015-01-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  8. TFM classification and staging of oral submucous fibrosis: A new proposal.

    Science.gov (United States)

    Arakeri, Gururaj; Thomas, Deepak; Aljabab, Abdulsalam S; Hunasgi, Santosh; Rai, Kirthi Kumar; Hale, Beverley; Fonseca, Felipe Paiva; Gomez, Ricardo Santiago; Rahimi, Siavash; Merkx, Matthias A W; Brennan, Peter A

    2018-04-01

    We have evaluated the rationale of existing grading and staging schemes of oral submucous fibrosis (OSMF) based on how they are categorized. A novel classification and staging scheme is proposed. A total of 300 OSMF patients were evaluated for agreement between functional, clinical, and histopathological staging. Bilateral biopsies were assessed in 25 patients to evaluate for any differences in histopathological staging of OSMF in the same mouth. Extent of clinician agreement for categorized staging data was evaluated using Cohen's weighted kappa analysis. Cross-tabulation was performed on categorical grading data to understand the intercorrelation, and the unweighted kappa analysis was used to assess the bilateral grade agreement. Probabilities of less than 0.05 were considered significant. Data were analyzed using SPSS Statistics (version 25.0, IBM, USA). A low agreement was found between all the stages depicting the independent nature of trismus, clinical features, and histopathological components (K = 0.312, 0.167, 0.152) in OSMF. Following analysis, a three-component classification scheme (TFM classification) was developed that describes the severity of each independently, grouping them using a novel three-tier staging scheme as a guide to the treatment plan. The proposed classification and staging could be useful for effective communication, categorization, and for recording data and prognosis, and for guiding treatment plans. Furthermore, the classification considers OSMF malignant transformation in detail. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Is axillary sonographic staging less accurate in invasive lobular breast cancer than in ductal breast cancer?

    Science.gov (United States)

    Sankaye, Prashant; Chhatani, Sharmila; Porter, Gareth; Steel, Jim; Doyle, Sarah

    2014-10-01

    The purpose of this study was to determine whether axillary sonography is less accurate in invasive lobular breast cancer than in ductal breast cancer. Patients with invasive breast cancer were retrospectively identified from histologic records from 2010 to 2012. Staging axillary sonograms from 96 patients with primary breast cancer in each of 2 subgroups, invasive lobular carcinoma (ILC) and invasive ductal carcinoma (IDC), were reviewed. Preoperative sonographically guided 14-gauge core biopsy was performed on morphologically abnormal lymph nodes. Thirty-one of 96 patients (32%) in each subgroup were node positive on final postoperative histopathologic analysis. Axillary staging sensitivity was 17 of 31 patients (54%) in the IDC subgroup and 15 of 31(48%) in the ILC subgroup. Further analysis of the data showed no statistically significant differences between these subgroups. We found that there was no statistically significant difference in the accuracy of axillary sonographic staging between ILC and IDC. © 2014 by the American Institute of Ultrasound in Medicine.

  10. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the

  11. Asymmetry of price returns-Analysis and perspectives from a non-extensive statistical physics point of view.

    Directory of Open Access Journals (Sweden)

    Łukasz Bil

    Full Text Available We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE in Poland and-for comparison-data from the most mature money market (Forex. It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London. The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives

  12. Asymmetry of price returns—Analysis and perspectives from a non-extensive statistical physics point of view

    Science.gov (United States)

    Bil, Łukasz; Zienowicz, Magdalena

    2017-01-01

    We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and—for comparison—data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies—owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)—but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives

  13. Asymmetry of price returns-Analysis and perspectives from a non-extensive statistical physics point of view.

    Science.gov (United States)

    Bil, Łukasz; Grech, Dariusz; Zienowicz, Magdalena

    2017-01-01

    We study how the approach grounded on non-extensive statistical physics can be applied to describe and distinguish different stages of the stock and money market development. A particular attention is given to asymmetric behavior of fat tailed distributions of positive and negative returns. A new method to measure this asymmetry is proposed. It is based on the value of the non-extensive Tsallis parameter q. The new quantifier of the relative asymmetry level between tails in terms of the Tsallis parameters q± is provided to analyze the effect of memory in data caused by nonlinear autocorrelations. The presented analysis takes into account data of separate stocks from the main developing stock market in Europe, i.e., the Warsaw Stock Exchange (WSE) in Poland and-for comparison-data from the most mature money market (Forex). It is argued that the proposed new quantifier is able to describe the stage of market development and its robustness to speculation. The main strength is put on a description and interpretation of the asymmetry between statistical properties of positive and negative returns for various stocks and for diversified time-lags Δt of data counting. The particular caution in this context is addressed to the difference between intraday and interday returns. Our search is extended to study memory effects and their dependence on the quotation frequency for similar large companies-owners of food-industrial retail supermarkets acting on both Polish and European markets (Eurocash, Jeronimo-Martins, Carrefour, Tesco)-but traded on various European stock markets of diversified economical maturity (respectively in Warsaw, Lisbon, Paris and London). The latter analysis seems to indicate quantitatively that stocks from the same economic sector traded on different markets within European Union (EU) may be a target of diversified level of speculations involved in trading independently on the true economic situation of the company. Our work thus gives indications

  14. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  15. Measuring the Success of an Academic Development Programme: A Statistical Analysis

    Science.gov (United States)

    Smith, L. C.

    2009-01-01

    This study uses statistical analysis to estimate the impact of first-year academic development courses in microeconomics, statistics, accountancy, and information systems, offered by the University of Cape Town's Commerce Academic Development Programme, on students' graduation performance relative to that achieved by mainstream students. The data…

  16. Edmonton obesity staging system among pediatric patients: a validation and obesogenic risk factor analysis.

    Science.gov (United States)

    Grammatikopoulou, M G; Chourdakis, M; Gkiouras, K; Roumeli, P; Poulimeneas, D; Apostolidou, E; Chountalas, I; Tirodimos, I; Filippou, O; Papadakou-Lagogianni, S; Dardavessis, T

    2018-01-08

    The Edmonton Obesity Staging System for Pediatrics (EOSS-P) is a useful tool, delineating different obesity severity tiers associated with distinct treatment barriers. The aim of the study was to apply the EOSS-P on a Greek pediatric cohort and assess risk factors associated with each stage, compared to normal weight controls. A total of 361 children (2-14 years old), outpatients of an Athenian hospital, participated in this case-control study by forming two groups: the obese (n = 203) and the normoweight controls (n = 158). Anthropometry, blood pressure, blood and biochemical markers, comorbidities and obesogenic lifestyle parameters were recorded and the EOSS-P was applied. Validation of EOSS-P stages was conducted by juxtaposing them with IOTF-defined weight status. Obesogenic risk factors' analysis was conducted by constructing gender-and-age-adjusted (GA) and multivariate logistic models. The majority of obese children were stratified at stage 1 (46.0%), 17.0% were on stage 0, and 37.0% on stage 2. The validation analysis revealed that EOSS-P stages greater than 0 were associated with diastolic blood pressure and levels of glucose, cholesterol, LDL and ALT. Reduced obesity odds were observed among children playing outdoors and increased odds for every screen time hour, both in the GA and in the multivariate analyses (all P  2 times/week was associated with reduced obesity odds in the GA analysis (OR = 0.57, 95% CI = 0.33-0.98, P linear = 0.047), it lost its significance in the multivariate analysis (P linear = 0.145). Analogous results were recorded in the analyses of the abovementioned physical activity risk factors for the EOSS-P stages. Linear relationships were observed for fast-food consumption and IOTF-defined obesity and higher than 0 EOSS-P stages. Parental obesity status was associated with all EOSS-P stages and IOTF-defined obesity status. Few outpatients were healthy obese (stage 0), while the majority exhibited several comorbidities

  17. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  18. Study of relationship between MUF correlation and detection sensitivity of statistical analysis

    International Nuclear Information System (INIS)

    Tamura, Toshiaki; Ihara, Hitoshi; Yamamoto, Yoichi; Ikawa, Koji

    1989-11-01

    Various kinds of statistical analysis are proposed to NRTA (Near Real Time Materials Accountancy) which was devised to satisfy the timeliness goal of one of the detection goals of IAEA. It will be presumed that different statistical analysis results will occur between the case of considered rigorous error propagation (with MUF correlation) and the case of simplified error propagation (without MUF correlation). Therefore, measurement simulation and decision analysis were done using flow simulation of 800 MTHM/Y model reprocessing plant, and relationship between MUF correlation and detection sensitivity and false alarm of statistical analysis was studied. Specific character of material accountancy for 800 MTHM/Y model reprocessing plant was grasped by this simulation. It also became clear that MUF correlation decreases not only false alarm but also detection probability for protracted loss in case of CUMUF test and Page's test applied to NRTA. (author)

  19. Stage-discharge relationship in tidal channels

    Science.gov (United States)

    Kearney, W. S.; Mariotti, G.; Deegan, L.; Fagherazzi, S.

    2016-12-01

    Long-term records of the flow of water through tidal channels are essential to constrain the budgets of sediments and biogeochemical compounds in salt marshes. Statistical models which relate discharge to water level allow the estimation of such records from more easily obtained records of water stage in the channel. While there is clearly structure in the stage-discharge relationship, nonlinearity and nonstationarity of the relationship complicates the construction of statistical stage-discharge models with adequate performance for discharge estimation and uncertainty quantification. Here we compare four different types of stage-discharge models, each of which is designed to capture different characteristics of the stage-discharge relationship. We estimate and validate each of these models on a two-month long time series of stage and discharge obtained with an Acoustic Doppler Current Profiler in a salt marsh channel. We find that the best performance is obtained by models which account for the nonlinear and time-varying nature of the stage-discharge relationship. Good performance can also be obtained from a simplified version of these models which approximates the fully nonlinear and time-varying models with a piecewise linear formulation.

  20. One-stage individual participant data meta-analysis models: estimation of treatment-covariate interactions must avoid ecological bias by separating out within-trial and across-trial information.

    Science.gov (United States)

    Hua, Hairui; Burke, Danielle L; Crowther, Michael J; Ensor, Joie; Tudur Smith, Catrin; Riley, Richard D

    2017-02-28

    Stratified medicine utilizes individual-level covariates that are associated with a differential treatment effect, also known as treatment-covariate interactions. When multiple trials are available, meta-analysis is used to help detect true treatment-covariate interactions by combining their data. Meta-regression of trial-level information is prone to low power and ecological bias, and therefore, individual participant data (IPD) meta-analyses are preferable to examine interactions utilizing individual-level information. However, one-stage IPD models are often wrongly specified, such that interactions are based on amalgamating within- and across-trial information. We compare, through simulations and an applied example, fixed-effect and random-effects models for a one-stage IPD meta-analysis of time-to-event data where the goal is to estimate a treatment-covariate interaction. We show that it is crucial to centre patient-level covariates by their mean value in each trial, in order to separate out within-trial and across-trial information. Otherwise, bias and coverage of interaction estimates may be adversely affected, leading to potentially erroneous conclusions driven by ecological bias. We revisit an IPD meta-analysis of five epilepsy trials and examine age as a treatment effect modifier. The interaction is -0.011 (95% CI: -0.019 to -0.003; p = 0.004), and thus highly significant, when amalgamating within-trial and across-trial information. However, when separating within-trial from across-trial information, the interaction is -0.007 (95% CI: -0.019 to 0.005; p = 0.22), and thus its magnitude and statistical significance are greatly reduced. We recommend that meta-analysts should only use within-trial information to examine individual predictors of treatment effect and that one-stage IPD models should separate within-trial from across-trial information to avoid ecological bias. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd

  1. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Science.gov (United States)

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the... on Documenting Statistical Analysis Programs and Data Files; Availability'' giving interested persons...

  2. A Retrospective Survival Analysis of Anatomic and Prognostic Stage Group Based on the American Joint Committee on Cancer 8th Edition Cancer Staging Manual in Luminal B Human Epidermal Growth Factor Receptor 2-negative Breast Cancer.

    Science.gov (United States)

    Xu, Ling; Li, Jiang-Hong; Ye, Jing-Ming; Duan, Xue-Ning; Cheng, Yuan-Jia; Xin, Ling; Liu, Qian; Zhou, Bin; Liu, Yin-Hua

    2017-08-20

    Current understanding of tumor biology suggests that breast cancer is a group of diseases with different intrinsic molecular subtypes. Anatomic staging system alone is insufficient to provide future outcome information. The American Joint Committee on Cancer (AJCC) expert panel updated the 8th edition of the staging manual with prognostic stage groups by incorporating biomarkers into the anatomic stage groups. In this study, we retrospectively analyzed the data from our center in China using the anatomic and prognostic staging system based on the AJCC 8th edition staging manual. We reviewed the data from January 2008 to December 2014 for cases with Luminal B Human Epidermal Growth Factor Receptor 2 (HER2)-negative breast cancer in our center. All cases were restaged using the AJCC 8th edition anatomic and prognostic staging system. The Kaplan-Meier method and log-rank test were used to compare the survival differences between different subgroups. SPSS software version 19.0 (IBM Corp., Armonk, NY, USA) was used for the statistical analyses. This study consisted of 796 patients with Luminal B HER-negative breast cancer. The 5-year disease-free survival (DFS) of 769 Stage I-III patients was 89.7%, and the 5-year overall survival (OS) of all 796 patients was 91.7%. Both 5-year DFS and 5-year OS were significantly different in the different anatomic and prognostic stage groups. There were 372 cases (46.7%) assigned to a different group. The prognostic Stage II and III patients restaged from anatomic Stage III had significant differences in 5-year DFS (χ2 = 11.319, P= 0.001) and 5-year OS (χ2 = 5.225, P= 0.022). In addition, cases restaged as prognostic Stage I, II, or III from the anatomic Stage II group had statistically significant differences in 5-year DFS (χ2 = 6.510, P= 0.039) but no significant differences in 5-year OS (χ2 = 5.087, P= 0.079). However, the restaged prognostic Stage I and II cases from anatomic Stage I had no statistically significant

  3. A statistical approach to the life cycle analysis of cumulus clouds selected in a virtual reality environment

    Science.gov (United States)

    Heus, Thijs; Jonker, Harm J. J.; van den Akker, Harry E. A.; Griffith, Eric J.; Koutek, Michal; Post, Frits H.

    2009-03-01

    In this study, a new method is developed to investigate the entire life cycle of shallow cumuli in large eddy simulations. Although trained observers have no problem in distinguishing the different life stages of a cloud, this process proves difficult to automate, because cloud-splitting and cloud-merging events complicate the distinction between a single system divided in several cloudy parts and two independent systems that collided. Because the human perception is well equipped to capture and to make sense of these time-dependent three-dimensional features, a combination of automated constraints and human inspection in a three-dimensional virtual reality environment is used to select clouds that are exemplary in their behavior throughout their entire life span. Three specific cases (ARM, BOMEX, and BOMEX without large-scale forcings) are analyzed in this way, and the considerable number of selected clouds warrants reliable statistics of cloud properties conditioned on the phase in their life cycle. The most dominant feature in this statistical life cycle analysis is the pulsating growth that is present throughout the entire lifetime of the cloud, independent of the case and of the large-scale forcings. The pulses are a self-sustained phenomenon, driven by a balance between buoyancy and horizontal convergence of dry air. The convective inhibition just above the cloud base plays a crucial role as a barrier for the cloud to overcome in its infancy stage, and as a buffer region later on, ensuring a steady supply of buoyancy into the cloud.

  4. Multicriteria analysis of product operational effectiveness at design stages

    Science.gov (United States)

    Irzaev, G. Kh

    2018-03-01

    The multicriteria rapid assessment method of techno-economic parameters of new products is developed. It avoids expensive engineering changes during the operational stages through the analysis of external and internal factors at an early stage in the design that affect the maintainability and manufacturability of the product. The expert selection of the initial multitude of indicators from the five enlarged criteria groups and their subsequent pairwise comparison allow one to distinguish the complex compliance criteria of product design with the average and optimum values of the operational effectiveness. The values comparison provides an opportunity to decide on the continuation of the process for designing and preparation of the product manufacture.

  5. Point defect characterization in HAADF-STEM images using multivariate statistical analysis

    International Nuclear Information System (INIS)

    Sarahan, Michael C.; Chi, Miaofang; Masiel, Daniel J.; Browning, Nigel D.

    2011-01-01

    Quantitative analysis of point defects is demonstrated through the use of multivariate statistical analysis. This analysis consists of principal component analysis for dimensional estimation and reduction, followed by independent component analysis to obtain physically meaningful, statistically independent factor images. Results from these analyses are presented in the form of factor images and scores. Factor images show characteristic intensity variations corresponding to physical structure changes, while scores relate how much those variations are present in the original data. The application of this technique is demonstrated on a set of experimental images of dislocation cores along a low-angle tilt grain boundary in strontium titanate. A relationship between chemical composition and lattice strain is highlighted in the analysis results, with picometer-scale shifts in several columns measurable from compositional changes in a separate column. -- Research Highlights: → Multivariate analysis of HAADF-STEM images. → Distinct structural variations among SrTiO 3 dislocation cores. → Picometer atomic column shifts correlated with atomic column population changes.

  6. Assessing efficiency and effectiveness of Malaysian Islamic banks: A two stage DEA analysis

    Science.gov (United States)

    Kamarudin, Norbaizura; Ismail, Wan Rosmanira; Mohd, Muhammad Azri

    2014-06-01

    Islamic banks in Malaysia are indispensable players in the financial industry with the growing needs for syariah compliance system. In the banking industry, most recent studies concerned only on operational efficiency. However rarely on the operational effectiveness. Since the production process of banking industry can be described as a two-stage process, two-stage Data Envelopment Analysis (DEA) can be applied to measure the bank performance. This study was designed to measure the overall performance in terms of efficiency and effectiveness of Islamic banks in Malaysia using Two-Stage DEA approach. This paper presents analysis of a DEA model which split the efficiency and effectiveness in order to evaluate the performance of ten selected Islamic Banks in Malaysia for the financial year period ended 2011. The analysis shows average efficient score is more than average effectiveness score thus we can say that Malaysian Islamic banks were more efficient rather than effective. Furthermore, none of the bank exhibit best practice in both stages as we can say that a bank with better efficiency does not always mean having better effectiveness at the same time.

  7. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  8. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    Science.gov (United States)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics

  9. FADTTS: functional analysis of diffusion tensor tract statistics.

    Science.gov (United States)

    Zhu, Hongtu; Kong, Linglong; Li, Runze; Styner, Martin; Gerig, Guido; Lin, Weili; Gilmore, John H

    2011-06-01

    The aim of this paper is to present a functional analysis of a diffusion tensor tract statistics (FADTTS) pipeline for delineating the association between multiple diffusion properties along major white matter fiber bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these white matter tract properties in various diffusion tensor imaging studies. The FADTTS integrates five statistical tools: (i) a multivariate varying coefficient model for allowing the varying coefficient functions in terms of arc length to characterize the varying associations between fiber bundle diffusion properties and a set of covariates, (ii) a weighted least squares estimation of the varying coefficient functions, (iii) a functional principal component analysis to delineate the structure of the variability in fiber bundle diffusion properties, (iv) a global test statistic to test hypotheses of interest, and (v) a simultaneous confidence band to quantify the uncertainty in the estimated coefficient functions. Simulated data are used to evaluate the finite sample performance of FADTTS. We apply FADTTS to investigate the development of white matter diffusivities along the splenium of the corpus callosum tract and the right internal capsule tract in a clinical study of neurodevelopment. FADTTS can be used to facilitate the understanding of normal brain development, the neural bases of neuropsychiatric disorders, and the joint effects of environmental and genetic factors on white matter fiber bundles. The advantages of FADTTS compared with the other existing approaches are that they are capable of modeling the structured inter-subject variability, testing the joint effects, and constructing their simultaneous confidence bands. However, FADTTS is not crucial for estimation and reduces to the functional analysis method for the single measure. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Orthopantomographic evaluation of canine and first premolar using Demirjian's stages in central India: new approach to forensic age estimation.

    Science.gov (United States)

    Pathak, Hrishikesh V; Dixit, Pradeep; Shrigiriwar, Manish; Bardale, Rajesh

    2012-07-01

    Teeth development is widely used for age estimation in forensic science. The aims of this study were as follows: first, to establish Indian data on canine and first premolar development for age estimation and second, to investigate population differences in teeth development. Orthopantomograms of 340 Indian children aged between 5 and 14 years were analyzed. Demirjian's stages were recorded for the developmental evaluation of canine and first premolar and for further descriptive statistical analysis. A two-way ANOVA was performed to test the significance of difference in teeth development by sex and stage. A one-way ANOVA was performed to investigate population differences in teeth development. Results showed statistically significant differences in teeth development by sex and stage. Accordingly, teeth development was earlier in girls. No statistically significant differences were observed in timings of Demirjian's stages among different populations. In conclusion, the findings of this study could be used for age estimation of Indian children. © 2012 American Academy of Forensic Sciences.

  11. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  12. Effect of the absolute statistic on gene-sampling gene-set analysis methods.

    Science.gov (United States)

    Nam, Dougu

    2017-06-01

    Gene-set enrichment analysis and its modified versions have commonly been used for identifying altered functions or pathways in disease from microarray data. In particular, the simple gene-sampling gene-set analysis methods have been heavily used for datasets with only a few sample replicates. The biggest problem with this approach is the highly inflated false-positive rate. In this paper, the effect of absolute gene statistic on gene-sampling gene-set analysis methods is systematically investigated. Thus far, the absolute gene statistic has merely been regarded as a supplementary method for capturing the bidirectional changes in each gene set. Here, it is shown that incorporating the absolute gene statistic in gene-sampling gene-set analysis substantially reduces the false-positive rate and improves the overall discriminatory ability. Its effect was investigated by power, false-positive rate, and receiver operating curve for a number of simulated and real datasets. The performances of gene-set analysis methods in one-tailed (genome-wide association study) and two-tailed (gene expression data) tests were also compared and discussed.

  13. An improved method for statistical analysis of raw accelerator mass spectrometry data

    International Nuclear Information System (INIS)

    Gutjahr, A.; Phillips, F.; Kubik, P.W.; Elmore, D.

    1987-01-01

    Hierarchical statistical analysis is an appropriate method for statistical treatment of raw accelerator mass spectrometry (AMS) data. Using Monte Carlo simulations we show that this method yields more accurate estimates of isotope ratios and analytical uncertainty than the generally used propagation of errors approach. The hierarchical analysis is also useful in design of experiments because it can be used to identify sources of variability. 8 refs., 2 figs

  14. Do stages of menopause affect the outcomes of pelvic floor muscle training?

    Science.gov (United States)

    Tosun, Özge Çeliker; Mutlu, Ebru Kaya; Tosun, Gökhan; Ergenoğlu, Ahmet Mete; Yeniel, Ahmet Özgur; Malkoç, Mehtap; Aşkar, Niyazi; İtil, İsmail Mete

    2015-02-01

    The purpose of our study is to determine whether there is a difference in pelvic floor muscle strength attributable to pelvic floor muscle training conducted during different stages of menopause. One hundred twenty-two women with stress urinary incontinence and mixed urinary incontinence were included in this prospective controlled study. The participants included in this study were separated into three groups according to the Stages of Reproductive Aging Workshop staging system as follows: group 1 (n = 41): stages -3 and -2; group 2 (n = 32): stages +1 and -1; and group 3 (n = 30): stage +2. All three groups were provided an individual home exercise program throughout the 12-week study. Pelvic floor muscle strength before and after the 12-week treatment was measured in all participants (using the PERFECT [power, endurance, number of repetitions, and number of fast (1-s) contractions; every contraction is timed] scheme, perineometry, transabdominal ultrasound, Brink scale, pad test, and stop test). Data were analyzed using analysis of variance. There were no statistically significant differences in pre-exercise training pelvic floor muscle strength parameters among the three groups. After 12 weeks, there were statistically significant increases in PERFECT scheme, Brink scale, perineometry, and ultrasound values. In contrast, there were significant decreases in stop test and 1-hour pad test values observed in the three groups (P = 0.001, dependent t test). In comparison with the other groups, group 1 demonstrated statistically significant improvements in the following postexercise training parameters: power, repetition, speed, Brink vertical displacement, and stop test. The lowest increase was observed in group 2 (P menopause with pelvic floor muscle training, but the rates of increase vary according to the menopausal stage of the participants. Women in the late menopausal transition and early menopause are least responsive to pelvic floor muscle strength training

  15. Statistical Image Analysis of Tomograms with Application to Fibre Geometry Characterisation

    DEFF Research Database (Denmark)

    Emerson, Monica Jane

    The goal of this thesis is to develop statistical image analysis tools to characterise the micro-structure of complex materials used in energy technologies, with a strong focus on fibre composites. These quantification tools are based on extracting geometrical parameters defining structures from 2D...... with high resolution both in space and time to observe fast micro-structural changes. This thesis demonstrates that statistical image analysis combined with X-ray CT opens up numerous possibilities for understanding the behaviour of fibre composites under real life conditions. Besides enabling...

  16. Developing a spatial-statistical model and map of historical malaria prevalence in Botswana using a staged variable selection procedure

    Directory of Open Access Journals (Sweden)

    Mabaso Musawenkosi LH

    2007-09-01

    produced a highly plausible and parsimonious model of historical malaria risk for Botswana from point-referenced data from a 1961/2 prevalence survey of malaria infection in 1–14 year old children. After starting with a list of 50 potential variables we ended with three highly plausible predictors, by applying a systematic and repeatable staged variable selection procedure that included a spatial analysis, which has application for other environmentally determined infectious diseases. All this was accomplished using general-purpose statistical software.

  17. The art of data analysis how to answer almost any question using basic statistics

    CERN Document Server

    Jarman, Kristin H

    2013-01-01

    A friendly and accessible approach to applying statistics in the real worldWith an emphasis on critical thinking, The Art of Data Analysis: How to Answer Almost Any Question Using Basic Statistics presents fun and unique examples, guides readers through the entire data collection and analysis process, and introduces basic statistical concepts along the way.Leaving proofs and complicated mathematics behind, the author portrays the more engaging side of statistics and emphasizes its role as a problem-solving tool.  In addition, light-hearted case studies

  18. Posterior tibial tendon insufficiency results at different stages.

    Science.gov (United States)

    Deland, Jonathan T; Page, Alexandra; Sung, Il-Hoon; O'Malley, Martin J; Inda, David; Choung, Steven

    2006-09-01

    The results of surgical treatment of posterior tibial tendon insufficiency (PTTI) may be different at different stages of the disease. No single study has compared the results at different stages. This comparison can be helpful to the patient and physician if the patient asks "What if I wait and the disease progresses, how will my results be different?" A preliminary study comparing results for stage IIa, stage IIb (advanced stage II), and stage III was performed followed by a larger study comparing IIa and IIb with 26 and 22 patients, respectively. American Orthopaedic Foot and Ankle Society (AOFAS) outcome scores as well as radiographs and functional questions were used. Nearly all patients, regardless of stage, felt they were helped by surgical treatment. However, the lowest AOFAS score was in stage III, the most advanced stage investigated in this study. In comparing stage IIa and IIb patients, stage IIb patients had a statistically higher incidence of lateral discomfort. Although statistically significant differences were not found in all comparisons, this study suggests that the results of surgical treatment for PTTI declines with increasing stage or severity of disease.

  19. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  20. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  1. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  2. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  3. Statistical Challenges of Big Data Analysis in Medicine

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2015-01-01

    Roč. 3, č. 1 (2015), s. 24-27 ISSN 1805-8698 R&D Projects: GA ČR GA13-23940S Grant - others:CESNET Development Fund(CZ) 494/2013 Institutional support: RVO:67985807 Keywords : big data * variable selection * classification * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research http://www.ijbh.org/ijbh2015-1.pdf

  4. Statistical Analysis of Hypercalcaemia Data related to Transferability

    DEFF Research Database (Denmark)

    Frølich, Anne; Nielsen, Bo Friis

    2005-01-01

    In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....

  5. Statistical analysis of questionnaires a unified approach based on R and Stata

    CERN Document Server

    Bartolucci, Francesco; Gnaldi, Michela

    2015-01-01

    Statistical Analysis of Questionnaires: A Unified Approach Based on R and Stata presents special statistical methods for analyzing data collected by questionnaires. The book takes an applied approach to testing and measurement tasks, mirroring the growing use of statistical methods and software in education, psychology, sociology, and other fields. It is suitable for graduate students in applied statistics and psychometrics and practitioners in education, health, and marketing.The book covers the foundations of classical test theory (CTT), test reliability, va

  6. Reducing bias in the analysis of counting statistics data

    International Nuclear Information System (INIS)

    Hammersley, A.P.; Antoniadis, A.

    1997-01-01

    In the analysis of counting statistics data it is common practice to estimate the variance of the measured data points as the data points themselves. This practice introduces a bias into the results of further analysis which may be significant, and under certain circumstances lead to false conclusions. In the case of normal weighted least squares fitting this bias is quantified and methods to avoid it are proposed. (orig.)

  7. Effectiveness of surgery and individualized high-dose hyperfractionated accelerated radiotherapy on survival in clinical stage I non-small cell lung cancer. A propensity score matched analysis

    International Nuclear Information System (INIS)

    Jimenez, Marcelo F.; Baardwijk, Angela van; Aerts, Hugo J.W.L.; De Ruysscher, Dirk; Novoa, Nuria M.; Varela, Gonzalo; Lambin, Philippe

    2010-01-01

    Background and purpose: Surgery is considered the treatment of choice for early-stage non-small cell lung cancer (NSCLC). Patients with poor pulmonary function or other comorbidities are treated with radiotherapy. The objective of this investigation is to compare the 3-year survival of two early-stage NSCLC populations treated in two different hospitals, either by surgical resection (lobectomy) or by individualized high-dose accelerated radiotherapy, after matching patients by propensity scoring analysis. Methods: A retrospective comparative study has been performed on two series of consecutive patients with cytohistological diagnosis of NSCLC, clinically staged IA by means of PET-scan (radiotherapy group) and pathologically staged IA (surgery group). Results: A total of 157 cases were initially selected for the analysis (110 operated and 47 treated by radiotherapy). Patients in the radiotherapy group were older, with higher comorbidity and lower FEV1% with 3-years probability of survival for operated patients higher than that found for patients treated by radiotherapy. After matching by propensity scoring (using age and FEV1%), differences disappear and 3-years probability of survival had no statistical differences. Conclusions: Although this is a non-randomized retrospective analysis, we have not found 3-years survival differences after matching cases between surgery and radiotherapy. Nevertheless, data presented here support the continuous investigation for non-surgical alternatives in this disease.

  8. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  10. Bayesian statistics applied to neutron activation data for reactor flux spectrum analysis

    International Nuclear Information System (INIS)

    Chiesa, Davide; Previtali, Ezio; Sisti, Monica

    2014-01-01

    Highlights: • Bayesian statistics to analyze the neutron flux spectrum from activation data. • Rigorous statistical approach for accurate evaluation of the neutron flux groups. • Cross section and activation data uncertainties included for the problem solution. • Flexible methodology applied to analyze different nuclear reactor flux spectra. • The results are in good agreement with the MCNP simulations of neutron fluxes. - Abstract: In this paper, we present a statistical method, based on Bayesian statistics, to analyze the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation experiment performed at the TRIGA Mark II reactor of Pavia University (Italy) in four irradiation positions characterized by different neutron spectra. In order to evaluate the neutron flux spectrum, subdivided in energy groups, a system of linear equations, containing the group effective cross sections and the activation rate data, has to be solved. However, since the system’s coefficients are experimental data affected by uncertainties, a rigorous statistical approach is fundamental for an accurate evaluation of the neutron flux groups. For this purpose, we applied the Bayesian statistical analysis, that allows to include the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, was used to define the problem statistical model and solve it. The first analysis involved the determination of the thermal, resonance-intermediate and fast flux components and the dependence of the results on the Prior distribution choice was investigated to confirm the reliability of the Bayesian analysis. After that, the main resonances of the activation cross sections were analyzed to implement multi-group models with finer energy subdivisions that would allow to determine the

  11. Reactor noise analysis by statistical pattern recognition methods

    International Nuclear Information System (INIS)

    Howington, L.C.; Gonzalez, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis is presented. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, updating, and data compacting capabilities. System design emphasizes control of the false-alarm rate. Its abilities to learn normal patterns, to recognize deviations from these patterns, and to reduce the dimensionality of data with minimum error were evaluated by experiments at the Oak Ridge National Laboratory (ORNL) High-Flux Isotope Reactor. Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the pattern recognition system

  12. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  13. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  14. Analytical and statistical analysis of elemental composition of lichens

    International Nuclear Information System (INIS)

    Calvelo, S.; Baccala, N.; Bubach, D.; Arribere, M.A.; Riberio Guevara, S.

    1997-01-01

    The elemental composition of lichens from remote southern South America regions has been studied with analytical and statistical techniques to determine if the values obtained reflect species, growth forms or habitat characteristics. The enrichment factors are calculated discriminated by species and collection site and compared with data available in the literature. The elemental concentrations are standardized and compared for different species. The information was statistically processed, a cluster analysis was performed using the three first principal axes of the PCA; the three groups formed are presented. Their relationship with the species, collection sites and the lichen growth forms are interpreted. (author)

  15. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  16. Parametric analysis of the statistical model of the stick-slip process

    Science.gov (United States)

    Lima, Roberta; Sampaio, Rubens

    2017-06-01

    In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.

  17. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  18. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  19. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  20. Statistical analysis of extreme values from insurance, finance, hydrology and other fields

    CERN Document Server

    Reiss, Rolf-Dieter

    1997-01-01

    The statistical analysis of extreme data is important for various disciplines, including hydrology, insurance, finance, engineering and environmental sciences. This book provides a self-contained introduction to the parametric modeling, exploratory analysis and statistical interference for extreme values. The entire text of this third edition has been thoroughly updated and rearranged to meet the new requirements. Additional sections and chapters, elaborated on more than 100 pages, are particularly concerned with topics like dependencies, the conditional analysis and the multivariate modeling of extreme data. Parts I–III about the basic extreme value methodology remain unchanged to some larger extent, yet notable are, e.g., the new sections about "An Overview of Reduced-Bias Estimation" (co-authored by M.I. Gomes), "The Spectral Decomposition Methodology", and "About Tail Independence" (co-authored by M. Frick), and the new chapter about "Extreme Value Statistics of Dependent Random Variables" (co-authored ...

  1. Effective Packet Number for 5G IM WeChat Application at Early Stage Traffic Classification

    Directory of Open Access Journals (Sweden)

    Muhammad Shafiq

    2017-01-01

    Full Text Available Accurate network traffic classification at early stage is very important for 5G network applications. During the last few years, researchers endeavored hard to propose effective machine learning model for classification of Internet traffic applications at early stage with few packets. Nevertheless, this essential problem still needs to be studied profoundly to find out effective packet number as well as effective machine learning (ML model. In this paper, we tried to solve the above-mentioned problem. For this purpose, five Internet traffic datasets are utilized. Initially, we extract packet size of 20 packets and then mutual information analysis is carried out to find out the mutual information of each packet on n flow type. Thereafter, we execute 10 well-known machine learning algorithms using crossover classification method. Two statistical analysis tests, Friedman and Wilcoxon pairwise tests, are applied for the experimental results. Moreover, we also apply the statistical tests for classifiers to find out effective ML classifier. Our experimental results show that 13–19 packets are the effective packet numbers for 5G IM WeChat application at early stage network traffic classification. We also find out effective ML classifier, where Random Forest ML classifier is effective classifier at early stage Internet traffic classification.

  2. Power flow as a complement to statistical energy analysis and finite element analysis

    Science.gov (United States)

    Cuschieri, J. M.

    1987-01-01

    Present methods of analysis of the structural response and the structure-borne transmission of vibrational energy use either finite element (FE) techniques or statistical energy analysis (SEA) methods. The FE methods are a very useful tool at low frequencies where the number of resonances involved in the analysis is rather small. On the other hand SEA methods can predict with acceptable accuracy the response and energy transmission between coupled structures at relatively high frequencies where the structural modal density is high and a statistical approach is the appropriate solution. In the mid-frequency range, a relatively large number of resonances exist which make finite element method too costly. On the other hand SEA methods can only predict an average level form. In this mid-frequency range a possible alternative is to use power flow techniques, where the input and flow of vibrational energy to excited and coupled structural components can be expressed in terms of input and transfer mobilities. This power flow technique can be extended from low to high frequencies and this can be integrated with established FE models at low frequencies and SEA models at high frequencies to form a verification of the method. This method of structural analysis using power flo and mobility methods, and its integration with SEA and FE analysis is applied to the case of two thin beams joined together at right angles.

  3. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    Science.gov (United States)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  4. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  5. The Bayesian New Statistics: Hypothesis testing, estimation, meta-analysis, and power analysis from a Bayesian perspective.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    In the practice of data analysis, there is a conceptual distinction between hypothesis testing, on the one hand, and estimation with quantified uncertainty on the other. Among frequentists in psychology, a shift of emphasis from hypothesis testing to estimation has been dubbed "the New Statistics" (Cumming 2014). A second conceptual distinction is between frequentist methods and Bayesian methods. Our main goal in this article is to explain how Bayesian methods achieve the goals of the New Statistics better than frequentist methods. The article reviews frequentist and Bayesian approaches to hypothesis testing and to estimation with confidence or credible intervals. The article also describes Bayesian approaches to meta-analysis, randomized controlled trials, and power analysis.

  6. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  7. STATISTICAL ANALYSIS OF THE HEAVY NEUTRAL ATOMS MEASURED BY IBEX

    International Nuclear Information System (INIS)

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-01-01

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O and Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O and Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath

  8. A look at the links between drainage density and flood statistics

    Directory of Open Access Journals (Sweden)

    A. Montanari

    2009-07-01

    Full Text Available We investigate the links between the drainage density of a river basin and selected flood statistics, namely, mean, standard deviation, coefficient of variation and coefficient of skewness of annual maximum series of peak flows. The investigation is carried out through a three-stage analysis. First, a numerical simulation is performed by using a spatially distributed hydrological model in order to highlight how flood statistics change with varying drainage density. Second, a conceptual hydrological model is used in order to analytically derive the dependence of flood statistics on drainage density. Third, real world data from 44 watersheds located in northern Italy were analysed. The three-level analysis seems to suggest that a critical value of the drainage density exists for which a minimum is attained in both the coefficient of variation and the absolute value of the skewness coefficient. Such minima in the flood statistics correspond to a minimum of the flood quantile for a given exceedance probability (i.e., recurrence interval. Therefore, the results of this study may provide useful indications for flood risk assessment in ungauged basins.

  9. Finite Element Analysis and Optimization for the Multi-stage Deep Drawing of Molybdenum Sheet

    International Nuclear Information System (INIS)

    Kim, Heung-Kyu; Hong, Seok Kwan; Kang, Jeong Jin; Heo, Young-moo; Lee, Jong-Kil; Jeon, Byung-Hee

    2005-01-01

    Molybdenum, a bcc refractory metal with a melting point of about 2600 deg. C, has a high heat and electrical conductivity. In addition, it remains strong mechanically at high temperatures as well as at low temperatures. Therefore it is a technologically very important material for the applications operating at high temperatures. However, a multi-stage process is required due to the low drawability for making a deep drawn part from the molybdenum sheet. In this study, a multi-stage deep drawing process for a molybdenum circular cup was designed by combining the drawing with the ironing, which was effective for the low drawability materials. A parametric study by FE analysis for the multi-stage deep drawing was conducted for evaluation of the design variables effect. Based on the FE analysis result, the multi-stage deep drawing process was parameterized by the design variables, and an optimum process design was obtained by the process optimization based on the FE simulation at each stage

  10. Explorations in statistics: the analysis of ratios and normalized data.

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of Explorations in Statistics explores the analysis of ratios and normalized-or standardized-data. As researchers, we compute a ratio-a numerator divided by a denominator-to compute a proportion for some biological response or to derive some standardized variable. In each situation, we want to control for differences in the denominator when the thing we really care about is the numerator. But there is peril lurking in a ratio: only if the relationship between numerator and denominator is a straight line through the origin will the ratio be meaningful. If not, the ratio will misrepresent the true relationship between numerator and denominator. In contrast, regression techniques-these include analysis of covariance-are versatile: they can accommodate an analysis of the relationship between numerator and denominator when a ratio is useless.

  11. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  12. Perceptual and statistical analysis of cardiac phase and amplitude images

    International Nuclear Information System (INIS)

    Houston, A.; Craig, A.

    1991-01-01

    A perceptual experiment was conducted using cardiac phase and amplitude images. Estimates of statistical parameters were derived from the images and the diagnostic potential of human and statistical decisions compared. Five methods were used to generate the images from 75 gated cardiac studies, 39 of which were classified as pathological. The images were presented to 12 observers experienced in nuclear medicine. The observers rated the images using a five-category scale based on their confidence of an abnormality presenting. Circular and linear statistics were used to analyse phase and amplitude image data, respectively. Estimates of mean, standard deviation (SD), skewness, kurtosis and the first term of the spatial correlation function were evaluated in the region of the left ventricle. A receiver operating characteristic analysis was performed on both sets of data and the human and statistical decisions compared. For phase images, circular SD was shown to discriminate better between normal and abnormal than experienced observers, but no single statistic discriminated as well as the human observer for amplitude images. (orig.)

  13. Surgical staging and prognosis in serous borderline ovarian tumours (BOT): a subanalysis of the AGO ROBOT study.

    Science.gov (United States)

    Trillsch, F; Mahner, S; Vettorazzi, E; Woelber, L; Reuss, A; Baumann, K; Keyver-Paik, M-D; Canzler, U; Wollschlaeger, K; Forner, D; Pfisterer, J; Schroeder, W; Muenstedt, K; Richter, B; Fotopoulou, C; Schmalfeldt, B; Burges, A; Ewald-Riegler, N; de Gregorio, N; Hilpert, F; Fehm, T; Meier, W; Hillemanns, P; Hanker, L; Hasenburg, A; Strauss, H-G; Hellriegel, M; Wimberger, P; Kommoss, S; Kommoss, F; Hauptmann, S; du Bois, A

    2015-02-17

    Incomplete surgical staging is a negative prognostic factor for patients with borderline ovarian tumours (BOT). However, little is known about the prognostic impact of each individual staging procedure. Clinical parameters of 950 patients with BOT (confirmed by central reference pathology) treated between 1998 and 2008 at 24 German AGO centres were analysed. In 559 patients with serous BOT and adequate ovarian surgery, further recommended staging procedures (omentectomy, peritoneal biopsies, cytology) were evaluated applying Cox regression models with respect to progression-free survival (PFS). For patients with one missing staging procedure, the hazard ratio (HR) for recurrence was 1.25 (95%-CI 0.66-2.39; P=0.497). This risk increased with each additional procedure skipped reaching statistical significance in case of two (HR 1.95; 95%-CI 1.06-3.58; P=0.031) and three missing steps (HR 2.37; 95%-CI 1.22-4.64; P=0.011). The most crucial procedure was omentectomy which retained a statistically significant impact on PFS in multiple analysis (HR 1.91; 95%-CI 1.15-3.19; P=0.013) adjusting for previously established prognostic factors as FIGO stage, tumour residuals, and fertility preservation. Individual surgical staging procedures contribute to the prognosis for patients with serous BOT. In this analysis, recurrence risk increased with each skipped surgical step. This should be considered when re-staging procedures following incomplete primary surgery are discussed.

  14. Statistical analysis of the count and profitability of air conditioners.

    Science.gov (United States)

    Rady, El Houssainy A; Mohamed, Salah M; Abd Elmegaly, Alaa A

    2018-08-01

    This article presents the statistical analysis of the number and profitability of air conditioners in an Egyptian company. Checking the same distribution for each categorical variable has been made using Kruskal-Wallis test.

  15. Statistical analysis of subjective preferences for video enhancement

    Science.gov (United States)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  16. Transcriptomic analysis of the late stages of grapevine (Vitis vinifera cv. Cabernet Sauvignon) berry ripening reveals significant induction of ethylene signaling and flavor pathways in the skin.

    Science.gov (United States)

    Cramer, Grant R; Ghan, Ryan; Schlauch, Karen A; Tillett, Richard L; Heymann, Hildegarde; Ferrarini, Alberto; Delledonne, Massimo; Zenoni, Sara; Fasoli, Marianna; Pezzotti, Mario

    2014-12-19

    Grapevine berry, a nonclimacteric fruit, has three developmental stages; the last one is when berry color and sugar increase. Flavors derived from terpenoid and fatty acid metabolism develop at the very end of this ripening stage. The transcriptomic response of pulp and skin of Cabernet Sauvignon berries in the late stages of ripening between 22 and 37 °Brix was assessed using whole-genome micorarrays. The transcript abundance of approximately 18,000 genes changed with °Brix and tissue type. There were a large number of changes in many gene ontology (GO) categories involving metabolism, signaling and abiotic stress. GO categories reflecting tissue differences were overrepresented in photosynthesis, isoprenoid metabolism and pigment biosynthesis. Detailed analysis of the interaction of the skin and pulp with °Brix revealed that there were statistically significantly higher abundances of transcripts changing with °Brix in the skin that were involved in ethylene signaling, isoprenoid and fatty acid metabolism. Many transcripts were peaking around known optimal fruit stages for flavor production. The transcript abundance of approximately two-thirds of the AP2/ERF superfamily of transcription factors changed during these developmental stages. The transcript abundance of a unique clade of ERF6-type transcription factors had the largest changes in the skin and clustered with genes involved in ethylene, senescence, and fruit flavor production including ACC oxidase, terpene synthases, and lipoxygenases. The transcript abundance of important transcription factors involved in fruit ripening was also higher in the skin. A detailed analysis of the transcriptome dynamics during late stages of ripening of grapevine berries revealed that these berries went through massive transcriptional changes in gene ontology categories involving chemical signaling and metabolism in both the pulp and skin, particularly in the skin. Changes in the transcript abundance of genes involved in

  17. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    Science.gov (United States)

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  18. Statistical Analysis of the Exchange Rate of Bitcoin.

    Science.gov (United States)

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  19. Statistical analysis and Monte Carlo simulation of growing self-avoiding walks on percolation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Yuxia [Department of Physics, Wuhan University, Wuhan 430072 (China); Sang Jianping [Department of Physics, Wuhan University, Wuhan 430072 (China); Department of Physics, Jianghan University, Wuhan 430056 (China); Zou Xianwu [Department of Physics, Wuhan University, Wuhan 430072 (China)]. E-mail: xwzou@whu.edu.cn; Jin Zhunzhi [Department of Physics, Wuhan University, Wuhan 430072 (China)

    2005-09-26

    The two-dimensional growing self-avoiding walk on percolation was investigated by statistical analysis and Monte Carlo simulation. We obtained the expression of the mean square displacement and effective exponent as functions of time and percolation probability by statistical analysis and made a comparison with simulations. We got a reduced time to scale the motion of walkers in growing self-avoiding walks on regular and percolation lattices.

  20. General specifications for the development of a USL NASA PC R and D statistical analysis support package

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Bassari, Jinous; Triantafyllopoulos, Spiros

    1984-01-01

    The University of Southwestern Louisiana (USL) NASA PC R and D statistical analysis support package is designed to be a three-level package to allow statistical analysis for a variety of applications within the USL Data Base Management System (DBMS) contract work. The design addresses usage of the statistical facilities as a library package, as an interactive statistical analysis system, and as a batch processing package.

  1. A method for statistical steady state thermal analysis of reactor cores

    International Nuclear Information System (INIS)

    Whetton, P.A.

    1981-01-01

    In a previous publication the author presented a method for undertaking statistical steady state thermal analyses of reactor cores. The present paper extends the technique to an assessment of confidence limits for the resulting probability functions which define the probability that a given thermal response value will be exceeded in a reactor core. Establishing such confidence limits is considered an integral part of any statistical thermal analysis and essential if such analysis are to be considered in any regulatory process. In certain applications the use of a best estimate probability function may be justifiable but it is recognised that a demonstrably conservative probability function is required for any regulatory considerations. (orig.)

  2. A statistical test for outlier identification in data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Morteza Khodabin

    2010-09-01

    Full Text Available In the use of peer group data to assess individual, typical or best practice performance, the effective detection of outliers is critical for achieving useful results. In these ‘‘deterministic’’ frontier models, statistical theory is now mostly available. This paper deals with the statistical pared sample method and its capability of detecting outliers in data envelopment analysis. In the presented method, each observation is deleted from the sample once and the resulting linear program is solved, leading to a distribution of efficiency estimates. Based on the achieved distribution, a pared test is designed to identify the potential outlier(s. We illustrate the method through a real data set. The method could be used in a first step, as an exploratory data analysis, before using any frontier estimation.

  3. Assessing the environmental sustainability of early stage design for bioprocesses under uncertainties: An analysis of glycerol bioconversion

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Cheali, Peam; Posada, John A.

    2016-01-01

    The development of a bio-based economy is seen as a key strategy towards a sustainable society in a world facing climate change, energy security and social distress. However, since substantial uncertainty is involved in early-stage design analyses, the ranking and identification of potential......; and lastly, (v) rank the alternatives within the design space. Finally, the methodology's applicability is highlighted by screening early-stage glycerol bioconversion routes to value-added chemicals for future biorefinery concepts. Through the proposed methodology, it was demonstrated that the statistical...

  4. A two-stage meta-analysis identifies several new loci for Parkinson's disease.

    NARCIS (Netherlands)

    Plagnol, V.; Nalls, M.A.; Bras, J.M.; Hernandez, D.; Sharma, M.; Sheerin, U.M.; Saad, M.; Simon-Sanchez, J.; Schulte, C.; Lesage, S.; Sveinbjornsdottir, S.; Amouyel, P.; Arepalli, S.; Band, G.; Barker, R.A.; Bellinguez, C.; Ben-Shlomo, Y.; Berendse, H.W.; Berg, D; Bhatia, K.P.; Bie, R.M. de; Biffi, A.; Bloem, B.R.; Bochdanovits, Z.; Bonin, M.; Brockmann, K.; Brooks, J.; Burn, D.J.; Charlesworth, G.; Chen, H.; Chinnery, P.F.; Chong, S.; Clarke, C.E.; Cookson, M.R.; Cooper, J.M.; Corvol, J.C.; Counsell, J.; Damier, P.; Dartigues, J.F.; Deloukas, P.; Deuschl, G.; Dexter, D.T.; Dijk, K.D. van; Dillman, A.; Durif, F.; Durr, A.; Edkins, S.; Evans, J.R.; Foltynie, T.; Freeman, C.; Gao, J.; Gardner, M.; Gibbs, J.R.; Goate, A.; Gray, E.; Guerreiro, R.; Gustafsson, O.; Harris, C.; Hellenthal, G.; Hilten, J.J. van; Hofman, A.; Hollenbeck, A.; Holton, J.L.; Hu, M.; Huang, X.; Huber, H; Hudson, G.; Hunt, S.E.; Huttenlocher, J.; Illig, T.; Jonsson, P.V.; Langford, C.; Lees, A.J.; Lichtner, P.; Limousin, P.; Lopez, G.; McNeill, A.; Moorby, C.; Moore, M.; Morris, H.A.; Morrison, K.E.; Mudanohwo, E.; O'Sullivan, S.S; Pearson, J.; Pearson, R.; Perlmutter, J.; Petursson, H.; Pirinen, M.; Polnak, P.; Post, B.; Potter, S.C.; Ravina, B.; Revesz, T.; Riess, O.; Rivadeneira, F.; Rizzu, P.; Ryten, M.; Sawcer, S.J.; Schapira, A.; Scheffer, H.; Shaw, K.; Shoulson, I.; Sidransky, E.; Silva, R. de; Smith, C.; Spencer, C.C.; Stefansson, H.; Steinberg, S.; Stockton, J.D.; Strange, A.; Su, Z.; Talbot, K.; Tanner, C.M.; Tashakkori-Ghanbaria, A.; Tison, F.; Trabzuni, D.; Traynor, B.J.; Uitterlinden, A.G.; Vandrovcova, J.; Velseboer, D.; Vidailhet, M.; Vukcevic, D.; Walker, R.; Warrenburg, B.P.C. van de; Weale, M.E.; Wickremaratchi, M.; Williams, N.; Williams-Gray, C.H.; Winder-Rhodes, S.; Stefansson, K.; Martinez, M.; Donnelly, P.; Singleton, A.B.; Hardy, J.; Heutink, P.; Brice, A.; Gasser, T.; Wood, N.W.

    2011-01-01

    A previous genome-wide association (GWA) meta-analysis of 12,386 PD cases and 21,026 controls conducted by the International Parkinson's Disease Genomics Consortium (IPDGC) discovered or confirmed 11 Parkinson's disease (PD) loci. This first analysis of the two-stage IPDGC study

  5. Accuracy of the One-Stage and Two-Stage Impression Techniques: A Comparative Analysis.

    Science.gov (United States)

    Jamshidy, Ladan; Mozaffari, Hamid Reza; Faraji, Payam; Sharifi, Roohollah

    2016-01-01

    Introduction . One of the main steps of impression is the selection and preparation of an appropriate tray. Hence, the present study aimed to analyze and compare the accuracy of one- and two-stage impression techniques. Materials and Methods . A resin laboratory-made model, as the first molar, was prepared by standard method for full crowns with processed preparation finish line of 1 mm depth and convergence angle of 3-4°. Impression was made 20 times with one-stage technique and 20 times with two-stage technique using an appropriate tray. To measure the marginal gap, the distance between the restoration margin and preparation finish line of plaster dies was vertically determined in mid mesial, distal, buccal, and lingual (MDBL) regions by a stereomicroscope using a standard method. Results . The results of independent test showed that the mean value of the marginal gap obtained by one-stage impression technique was higher than that of two-stage impression technique. Further, there was no significant difference between one- and two-stage impression techniques in mid buccal region, but a significant difference was reported between the two impression techniques in MDL regions and in general. Conclusion . The findings of the present study indicated higher accuracy for two-stage impression technique than for the one-stage impression technique.

  6. Radar Derived Spatial Statistics of Summer Rain. Volume 2; Data Reduction and Analysis

    Science.gov (United States)

    Konrad, T. G.; Kropfli, R. A.

    1975-01-01

    Data reduction and analysis procedures are discussed along with the physical and statistical descriptors used. The statistical modeling techniques are outlined and examples of the derived statistical characterization of rain cells in terms of the several physical descriptors are presented. Recommendations concerning analyses which can be pursued using the data base collected during the experiment are included.

  7. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Science.gov (United States)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-06-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  8. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    International Nuclear Information System (INIS)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-01-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  9. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Energy Technology Data Exchange (ETDEWEB)

    Glascock, M. D.; Neff, H. [University of Missouri, Research Reactor Center (United States); Vaughn, K. J. [Pacific Lutheran University, Department of Anthropology (United States)

    2004-06-15

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  10. Public and patient involvement in quantitative health research: A statistical perspective.

    Science.gov (United States)

    Hannigan, Ailish

    2018-06-19

    The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.

  11. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  12. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to be in statistical control. Significant deviations between analytical results from different laboratories reveal the presence of systematic errors, and agreement between different laboratories indicate the absence of systematic errors. This statistical approach, referred to as the analysis of precision, was applied...

  13. Analysis of factors responsible for the image in early stage emphysema and research concerning the diagnosis

    International Nuclear Information System (INIS)

    Nakanishi, Hirotaka

    1998-01-01

    To clarify the utility of the CT image to a clinical diagnosis of the early stage emphysema, the relation of CT value to the level of the lung destruction, the change in the lung density and pulmonary function was examined. Experimental pulmonary emphysema model in canine was produced by inhalation of aerosolized papain solution. In this model, the relationship between the destruction in lung tissues and the analysis of CT images was investigated. Changes in the alveolar surface area per unit lung volume well reflected those in mean CT value in the lung parenchyma. Also, it was clarified that the degree of the lung destruction in this model corresponded to that in patients with the early stage emphysema. Mean CT value in the area that formed lowest 5th percentile of the CT value histogram (mCT (5%ile)) was developed to analyze CT images in emphysema. To develop this study, changes of the mCT (5%ile) at the respiratory level from 5% to 95% inspiratory vital capacity (mCT (5%ile (5-95%VC))) was examined. In experimental studies, there was statistical significance between control and emphysema model. In clinical study using 14 patients with emphysema, the mCT (5%ile (5-95%VC)) reflected well the values of pulmonary function tests which indicated air flow limitation such as %pred. FEV 1.0 and MMF. The present studies demonstrated that it might be useful to detect the pathological and functional impairment in the early stage emphysema by using mCT (5%ile (5-95%VC)). (author)

  14. Maintenance based Bevacizumab versus complete stop or continuous therapy after induction therapy in first line treatment of stage IV colorectal cancer: A meta-analysis of randomized clinical trials.

    Science.gov (United States)

    Tamburini, Emiliano; Rudnas, Britt; Santelmo, Carlotta; Drudi, Fabrizio; Gianni, Lorenzo; Nicoletti, Stefania V L; Ridolfi, Claudio; Tassinari, Davide

    2016-08-01

    In stage IV colorectal cancer, bevacizumab-based maintenance therapy, complete stop therapy and continuous therapy are considered all possible approaches after first line induction chemotherapy. However, there are no clear data about which approach is preferable. All randomized phase III trials comparing bevacizumab-based maintenance therapy (MB) with complete stop therapy (ST) or with continuous therapy (CT) were considered eligible and included into the analysis. Primary endpoint was the Time to failure strategies (TFS). Secondary endpoints were Overall Survival (OS) and Progression free survival (PFS). Meta-analysis was performed in line with the PRISMA statement. 1892 patients of five trials were included into the analysis. A significant improvement in TFS (HR 0.79; CI 95% 0.7-0.9 p=0.0005) and PFS (HR 0.56; CI 95% 0.44-0.71 p<0.00001) were observed in favour of MB versus ST. A trend, but not statistically significant, in favour of MB versus ST was also observed for OS (HR 0.88; CI 95% 0.77-1.01, p=0.08). Comparing maintenance therapy versus continuous therapy no statistically differences were observed in the outcomes evaluated (OS 12 months OR 1.1 p=0.62, OS 24 months OR 1 p=1, OS 36 months OR 0.54 p=0.3, TFS 12 months OR 0.76 p=0.65). Our meta-analysis suggests that use of MB approach increases TFS, PFS compared to ST. Although without observing any statistically advantage, it should be highlighted that MB versus ST showed a trend in favour of MB. We observed no difference between MB and CT. MB should be considered the standard regimen in patients with stage IV colorectal cancer after first line induction therapy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Statistical analysis of the BOIL program in RSYST-III

    International Nuclear Information System (INIS)

    Beck, W.; Hausch, H.J.

    1978-11-01

    The paper describes a statistical analysis in the RSYST-III program system. Using the example of the BOIL program, it is shown how the effects of inaccurate input data on the output data can be discovered. The existing possibilities of data generation, data handling, and data evaluation are outlined. (orig.) [de

  16. Multivariate statistical analysis of precipitation chemistry in Northwestern Spain

    International Nuclear Information System (INIS)

    Prada-Sanchez, J.M.; Garcia-Jurado, I.; Gonzalez-Manteiga, W.; Fiestras-Janeiro, M.G.; Espada-Rios, M.I.; Lucas-Dominguez, T.

    1993-01-01

    149 samples of rainwater were collected in the proximity of a power station in northwestern Spain at three rainwater monitoring stations. The resulting data are analyzed using multivariate statistical techniques. Firstly, the Principal Component Analysis shows that there are three main sources of pollution in the area (a marine source, a rural source and an acid source). The impact from pollution from these sources on the immediate environment of the stations is studied using Factorial Discriminant Analysis. 8 refs., 7 figs., 11 tabs

  17. Multivariate statistical analysis of precipitation chemistry in Northwestern Spain

    Energy Technology Data Exchange (ETDEWEB)

    Prada-Sanchez, J.M.; Garcia-Jurado, I.; Gonzalez-Manteiga, W.; Fiestras-Janeiro, M.G.; Espada-Rios, M.I.; Lucas-Dominguez, T. (University of Santiago, Santiago (Spain). Faculty of Mathematics, Dept. of Statistics and Operations Research)

    1993-07-01

    149 samples of rainwater were collected in the proximity of a power station in northwestern Spain at three rainwater monitoring stations. The resulting data are analyzed using multivariate statistical techniques. Firstly, the Principal Component Analysis shows that there are three main sources of pollution in the area (a marine source, a rural source and an acid source). The impact from pollution from these sources on the immediate environment of the stations is studied using Factorial Discriminant Analysis. 8 refs., 7 figs., 11 tabs.

  18. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    Science.gov (United States)

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  19. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    Science.gov (United States)

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  20. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    Science.gov (United States)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  1. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    Directory of Open Access Journals (Sweden)

    Sergis Antonis

    2011-01-01

    Full Text Available Abstract This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  2. Accuracy of the One-Stage and Two-Stage Impression Techniques: A Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Ladan Jamshidy

    2016-01-01

    Full Text Available Introduction. One of the main steps of impression is the selection and preparation of an appropriate tray. Hence, the present study aimed to analyze and compare the accuracy of one- and two-stage impression techniques. Materials and Methods. A resin laboratory-made model, as the first molar, was prepared by standard method for full crowns with processed preparation finish line of 1 mm depth and convergence angle of 3-4°. Impression was made 20 times with one-stage technique and 20 times with two-stage technique using an appropriate tray. To measure the marginal gap, the distance between the restoration margin and preparation finish line of plaster dies was vertically determined in mid mesial, distal, buccal, and lingual (MDBL regions by a stereomicroscope using a standard method. Results. The results of independent test showed that the mean value of the marginal gap obtained by one-stage impression technique was higher than that of two-stage impression technique. Further, there was no significant difference between one- and two-stage impression techniques in mid buccal region, but a significant difference was reported between the two impression techniques in MDL regions and in general. Conclusion. The findings of the present study indicated higher accuracy for two-stage impression technique than for the one-stage impression technique.

  3. Modeling of asphalt-rubber rotational viscosity by statistical analysis and neural networks

    Directory of Open Access Journals (Sweden)

    Luciano Pivoto Specht

    2007-03-01

    Full Text Available It is of a great importance to know binders' viscosity in order to perform handling, mixing, application processes and asphalt mixes compaction in highway surfacing. This paper presents the results of viscosity measurement in asphalt-rubber binders prepared in laboratory. The binders were prepared varying the rubber content, rubber particle size, duration and temperature of mixture, all following a statistical design plan. The statistical analysis and artificial neural networks were used to create mathematical models for prediction of the binders viscosity. The comparison between experimental data and simulated results with the generated models showed best performance of the neural networks analysis in contrast to the statistic models. The results indicated that the rubber content and duration of mixture have major influence on the observed viscosity for the considered interval of parameters variation.

  4. Common pitfalls in statistical analysis: Odds versus risk

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh; Pramesh, C. S.

    2015-01-01

    In biomedical research, we are often interested in quantifying the relationship between an exposure and an outcome. “Odds” and “Risk” are the most common terms which are used as measures of association between variables. In this article, which is the fourth in the series of common pitfalls in statistical analysis, we explain the meaning of risk and odds and the difference between the two. PMID:26623395

  5. Statistical Analysis of the Exchange Rate of Bitcoin.

    Directory of Open Access Journals (Sweden)

    Jeffrey Chu

    Full Text Available Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  6. Statistical Analysis of the Exchange Rate of Bitcoin

    Science.gov (United States)

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  7. Analysis of Variance with Summary Statistics in Microsoft® Excel®

    Science.gov (United States)

    Larson, David A.; Hsu, Ko-Cheng

    2010-01-01

    Students regularly are asked to solve Single Factor Analysis of Variance problems given only the sample summary statistics (number of observations per category, category means, and corresponding category standard deviations). Most undergraduate students today use Excel for data analysis of this type. However, Excel, like all other statistical…

  8. The Australasian Resuscitation in Sepsis Evaluation (ARISE) trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony P; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-09-01

    The Australasian Resuscitation in Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the emergency department with severe sepsis. In keeping with current practice, and considering aspects of trial design and reporting specific to non-pharmacological interventions, our plan outlines the principles and methods for analysing and reporting the trial results. The document is prepared before completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and before completion of the two related international studies. Our statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. We reviewed the data collected by the research team as specified in the study protocol and detailed in the study case report form. We describe information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation, other related therapies and other relevant data with appropriate comparisons between groups. We define the primary, secondary and tertiary outcomes for the study, with description of the planned statistical analyses. We have developed a statistical analysis plan with a trial profile, mock-up tables and figures. We describe a plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies and adverse events. We describe the primary, secondary and tertiary outcomes with identification of subgroups to be analysed. We have developed a statistical analysis plan for the ARISE study, available in the public domain, before the completion of recruitment into the study. This will minimise analytical bias and

  9. Statistical Analysis Of Tank 19F Floor Sample Results

    International Nuclear Information System (INIS)

    Harris, S.

    2010-01-01

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  10. Vector-field statistics for the analysis of time varying clinical gait data.

    Science.gov (United States)

    Donnelly, C J; Alexander, C; Pataky, T C; Stannage, K; Reid, S; Robinson, M A

    2017-01-01

    In clinical settings, the time varying analysis of gait data relies heavily on the experience of the individual(s) assessing these biological signals. Though three dimensional kinematics are recognised as time varying waveforms (1D), exploratory statistical analysis of these data are commonly carried out with multiple discrete or 0D dependent variables. In the absence of an a priori 0D hypothesis, clinicians are at risk of making type I and II errors in their analyis of time varying gait signatures in the event statistics are used in concert with prefered subjective clinical assesment methods. The aim of this communication was to determine if vector field waveform statistics were capable of providing quantitative corroboration to practically significant differences in time varying gait signatures as determined by two clinically trained gait experts. The case study was a left hemiplegic Cerebral Palsy (GMFCS I) gait patient following a botulinum toxin (BoNT-A) injection to their left gastrocnemius muscle. When comparing subjective clinical gait assessments between two testers, they were in agreement with each other for 61% of the joint degrees of freedom and phases of motion analysed. For tester 1 and tester 2, they were in agreement with the vector-field analysis for 78% and 53% of the kinematic variables analysed. When the subjective analyses of tester 1 and tester 2 were pooled together and then compared to the vector-field analysis, they were in agreement for 83% of the time varying kinematic variables analysed. These outcomes demonstrate that in principle, vector-field statistics corroborates with what a team of clinical gait experts would classify as practically meaningful pre- versus post time varying kinematic differences. The potential for vector-field statistics to be used as a useful clinical tool for the objective analysis of time varying clinical gait data is established. Future research is recommended to assess the usefulness of vector-field analyses

  11. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    Science.gov (United States)

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  12. Introduction to statistics and data analysis with exercises, solutions and applications in R

    CERN Document Server

    Heumann, Christian; Shalabh

    2016-01-01

    This introductory statistics textbook conveys the essential concepts and tools needed to develop and nurture statistical thinking. It presents descriptive, inductive and explorative statistical methods and guides the reader through the process of quantitative data analysis. In the experimental sciences and interdisciplinary research, data analysis has become an integral part of any scientific study. Issues such as judging the credibility of data, analyzing the data, evaluating the reliability of the obtained results and finally drawing the correct and appropriate conclusions from the results are vital. The text is primarily intended for undergraduate students in disciplines like business administration, the social sciences, medicine, politics, macroeconomics, etc. It features a wealth of examples, exercises and solutions with computer code in the statistical programming language R as well as supplementary material that will enable the reader to quickly adapt all methods to their own applications.

  13. A Two-Stage Meta-Analysis Identifies Several New Loci for Parkinson's Disease

    NARCIS (Netherlands)

    Plagnol, Vincent; Nalls, Michael A.; Bras, Jose M.; Hernandez, Dena G.; Sharma, Manu; Sheerin, Una-Marie; Saad, Mohamad; Simon-Sanchez, Javier; Schulte, Claudia; Lesage, Suzanne; Sveinbjornsdottir, Sigurlaug; Amouyel, Philippe; Arepalli, Sampath; Band, Gavin; Barker, Roger A.; Bellinguez, Celine; Ben-Shlomo, Yoav; Berendse, Henk W.; Berg, Daniela; Bhatia, Kailash; de Bie, Rob M. A.; Biffi, Alessandro; Bloem, Bas; Bochdanovits, Zoltan; Bonin, Michael; Brockmann, Kathrin; Brooks, Janet; Burn, David J.; Charlesworth, Gavin; Chen, Honglei; Chinnery, Patrick F.; Chong, Sean; Clarke, Carl E.; Cookson, Mark R.; Cooper, J. Mark; Corvol, Jean Christophe; Counsell, Carl; Damier, Philippe; Dartigues, Jean-Francois; Deloukas, Panos; Deuschl, Guenther; Dexter, David T.; van Dijk, Karin D.; Dillman, Allissa; Durif, Frank; Duerr, Alexandra; Edkins, Sarah; Evans, Jonathan R.; Foltynie, Thomas; Freeman, Colin; Gao, Jianjun; Gardner, Michelle; Gibbs, J. Raphael; Goate, Alison; Gray, Emma; Guerreiro, Rita; Gustafsson, Omar; Harris, Clare; Hellenthal, Garrett; van Hilten, Jacobus J.; Hofman, Albert; Hollenbeck, Albert; Holton, Janice; Hu, Michele; Huang, Xuemei; Huber, Heiko; Hudson, Gavin; Hunt, Sarah E.; Huttenlocher, Johanna; Illig, Thomas; Jonsson, Palmi V.; Langford, Cordelia; Lees, Andrew; Lichtner, Peter; Limousin, Patricia; Lopez, Grisel; Lorenz, Delia; McNeill, Alisdair; Moorby, Catriona; Moore, Matthew; Morris, Huw; Morrison, Karen E.; Mudanohwo, Ese; O'Sullivan, Sean S.; Pearson, Justin; Pearson, Richard; Perlmutter, Joel S.; Petursson, Hjoervar; Pirinen, Matti; Pollak, Pierre; Post, Bart; Potter, Simon; Ravina, Bernard; Revesz, Tamas; Riess, Olaf; Rivadeneira, Fernando; Rizzu, Patrizia; Ryten, Mina; Sawcer, Stephen; Schapira, Anthony; Scheffer, Hans; Shaw, Karen; Shoulson, Ira; Sidransky, Ellen; de Silva, Rohan; Smith, Colin; Spencer, Chris C. A.; Stefansson, Hreinn; Steinberg, Stacy; Stockton, Joanna D.; Strange, Amy; Su, Zhan; Talbot, Kevin; Tanner, Carlie M.; Tashakkori-Ghanbaria, Avazeh; Tison, Francois; Trabzuni, Daniah; Traynor, Bryan J.; Uitterlinden, Andre G.; Vandrovcova, Jana; Velseboer, Daan; Vidailhet, Marie; Vukcevic, Damjan; Walker, Robert; van de Warrenburg, Bart; Weale, Michael E.; Wickremaratchi, Mirdhu; Williams, Nigel; Williams-Gray, Caroline H.; Winder-Rhodes, Sophie; Stefansson, Kari; Martinez, Maria; Donnelly, Peter; Singleton, Andrew B.; Hardy, John; Heutink, Peter; Brice, Alexis; Gasser, Thomas; Wood, Nicholas W.

    2011-01-01

    A previous genome-wide association (GWA) meta-analysis of 12,386 PD cases and 21,026 controls conducted by the International Parkinson's Disease Genomics Consortium (IPDGC) discovered or confirmed 11 Parkinson's disease (PD) loci. This first analysis of the two-stage IPDGC study focused on the set

  14. Aerodynamic Analysis and Three-Dimensional Redesign of a Multi-Stage Axial Flow Compressor

    Directory of Open Access Journals (Sweden)

    Tao Ning

    2016-04-01

    Full Text Available This paper describes the introduction of three-dimension (3-D blade designs into a 5-stage axial compressor with multi-stage computational fluid dynamic (CFD methods. Prior to a redesign, a validation study is conducted for the overall performance and flow details based on full-scale test data, proving that the multi-stage CFD applied is a relatively reliable tool for the analysis of the follow-up redesign. Furthermore, at the near stall point, the aerodynamic analysis demonstrates that significant separation exists in the last stator, leading to the aerodynamic redesign, which is the focus of the last stator. Multi-stage CFD methods are applied throughout the three-dimensional redesign process for the last stator to explore their aerodynamic improvement potential. An unconventional asymmetric bow configuration incorporated with leading edge re-camber and re-solidity is employed to reduce the high loss region dominated by the mainstream. The final redesigned version produces a 13% increase in the stall margin while maintaining the efficiency at the design point.

  15. Process analysis and mechanism of multi-stage hydropyrolysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Li, W.; Wang, N.; Li, B.Q. [Chinese Academy of Science, Taiyuan (China). Inst. of Coal Chemistry, State Key Laboratory of Coal Conversion

    2002-07-01

    The mechanism of multi-stage hydropyrolysis of coal was probed through detailed analysis of products of hydropyrolysis with different holding methods. The results showed that the holding method significantly affects the product distributions, thus making an apparent difference in hydrogen utilization efficiency. The holding temperature should be about 350-500{degree}C during which more free radicals are produced rapidly. Pore-riched structures are formed at the holding stage at 350{degree}C due to the evolution of large amount of volatiles, which is favorable to the subsequent hydrogenation reaction. The holding at a low temperature favors the reaction of hydrogen with oxygen-containing groups, leading to the formation of phenol and avoiding the formation of water at a high temperature. The cleavage of chemical bonds in the char is mainly dependent-on the pyrolysis temperature. The effect of holding stage is to change the distribution and components of products via stabilizing the free radicals and hydrogenating the heavier products.

  16. Performance of an iterative two-stage bayesian technique for population pharmacokinetic analysis of rich data sets

    NARCIS (Netherlands)

    Proost, Johannes H.; Eleveld, Douglas J.

    2006-01-01

    Purpose. To test the suitability of an Iterative Two-Stage Bayesian (ITSB) technique for population pharmacokinetic analysis of rich data sets, and to compare ITSB with Standard Two-Stage (STS) analysis and nonlinear Mixed Effect Modeling (MEM). Materials and Methods. Data from a clinical study with

  17. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system

  18. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1975-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system. 19 references

  19. RESEARCH OF THE DATA BANK OF STATISTICAL ANALYSIS OF THE ADVERTISING MARKET

    Directory of Open Access Journals (Sweden)

    Ekaterina F. Devochkina

    2014-01-01

    Full Text Available The article contains the description of the process of making statistical accounting of the Russian advertising market. The author pays attention to the forms of state statistical accounting of different years, marks their different features and shortage. Also the article contains analysis of alternative sources of numerical information of Russian advertising market.

  20. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  1. Halo statistics analysis within medium volume cosmological N-body simulation

    Directory of Open Access Journals (Sweden)

    Martinović N.

    2015-01-01

    Full Text Available In this paper we present halo statistics analysis of a ΛCDM N body cosmological simulation (from first halo formation until z = 0. We study mean major merger rate as a function of time, where for time we consider both per redshift and per Gyr dependence. For latter we find that it scales as the well known power law (1 + zn for which we obtain n = 2.4. The halo mass function and halo growth function are derived and compared both with analytical and empirical fits. We analyse halo growth through out entire simulation, making it possible to continuously monitor evolution of halo number density within given mass ranges. The halo formation redshift is studied exploring possibility for a new simple preliminary analysis during the simulation run. Visualization of the simulation is portrayed as well. At redshifts z = 0−7 halos from simulation have good statistics for further analysis especially in mass range of 1011 − 1014 M./h. [176021 ’Visible and invisible matter in nearby galaxies: theory and observations

  2. Practical Statistics for Environmental and Biological Scientists

    CERN Document Server

    Townend, John

    2012-01-01

    All students and researchers in environmental and biological sciences require statistical methods at some stage of their work. Many have a preconception that statistics are difficult and unpleasant and find that the textbooks available are difficult to understand. Practical Statistics for Environmental and Biological Scientists provides a concise, user-friendly, non-technical introduction to statistics. The book covers planning and designing an experiment, how to analyse and present data, and the limitations and assumptions of each statistical method. The text does not refer to a specific comp

  3. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  4. Birth tourism: socio-demographic and statistical aspects

    Directory of Open Access Journals (Sweden)

    Anatoly V. Korotkov

    2016-01-01

    Full Text Available The purpose of the study is to research birth tourism issue. The article gives the socio-demographic and statistical aspects of research problems of birth inbound tourism in the Russian Federation. Following the literature analysis, the degree of study for birth tourism lags behind its actual size. Currently, the media has accumulated a significant amount of information on birth tourism in Russia, that requires processing, systematization and understanding that can and should become an independent area of study of sociologists and demographers to develop recommendations for the management of socio-demographic processes in birth tourism in our country. It is necessary to identify the problems that will inevitably arise. At present, this process is almost not regulated.These problems are complex, it requires the joint efforts of sociologists and demographers. However, it is impossible to obtain reliable results and to develop management decisions without attention to the statistical aspect of this problem. It is necessary to create methodological support for collecting and information processing and model development of the birth tourism. At the initial stage it is necessary to identify the direction and objectives of the analysis to determine the factors in the development of this process, to develop a hierarchical system of statistical indicators, to receive the information, needed for calculating of specific indicators.The complex research of the birth tourism issues should be based on the methodology of sociology, demography and statistics, including statistical observation, interviews with residents, structure analysis and birth tourism concentration in the country, the analysis of the dynamics, classification of factors and reasons, the grouping of regions for the development of the studied processes and, of course, the development of economic-statistical indicators.The article reveals the problem of the significant influence of the

  5. Dental and Chronological Ages as Determinants of Peak Growth Period and Its Relationship with Dental Calcification Stages.

    Science.gov (United States)

    Litsas, George; Lucchese, Alessandra

    2016-01-01

    To investigate the relationship between dental, chronological, and cervical vertebral maturation growth in the peak growth period, as well as to study the association between the dental calcification phases and the skeletal maturity stages during the same growth period. Subjects were selected from orthodontic pre-treatment cohorts consisting of 420 subjects where 255 were identified and enrolled into the study, comprising 145 girls and 110 boys. The lateral cephalometric and panoramic radiographs were examined from the archives of the Department of Orthodontics, Aristotle University of Thessaloniki, Greece. Dental age was assessed according to the method of Demirjian, and skeletal maturation according to the Cervical Vertebral Maturation Method. Statistical elaboration included Spearman Brown formula, descriptive statistics, Pearson's correlation coefficient and regression analysis, paired samples t-test, and Spearman's rho correlation coefficient. Chronological and dental age showed a high correlation for both gender(r =0.741 for boys, r = 0.770 for girls, pStage IV for both males (r=0.554) and females (r=0.68). The lowest correlation was for the CVM Stage III in males (r=0.433, pStage II in females (r=0.393, p>0.001). The t-test revealed statistically significant differences between these variables (pstages was determined. The second molars showed the highest correlation with CVM stages (CVMS) (r= 0.65 for boys, r = 0.72 for girls). Dental age was more advanced than chronological for both boys and girls for all CVMS. During the peak period these differences were more pronounced. Moreover, all correlations between skeletal and dental stages were statistically significant. The second molars showed the highest correlation whereas the canines showed the lowest correlation for both gender.

  6. A coupling model for the two-stage core calculation method with subchannel analysis for boiling water reactors

    International Nuclear Information System (INIS)

    Mitsuyasu, Takeshi; Aoyama, Motoo; Yamamoto, Akio

    2017-01-01

    Highlights: • A coupling model of the two-stage core calculation with subchannel analysis. • BWR fuel assembly parameters are assumed and verified. • The model was evaluated for heterogeneous problems. - Abstract: The two-stage core analysis method is widely used for BWR core analysis. The purpose of this study is to develop a core analysis model coupled with subchannel analysis within the two-stage calculation scheme using an assembly-based thermal-hydraulics calculation in the core analysis. The model changes the 2D lattice physics scheme, and couples with 3D subchannel analysis which evaluates the thermal-hydraulics characteristics within the coolant flow area divided as some subchannel regions. In order to couple with these two analyses, some BWR fuel assembly parameters are assumed and verified. The developed model is evaluated for the heterogeneous problem with and without a control rod. The present model is especially effective for the control rod inserted condition. The present model can incorporate the subchannel effect into the current two-stage core calculation method.

  7. Automatic staging of bladder cancer on CT urography

    Science.gov (United States)

    Garapati, Sankeerth S.; Hadjiiski, Lubomir M.; Cha, Kenny H.; Chan, Heang-Ping; Caoili, Elaine M.; Cohan, Richard H.; Weizer, Alon; Alva, Ajjai; Paramagul, Chintana; Wei, Jun; Zhou, Chuan

    2016-03-01

    Correct staging of bladder cancer is crucial for the decision of neoadjuvant chemotherapy treatment and minimizing the risk of under- or over-treatment. Subjectivity and variability of clinicians in utilizing available diagnostic information may lead to inaccuracy in staging bladder cancer. An objective decision support system that merges the information in a predictive model based on statistical outcomes of previous cases and machine learning may assist clinicians in making more accurate and consistent staging assessments. In this study, we developed a preliminary method to stage bladder cancer. With IRB approval, 42 bladder cancer cases with CTU scans were collected from patient files. The cases were classified into two classes based on pathological stage T2, which is the decision threshold for neoadjuvant chemotherapy treatment (i.e. for stage >=T2) clinically. There were 21 cancers below stage T2 and 21 cancers at stage T2 or above. All 42 lesions were automatically segmented using our auto-initialized cascaded level sets (AI-CALS) method. Morphological features were extracted, which were selected and merged by linear discriminant analysis (LDA) classifier. A leave-one-case-out resampling scheme was used to train and test the classifier using the 42 lesions. The classification accuracy was quantified using the area under the ROC curve (Az). The average training Az was 0.97 and the test Az was 0.85. The classifier consistently selected the lesion volume, a gray level feature and a contrast feature. This predictive model shows promise for assisting in assessing the bladder cancer stage.

  8. Information and Analysis System Stages of Family Welfare in District Balong

    Directory of Open Access Journals (Sweden)

    Eka Arynda Ayu

    2017-05-01

    Full Text Available Badan Kependudukan dan Keluarga Berencana Nasional (BKKBN is a family that formed due to legal marriage, is able to meet the needs of the spiritual and the material that is decent, devoted to God Almighty, have a relationship that is harmonious and balanced between members and between families with the community and the environment. Every year the government to collect data on the status of a prosperous family stage where the purpose of the data collection is in the framework of development and poverty alleviation programs. Data collection process in the District Balong is still done manually so that the risk of error in determining the status of a family stage could happen. Information and analysis system of status stages of family welfare is designed to make web-based officers in the input data and determine the status of a prosperous family stages based on selected indicators of the sheet R/1/KS. Sample of data from Bulukidul village and sub-district village of Balong Ngraket 2014. Results of the system in the form of data reports the results of process steps and the results can be viewed in graphical form. Comparison chart to show the status of the highest percentage of poor welfare families stages. Instead lowest percentage shows the stages of a prosperous family able or rich. 

  9. Classification of Grassland Successional Stages Using Airborne Hyperspectral Imagery

    Directory of Open Access Journals (Sweden)

    Thomas Möckel

    2014-08-01

    Full Text Available Plant communities differ in their species composition, and, thus, also in their functional trait composition, at different stages in the succession from arable fields to grazed grassland. We examine whether aerial hyperspectral (414–2501 nm remote sensing can be used to discriminate between grazed vegetation belonging to different grassland successional stages. Vascular plant species were recorded in 104.1 m2 plots on the island of Öland (Sweden and the functional properties of the plant species recorded in the plots were characterized in terms of the ground-cover of grasses, specific leaf area and Ellenberg indicator values. Plots were assigned to three different grassland age-classes, representing 5–15, 16–50 and >50 years of grazing management. Partial least squares discriminant analysis models were used to compare classifications based on aerial hyperspectral data with the age-class classification. The remote sensing data successfully classified the plots into age-classes: the overall classification accuracy was higher for a model based on a pre-selected set of wavebands (85%, Kappa statistic value = 0.77 than one using the full set of wavebands (77%, Kappa statistic value = 0.65. Our results show that nutrient availability and grass cover differences between grassland age-classes are detectable by spectral imaging. These techniques may potentially be used for mapping the spatial distribution of grassland habitats at different successional stages.

  10. Single-stage laparoscopic common bile duct exploration and cholecystectomy versus two-stage endoscopic stone extraction followed by laparoscopic cholecystectomy for patients with gallbladder stones with common bile duct stones: systematic review and meta-analysis of randomized trials with trial sequential analysis.

    Science.gov (United States)

    Singh, Anand Narayan; Kilambi, Ragini

    2018-03-30

    The ideal management of common bile duct (CBD) stones associated with gall stones is a matter of debate. We planned a meta-analysis of randomized trials comparing single-stage laparoscopic CBD exploration and cholecystectomy (LCBDE) with two-stage preoperative endoscopic stone extraction followed by cholecystectomy (ERCP + LC). We searched the Pubmed/Medline, Web of science, Science citation index, Google scholar and Cochrane Central Register of Controlled trials electronic databases till June 2017 for all English language randomized trials comparing the two approaches. Statistical analysis was performed using Review Manager (RevMan) [Computer program], Version 5.3. Copenhagen: The Nordic Cochrane Centre, The Cochrane Collaboration, 2014 and results were expressed as odds ratio for dichotomous variables and mean difference for continuous. p value ≤ 0.05 was considered significant. Trial sequential analysis (TSA) was performed using TSA version 0.9.5.5 (Copenhagen: The Copenhagen Trial Unit, Centre for Clinical Intervention Research, 2016). PROSPERO trial registration number is CRD42017074673. A total of 11 trials were included in the analysis, with a total of 1513 patients (751-LCBDE; 762-ERCP + LC). LCBDE was found to have significantly lower rates of technical failure [OR 0.59, 95% CI (0.38, 0.93), p = 0.02] and shorter hospital stay [MD - 1.63, 95% CI (- 3.23, - 0.03), p = 0.05]. There was no significant difference in mortality [OR 0.37, 95% CI (0.09, 1.51), p = 0.17], morbidity [OR 0.97, 95% CI (0.70, 1.33), p = 0.84], cost [MD - 379.13, 95% CI (- 784.80, 111.2), p = 0.13] or recurrent/retained stones [OR 1.01, 95% CI (0.38, 2.73), p = 0.98]. TSA showed that although the Z-curve crossed the boundaries of conventional significance, the estimated information size is yet to be achieved. Single-stage LCBDE is superior to ERCP + LC in terms of technical success and shorter hospital stay in good-risk patients with

  11. Short-run and Current Analysis Model in Statistics

    Directory of Open Access Journals (Sweden)

    Constantin Anghelache

    2006-01-01

    Full Text Available Using the short-run statistic indicators is a compulsory requirement implied in the current analysis. Therefore, there is a system of EUROSTAT indicators on short run which has been set up in this respect, being recommended for utilization by the member-countries. On the basis of these indicators, there are regular, usually monthly, analysis being achieved in respect of: the production dynamic determination; the evaluation of the short-run investment volume; the development of the turnover; the wage evolution: the employment; the price indexes and the consumer price index (inflation; the volume of exports and imports and the extent to which the imports are covered by the exports and the sold of trade balance. The EUROSTAT system of indicators of conjuncture is conceived as an open system, so that it can be, at any moment extended or restricted, allowing indicators to be amended or even removed, depending on the domestic users requirements as well as on the specific requirements of the harmonization and integration. For the short-run analysis, there is also the World Bank system of indicators of conjuncture, which is utilized, relying on the data sources offered by the World Bank, The World Institute for Resources or other international organizations statistics. The system comprises indicators of the social and economic development and focuses on the indicators for the following three fields: human resources, environment and economic performances. At the end of the paper, there is a case study on the situation of Romania, for which we used all these indicators.

  12. Short-run and Current Analysis Model in Statistics

    Directory of Open Access Journals (Sweden)

    Constantin Mitrut

    2006-03-01

    Full Text Available Using the short-run statistic indicators is a compulsory requirement implied in the current analysis. Therefore, there is a system of EUROSTAT indicators on short run which has been set up in this respect, being recommended for utilization by the member-countries. On the basis of these indicators, there are regular, usually monthly, analysis being achieved in respect of: the production dynamic determination; the evaluation of the short-run investment volume; the development of the turnover; the wage evolution: the employment; the price indexes and the consumer price index (inflation; the volume of exports and imports and the extent to which the imports are covered by the exports and the sold of trade balance. The EUROSTAT system of indicators of conjuncture is conceived as an open system, so that it can be, at any moment extended or restricted, allowing indicators to be amended or even removed, depending on the domestic users requirements as well as on the specific requirements of the harmonization and integration. For the short-run analysis, there is also the World Bank system of indicators of conjuncture, which is utilized, relying on the data sources offered by the World Bank, The World Institute for Resources or other international organizations statistics. The system comprises indicators of the social and economic development and focuses on the indicators for the following three fields: human resources, environment and economic performances. At the end of the paper, there is a case study on the situation of Romania, for which we used all these indicators.

  13. Statistical analysis of proteomics, metabolomics, and lipidomics data using mass spectrometry

    CERN Document Server

    Mertens, Bart

    2017-01-01

    This book presents an overview of computational and statistical design and analysis of mass spectrometry-based proteomics, metabolomics, and lipidomics data. This contributed volume provides an introduction to the special aspects of statistical design and analysis with mass spectrometry data for the new omic sciences. The text discusses common aspects of design and analysis between and across all (or most) forms of mass spectrometry, while also providing special examples of application with the most common forms of mass spectrometry. Also covered are applications of computational mass spectrometry not only in clinical study but also in the interpretation of omics data in plant biology studies. Omics research fields are expected to revolutionize biomolecular research by the ability to simultaneously profile many compounds within either patient blood, urine, tissue, or other biological samples. Mass spectrometry is one of the key analytical techniques used in these new omic sciences. Liquid chromatography mass ...

  14. Market-stage analysis enhances strategic planning.

    Science.gov (United States)

    McDonald, R B

    1998-07-01

    Changing market conditions are challenging healthcare organizations to determine how to allocate resources and make operational planning decisions to prepare for future changes. A vital part of meeting these challenges is understanding the impact of market stages, and using that knowledge to build effective business strategies. Financial modeling that includes market-stage information provides insight into market opportunities and presents a clearer picture of the organizational changes that will need to be implemented at each stage. Effective strategic action should take into account critical success factors in market responsiveness, organizational responsiveness, operational effectiveness, and financial strength.

  15. Three-Dimensional Assembly Tolerance Analysis Based on the Jacobian-Torsor Statistical Model

    Directory of Open Access Journals (Sweden)

    Peng Heping

    2017-01-01

    Full Text Available The unified Jacobian-Torsor model has been developed for deterministic (worst case tolerance analysis. This paper presents a comprehensive model for performing statistical tolerance analysis by integrating the unified Jacobian-Torsor model and Monte Carlo simulation. In this model, an assembly is sub-divided into surfaces, the Small Displacements Torsor (SDT parameters are used to express the relative position between any two surfaces of the assembly. Then, 3D dimension-chain can be created by using a surface graph of the assembly and the unified Jacobian-Torsor model is developed based on the effect of each functional element on the whole functional requirements of products. Finally, Monte Carlo simulation is implemented for the statistical tolerance analysis. A numerical example is given to demonstrate the capability of the proposed method in handling three-dimensional assembly tolerance analysis.

  16. SAS and R data management, statistical analysis, and graphics

    CERN Document Server

    Kleinman, Ken

    2009-01-01

    An All-in-One Resource for Using SAS and R to Carry out Common TasksProvides a path between languages that is easier than reading complete documentationSAS and R: Data Management, Statistical Analysis, and Graphics presents an easy way to learn how to perform an analytical task in both SAS and R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation. The book covers many common tasks, such as data management, descriptive summaries, inferential procedures, regression analysis, and the creation of graphics, along with more complex applicat

  17. Statistical methods for data analysis in particle physics

    CERN Document Server

    AUTHOR|(CDS)2070643

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  18. Statistical physics

    CERN Document Server

    Guénault, Tony

    2007-01-01

    In this revised and enlarged second edition of an established text Tony Guénault provides a clear and refreshingly readable introduction to statistical physics, an essential component of any first degree in physics. The treatment itself is self-contained and concentrates on an understanding of the physical ideas, without requiring a high level of mathematical sophistication. A straightforward quantum approach to statistical averaging is adopted from the outset (easier, the author believes, than the classical approach). The initial part of the book is geared towards explaining the equilibrium properties of a simple isolated assembly of particles. Thus, several important topics, for example an ideal spin-½ solid, can be discussed at an early stage. The treatment of gases gives full coverage to Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein statistics. Towards the end of the book the student is introduced to a wider viewpoint and new chapters are included on chemical thermodynamics, interactions in, for exam...

  19. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    Science.gov (United States)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  20. 14 CFR 298.61 - Reporting of traffic statistics.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Reporting of traffic statistics. 298.61... Requirements § 298.61 Reporting of traffic statistics. (a) Each commuter air carrier and small certificated air... statistics shall be compiled in terms of each flight stage as actually performed. The detail T-100 data shall...

  1. Statistical strategies to reveal potential vibrational markers for in vivo analysis by confocal Raman spectroscopy

    Science.gov (United States)

    Oliveira Mendes, Thiago de; Pinto, Liliane Pereira; Santos, Laurita dos; Tippavajhala, Vamshi Krishna; Téllez Soto, Claudio Alberto; Martin, Airton Abrahão

    2016-07-01

    The analysis of biological systems by spectroscopic techniques involves the evaluation of hundreds to thousands of variables. Hence, different statistical approaches are used to elucidate regions that discriminate classes of samples and to propose new vibrational markers for explaining various phenomena like disease monitoring, mechanisms of action of drugs, food, and so on. However, the technical statistics are not always widely discussed in applied sciences. In this context, this work presents a detailed discussion including the various steps necessary for proper statistical analysis. It includes univariate parametric and nonparametric tests, as well as multivariate unsupervised and supervised approaches. The main objective of this study is to promote proper understanding of the application of various statistical tools in these spectroscopic methods used for the analysis of biological samples. The discussion of these methods is performed on a set of in vivo confocal Raman spectra of human skin analysis that aims to identify skin aging markers. In the Appendix, a complete routine of data analysis is executed in a free software that can be used by the scientific community involved in these studies.

  2. A method for statistical steady state thermal analysis of reactor cores

    International Nuclear Information System (INIS)

    Whetton, P.A.

    1980-01-01

    This paper presents a method for performing a statistical steady state thermal analysis of a reactor core. The technique is only outlined here since detailed thermal equations are dependent on the core geometry. The method has been applied to a pressurised water reactor core and the results are presented for illustration purposes. Random hypothetical cores are generated using the Monte-Carlo method. The technique shows that by splitting the parameters into two types, denoted core-wise and in-core, the Monte Carlo method may be used inexpensively. The idea of using extremal statistics to characterise the low probability events (i.e. the tails of a distribution) is introduced together with a method of forming the final probability distribution. After establishing an acceptable probability of exceeding a thermal design criterion, the final probability distribution may be used to determine the corresponding thermal response value. If statistical and deterministic (i.e. conservative) thermal response values are compared, information on the degree of pessimism in the deterministic method of analysis may be inferred and the restrictive performance limitations imposed by this method relieved. (orig.)

  3. Statistical analysis of first period of operation of FTU Tokamak

    International Nuclear Information System (INIS)

    Crisanti, F.; Apruzzese, G.; Frigione, D.; Kroegler, H.; Lovisetto, L.; Mazzitelli, G.; Podda, S.

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted

  4. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  5. An Analysis of Kohlberg's "Stage 4 1/2" within an Enhanced Framework of Moral Stages.

    Science.gov (United States)

    Minnameier, Gerhard

    This paper discusses a well-known problem of stage categorization within Kohlberg's theory of moral stages (L. Kohlberg, 1973), that of "Stage 4 1/2." Some subjects previously scored at stage 4 in Kohlberg's framework took on some characteristics of stage 2 reasoning, which suggested the possibility of regression. To reconcile this…

  6. Using R and RStudio for data management, statistical analysis and graphics

    CERN Document Server

    Horton, Nicholas J

    2015-01-01

    This is the second edition of the popular book on using R for statistical analysis and graphics. The authors, who run a popular blog supplementing their books, have focused on adding many new examples to this new edition. These examples are presented primarily in new chapters based on the following themes: simulation, probability, statistics, mathematics/computing, and graphics. The authors have also added many other updates, including a discussion of RStudio-a very popular development environment for R.

  7. Statistical analysis of absorptive laser damage in dielectric thin films

    International Nuclear Information System (INIS)

    Budgor, A.B.; Luria-Budgor, K.F.

    1978-01-01

    The Weibull distribution arises as an example of the theory of extreme events. It is commonly used to fit statistical data arising in the failure analysis of electrical components and in DC breakdown of materials. This distribution is employed to analyze time-to-damage and intensity-to-damage statistics obtained when irradiating thin film coated samples of SiO 2 , ZrO 2 , and Al 2 O 3 with tightly focused laser beams. The data used is furnished by Milam. The fit to the data is excellent; and least squared correlation coefficients greater than 0.9 are often obtained

  8. Multivariate statistical analysis of a multi-step industrial processes

    DEFF Research Database (Denmark)

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  9. Statistical analysis of failure time in stress corrosion cracking of fuel tube in light water reactor

    International Nuclear Information System (INIS)

    Hirao, Keiichi; Yamane, Toshimi; Minamino, Yoritoshi

    1991-01-01

    This report is to show how the life due to stress corrosion cracking breakdown of fuel cladding tubes is evaluated by applying the statistical techniques to that examined by a few testing methods. The statistical distribution of the limiting values of constant load stress corrosion cracking life, the statistical analysis by making the probabilistic interpretation of constant load stress corrosion cracking life, and the statistical analysis of stress corrosion cracking life by the slow strain rate test (SSRT) method are described. (K.I.)

  10. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    Directory of Open Access Journals (Sweden)

    Marco Aldinucci

    2014-01-01

    Full Text Available The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  11. On designing multicore-aware simulators for systems biology endowed with OnLine statistics.

    Science.gov (United States)

    Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  12. Implementation and statistical analysis of Metropolis algorithm for SU(3)

    International Nuclear Information System (INIS)

    Katznelson, E.; Nobile, A.

    1984-12-01

    In this paper we study the statistical properties of an implementation of the Metropolis algorithm for SU(3) gauge theory. It is shown that the results have normal distribution. We demonstrate that in this case error analysis can be carried on in a simple way and we show that applying it to both the measurement strategy and the output data analysis has an important influence on the performance and reliability of the simulation. (author)

  13. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  14. Statistical mechanical analysis of LMFBR fuel cladding tubes

    International Nuclear Information System (INIS)

    Poncelet, J.-P.; Pay, A.

    1977-01-01

    The most important design requirement on fuel pin cladding for LMFBR's is its mechanical integrity. Disruptive factors include internal pressure from mixed oxide fuel fission gas release, thermal stresses and high temperature creep, neutron-induced differential void-swelling as a source of stress in the cladding and irradiation creep of stainless steel material, corrosion by fission products. Under irradiation these load-restraining mechanisms are accentuated by stainless steel embrittlement and strength alterations. To account for the numerous uncertainties involved in the analysis by theoretical models and computer codes statistical tools are unavoidably requested, i.e. Monte Carlo simulation methods. Thanks to these techniques, uncertainties in nominal characteristics, material properties and environmental conditions can be linked up in a correct way and used for a more accurate conceptual design. First, a thermal creep damage index is set up through a sufficiently sophisticated clad physical analysis including arbitrary time dependence of power and neutron flux as well as effects of sodium temperature, burnup and steel mechanical behavior. Although this strain limit approach implies a more general but time consuming model., on the counterpart the net output is improved and e.g. clad temperature, stress and strain maxima may be easily assessed. A full spectrum of variables are statistically treated to account for their probability distributions. Creep damage probability may be obtained and can contribute to a quantitative fuel probability estimation

  15. A robust statistical method for association-based eQTL analysis.

    Directory of Open Access Journals (Sweden)

    Ning Jiang

    Full Text Available It has been well established that theoretical kernel for recently surging genome-wide association study (GWAS is statistical inference of linkage disequilibrium (LD between a tested genetic marker and a putative locus affecting a disease trait. However, LD analysis is vulnerable to several confounding factors of which population stratification is the most prominent. Whilst many methods have been proposed to correct for the influence either through predicting the structure parameters or correcting inflation in the test statistic due to the stratification, these may not be feasible or may impose further statistical problems in practical implementation.We propose here a novel statistical method to control spurious LD in GWAS from population structure by incorporating a control marker into testing for significance of genetic association of a polymorphic marker with phenotypic variation of a complex trait. The method avoids the need of structure prediction which may be infeasible or inadequate in practice and accounts properly for a varying effect of population stratification on different regions of the genome under study. Utility and statistical properties of the new method were tested through an intensive computer simulation study and an association-based genome-wide mapping of expression quantitative trait loci in genetically divergent human populations.The analyses show that the new method confers an improved statistical power for detecting genuine genetic association in subpopulations and an effective control of spurious associations stemmed from population structure when compared with other two popularly implemented methods in the literature of GWAS.

  16. Estimated GFR (eGFR by prediction equation in staging of chronic kidney disease compared to gamma camera GFR

    Directory of Open Access Journals (Sweden)

    Mohammad Masum Alam

    2016-07-01

    Full Text Available Background: Glomerular filtration rate is an effective tool for diagnosis and staging of chronic kidney disease. The effect ofrenal insufficiency by different method of this tool among patients with CKD is controversial.Objective: The objec­tive of this study was to evaluate the performance of eGFR in staging of CKD compared to gamma camera based GFR.Methods: This cross sectional analytical study was conducted in the Department of Biochemistry Bangabandhu Sheikh Mujib Medical University (BSMMU with the collaboration with National Institute of Nuclear Medicine and Allied Sciences, BSMMU during the period of January 2011 to December 2012. Gama camera based GFR was estimated from DTP A reno gram and eGFR was estimated by three prediction equations. Comparison was done by Bland Altman agree­ment test to see the agreement on the measurement of GFR between three equation based eGFR method and gama camera based GFR method. Staging comparison was done by Kappa analysis to see the agreement between the stages identified by those different methods.Results: Bland-Altman agreement analysis between GFR measured by gamma camera, CG equation ,CG equation corrected by BSA and MDRD equation shows statistically significant. CKD stages determined by CG GFR, CG GFR corrected by BSA , MDRD GFR and gamma camera based GFR was compared by Kappa statistical analysis .The kappa value was 0.66, 0.77 and 0.79 respectively.Conclusions: This study findings suggest that GFR estimation by MDRD equation in CKD patients shows good agreement with gamma camera based GFR and for staging of CKD patients, eGFR by MDRD formula may be used as very effective tool in Bangladeshi population.

  17. Statistical and Machine-Learning Data Mining Techniques for Better Predictive Modeling and Analysis of Big Data

    CERN Document Server

    Ratner, Bruce

    2011-01-01

    The second edition of a bestseller, Statistical and Machine-Learning Data Mining: Techniques for Better Predictive Modeling and Analysis of Big Data is still the only book, to date, to distinguish between statistical data mining and machine-learning data mining. The first edition, titled Statistical Modeling and Analysis for Database Marketing: Effective Techniques for Mining Big Data, contained 17 chapters of innovative and practical statistical data mining techniques. In this second edition, renamed to reflect the increased coverage of machine-learning data mining techniques, the author has

  18. Cross-View Neuroimage Pattern Analysis for Alzheimer's Disease Staging

    Directory of Open Access Journals (Sweden)

    Sidong eLiu

    2016-02-01

    Full Text Available The research on staging of pre-symptomatic and prodromal phase of neurological disorders, e.g., Alzheimer's disease (AD, is essential for prevention of dementia. New strategies for AD staging with a focus on early detection, are demanded to optimize potential efficacy of disease-modifying therapies that can halt or slow the disease progression. Recently, neuroimaging are increasingly used as additional research-based markers to detect AD onset and predict conversion of MCI and normal control (NC to AD. Researchers have proposed a variety of neuroimaging biomarkers to characterize the patterns of the pathology of AD and MCI, and suggested that multi-view neuroimaging biomarkers could lead to better performance than single-view biomarkers in AD staging. However, it is still unclear what leads to such synergy and how to preserve or maximize. In an attempt to answer these questions, we proposed a cross-view pattern analysis framework for investigating the synergy between different neuroimaging biomarkers. We quantitatively analyzed 9 types of biomarkers derived from FDG-PET and T1-MRI, and evaluated their performance in a task of classifying AD, MCI and NC subjects obtained from the ADNI baseline cohort. The experiment results showed that these biomarkers could depict the pathology of AD from different perspectives, and output distinct patterns that are significantly associated with the disease progression. Most importantly, we found that these features could be separated into clusters, each depicting a particular aspect; and the inter-cluster features could always achieve better performance than the intra-cluster features in AD staging.

  19. Transcriptional analysis of late ripening stages of grapevine berry

    Science.gov (United States)

    2011-01-01

    Background The composition of grapevine berry at harvest is a major determinant of wine quality. Optimal oenological maturity of berries is characterized by a high sugar/acidity ratio, high anthocyanin content in the skin, and low astringency. However, harvest time is still mostly determined empirically, based on crude biochemical composition and berry tasting. In this context, it is interesting to identify genes that are expressed/repressed specifically at the late stages of ripening and which may be used as indicators of maturity. Results Whole bunches and berries sorted by density were collected in vineyard on Chardonnay (white cultivar) grapevines for two consecutive years at three stages of ripening (7-days before harvest (TH-7), harvest (TH), and 10-days after harvest (TH+10)). Microvinification and sensory analysis indicate that the quality of the wines made from the whole bunches collected at TH-7, TH and TH+10 differed, TH providing the highest quality wines. In parallel, gene expression was studied with Qiagen/Operon microarrays using two types of samples, i.e. whole bunches and berries sorted by density. Only 12 genes were consistently up- or down-regulated in whole bunches and density sorted berries for the two years studied in Chardonnay. 52 genes were differentially expressed between the TH-7 and TH samples. In order to determine whether these genes followed a similar pattern of expression during the late stages of berry ripening in a red cultivar, nine genes were selected for RT-PCR analysis with Cabernet Sauvignon grown under two different temperature regimes affecting the precocity of ripening. The expression profiles and their relationship to ripening were confirmed in Cabernet Sauvignon for seven genes, encoding a carotenoid cleavage dioxygenase, a galactinol synthase, a late embryogenesis abundant protein, a dirigent-like protein, a histidine kinase receptor, a valencene synthase and a putative S-adenosyl-L-methionine:salicylic acid carboxyl

  20. Transcriptional analysis of late ripening stages of grapevine berry

    Directory of Open Access Journals (Sweden)

    Guillaumie Sabine

    2011-11-01

    Full Text Available Abstract Background The composition of grapevine berry at harvest is a major determinant of wine quality. Optimal oenological maturity of berries is characterized by a high sugar/acidity ratio, high anthocyanin content in the skin, and low astringency. However, harvest time is still mostly determined empirically, based on crude biochemical composition and berry tasting. In this context, it is interesting to identify genes that are expressed/repressed specifically at the late stages of ripening and which may be used as indicators of maturity. Results Whole bunches and berries sorted by density were collected in vineyard on Chardonnay (white cultivar grapevines for two consecutive years at three stages of ripening (7-days before harvest (TH-7, harvest (TH, and 10-days after harvest (TH+10. Microvinification and sensory analysis indicate that the quality of the wines made from the whole bunches collected at TH-7, TH and TH+10 differed, TH providing the highest quality wines. In parallel, gene expression was studied with Qiagen/Operon microarrays using two types of samples, i.e. whole bunches and berries sorted by density. Only 12 genes were consistently up- or down-regulated in whole bunches and density sorted berries for the two years studied in Chardonnay. 52 genes were differentially expressed between the TH-7 and TH samples. In order to determine whether these genes followed a similar pattern of expression during the late stages of berry ripening in a red cultivar, nine genes were selected for RT-PCR analysis with Cabernet Sauvignon grown under two different temperature regimes affecting the precocity of ripening. The expression profiles and their relationship to ripening were confirmed in Cabernet Sauvignon for seven genes, encoding a carotenoid cleavage dioxygenase, a galactinol synthase, a late embryogenesis abundant protein, a dirigent-like protein, a histidine kinase receptor, a valencene synthase and a putative S

  1. Constitution of an incident database suited to statistical analysis and examples

    International Nuclear Information System (INIS)

    Verpeaux, J.L.

    1990-01-01

    The Nuclear Protection and Safety Institute (IPSN) has set up and is developing an incidents database, which is used for the management and analysis of incidents encountered in French PWR plants. IPSN has already carried out several incidents or safety important events statistical analysis, and is improving its database on the basis of the experience it gained from this various studies. A description of the analysis method and of the developed database is presented

  2. A new statistic for the analysis of circular data in gamma-ray astronomy

    Science.gov (United States)

    Protheroe, R. J.

    1985-01-01

    A new statistic is proposed for the analysis of circular data. The statistic is designed specifically for situations where a test of uniformity is required which is powerful against alternatives in which a small fraction of the observations is grouped in a small range of directions, or phases.

  3. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  4. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  5. Immediate compared with delayed pushing in the second stage of labor: a systematic review and meta-analysis.

    Science.gov (United States)

    Tuuli, Methodius G; Frey, Heather A; Odibo, Anthony O; Macones, George A; Cahill, Alison G

    2012-09-01

    To estimate whether immediate or delayed pushing in the second stage of labor optimizes spontaneous vaginal delivery and other perinatal outcomes. We searched electronic databases MEDLINE and CINHAL through August 2011 without restrictions. The search terms used were MeSH headings, text words, and word variations of the words or phrases labor, laboring down, passive descent, passive second stage, physiologic second stage, spontaneous pushing, pushing, or bearing down. We searched for randomized controlled trials comparing immediate with delayed pushing in the second stage of labor. The primary outcome was spontaneous vaginal delivery. Secondary outcomes were instrumental delivery, cesarean delivery, duration of the second stage, duration of active pushing, and other maternal and neonatal outcomes. Heterogeneity was assessed using the Q test and I2. Pooled relative risks (RRs) and weighted mean differences were calculated using random-effects models. Twelve randomized controlled trials (1,584 immediate and 1,531 delayed pushing) met inclusion criteria. Overall, delayed pushing was associated with an increased rate of spontaneous vaginal delivery compared with immediate pushing (61.5% compared with 56.9%, pooled RR 1.09, 95% confidence interval [CI] 1.03-1.15). This increase was smaller and not statistically significant among high-quality studies (59.0% compared with 54.9%, pooled RR 1.07, 95% CI 0.98-1.26) but larger and statistically significant in lower-quality studies (81.0% compared with 71.0%%, pooled RR 1.13, 95% CI 1.02-1.24). Operative vaginal delivery rates were high in most studies and not significantly different between the two groups (33.7% compared with 37.4%, pooled RR 0.89, 95% CI 0.76-1.06). Delayed pushing was associated with prolongation of the second stage (weighted mean difference 56.92 minutes, 95% CI 42.19-71.64) and shortened duration of active pushing (weighted mean difference -21.98 minutes, 95% CI -31.29 to -12.68). Studies to date suggest

  6. A critical discussion of null hypothesis significance testing and statistical power analysis within psychological research

    DEFF Research Database (Denmark)

    Jones, Allan; Sommerlund, Bo

    2007-01-01

    The uses of null hypothesis significance testing (NHST) and statistical power analysis within psychological research are critically discussed. The article looks at the problems of relying solely on NHST when dealing with small and large sample sizes. The use of power-analysis in estimating...... the potential error introduced by small and large samples is advocated. Power analysis is not recommended as a replacement to NHST but as an additional source of information about the phenomena under investigation. Moreover, the importance of conceptual analysis in relation to statistical analysis of hypothesis...

  7. Structural analysis at aircraft conceptual design stage

    Science.gov (United States)

    Mansouri, Reza

    . Considering the strength and limitations of both methodologies, the question to be answered in this thesis is: How valuable and compatible are the classical analytical methods in today's conceptual design environment? And can these methods complement each other? To answer these questions, this thesis investigates the pros and cons of classical analytical structural analysis methods during the conceptual design stage through the following objectives: Illustrate structural design methodology of these methods within the framework of Aerospace Vehicle Design (AVD) lab's design lifecycle. Demonstrate the effectiveness of moment distribution method through four case studies. This will be done by considering and evaluating the strength and limitation of these methods. In order to objectively quantify the limitation and capabilities of the analytical method at the conceptual design stage, each case study becomes more complex than the one before.

  8. Dataset on statistical analysis of editorial board composition of Hindawi journals indexed in Emerging sources citation index

    Directory of Open Access Journals (Sweden)

    Hilary I. Okagbue

    2018-04-01

    Full Text Available This data article contains the statistical analysis of the total, percentage and distribution of editorial board composition of 111 Hindawi journals indexed in Emerging Sources Citation Index (ESCI across the continents. The reliability of the data was shown using correlation, goodness-of-fit test, analysis of variance and statistical variability tests. Keywords: Hindawi, Bibliometrics, Data analysis, ESCI, Random, Smart campus, Web of science, Ranking analytics, Statistics

  9. Trading stages

    DEFF Research Database (Denmark)

    Steiner, Uli; Tuljapurkar, Shripad; Coulson, Tim

    2012-01-01

    Interest in stage-and age structured models has recently increased because they can describe quantitative traits such as size that are left out of age-only demography. Available methods for the analysis of effects of vital rates on lifespan in stage-structured models have not been widely applied ...... examples. Much of our approach relies on trading of time and mortality risk in one stage for time and risk in others. Our approach contributes to the new framework of the study of age- and stage-structured biodemography....

  10. Statistical analysis of the determinations of the Sun's Galactocentric distance

    Science.gov (United States)

    Malkin, Zinovy

    2013-02-01

    Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.

  11. Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco

    Science.gov (United States)

    Bounoua, Z.; Mechaqrane, A.

    2018-05-01

    An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.

  12. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  13. Analysis of spectral data with rare events statistics

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  14. Diagnostic staging laparoscopy in gastric cancer treatment: A cost-effectiveness analysis.

    Science.gov (United States)

    Li, Kevin; Cannon, John G D; Jiang, Sam Y; Sambare, Tanmaya D; Owens, Douglas K; Bendavid, Eran; Poultsides, George A

    2018-05-01

    Accurate preoperative staging helps avert morbidity, mortality, and cost associated with non-therapeutic laparotomy in gastric cancer (GC) patients. Diagnostic staging laparoscopy (DSL) can detect metastases with high sensitivity, but its cost-effectiveness has not been previously studied. We developed a decision analysis model to assess the cost-effectiveness of preoperative DSL in GC workup. Analysis was based on a hypothetical cohort of GC patients in the U.S. for whom initial imaging shows no metastases. The cost-effectiveness of DSL was measured as cost per quality-adjusted life-year (QALY) gained. Drivers of cost-effectiveness were assessed in sensitivity analysis. Preoperative DSL required an investment of $107 012 per QALY. In sensitivity analysis, DSL became cost-effective at a threshold of $100 000/QALY when the probability of occult metastases exceeded 31.5% or when test sensitivity for metastases exceeded 86.3%. The likelihood of cost-effectiveness increased from 46% to 93% when both parameters were set at maximum reported values. The cost-effectiveness of DSL for GC patients is highly dependent on patient and test characteristics, and is more likely when DSL is used selectively where procedure yield is high, such as for locally advanced disease or in detecting peritoneal and superficial versus deep liver lesions. © 2017 Wiley Periodicals, Inc.

  15. Retrospective analysis of 56 edentulous dental arches restored with 344 single-stage implants using an immediate loading fixed provisional protocol: statistical predictors of implant failure.

    Science.gov (United States)

    Kinsel, Richard P; Liss, Mindy

    2007-01-01

    The purpose of this retrospective study was to evaluate the effects of implant dimensions, surface treatment, location in the dental arch, numbers of supporting implant abutments, surgical technique, and generally recognized risk factors on the survival of a series of single-stage Straumann dental implants placed into edentulous arches using an immediate loading protocol. Each patient received between 4 and 18 implants in one or both dental arches. Periapical radiographs were obtained over a 2- to 10-year follow-up period to evaluate crestal bone loss following insertion of the definitive metal-ceramic fixed prostheses. Univariate tests for failure rates as a function of age ( or = 60 years), gender, smoking, bone grafting, dental arch, surface type, anterior versus posterior, number of implants per arch, and surgical technique were made using Fisher exact tests. The Cochran-Armitage test for trend was used to evaluate the presence of a linear trend in failure rates regarding implant length and implant diameter. Logistic regression modeling was used to determine which, if any, of the aforementioned factors would predict patient and implant failure. A significance criterion of P = .05 was utilized. Data were collected for 344 single-stage implants placed into 56 edentulous arches (39 maxillae and 17 mandibles) of 43 patients and immediately loaded with a 1-piece provisional fixed prosthesis. A total of 16 implants failed to successfully integrate, for a survival rate of 95.3%. Increased rates of failure were associated with reduced implant length, placement in the posterior region of the jaw, increased implant diameter, and surface treatment. Implant length emerged as the sole significant predictor of implant failure. In this retrospective analysis of 56 consecutively treated edentulous arches with multiple single-stage dental implants loaded immediately, reduced implant length was the sole significant predictor of failure.

  16. Ten-year clinico-statistical study of oral squamous cell carcinoma

    International Nuclear Information System (INIS)

    Aoki, Shinjiro; Kawabe, Ryoichi; Chikumaru, Hiroshi; Saito, Tomokatsu; Hirota, Makoto; Miyake, Tetsumi; Omura, Susumu; Fujita, Kiyohide

    2003-01-01

    This clinico-statistical study includes 232 cases of oral squamous cell carcinoma that underwent radical treatment in the Department of Oral and Maxillofacial Surgery, Yokohama City University Hospital, during the decade from 1991 to 2000. Surgery was principally adopted as the first line for treatment in 199 cases, and radiotherapy in 33 cases. The 5-year overall survival rate was 73.4%. The results according to stage were as follows: stage I, 87.5%; Stage II, 77.9%; Stage III, 63.5%; and Stage IV A, 44.7%. The primary sites were classified as follows: upper gingiva, 85.2%; tongue, 73.7%; floor of mouth, 68.9%; lower gingiva, 66.3%; buccal mucosa, 63.9%; and hard palate, 50%. For tongue cancer, the 5-year overall survival rates by stage were: Stage I, 90.8%; Stage II, 82.1%; Stage III, 40.3%; and Stage IV A, 45.7%. Statistical significance was seen between cases of Stages I and II and those of Stages III and IV A stage. For lower gingival cancer, the 5-year overall survival rates by stage were: Stage I, 90.8%; Stage II, 82.1%; Stage III, 40.3%; and Stage IV A, 45.7%. Even in Stage I lower gingival cancers had unfavorable clinical outcomes. Preventive neck dissections were performed on 52 N 0 neck patients, but clinically negative nodes however showed metastasis in 14 patients (26.9%). (author)

  17. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  18. Limitations of Using Microsoft Excel Version 2016 (MS Excel 2016) for Statistical Analysis for Medical Research.

    Science.gov (United States)

    Tanavalee, Chotetawan; Luksanapruksa, Panya; Singhatanadgige, Weerasak

    2016-06-01

    Microsoft Excel (MS Excel) is a commonly used program for data collection and statistical analysis in biomedical research. However, this program has many limitations, including fewer functions that can be used for analysis and a limited number of total cells compared with dedicated statistical programs. MS Excel cannot complete analyses with blank cells, and cells must be selected manually for analysis. In addition, it requires multiple steps of data transformation and formulas to plot survival analysis graphs, among others. The Megastat add-on program, which will be supported by MS Excel 2016 soon, would eliminate some limitations of using statistic formulas within MS Excel.

  19. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  20. Statistical power analysis a simple and general model for traditional and modern hypothesis tests

    CERN Document Server

    Murphy, Kevin R; Wolach, Allen

    2014-01-01

    Noted for its accessible approach, this text applies the latest approaches of power analysis to both null hypothesis and minimum-effect testing using the same basic unified model. Through the use of a few simple procedures and examples, the authors show readers with little expertise in statistical analysis how to obtain the values needed to carry out the power analysis for their research. Illustrations of how these analyses work and how they can be used to choose the appropriate criterion for defining statistically significant outcomes are sprinkled throughout. The book presents a simple and g

  1. Stages of change and health‐related quality of life among employees of an institution

    Science.gov (United States)

    Liau, Siow Yen; Shafie, Asrul A; Ibrahim, Mohamed Izham Mohamed; Hassali, Mohamed Azmi; Othman, Ahmad Tajuddin; Mohamed, Mohamad Haniki Nik; Hamdi, Menal A

    2011-01-01

    Abstract Background  Transtheoretical Model of change has been used successfully in promoting behaviour change. Objective  To examine the relationships between health‐related quality of life (HRQoL) scores with the stages of change of adequate physical activity and fruit and vegetables intake. Design  This was a cross‐sectional study conducted among employees of the main campus and Engineering campus of Universiti Sains Malaysia (USM) during October 2009 and March 2010. Main variables studied: Data on physical activity and fruit and vegetable intake was collected using the WHO STEPS instrument for chronic disease risk factors surveillance. The Short Form‐12 health survey (SF‐12) was used to gather information on participants’ HRQoL. The current stages of change are measured using the measures developed by the Pro‐Change Behaviour Systems Incorporation. Statistical analysis: One way ANOVA and its non‐parametric equivalent Kruskal‐Wallis were used to compare the differences between SF‐12 scores with the stages of change. Results  A total of 144 employees were included in this analysis. A large proportion of the participants reported inadequate fruits and vegetable intake (92.3%) and physical activity (84.6%). Mean physical and mental component scores of SF‐12 were 50.39 (SD = 7.69) and 49.73 (SD = 8.64) respectively. Overall, there was no statistical significant difference in the SF‐12 domains scores with regards to the stages of change for both the risk factors. Conclusions  There were some evidence of positive relationship between stages of change of physical activity and fruit and vegetable intake with SF‐12 scores. Further studies need to be conducted to confirm this association. PMID:21645189

  2. Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2010-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

  3. Spatially variable stage-driven groundwater-surface water interaction inferred from time-frequency analysis of distributed temperature sensing data

    Science.gov (United States)

    Mwakanyamale, Kisa; Slater, Lee; Day-Lewis, Frederick D.; Elwaseif, Mehrez; Johnson, Carole D.

    2012-01-01

    Characterization of groundwater-surface water exchange is essential for improving understanding of contaminant transport between aquifers and rivers. Fiber-optic distributed temperature sensing (FODTS) provides rich spatiotemporal datasets for quantitative and qualitative analysis of groundwater-surface water exchange. We demonstrate how time-frequency analysis of FODTS and synchronous river stage time series from the Columbia River adjacent to the Hanford 300-Area, Richland, Washington, provides spatial information on the strength of stage-driven exchange of uranium contaminated groundwater in response to subsurface heterogeneity. Although used in previous studies, the stage-temperature correlation coefficient proved an unreliable indicator of the stage-driven forcing on groundwater discharge in the presence of other factors influencing river water temperature. In contrast, S-transform analysis of the stage and FODTS data definitively identifies the spatial distribution of discharge zones and provided information on the dominant forcing periods (≥2 d) of the complex dam operations driving stage fluctuations and hence groundwater-surface water exchange at the 300-Area.

  4. Error analysis of terrestrial laser scanning data by means of spherical statistics and 3D graphs.

    Science.gov (United States)

    Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G; Arias, Pedro

    2010-01-01

    This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics.

  5. CFAssay: statistical analysis of the colony formation assay

    International Nuclear Information System (INIS)

    Braselmann, Herbert; Michna, Agata; Heß, Julia; Unger, Kristian

    2015-01-01

    Colony formation assay is the gold standard to determine cell reproductive death after treatment with ionizing radiation, applied for different cell lines or in combination with other treatment modalities. Associated linear-quadratic cell survival curves can be calculated with different methods. For easy code exchange and methodological standardisation among collaborating laboratories a software package CFAssay for R (R Core Team, R: A Language and Environment for Statistical Computing, 2014) was established to perform thorough statistical analysis of linear-quadratic cell survival curves after treatment with ionizing radiation and of two-way designs of experiments with chemical treatments only. CFAssay offers maximum likelihood and related methods by default and the least squares or weighted least squares method can be optionally chosen. A test for comparision of cell survival curves and an ANOVA test for experimental two-way designs are provided. For the two presented examples estimated parameters do not differ much between maximum-likelihood and least squares. However the dispersion parameter of the quasi-likelihood method is much more sensitive for statistical variation in the data than the multiple R 2 coefficient of determination from the least squares method. The dispersion parameter for goodness of fit and different plot functions in CFAssay help to evaluate experimental data quality. As open source software interlaboratory code sharing between users is facilitated

  6. Procedure for statistical analysis of one-parameter discrepant experimental data

    International Nuclear Information System (INIS)

    Badikov, Sergey A.; Chechev, Valery P.

    2012-01-01

    A new, Mandel–Paule-type procedure for statistical processing of one-parameter discrepant experimental data is described. The procedure enables one to estimate a contribution of unrecognized experimental errors into the total experimental uncertainty as well as to include it in analysis. A definition of discrepant experimental data for an arbitrary number of measurements is introduced as an accompanying result. In the case of negligible unrecognized experimental errors, the procedure simply reduces to the calculation of the weighted average and its internal uncertainty. The procedure was applied to the statistical analysis of half-life experimental data; Mean half-lives for 20 actinides were calculated and results were compared to the ENSDF and DDEP evaluations. On the whole, the calculated half-lives are consistent with the ENSDF and DDEP evaluations. However, the uncertainties calculated in this work essentially exceed the ENSDF and DDEP evaluations for discrepant experimental data. This effect can be explained by adequately taking into account unrecognized experimental errors. - Highlights: ► A new statistical procedure for processing one-parametric discrepant experimental data has been presented. ► Procedure estimates a contribution of unrecognized errors in the total experimental uncertainty. ► Procedure was applied for processing half-life discrepant experimental data. ► Results of the calculations are compared to the ENSDF and DDEP evaluations.

  7. Noise removing in encrypted color images by statistical analysis

    Science.gov (United States)

    Islam, N.; Puech, W.

    2012-03-01

    Cryptographic techniques are used to secure confidential data from unauthorized access but these techniques are very sensitive to noise. A single bit change in encrypted data can have catastrophic impact over the decrypted data. This paper addresses the problem of removing bit error in visual data which are encrypted using AES algorithm in the CBC mode. In order to remove the noise, a method is proposed which is based on the statistical analysis of each block during the decryption. The proposed method exploits local statistics of the visual data and confusion/diffusion properties of the encryption algorithm to remove the errors. Experimental results show that the proposed method can be used at the receiving end for the possible solution for noise removing in visual data in encrypted domain.

  8. Statistical Analysis of Radio Propagation Channel in Ruins Environment

    Directory of Open Access Journals (Sweden)

    Jiao He

    2015-01-01

    Full Text Available The cellphone based localization system for search and rescue in complex high density ruins has attracted a great interest in recent years, where the radio channel characteristics are critical for design and development of such a system. This paper presents a spatial smoothing estimation via rotational invariance technique (SS-ESPRIT for radio channel characterization of high density ruins. The radio propagations at three typical mobile communication bands (0.9, 1.8, and 2 GHz are investigated in two different scenarios. Channel parameters, such as arrival time, delays, and complex amplitudes, are statistically analyzed. Furthermore, a channel simulator is built based on these statistics. By comparison analysis of average excess delay and delay spread, the validation results show a good agreement between the measurements and channel modeling results.

  9. Statistical Analysis of Sport Movement Observations: the Case of Orienteering

    Science.gov (United States)

    Amouzandeh, K.; Karimipour, F.

    2017-09-01

    Study of movement observations is becoming more popular in several applications. Particularly, analyzing sport movement time series has been considered as a demanding area. However, most of the attempts made on analyzing movement sport data have focused on spatial aspects of movement to extract some movement characteristics, such as spatial patterns and similarities. This paper proposes statistical analysis of sport movement observations, which refers to analyzing changes in the spatial movement attributes (e.g. distance, altitude and slope) and non-spatial movement attributes (e.g. speed and heart rate) of athletes. As the case study, an example dataset of movement observations acquired during the "orienteering" sport is presented and statistically analyzed.

  10. SeDA: A software package for the statistical analysis of the instrument drift

    International Nuclear Information System (INIS)

    Lee, H. J.; Jang, S. C.; Lim, T. J.

    2006-01-01

    The setpoints for safety-related equipment are affected by many sources of an uncertainty. ANSI/ISA-S67.04.01-2000 [1] and ISA-RP6 7.04.02-2000 [2] suggested the statistical approaches for ensuring that the safety-related instrument setpoints were established and maintained within the technical specification limits [3]. However, Jang et al. [4] indicated that the preceding methodologies for a setpoint drift analysis might be insufficient to manage a setpoint drift on an instrumentation device and proposed new statistical analysis procedures for the management of a setpoint drift, based on the plant specific as-found/as-left data. Although IHPA (Instrument History Performance Analysis) is a widely known commercial software package to analyze an instrument setpoint drift, several steps in the new procedure cannot be performed by using it because it is based on the statistical approaches suggested in the ANSI/ISA-S67.04.01 -2000 [1] and ISA-RP67.04.02-2000 [2], In this paper we present a software package (SeDA: Setpoint Drift Analysis) that implements new methodologies, and which is easy to use, as it is accompanied by powerful graphical tools. (authors)

  11. Brain-Wide Analysis of Functional Connectivity in First-Episode and Chronic Stages of Schizophrenia.

    Science.gov (United States)

    Li, Tao; Wang, Qiang; Zhang, Jie; Rolls, Edmund T; Yang, Wei; Palaniyappan, Lena; Zhang, Lu; Cheng, Wei; Yao, Ye; Liu, Zhaowen; Gong, Xiaohong; Luo, Qiang; Tang, Yanqing; Crow, Timothy J; Broome, Matthew R; Xu, Ke; Li, Chunbo; Wang, Jijun; Liu, Zhening; Lu, Guangming; Wang, Fei; Feng, Jianfeng

    2017-03-01

    Published reports of functional abnormalities in schizophrenia remain divergent due to lack of staging point-of-view and whole-brain analysis. To identify key functional-connectivity differences of first-episode (FE) and chronic patients from controls using resting-state functional MRI, and determine changes that are specifically associated with disease onset, a clinical staging model is adopted. We analyze functional-connectivity differences in prodromal, FE (mostly drug naïve), and chronic patients from their matched controls from 6 independent datasets involving a total of 789 participants (343 patients). Brain-wide functional-connectivity analysis was performed in different datasets and the results from the datasets of the same stage were then integrated by meta-analysis, with Bonferroni correction for multiple comparisons. Prodromal patients differed from controls in their pattern of functional-connectivity involving the inferior frontal gyri (Broca's area). In FE patients, 90% of the functional-connectivity changes involved the frontal lobes, mostly the inferior frontal gyrus including Broca's area, and these changes were correlated with delusions/blunted affect. For chronic patients, functional-connectivity differences extended to wider areas of the brain, including reduced thalamo-frontal connectivity, and increased thalamo-temporal and thalamo-sensorimoter connectivity that were correlated with the positive, negative, and general symptoms, respectively. Thalamic changes became prominent at the chronic stage. These results provide evidence for distinct patterns of functional-dysconnectivity across FE and chronic stages of schizophrenia. Importantly, abnormalities in the frontal language networks appear early, at the time of disease onset. The identification of stage-specific pathological processes may help to understand the disease course of schizophrenia and identify neurobiological markers crucial for early diagnosis. © The Author 2016. Published by

  12. Multivariate meta-analysis: a robust approach based on the theory of U-statistic.

    Science.gov (United States)

    Ma, Yan; Mazumdar, Madhu

    2011-10-30

    Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Implementation of statistical analysis methods for medical physics data

    International Nuclear Information System (INIS)

    Teixeira, Marilia S.; Pinto, Nivia G.P.; Barroso, Regina C.; Oliveira, Luis F.

    2009-01-01

    The objective of biomedical research with different radiation natures is to contribute for the understanding of the basic physics and biochemistry of the biological systems, the disease diagnostic and the development of the therapeutic techniques. The main benefits are: the cure of tumors through the therapy, the anticipated detection of diseases through the diagnostic, the using as prophylactic mean for blood transfusion, etc. Therefore, for the better understanding of the biological interactions occurring after exposure to radiation, it is necessary for the optimization of therapeutic procedures and strategies for reduction of radioinduced effects. The group pf applied physics of the Physics Institute of UERJ have been working in the characterization of biological samples (human tissues, teeth, saliva, soil, plants, sediments, air, water, organic matrixes, ceramics, fossil material, among others) using X-rays diffraction and X-ray fluorescence. The application of these techniques for measurement, analysis and interpretation of the biological tissues characteristics are experimenting considerable interest in the Medical and Environmental Physics. All quantitative data analysis must be initiated with descriptive statistic calculation (means and standard deviations) in order to obtain a previous notion on what the analysis will reveal. It is well known que o high values of standard deviation found in experimental measurements of biologicals samples can be attributed to biological factors, due to the specific characteristics of each individual (age, gender, environment, alimentary habits, etc). This work has the main objective the development of a program for the use of specific statistic methods for the optimization of experimental data an analysis. The specialized programs for this analysis are proprietary, another objective of this work is the implementation of a code which is free and can be shared by the other research groups. As the program developed since the

  15. Statistical analysis of AFM topographic images of self-assembled quantum dots

    Energy Technology Data Exchange (ETDEWEB)

    Sevriuk, V. A.; Brunkov, P. N., E-mail: brunkov@mail.ioffe.ru; Shalnev, I. V.; Gutkin, A. A.; Klimko, G. V.; Gronin, S. V.; Sorokin, S. V.; Konnikov, S. G. [Russian Academy of Sciences, Ioffe Physical-Technical Institute (Russian Federation)

    2013-07-15

    To obtain statistical data on quantum-dot sizes, AFM topographic images of the substrate on which the dots under study are grown are analyzed. Due to the nonideality of the substrate containing height differences on the order of the size of nanoparticles at distances of 1-10 {mu}m and the insufficient resolution of closely arranged dots due to the finite curvature radius of the AFM probe, automation of the statistical analysis of their large dot array requires special techniques for processing topographic images to eliminate the loss of a particle fraction during conventional processing. As such a technique, convolution of the initial matrix of the AFM image with a specially selected matrix is used. This makes it possible to determine the position of each nanoparticle and, using the initial matrix, to measure their geometrical parameters. The results of statistical analysis by this method of self-assembled InAs quantum dots formed on the surface of an AlGaAs epitaxial layer are presented. It is shown that their concentration, average size, and half-width of height distribution depend strongly on the In flow and total amount of deposited InAs which are varied within insignificant limits.

  16. Statistical Analysis of Environmental Tritium around Wolsong Site

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ju Youl [FNC Technology Co., Yongin (Korea, Republic of)

    2010-04-15

    To find the relationship among airborne tritium, tritium in rainwater, TFWT (Tissue Free Water Tritium) and TBT (Tissue Bound Tritium), statistical analysis is conducted based on tritium data measured at KHNP employees' house around Wolsong nuclear power plants during 10 years from 1999 to 2008. The results show that tritium in such media exhibits a strong seasonal and annual periodicity. Tritium concentration in rainwater is observed to be highly correlated with TFWT and directly transmitted to TFWT without delay. The response of environmental radioactivity of tritium around Wolsong site is analyzed using time-series technique and non-parametric trend analysis. Tritium in the atmosphere and rainwater is strongly auto-correlated by seasonal and annual periodicity. TFWT concentration in pine needle is proven to be more sensitive to rainfall phenomenon than other weather variables. Non-parametric trend analysis of TFWT concentration within pine needle shows a increasing slope in terms of confidence level of 95%. This study demonstrates a usefulness of time-series and trend analysis for the interpretation of environmental radioactivity relationship with various environmental media.

  17. Statistical Analysis of the Polarimetric Cloud Analysis and Seeding Test (POLCAST) Field Projects

    Science.gov (United States)

    Ekness, Jamie Lynn

    The North Dakota farming industry brings in more than $4.1 billion annually in cash receipts. Unfortunately, agriculture sales vary significantly from year to year, which is due in large part to weather events such as hail storms and droughts. One method to mitigate drought is to use hygroscopic seeding to increase the precipitation efficiency of clouds. The North Dakota Atmospheric Research Board (NDARB) sponsored the Polarimetric Cloud Analysis and Seeding Test (POLCAST) research project to determine the effectiveness of hygroscopic seeding in North Dakota. The POLCAST field projects obtained airborne and radar observations, while conducting randomized cloud seeding. The Thunderstorm Identification Tracking and Nowcasting (TITAN) program is used to analyze radar data (33 usable cases) in determining differences in the duration of the storm, rain rate and total rain amount between seeded and non-seeded clouds. The single ratio of seeded to non-seeded cases is 1.56 (0.28 mm/0.18 mm) or 56% increase for the average hourly rainfall during the first 60 minutes after target selection. A seeding effect is indicated with the lifetime of the storms increasing by 41 % between seeded and non-seeded clouds for the first 60 minutes past seeding decision. A double ratio statistic, a comparison of radar derived rain amount of the last 40 minutes of a case (seed/non-seed), compared to the first 20 minutes (seed/non-seed), is used to account for the natural variability of the cloud system and gives a double ratio of 1.85. The Mann-Whitney test on the double ratio of seeded to non-seeded cases (33 cases) gives a significance (p-value) of 0.063. Bootstrapping analysis of the POLCAST set indicates that 50 cases would provide statistically significant results based on the Mann-Whitney test of the double ratio. All the statistical analysis conducted on the POLCAST data set show that hygroscopic seeding in North Dakota does increase precipitation. While an additional POLCAST field

  18. Whole-lesion apparent diffusion coefficient histogram analysis: significance in T and N staging of gastric cancers.

    Science.gov (United States)

    Liu, Song; Zhang, Yujuan; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang

    2017-10-02

    Whole-lesion apparent diffusion coefficient (ADC) histogram analysis has been introduced and proved effective in assessment of multiple tumors. However, the application of whole-volume ADC histogram analysis in gastrointestinal tumors has just started and never been reported in T and N staging of gastric cancers. Eighty patients with pathologically confirmed gastric carcinomas underwent diffusion weighted (DW) magnetic resonance imaging before surgery prospectively. Whole-lesion ADC histogram analysis was performed by two radiologists independently. The differences of ADC histogram parameters among different T and N stages were compared with independent-samples Kruskal-Wallis test. Receiver operating characteristic (ROC) analysis was performed to evaluate the performance of ADC histogram parameters in differentiating particular T or N stages of gastric cancers. There were significant differences of all the ADC histogram parameters for gastric cancers at different T (except ADC min and ADC max ) and N (except ADC max ) stages. Most ADC histogram parameters differed significantly between T1 vs T3, T1 vs T4, T2 vs T4, N0 vs N1, N0 vs N3, and some parameters (ADC 5% , ADC 10% , ADC min ) differed significantly between N0 vs N2, N2 vs N3 (all P histogram parameters held great potential in differentiating different T and N stages of gastric cancers preoperatively.

  19. Screening and staging for non-small cell lung cancer by serum laser Raman spectroscopy.

    Science.gov (United States)

    Wang, Hong; Zhang, Shaohong; Wan, Limei; Sun, Hong; Tan, Jie; Su, Qiucheng

    2018-08-05

    Lung cancer is the leading cause of cancer-related death worldwide. Current clinical screening methods to detect lung cancer are expensive and associated with many complications. Raman spectroscopy is a spectroscopic technique that offers a convenient method to gain molecular information about biological samples. In this study, we measured the serum Raman spectral intensity of healthy volunteers and patients with different stages of non-small cell lung cancer. The purpose of this study was to evaluate the application of serum laser Raman spectroscopy as a low cost alternative method in the screening and staging of non-small cell lung cancer (NSCLC). The Raman spectra of the sera of peripheral venous blood were measured with a LabRAM HR 800 confocal Micro Raman spectrometer for individuals from five groups including 14 healthy volunteers (control group), 23 patients with stage I NSCLC (stage I group), 24 patients with stage II NSCLC (stage II group), 19 patients with stage III NSCLC (stage III group), 11 patients with stage IV NSCLC (stage IV group). Each serum sample was measured 3 times at different spots and the average spectra represented the signal of Raman spectra in each case. The Raman spectrum signal data of the five groups were statistically analyzed by analysis of variance (ANOVA), principal component analysis (PCA), linear discriminant analysis (LDA), and cross-validation. Raman spectral intensity was sequentially reduced in serum samples from control group, stage I group, stage II group and stage III/IV group. The strongest peak intensity was observed in the control group, and the weakest one was found in the stage III/IV group at bands of 848 cm -1 , 999 cm -1 , 1152 cm -1 , 1446 cm -1 and 1658 cm -1 (P Raman spectroscopy can effectively identify patients with stage I, stage II or stage III/IV Non-Small Cell Lung cancer using patient serum samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Statistical Power in Plant Pathology Research.

    Science.gov (United States)

    Gent, David H; Esker, Paul D; Kriss, Alissa B

    2018-01-01

    In null hypothesis testing, failure to reject a null hypothesis may have two potential interpretations. One interpretation is that the treatments being evaluated do not have a significant effect, and a correct conclusion was reached in the analysis. Alternatively, a treatment effect may have existed but the conclusion of the study was that there was none. This is termed a Type II error, which is most likely to occur when studies lack sufficient statistical power to detect a treatment effect. In basic terms, the power of a study is the ability to identify a true effect through a statistical test. The power of a statistical test is 1 - (the probability of Type II errors), and depends on the size of treatment effect (termed the effect size), variance, sample size, and significance criterion (the probability of a Type I error, α). Low statistical power is prevalent in scientific literature in general, including plant pathology. However, power is rarely reported, creating uncertainty in the interpretation of nonsignificant results and potentially underestimating small, yet biologically significant relationships. The appropriate level of power for a study depends on the impact of Type I versus Type II errors and no single level of power is acceptable for all purposes. Nonetheless, by convention 0.8 is often considered an acceptable threshold and studies with power less than 0.5 generally should not be conducted if the results are to be conclusive. The emphasis on power analysis should be in the planning stages of an experiment. Commonly employed strategies to increase power include increasing sample sizes, selecting a less stringent threshold probability for Type I errors, increasing the hypothesized or detectable effect size, including as few treatment groups as possible, reducing measurement variability, and including relevant covariates in analyses. Power analysis will lead to more efficient use of resources and more precisely structured hypotheses, and may even

  1. Comparative transcriptome analysis of two races of Heterodera glycines at different developmental stages.

    Directory of Open Access Journals (Sweden)

    Gaofeng Wang

    Full Text Available The soybean cyst nematode, Heterodera glycines, is an important pest of soybeans. Although resistance is available against this nematode, selection for virulent races can occur, allowing the nematode to overcome the resistance of cultivars. There are abundant field populations, however, little is known about their genetic diversity. In order to elucidate the differences between races, we investigated the transcriptional diversity within race 3 and race 4 inbred lines during their compatible interactions with the soybean host Zhonghuang 13. Six different race-enriched cDNA libraries were constructed with limited nematode samples collected from the three sedentary stages, parasitic J2, J3 and J4 female, respectively. Among 689 putative race-enriched genes isolated from the six libraries with functional annotations, 92 were validated by quantitative RT-PCR (qRT-PCR, including eight putative effector encoding genes. Further race-enriched genes were validated within race 3 and race 4 during development in soybean roots. Gene Ontology (GO analysis of all the race-enriched genes at J3 and J4 female stages showed that most of them functioned in metabolic processes. Relative transcript level analysis of 13 selected race-enriched genes at four developmental stages showed that the differences in their expression abundance took place at either one or more developmental stages. This is the first investigation into the transcript diversity of H. glycines races throughout their sedentary stages, increasing the understanding of the genetic diversity of H. glycines.

  2. Peculiarities of Teaching Medical Informatics and Statistics

    Directory of Open Access Journals (Sweden)

    Sergey Glushkov

    2017-05-01

    Full Text Available The article reviews features of teaching Medical Informatics and Statistics. The course is referred to the disciplines of Mathematical and Natural sciences. The course is provided in all the faculties of I. M. Sechenov First Moscow State Medical University. For students of Preventive Medicine Department the time frame allotted for studying the course is significantly larger than for similar course provided at other faculties. To improve the teaching methodology of the discipline an analysis of the curriculum has been carried out, attendance and students’ performance statistics have been summarized. As a result, the main goals and objectives have been identified. Besides, general educational functions and the contribution to the solution of problems of education, students’ upbringing and development have been revealed; two stages of teaching have been presented. Recommendations referred to the newest methodological development aimed at improving the quality of teaching the discipline are provided. The ways of improving the methods and organizational forms of education are outlined.

  3. Statistical Analysis of Designed Experiments Theory and Applications

    CERN Document Server

    Tamhane, Ajit C

    2012-01-01

    A indispensable guide to understanding and designing modern experiments The tools and techniques of Design of Experiments (DOE) allow researchers to successfully collect, analyze, and interpret data across a wide array of disciplines. Statistical Analysis of Designed Experiments provides a modern and balanced treatment of DOE methodology with thorough coverage of the underlying theory and standard designs of experiments, guiding the reader through applications to research in various fields such as engineering, medicine, business, and the social sciences. The book supplies a foundation for the

  4. Multivariate statistical analysis of major and trace element data for ...

    African Journals Online (AJOL)

    Multivariate statistical analysis of major and trace element data for niobium exploration in the peralkaline granites of the anorogenic ring-complex province of Nigeria. PO Ogunleye, EC Ike, I Garba. Abstract. No Abstract Available Journal of Mining and Geology Vol.40(2) 2004: 107-117. Full Text: EMAIL FULL TEXT EMAIL ...

  5. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. For the Love of Statistics: Appreciating and Learning to Apply Experimental Analysis and Statistics through Computer Programming Activities

    Science.gov (United States)

    Mascaró, Maite; Sacristán, Ana Isabel; Rufino, Marta M.

    2016-01-01

    For the past 4 years, we have been involved in a project that aims to enhance the teaching and learning of experimental analysis and statistics, of environmental and biological sciences students, through computational programming activities (using R code). In this project, through an iterative design, we have developed sequences of R-code-based…

  7. Prognostic and survival analysis of 837 Chinese colorectal cancer patients.

    Science.gov (United States)

    Yuan, Ying; Li, Mo-Dan; Hu, Han-Guang; Dong, Cai-Xia; Chen, Jia-Qi; Li, Xiao-Fen; Li, Jing-Jing; Shen, Hong

    2013-05-07

    To develop a prognostic model to predict survival of patients with colorectal cancer (CRC). Survival data of 837 CRC patients undergoing surgery between 1996 and 2006 were collected and analyzed by univariate analysis and Cox proportional hazard regression model to reveal the prognostic factors for CRC. All data were recorded using a standard data form and analyzed using SPSS version 18.0 (SPSS, Chicago, IL, United States). Survival curves were calculated by the Kaplan-Meier method. The log rank test was used to assess differences in survival. Univariate hazard ratios and significant and independent predictors of disease-specific survival and were identified by Cox proportional hazard analysis. The stepwise procedure was set to a threshold of 0.05. Statistical significance was defined as P analysis suggested age, preoperative obstruction, serum carcinoembryonic antigen level at diagnosis, status of resection, tumor size, histological grade, pathological type, lymphovascular invasion, invasion of adjacent organs, and tumor node metastasis (TNM) staging were positive prognostic factors (P analysis showed a significant statistical difference in 3-year survival among these groups: LNR1, 73%; LNR2, 55%; and LNR3, 42% (P analysis results showed that histological grade, depth of bowel wall invasion, and number of metastatic lymph nodes were the most important prognostic factors for CRC if we did not consider the interaction of the TNM staging system (P < 0.05). When the TNM staging was taken into account, histological grade lost its statistical significance, while the specific TNM staging system showed a statistically significant difference (P < 0.0001). The overall survival of CRC patients has improved between 1996 and 2006. LNR is a powerful factor for estimating the survival of stage III CRC patients.

  8. Statistical analysis of angular correlation measurements

    International Nuclear Information System (INIS)

    Oliveira, R.A.A.M. de.

    1986-01-01

    Obtaining the multipole mixing ratio, δ, of γ transitions in angular correlation measurements is a statistical problem characterized by the small number of angles in which the observation is made and by the limited statistic of counting, α. The inexistence of a sufficient statistics for the estimator of δ, is shown. Three different estimators for δ were constructed and their properties of consistency, bias and efficiency were tested. Tests were also performed in experimental results obtained in γ-γ directional correlation measurements. (Author) [pt

  9. Mathematical and statistical analysis of the effect of boron on yield parameters of wheat

    Energy Technology Data Exchange (ETDEWEB)

    Rawashdeh, Hamzeh [Water Management and Environment Research Department, National Center for Agricultural Research and Extension, P.O. Box 639, Baqa 19381 (Jordan); Sala, Florin [Soil Science and Plant Nutrition, Faculty of Agriculture, Banat University of Agricultural Sciences and Veterinary Medicine “Regele Mihai I al României” from Timişoara, Timişoara, 300645 (Romania); Boldea, Marius [Mathematics and Statistics, Faculty of Agriculture, Banat University of Agricultural Sciences and Veterinary Medicine “Regele Mihai I al României” from Timisoara, Timişoara, 300645 (Romania)

    2015-03-10

    The main objective of this research is to investigate the effect of foliar applications of boron at different growth stages on yield and yield parameters of wheat. The contribution of boron in achieving yield parameters is described by second degree polynomial equations, with high statistical confidence (p<0.01; F theoretical < F calculated, according to ANOVA test, for Alfa = 0.05). Regression analysis, based on R{sup 2} values obtained, made it possible to evaluate the particular contribution of boron to the realization of yield parameters. This was lower for spike length (R{sup 2} = 0.812), thousand seeds weight (R{sup 2} = 0.850) and higher in the case of the number of spikelets (R{sup 2} = 0.936) and the number of seeds on a spike (R{sup 2} = 0.960). These results confirm that boron plays an important part in achieving the number of seeds on a spike in the case of wheat, as the contribution of this element to the process of flower fertilization is well-known. In regards to productivity elements, the contribution of macroelements to yield quantity is clear, the contribution of B alone being R{sup 2} = 0.868.

  10. Statistical mechanical analysis of the linear vector channel in digital communication

    International Nuclear Information System (INIS)

    Takeda, Koujin; Hatabu, Atsushi; Kabashima, Yoshiyuki

    2007-01-01

    A statistical mechanical framework to analyze linear vector channel models in digital wireless communication is proposed for a large system. The framework is a generalization of that proposed for code-division multiple-access systems in Takeda et al (2006 Europhys. Lett. 76 1193) and enables the analysis of the system in which the elements of the channel transfer matrix are statistically correlated with each other. The significance of the proposed scheme is demonstrated by assessing the performance of an existing model of multi-input multi-output communication systems

  11. Monte Carlo based statistical power analysis for mediation models: methods and software.

    Science.gov (United States)

    Zhang, Zhiyong

    2014-12-01

    The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.

  12. A statistical analysis of the impact of advertising signs on road safety.

    Science.gov (United States)

    Yannis, George; Papadimitriou, Eleonora; Papantoniou, Panagiotis; Voulgari, Chrisoula

    2013-01-01

    This research aims to investigate the impact of advertising signs on road safety. An exhaustive review of international literature was carried out on the effect of advertising signs on driver behaviour and safety. Moreover, a before-and-after statistical analysis with control groups was applied on several road sites with different characteristics in the Athens metropolitan area, in Greece, in order to investigate the correlation between the placement or removal of advertising signs and the related occurrence of road accidents. Road accident data for the 'before' and 'after' periods on the test sites and the control sites were extracted from the database of the Hellenic Statistical Authority, and the selected 'before' and 'after' periods vary from 2.5 to 6 years. The statistical analysis shows no statistical correlation between road accidents and advertising signs in none of the nine sites examined, as the confidence intervals of the estimated safety effects are non-significant at 95% confidence level. This can be explained by the fact that, in the examined road sites, drivers are overloaded with information (traffic signs, directions signs, labels of shops, pedestrians and other vehicles, etc.) so that the additional information load from advertising signs may not further distract them.

  13. Comparative Proteomic Analysis of Hymenolepis diminuta Cysticercoid and Adult Stages

    Directory of Open Access Journals (Sweden)

    Anna Sulima

    2018-01-01

    Full Text Available Cestodiases are common parasitic diseases of animals and humans. As cestodes have complex lifecycles, hexacanth larvae, metacestodes (including cysticercoids, and adults produce proteins allowing them to establish invasion and to survive in the hostile environment of the host. Hymenolepis diminuta is the most commonly used model cestode in experimental parasitology. The aims of the present study were to perform a comparative proteomic analysis of two consecutive developmental stages of H. diminuta (cysticercoid and adult and to distinguish proteins which might be characteristic for each of the stages from those shared by both stages. Somatic proteins of H. diminuta were isolated from 6-week-old cysticercoids and adult tapeworms. Cysticercoids were obtained from experimentally infected beetles, Tenebrio molitor, whereas adult worms were collected from experimentally infected rats. Proteins were separated by GeLC-MS/MS (one dimensional gel electrophoresis coupled with liquid chromatography and tandem mass spectrometry. Additionally protein samples were digested in-liquid and identified by LC-MS/MS. The identified proteins were classified according to molecular function, cellular components and biological processes. Our study showed a number of differences and similarities in the protein profiles of cysticercoids and adults; 233 cysticercoid and 182 adult proteins were identified. From these proteins, 131 were present only in the cysticercoid and 80 only in the adult stage samples. Both developmental stages shared 102 proteins; among which six represented immunomodulators and one is a potential drug target. In-liquid digestion and LC-MS/MS complemented and confirmed some of the GeLC-MS/MS identifications. Possible roles and functions of proteins identified with both proteomic approaches are discussed.

  14. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    Science.gov (United States)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  15. Species association in tropical montane rain forest at two successional stages in Diaoluo Mountain, Hainan

    Institute of Scientific and Technical Information of China (English)

    Fude LIU; Wenjin WANG; Ming ZHANG; Jianwei ZHENG; Zhongsheng WANG; Shiting ZHANG; Wenjie YANG; Shuqing AN

    2008-01-01

    Species association is one of the basic concepts in community succession. There are different viewpoints on how species interaction changes with the progress of succession. In order to assess these relationships, we examined species associations in the tropical montane rain forest at early and late successional stages in Diaoluo Mountain, Hainan Island. Based on data from a 2 × 2 contingency table of species presence or absence, statist-ical methods including analysis of species association and χ2 tests were applied. The results show that: 1) an overall positive association was present among tree species in the communities during the two successional stages and were statistically significant at the late stage. The number of species pairs with positive and negative associations decreased throughout the process of succession, while the number with null associations was greatly increased. The same trend existed among the dominant and compan-ion species. The results indicate that the communities are developing towards a stable stage where the woody species coexist in harmony. 2) In the early-established and later invading species, all positive associations were not signifi-cant. Compared with positive and null associations, fewer negative associations were found. This implies that these species are inclined to coexist independently through por-tioning of resources. 3) Among the later invading species, positive associations were significant and no negative associations were found which suggest that these species have similar adaptive ability in the habitat and occupied overlapping niches in the community.

  16. NPP Siting in Western Part of Java Island Indonesia: Regional Analysis Stage

    International Nuclear Information System (INIS)

    Sastratenaya, A.S.; Yuliastuti

    2011-01-01

    Full text of publication follows: Considering that Banten and West Java Provinces are dense regions of industry, therefore they require a large amount of electricity. Nuclear power plant is one option to be considered to anticipate the future electricity demand. To support the program, it is needed to look for some potential locations through NPP siting. The siting should meet the requirement of safety, safety aspects of the natural external events, human induced external events, public and environmental safety. Site selection is performed in several stages, where each stage has specific assessment criteria. Siting is commenced with pre-survey activity to obtain several interest areas, the activity covers a wide area but the used data is very limited and only apply general criteria. The following activities after pre survey are site survey consisting of (1) regional analysis, (2) site screening, and (3) comparison and ranking stages. The objective of regional analysis is to obtain potential sites in the study area of 150 km radius from each interest area by using both general and specific criteria. The potential sites then screened to obtain selected candidate sites by using more detailed secondary data as well as survey activities such as geophysical investigation, a few of drilling, etc., within the radius of 50 km from each potential site. All the selected candidate sites are then compared and ranked to obtain preferred candidate site. Site evaluation is the next step to evaluate all site-specific parameter to obtain design basis parameters and as the basis for preparing site permit document. This paper presents the methodology and result of regional analysis stage. The objective of the activity is to obtain potential sites in the north coast of West Java and Banten Provinces by considering fourteen study aspects which could be categorize into safety related aspects, non-safety related aspect and public education. However, this paper only considers the safety

  17. Thermal properties Forsmark. Modelling stage 2.3 Complementary analysis and verification of the thermal bedrock model, stage 2.

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Wrafter, John; Laendell, Maerta (Geo Innova AB (Sweden)); Back, Paer-Erik; Rosen, Lars (Sweco AB (Sweden))

    2008-11-15

    This report present the results of thermal modelling work for the Forsmark area carried out during modelling stage 2.3. The work complements the main modelling efforts carried out during modelling stage 2.2. A revised spatial statistical description of the rock mass thermal conductivity for rock domain RFM045 is the main result of this work. Thermal modelling of domain RFM045 in Forsmark model stage 2.2 gave lower tail percentiles of thermal conductivity that were considered to be conservatively low due to the way amphibolite, the rock type with the lowest thermal conductivity, was modelled. New and previously available borehole data are used as the basis for revised stochastic geological simulations of domain RFM045. By defining two distinct thermal subdomains, these simulations have succeeded in capturing more of the lithological heterogeneity present. The resulting thermal model for rock domain RFM045 is, therefore, considered to be more realistic and reliable than that presented in model stage 2.2. The main conclusions of modelling efforts in model stage 2.3 are: - Thermal modelling indicates a mean thermal conductivity for domain RFM045 at the 5 m scale of 3.56 W/(mK). This is slightly higher than the value of 3.49 W/(mK) derived in model stage 2.2. - The variance decreases and the lower tail percentiles increase as the scale of observation increases from 1 to 5 m. Best estimates of the 0.1 percentile of thermal conductivity for domain RFM045 are 2.24 W/(mK) for the 1 m scale and 2.36 W/(mK) for the 5 m scale. This can be compared with corresponding values for domain RFM029 of 2.30 W/(mK) for the 1 m scale and 2.87 W/(mK)for the 5 m scale. - The reason for the pronounced lower tail in the thermal conductivity distribution for domain RFM045 is the presence of large bodies of the low-conductive amphibolite. - The modelling results for domain RFM029 presented in model stage 2.2 are still applicable. - As temperature increases, the thermal conductivity decreases

  18. Meta- and statistical analysis of single-case intervention research data: quantitative gifts and a wish list.

    Science.gov (United States)

    Kratochwill, Thomas R; Levin, Joel R

    2014-04-01

    In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  19. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  20. Statistical analysis of seismicity and hazard estimation for Italy (mixed approach). Statistical parameters of main shocks and aftershocks in the Italian region

    International Nuclear Information System (INIS)

    Molchan, G.M.; Kronrod, T.L.; Dmitrieva, O.E.

    1995-03-01

    The catalog of earthquakes of Italy (1900-1993) is analyzed in the present work. The following problems have been considered: 1) a choice of the operating magnitude, 2) an analysis of data completeness, and 3) a grouping (in time and in space). The catalog has been separated into main shocks and aftershocks. Statistical estimations of seismicity parameters (a,b) are performed for the seismogenetic zones defined by GNDT. The non-standard elements of the analysis performed are: (a) statistical estimation and comparison of seismicity parameters under the condition of arbitrary data grouping in magnitude, time and space; (b) use of a not conventional statistical method for the aftershock identification; the method is based on the idea of optimizing two kinds of errors in the aftershock identification process; (c) use of the aftershock zones to reveal seismically- interrelated seismogenic zones. This procedure contributes to the stability of the estimation of the ''b-value'' Refs, 25 figs, tabs

  1. Mediastinal lymph node dissection versus mediastinal lymph node sampling for early stage non-small cell lung cancer: a systematic review and meta-analysis.

    Science.gov (United States)

    Huang, Xiongfeng; Wang, Jianmin; Chen, Qiao; Jiang, Jielin

    2014-01-01

    This systematic review and meta-analysis aimed to evaluate the overall survival, local recurrence, distant metastasis, and complications of mediastinal lymph node dissection (MLND) versus mediastinal lymph node sampling (MLNS) in stage I-IIIA non-small cell lung cancer (NSCLC) patients. A systematic search of published literature was conducted using the main databases (MEDLINE, PubMed, EMBASE, and Cochrane databases) to identify relevant randomized controlled trials that compared MLND vs. MLNS in NSCLC patients. Methodological quality of included randomized controlled trials was assessed according to the criteria from the Cochrane Handbook for Systematic Review of Interventions (Version 5.1.0). Meta-analysis was performed using The Cochrane Collaboration's Review Manager 5.3. The results of the meta-analysis were expressed as hazard ratio (HR) or risk ratio (RR), with their corresponding 95% confidence interval (CI). We included results reported from six randomized controlled trials, with a total of 1,791 patients included in the primary meta-analysis. Compared to MLNS in NSCLC patients, there was no statistically significant difference in MLND on overall survival (HR = 0.77, 95% CI 0.55 to 1.08; P = 0.13). In addition, the results indicated that local recurrence rate (RR = 0.93, 95% CI 0.68 to 1.28; P = 0.67), distant metastasis rate (RR = 0.88, 95% CI 0.74 to 1.04; P = 0.15), and total complications rate (RR = 1.10, 95% CI 0.67 to 1.79; P = 0.72) were similar, no significant difference found between the two groups. Results for overall survival, local recurrence rate, and distant metastasis rate were similar between MLND and MLNS in early stage NSCLC patients. There was no evidence that MLND increased complications compared with MLNS. Whether or not MLND is superior to MLNS for stage II-IIIA remains to be determined.

  2. Life cycle analysis in preliminary design stages

    OpenAIRE

    Agudelo , Lina-Maria; Mejía-Gutiérrez , Ricardo; Nadeau , Jean-Pierre; PAILHES , Jérôme

    2014-01-01

    International audience; In a design process the product is decomposed into systems along the disciplinary lines. Each stage has its own goals and constraints that must be satisfied and has control over a subset of design variables that describe the overall system. When using different tools to initiate a product life cycle, including the environment and impacts, its noticeable that there is a gap in tools that linked the stages of preliminary design and the stages of materialization. Differen...

  3. Analysis and classification of ECG-waves and rhythms using circular statistics and vector strength

    Directory of Open Access Journals (Sweden)

    Janßen Jan-Dirk

    2017-09-01

    Full Text Available The most common way to analyse heart rhythm is to calculate the RR-interval and the heart rate variability. For further evaluation, descriptive statistics are often used. Here we introduce a new and more natural heart rhythm analysis tool that is based on circular statistics and vector strength. Vector strength is a tool to measure the periodicity or lack of periodicity of a signal. We divide the signal into non-overlapping window segments and project the detected R-waves around the unit circle using the complex exponential function and the median RR-interval. In addition, we calculate the vector strength and apply circular statistics as wells as an angular histogram on the R-wave vectors. This approach enables an intuitive visualization and analysis of rhythmicity. Our results show that ECG-waves and rhythms can be easily visualized, analysed and classified by circular statistics and vector strength.

  4. Consolidity analysis for fully fuzzy functions, matrices, probability and statistics

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2015-03-01

    Full Text Available The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, the handling of fuzzy matrices covered determinants of fuzzy matrices, the eigenvalues of fuzzy matrices, and solving least-squares fuzzy linear equations. The approach demonstrated to be also applicable in a systematic way in handling new fuzzy probabilistic and statistical problems. This included extending the conventional probabilistic and statistical analysis for handling fuzzy random data. Application also covered the consolidity of fuzzy optimization problems. Various numerical examples solved have demonstrated that the new consolidity concept is highly effective in solving in a compact form the propagation of fuzziness in linear, nonlinear, multivariable and dynamic problems with different types of complexities. Finally, it is demonstrated that the implementation of the suggested fuzzy mathematics can be easily embedded within normal mathematics through building special fuzzy functions library inside the computational Matlab Toolbox or using other similar software languages.

  5. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data

    Directory of Open Access Journals (Sweden)

    Maria Vinaixa

    2012-10-01

    Full Text Available Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples.

  6. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  7. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  8. STATISTICAL ANALYSIS OF SPORT MOVEMENT OBSERVATIONS: THE CASE OF ORIENTEERING

    Directory of Open Access Journals (Sweden)

    K. Amouzandeh

    2017-09-01

    Full Text Available Study of movement observations is becoming more popular in several applications. Particularly, analyzing sport movement time series has been considered as a demanding area. However, most of the attempts made on analyzing movement sport data have focused on spatial aspects of movement to extract some movement characteristics, such as spatial patterns and similarities. This paper proposes statistical analysis of sport movement observations, which refers to analyzing changes in the spatial movement attributes (e.g. distance, altitude and slope and non-spatial movement attributes (e.g. speed and heart rate of athletes. As the case study, an example dataset of movement observations acquired during the “orienteering” sport is presented and statistically analyzed.

  9. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  10. Pediatric differentiated thyroid carcinoma in stage I: risk factor analysis for disease free survival

    International Nuclear Information System (INIS)

    Wada, Nobuyuki; Rino, Yasushi; Masuda, Munetaka; Ito, Koichi; Sugino, Kiminori; Mimura, Takashi; Nagahama, Mitsuji; Kitagawa, Wataru; Shibuya, Hiroshi; Ohkuwa, Keiko; Nakayama, Hirotaka; Hirakawa, Shohei

    2009-01-01

    To examine the outcomes and risk factors in pediatric differentiated thyroid carcinoma (DTC) patients who were defined as TNM stage I because some patients develop disease recurrence but treatment strategy for such stage I pediatric patients is still controversial. We reviewed 57 consecutive TNM stage I patients (15 years or less) with DTC (46 papillary and 11 follicular) who underwent initial treatment at Ito Hospital between 1962 and 2004 (7 males and 50 females; mean age: 13.1 years; mean follow-up: 17.4 years). Clinicopathological results were evaluated in all patients. Multivariate analysis was performed to reveal the risk factors for disease-free survival (DFS) in these 57 patients. Extrathyroid extension and clinical lymphadenopathy at diagnosis were found in 7 and 12 patients, respectively. Subtotal/total thyroidectomy was performed in 23 patients, modified neck dissection in 38, and radioactive iodine therapy in 10. Pathological node metastasis was confirmed in 37 patients (64.9%). Fifteen patients (26.3%) exhibited local recurrence and 3 of them also developed metachronous lung metastasis. Ten of these 15 achieved disease-free after further treatments and no patients died of disease. In multivariate analysis, male gender (p = 0.017), advanced tumor (T3, 4a) stage (p = 0.029), and clinical lymphadenopathy (p = 0.006) were risk factors for DFS in stage I pediatric patients. Male gender, tumor stage, and lymphadenopathy are risk factors for DFS in stage I pediatric DTC patients. Aggressive treatment (total thyroidectomy, node dissection, and RI therapy) is considered appropriate for patients with risk factors, whereas conservative or stepwise approach may be acceptable for other patients

  11. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  12. Prediction of noise in ships by the application of “statistical energy analysis.”

    DEFF Research Database (Denmark)

    Jensen, John Ødegaard

    1979-01-01

    If it will be possible effectively to reduce the noise level in the accomodation on board ships, by introducing appropriate noise abatement measures already at an early design stage, it is quite essential that sufficiently accurate prediction methods are available for the naval architects...... or for a special noise abatement measure, e.g., increased structural damping. The paper discusses whether it might be possible to derive an alternative calculation model based on the “statistical energy analysis” approach (SEA). By considering the hull of a ship to be constructed from plate elements connected...

  13. The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach

    Science.gov (United States)

    Sari, S. Y.; Afrizon, R.

    2018-04-01

    Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.

  14. PROSA: A computer program for statistical analysis of near-real-time-accountancy (NRTA) data

    International Nuclear Information System (INIS)

    Beedgen, R.; Bicking, U.

    1987-04-01

    The computer program PROSA (Program for Statistical Analysis of NRTA Data) is a tool to decide on the basis of statistical considerations if, in a given sequence of materials balance periods, a loss of material might have occurred or not. The evaluation of the material balance data is based on statistical test procedures. In PROSA three truncated sequential tests are applied to a sequence of material balances. The manual describes the statistical background of PROSA and how to use the computer program on an IBM-PC with DOS 3.1. (orig.) [de

  15. Bladder cancer: Evaluation of staging accuracy using dynamic MRI

    International Nuclear Information System (INIS)

    Rajesh, A.; Sokhi, H.K.; Fung, R.; Mulcahy, K.A.; Bankart, M.J.G.

    2011-01-01

    Aim: To assess the accuracy of magnetic resonance imaging (MRI) in staging bladder cancer and to assess whether dynamic gadolinium-enhanced sequences have any added benefit in staging. Materials and methods: Over a 22 month period, the MRI findings of 100 consecutive patients with histologically proven transitional cell carcinoma (TCC) of the bladder were reviewed. The T stage was assessed independently on T2-weighted imaging alone and in combination with gadolinium-enhanced MRI. The final histological diagnosis was considered the reference standard. Statistical analysis was performed to ascertain stage-by-stage accuracy. Accuracy of MRI in differentiating superficial (≤T1) from invasive (≥T2) and in differentiating organ-confined (≤T2) from non-organ-confined (≥T3) disease was assessed. Results: On a stage-by-stage basis, tumours were correctly staged using MRI in 63% of patients (observed agreement = 0.63, weighted kappa = 0.57). The sensitivity and specificity of MRI to differentiate between superficial (≤T1) from invasive (≥T2) disease was 78.2 and 93.3%. The observed agreement for this group was 85% (kappa = 70%; p < 0.0001). The sensitivity and specificity of MRI to differentiate between organ-confined (≤T2) from non-organ confined (≥T3) disease was 90.5 and 60%. The observed agreement for this group was 89% (kappa = 30%; p < 0.01). Gadolinium-enhanced images improved staging in only three patients. Conclusion: In the present study MRI was found to be a moderately accurate tool in assessing the T stage. Agreement on a stage-by-stage basis was good. Agreement for differentiating between non-invasive versus muscle-invasive disease was good and that for organ-confined versus non-organ-confined disease was fair. Routine use of gadolinium-enhanced images is not routinely required.

  16. Bladder cancer: Evaluation of staging accuracy using dynamic MRI

    Energy Technology Data Exchange (ETDEWEB)

    Rajesh, A., E-mail: arajesh27@hotmail.com [Department of Radiology, University Hospitals of Leicester NHS Trust, Leicester General Hospital (United Kingdom); Sokhi, H.K.; Fung, R.; Mulcahy, K.A. [Department of Radiology, University Hospitals of Leicester NHS Trust, Leicester General Hospital (United Kingdom); Bankart, M.J.G. [Department of Health Sciences, University of Leicester, Leicester (United Kingdom)

    2011-12-15

    Aim: To assess the accuracy of magnetic resonance imaging (MRI) in staging bladder cancer and to assess whether dynamic gadolinium-enhanced sequences have any added benefit in staging. Materials and methods: Over a 22 month period, the MRI findings of 100 consecutive patients with histologically proven transitional cell carcinoma (TCC) of the bladder were reviewed. The T stage was assessed independently on T2-weighted imaging alone and in combination with gadolinium-enhanced MRI. The final histological diagnosis was considered the reference standard. Statistical analysis was performed to ascertain stage-by-stage accuracy. Accuracy of MRI in differentiating superficial ({<=}T1) from invasive ({>=}T2) and in differentiating organ-confined ({<=}T2) from non-organ-confined ({>=}T3) disease was assessed. Results: On a stage-by-stage basis, tumours were correctly staged using MRI in 63% of patients (observed agreement = 0.63, weighted kappa = 0.57). The sensitivity and specificity of MRI to differentiate between superficial ({<=}T1) from invasive ({>=}T2) disease was 78.2 and 93.3%. The observed agreement for this group was 85% (kappa = 70%; p < 0.0001). The sensitivity and specificity of MRI to differentiate between organ-confined ({<=}T2) from non-organ confined ({>=}T3) disease was 90.5 and 60%. The observed agreement for this group was 89% (kappa = 30%; p < 0.01). Gadolinium-enhanced images improved staging in only three patients. Conclusion: In the present study MRI was found to be a moderately accurate tool in assessing the T stage. Agreement on a stage-by-stage basis was good. Agreement for differentiating between non-invasive versus muscle-invasive disease was good and that for organ-confined versus non-organ-confined disease was fair. Routine use of gadolinium-enhanced images is not routinely required.

  17. Sleep staging with movement-related signals.

    Science.gov (United States)

    Jansen, B H; Shankar, K

    1993-05-01

    Body movement related signals (i.e., activity due to postural changes and the ballistocardiac effort) were recorded from six normal volunteers using the static-charge-sensitive bed (SCSB). Visual sleep staging was performed on the basis of simultaneously recorded EEG, EMG and EOG signals. A statistical classification technique was used to determine if reliable sleep staging could be performed using only the SCSB signal. A classification rate of between 52% and 75% was obtained for sleep staging in the five conventional sleep stages and the awake state. These rates improved from 78% to 89% for classification between awake, REM and non-REM sleep and from 86% to 98% for awake versus asleep classification.

  18. Stage of readiness of patients with behavioral dysphonia in pre and post-group voice therapy assessments.

    Science.gov (United States)

    Costa, Bianca Oliveira Ismael da; Silva, Priscila Oliveira Costa; Pinheiro, Renata Serrano de Andrade; Silva, Hêmmylly Farias da; Almeida, Anna Alice Figueirêdo de

    2017-08-10

    To verify the efficacy of group voice therapy in the stage of readiness and identify which items of the URICA-Voice range are more sensitive to post-therapy change in patients with behavioral dysphonia. An intervention study was conducted on 49 patients with behavioral dysphonia. An eclectic approach to group therapy was implemented over eight sessions, the first and last sessions consisting of assessments. The URICA-Voice range was used to evaluate the stage of readiness at pre- and post-therapy assessments. A descriptive and inferential statistical analysis was implemented for the results. Most participants were female, did not make professional use of voice, and had membranous vocal fold lesions. Most of them were in the Contemplation stage at in both moments, pre- and post-therapy. There was no significant change in the comparison of pre- and post-therapy scores. The majority of patients showed a reduction in the stage of readiness and some advanced to a higher stage. In the comparison of URICA-V range items, seven questions had equal or inferior responses in the post-therapy assessment. There was no statistical difference when comparing the pre- and post-therapy total average score of the URICA-Voice range. There were significant changes in the stage of readiness of patients in pre- and post-group speech therapy assessments.

  19. Statistical testing and power analysis for brain-wide association study.

    Science.gov (United States)

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. The heat transfer analysis of the first stage blade

    International Nuclear Information System (INIS)

    Hong, Yong Ju; Choi, Bum Seog; Park, Byung Gyu; Yoon, Eui Soo

    2001-01-01

    To get higher efficiency of gas turbine, the designer should have more higher Turbine Inlet Temperature(TIT). Today, modern gas turbine having sophisticated cooling scheme has TIT above 1,700 .deg. C. In the Korea, many gas turbine having TIT above 1,300 .deg. C was imported and being operated, but the gas with high TIT above 1,300 .deg. C in the turbine will give damage to liner of combustor, and blade of turbine and etc. So frequently maintenance for parts enduring high temperature was performed. In this study, the heat transfer analysis of cooling air in the internal cooling channel (network analysis) and temperature analysis of the blade (Finite Element Analysis) in the first stage rotor was conducted for development of the optimal cooling passage design procedure. The results of network analysis and FEM analysis of blade show that the high temperature spot are occurred at the leading edge, trailing edge near tip, and platform. So to get more reliable performance of gas turbine, the more efficient cooling method should be applied at the leading edge and tip section and the thermal barrier coating on the blade surface has important role in cooling blade

  1. Statistical analysis of magnetically soft particles in magnetorheological elastomers

    Science.gov (United States)

    Gundermann, T.; Cremer, P.; Löwen, H.; Menzel, A. M.; Odenbach, S.

    2017-04-01

    The physical properties of magnetorheological elastomers (MRE) are a complex issue and can be influenced and controlled in many ways, e.g. by applying a magnetic field, by external mechanical stimuli, or by an electric potential. In general, the response of MRE materials to these stimuli is crucially dependent on the distribution of the magnetic particles inside the elastomer. Specific knowledge of the interactions between particles or particle clusters is of high relevance for understanding the macroscopic rheological properties and provides an important input for theoretical calculations. In order to gain a better insight into the correlation between the macroscopic effects and microstructure and to generate a database for theoretical analysis, x-ray micro-computed tomography (X-μCT) investigations as a base for a statistical analysis of the particle configurations were carried out. Different MREs with quantities of 2-15 wt% (0.27-2.3 vol%) of iron powder and different allocations of the particles inside the matrix were prepared. The X-μCT results were edited by an image processing software regarding the geometrical properties of the particles with and without the influence of an external magnetic field. Pair correlation functions for the positions of the particles inside the elastomer were calculated to statistically characterize the distributions of the particles in the samples.

  2. Validation of Wendelstein 7-X fabrication and assembly stages by magnetic field calculations

    International Nuclear Information System (INIS)

    Andreeva, T.; Kislinger, J.

    2005-01-01

    The Wendelstein 7-X stellarator, which is currently under construction in Greifswald, is a 5-period machine, and many of the planned operational plasma scenarios are characterized by a rotational transform t/2p =1 at the plasma boundary. Such magnetic configurations are very sensitive to the symmetry breaking perturbations caused by fabrication and assembly errors, which can occur at different stages of the device construction. As a consequence, new islands at any periodicity can be produced, existing islands can be modified, stochastic regions can be enhanced and power load onto the divertor plates can be increased. Therefore the high precision of the machine construction is a very important issue, and evaluation of the magnetic field is necessary for the continuous validation of the fabrication and assembly stages with respect to their impact on the magnetic field perturbation. Analysis of the first fabricated winding packs (WPs) has shown that the fabrication errors can be divided into the systematical and statistical parts [1]. The systematic deviations add only negligible field components and don't perturb 5-fold symmetry of the machine, whilst the statistical deviations cause the disturbance of the machine periodicity. For that estimation of the magnetic field perturbation the numerical procedure has been developed [2], which describes statistically the randomly distributed errors, taken within the given tolerances or uses the actual measurements available as an input parameter. Since the construction of the magnet system of W7-X is subdivided into two main phases fabrication of components by industrial contractors and assembly of these components into the magnet system at the Greifswald site, the analysis of the magnetic field perturbation starts from the consideration of the impact of the WPs geometry deviations during the manufacturing stage. (Author)

  3. Pattern recognition in menstrual bleeding diaries by statistical cluster analysis

    Directory of Open Access Journals (Sweden)

    Wessel Jens

    2009-07-01

    Full Text Available Abstract Background The aim of this paper is to empirically identify a treatment-independent statistical method to describe clinically relevant bleeding patterns by using bleeding diaries of clinical studies on various sex hormone containing drugs. Methods We used the four cluster analysis methods single, average and complete linkage as well as the method of Ward for the pattern recognition in menstrual bleeding diaries. The optimal number of clusters was determined using the semi-partial R2, the cubic cluster criterion, the pseudo-F- and the pseudo-t2-statistic. Finally, the interpretability of the results from a gynecological point of view was assessed. Results The method of Ward yielded distinct clusters of the bleeding diaries. The other methods successively chained the observations into one cluster. The optimal number of distinctive bleeding patterns was six. We found two desirable and four undesirable bleeding patterns. Cyclic and non cyclic bleeding patterns were well separated. Conclusion Using this cluster analysis with the method of Ward medications and devices having an impact on bleeding can be easily compared and categorized.

  4. Examining a Stage-Based Intervention and Predictors of Exercise Behavior among Iranian Sedentary Adolescents

    Directory of Open Access Journals (Sweden)

    Zeinab Ghiami

    2015-01-01

    Full Text Available This study evaluated the effect of an intervention based on Transtheoretical Model on exercise behavior and examined TTM constructs as predictors of stages of change among Iranian adolescents. Fifty-six sedentary adolescents completed an assessment at baseline, 2 months, and 4 months. Repeated measures ANOVA and logistic regression were used to analyze the data. The analysis showed a statistically significant difference in the mean scores on stages of change for the experimental group. Thus, students in the experiment group significantly improved their stages compared to the baseline. Furthermore, stages of change were found to correlate with TTM constructs, and self-efficacy was shown to be a strong predictor of stages of change. This study indicated that a stage-based intervention using TTM constructs could effectively improve adolescents’ intention to engage in physical activity. Moreover, the level of physical activity in adolescent can be improved by increasing their self-efficacy to exercise. Keywords: Physical Activity; Stage of Change; Processes of Change; Decisional Balance; Self-efficacy; Transtheoretical Model

  5. Profile fitting and the two-stage method in neutron powder diffractometry for structure and texture analysis

    International Nuclear Information System (INIS)

    Jansen, E.; Schaefer, W.; Will, G.; Kernforschungsanlage Juelich G.m.b.H.

    1988-01-01

    An outline and an application of the two-stage method in neutron powder diffractometry are presented. Stage (1): Individual reflection data like position, half-width and integrated intensity are analysed by profile fitting. The profile analysis is based on an experimentally determined instrument function and can be applied without prior knowledge of a structural model. A mathematical procedure is described which results in a variance-covariance matrix containing standard deviations and correlations of the refined reflection parameters. Stage (2): The individual reflection data derived from the profile fitting procedure can be used for appropriate purposes either in structure determination or in texture and strain or stress analysis. The integrated intensities are used in the non-diagonal weighted least-squares routine POWLS for structure refinement. The weight matrix is given by the inverted variance-covariance matrix of stage (1). This procedure is the basis for reliable and real Bragg R values and for a realistic estimation of standard deviations of structural parameters. In the case of texture analysis the integrated intensities are compiled into pole figures representing the intensity distribution for all sample orientations of individual hkl. Various examples for the wide application of the two-stage method in structure and texture analysis are given: Structure refinement of a standard quartz specimen, magnetic ordering in the system Tb x Y 1-x Ag, preferred orientation effects in deformed marble and texture investigations of a triclinic plagioclase. (orig.)

  6. Positron emission tomography/computed tomography (PET/CT) and CT for N staging of non-small cell lung cancer.

    Science.gov (United States)

    Vegar Zubović, Sandra; Kristić, Spomenka; Hadžihasanović, Besima

    2017-08-01

    Aim The aim of this study is to investigate the possibilities of non-invasive diagnostic imaging methods, positron emission tomography/computed tomography (PET/CT) and CT, in clinical N staging of non-small cell lung cancer (NSCLC). Methods Retrospective clinical study included 50 patients with diagnosed NSCLC who have undergone PET/CT for the purpose of disease staging. The International association for the study of lung cancer (IASLC) nodal mapping system was used for analysis of nodal disease. Data regarding CT N-staging and PET/CT Nstaging were recorded. Two methods were compared using χ2 test and Spearman rank correlation coefficient. Results Statistical analysis showed that although there were some differences in determining the N stage between CT and PET/CT, these methods were in significant correlation. CT and PET/CT findings established the same N stage in 74% of the patients. In five patients based on PET/CT findings the staging was changed from operable to inoperable, while in four patients staging was changed from inoperable to operable. Conclusion PET/CT and CT are noninvasive methods that can be reliably used for N staging of NSCLC. Copyright© by the Medical Assotiation of Zenica-Doboj Canton.

  7. The role of the MR-fluoroscopy in the diagnosis and staging of the pelvic organ prolapse

    International Nuclear Information System (INIS)

    Etlik, Oemer; Arslan, Halil; Odabasi, Oner; Odabasi, Hulya; Harman, Mustafa; Celebi, Hacer; Sakarya, M. Emin

    2005-01-01

    Introduction: The aim of the study is to investigate the efficacy of the magnetic resonance fluoroscopy in the diagnosis and staging of the pelvic prolapse. Materials and methods: The study consisted of 46 patients who were known to have pelvic prolapses from their vaginal examination. Thirty women who underwent vaginal exam and shown not have pelvic prolapse were selected as a control group. Firstly, pelvic sagittal FSE T2 weighted images of all the women were acquired in 0.3 T open MR equipment than sagittal MR-fluoroscopic images using spoiled gradient echo sequences were obtained during pelvic strain. Physical examination and MR-fluoroscopic findings were compared. The relationship between the stages of prolapse established by both of the methods was evaluated statistically with Pearson's correlation analysis. Results: Physical examination and MR findings were very concordant in the diagnosis of pelvic prolapse and statistical correlations in the stages of prolapse were established between both of the methods (P<0.01 for anterior and middle comportment, P<0.05 for posterior comportment). Conclusion: We conclude that MR-fluoroscopy is a non-invasive, easily applied, dynamic useful method without contrast agent in the diagnosis and staging of pelvic organ prolapse

  8. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  9. Variability analysis of AGN: a review of results using new statistical criteria

    Science.gov (United States)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Romero, G. E.; Combi, J. A.

    We present here a re-analysis of the variability results of a sample of active galactic nuclei (AGN), which have been observed on several sessions with the 2.15 m "Jorge Sahade" telescope (CASLEO), San Juan, Argentina, and whose results are published (Romero et al. 1999, 2000, 2002; Cellone et al. 2000). The motivation for this new analysis is the implementation, dur- ing the last years, of improvements in the statistical criteria applied, taking quantitatively into account the incidence of the photometric errors (Cellone et al. 2007). This work is framed as a first step in an integral study on the statistical estimators of AGN variability. This study is motivated by the great diversity of statistical tests that have been proposed to analyze the variability of these objects. Since we note that, in some cases, the results of the object variability depend on the test used, we attempt to make a com- parative study of the various tests and analyze, under the given conditions, which of them is the most efficient and reliable.

  10. A Statistical Primer: Understanding Descriptive and Inferential Statistics

    OpenAIRE

    Gillian Byrne

    2007-01-01

    As libraries and librarians move more towards evidence‐based decision making, the data being generated in libraries is growing. Understanding the basics of statistical analysis is crucial for evidence‐based practice (EBP), in order to correctly design and analyze researchas well as to evaluate the research of others. This article covers the fundamentals of descriptive and inferential statistics, from hypothesis construction to sampling to common statistical techniques including chi‐square, co...

  11. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  12. Seafloor Topographic Analysis in Staged Ocean Resource Exploration

    Science.gov (United States)

    Ikeda, M.; Okawa, M.; Osawa, K.; Kadoshima, K.; Asakawa, E.; Sumi, T.

    2017-12-01

    J-MARES (Research and Development Partnership for Next Generation Technology of Marine Resources Survey, JAPAN) has been designing a low-expense and high-efficiency exploration system for seafloor hydrothermal massive sulfide deposits in "Cross-ministerial Strategic Innovation Promotion Program (SIP)" granted by the Cabinet Office, Government of Japan since 2014. We designed a method to focus mineral deposit prospective area in multi-stages (the regional survey, semi-detail survey and detail survey) by extracted topographic features of some well-known seafloor massive sulfide deposits from seafloor topographic analysis using seafloor topographic data acquired by the bathymetric survey. We applied this procedure to an area of interest more than 100km x 100km over Okinawa Trough, including some known seafloor massive sulfide deposits. In Addition, we tried to create a three-dimensional model of seafloor topography by SfM (Structure from Motion) technique using multiple image data of Chimney distributed around well-known seafloor massive sulfide deposit taken with Hi-Vision camera mounted on ROV in detail survey such as geophysical exploration. Topographic features of Chimney was extracted by measuring created three-dimensional model. As the result, it was possible to estimate shape of seafloor sulfide such as Chimney to be mined by three-dimensional model created from image data taken with camera mounted on ROV. In this presentation, we will discuss about focusing mineral deposit prospective area in multi-stages by seafloor topographic analysis using seafloor topographic data in exploration system for seafloor massive sulfide deposit and also discuss about three-dimensional model of seafloor topography created from seafloor image data taken with ROV.

  13. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    Science.gov (United States)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  14. Neutron activation and statistical analysis of pottery from Thera, Greece

    International Nuclear Information System (INIS)

    Kilikoglou, V.; Grimanis, A.P.; Karayannis, M.I.

    1990-01-01

    Neutron activation analysis, in combination with multivariate analysis of the generated data, was used for the chemical characterization of prehistoric pottery from the Greek islands of Thera, Melos (islands with similar geology) and Crete. The statistical procedure which proved that Theran pottery could be distinguished from Melian is described. This discrimination, attained for the first time, was mainly based on the concentrations of the trace elements Sm, Yb, Lu and Cr. Also, Cretan imports to both Thera and Melos were clearly separable from local products. (author) 22 refs.; 1 fig.; 4 tabs

  15. Comparisons of single-stage and two-stage approaches to genomic selection.

    Science.gov (United States)

    Schulz-Streeck, Torben; Ogutu, Joseph O; Piepho, Hans-Peter

    2013-01-01

    Genomic selection (GS) is a method for predicting breeding values of plants or animals using many molecular markers that is commonly implemented in two stages. In plant breeding the first stage usually involves computation of adjusted means for genotypes which are then used to predict genomic breeding values in the second stage. We compared two classical stage-wise approaches, which either ignore or approximate correlations among the means by a diagonal matrix, and a new method, to a single-stage analysis for GS using ridge regression best linear unbiased prediction (RR-BLUP). The new stage-wise method rotates (orthogonalizes) the adjusted means from the first stage before submitting them to the second stage. This makes the errors approximately independently and identically normally distributed, which is a prerequisite for many procedures that are potentially useful for GS such as machine learning methods (e.g. boosting) and regularized regression methods (e.g. lasso). This is illustrated in this paper using componentwise boosting. The componentwise boosting method minimizes squared error loss using least squares and iteratively and automatically selects markers that are most predictive of genomic breeding values. Results are compared with those of RR-BLUP using fivefold cross-validation. The new stage-wise approach with rotated means was slightly more similar to the single-stage analysis than the classical two-stage approaches based on non-rotated means for two unbalanced datasets. This suggests that rotation is a worthwhile pre-processing step in GS for the two-stage approaches for unbalanced datasets. Moreover, the predictive accuracy of stage-wise RR-BLUP was higher (5.0-6.1%) than that of componentwise boosting.

  16. Stage- and gender-specific proteomic analysis of Brugia malayi excretory-secretory products.

    Directory of Open Access Journals (Sweden)

    Yovany Moreno

    Full Text Available INTRODUCTION: While we lack a complete understanding of the molecular mechanisms by which parasites establish and achieve protection from host immune responses, it is accepted that many of these processes are mediated by products, primarily proteins, released from the parasite. Parasitic nematodes occur in different life stages and anatomical compartments within the host. Little is known about the composition and variability of products released at different developmental stages and their contribution to parasite survival and progression of the infection. METHODOLOGY/PRINCIPAL FINDINGS: To gain a deeper understanding on these aspects, we collected and analyzed through 1D-SDS PAGE and LC-MS/MS the Excretory-Secretory Products (ESP of adult female, adult male and microfilariae of the filarial nematode Brugia malayi, one of the etiological agents of human lymphatic filariasis. This proteomic analysis led to the identification of 228 proteins. The list includes 76 proteins with unknown function as well as also proteins with potential immunoregulatory properties, such as protease inhibitors, cytokine homologues and carbohydrate-binding proteins. Larval and adult ESP differed in composition. Only 32 proteins were shared between all three stages/genders. Consistent with this observation, different gene ontology profiles were associated with the different ESP. CONCLUSIONS/SIGNIFICANCE: A comparative analysis of the proteins released in vitro by different forms of a parasitic nematode dwelling in the same host is presented. The catalog of secreted proteins reflects different stage- and gender-specific related processes and different strategies of immune evasion, providing valuable insights on the contribution of each form of the parasite for establishing the host-parasite interaction.

  17. Retrospective Analysis of the Survival Benefit of Induction Chemotherapy in Stage IVa-b Nasopharyngeal Carcinoma.

    Science.gov (United States)

    Lan, Xiao-Wen; Zou, Xue-Bin; Xiao, Yao; Tang, Jie; OuYang, Pu-Yun; Su, Zhen; Xie, Fang-Yun

    2016-01-01

    The value of adding induction chemotherapy to chemoradiotherapy in locoregionally advanced nasopharyngeal carcinoma (LA-NPC) remains controversial, yet high-risk patients with LA-NPC have poor outcomes after chemoradiotherapy. We aimed to assess the survival benefits of induction chemotherapy in stage IVa-b NPC. A total of 602 patients with stage IVa-b NPC treated with intensity-modulated radiation therapy (IMRT) and concurrent chemotherapy with or without induction chemotherapy were retrospectively analyzed. Overall survival (OS), locoregional relapse-free survival (LRFS), distant metastasis-free survival (DMFS) and progression-free survival (PFS) were evaluated using the Kaplan-Meier method, log-rank test and Cox regression analysis. In univariate analysis, 5-year OS was 83.2% for induction chemotherapy plus concurrent chemotherapy and 74.8% for concurrent chemotherapy alone, corresponding to an absolute risk reduction of 8.4% (P = 0.022). Compared to concurrent chemotherapy alone, addition of induction chemotherapy improved 5-year DMFS (83.2% vs. 74.4%, P = 0.018) but not 5-year LRFS (83.7% vs. 83.0%, P = 0.848) or PFS (71.9% vs. 66.0%, P = 0.12). Age, T category, N category, chemotherapy strategy and clinical stage were associated with 5-year OS (P = 0.017, P = 0.031, P = 0.007, P = 0.022, P = 0.001, respectively). In multivariate analysis, induction chemotherapy plus concurrent chemotherapy was an independent favorable prognostic factor for OS (HR, 0.62; 95% CI, 0.43-0.90, P = 0.012) and DMFS (HR, 0.57; 95% CI, 0.38-0.83, P = 0.004). In subgroup analysis, induction chemotherapy significantly improved 5-year DMFS in stage IVa (86.8% vs. 77.3%, P = 0.008), but provided no significant benefit in stage IVb. In patients with stage IVa-b NPC treated with IMRT, addition of induction chemotherapy to concurrent chemotherapy significantly improved 5-year OS and 5-year DMFS. This study provides a basis for selection of high risk patients in future clinical therapeutic

  18. The statistical analysis of the mobility and the labor force use

    Directory of Open Access Journals (Sweden)

    Daniela-Emanuela Dãnãcicã

    2006-05-01

    Full Text Available The paper approaches some of the classical methods used in statistics for theanalysis of labor force and proposes new ways of current analysis required foradopting optimal economic patterns and strategies. The proposed methods, thelinear mean deviation used in the analysis of the external mobility of the laborforce, the coefficient of variation used in the analysis of the external mobility of thelabor force and two-dimensional table used the coefficient of internal mobilitycalculation, are illustrated by the premises, the calculus methodology, practicalapplications and guidance for their use in adopting and applying optimal economicpolicy.

  19. Deep convolutional neural networks for interpretable analysis of EEG sleep stage scoring

    DEFF Research Database (Denmark)

    Vilamala, Albert; Madsen, Kristoffer Hougaard; Hansen, Lars K.

    2017-01-01

    to purse for an automatic stage scoring based on machine learning techniques have been carried out over the last years. In this work, we resort to multitaper spectral analysis to create visually interpretable images of sleep patterns from EEG signals as inputs to a deep convolutional network trained...... to solve visual recognition tasks. As a working example of transfer learning, a system able to accurately classify sleep stages in new unseen patients is presented. Evaluations in a widely-used publicly available dataset favourably compare to state-of-the-art results, while providing a framework for visual...

  20. Statistical analysis of the hydrodynamic pressure in the near field of compressible jets

    International Nuclear Information System (INIS)

    Camussi, R.; Di Marco, A.; Castelain, T.

    2017-01-01

    Highlights: • Statistical properties of pressure fluctuations retrieved through wavelet analysis • Time delay PDFs approximated by a log-normal distribution • Amplitude PDFs approximated by a Gamma distribution • Random variable PDFs weakly dependent upon position and Mach number. • A general stochastic model achieved for the distance dependency - Abstract: This paper is devoted to the statistical characterization of the pressure fluctuations measured in the near field of a compressible jet at two subsonic Mach numbers, 0.6 and 0.9. The analysis is focused on the hydrodynamic pressure measured at different distances from the jet exit and analyzed at the typical frequency associated to the Kelvin–Helmholtz instability. Statistical properties are retrieved by the application of the wavelet transform to the experimental data and the computation of the wavelet scalogram around that frequency. This procedure highlights traces of events that appear intermittently in time and have variable strength. A wavelet-based event tracking procedure has been applied providing a statistical characterization of the time delay between successive events and of their energy level. On this basis, two stochastic models are proposed and validated against the experimental data in the different flow conditions

  1. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  2. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  3. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    Science.gov (United States)

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially

  4. Specific features of the flow structure in a reactive type turbine stage

    Science.gov (United States)

    Chernikov, V. A.; Semakina, E. Yu.

    2017-04-01

    The results of experimental studies of the gas dynamics for a reactive type turbine stage are presented. The objective of the studies is the measurement of the 3D flow fields in reference cross sections, experimental determination of the stage characteristics, and analysis of the flow structure for detecting the sources of kinetic energy losses. The integral characteristics of the studied stage are obtained by averaging the results of traversing the 3D flow over the area of the reference cross sections before and behind the stage. The averaging is performed using the conservation equations for mass, total energy flux, angular momentum with respect to the axis z of the turbine, entropy flow, and the radial projection of the momentum flux equation. The flow parameter distributions along the channel height behind the stage are obtained in the same way. More thorough analysis of the flow structure is performed after interpolation of the experimentally measured point parameter values and 3D flow velocities behind the stage. The obtained continuous velocity distributions in the absolute and relative coordinate systems are presented in the form of vector fields. The coordinates of the centers and the vectors of secondary vortices are determined using the results of point measurements of velocity vectors in the cross section behind the turbine stage and their subsequent interpolation. The approach to analysis of experimental data on aerodynamics of the turbine stage applied in this study allows one to find the detailed space structure of the working medium flow, including secondary coherent vortices at the root and peripheral regions of the air-gas part of the stage. The measured 3D flow parameter fields and their interpolation, on the one hand, point to possible sources of increased power losses, and, on the other hand, may serve as the basis for detailed testing of CFD models of the flow using both integral and local characteristics. The comparison of the numerical and

  5. Efficiency assessment of wind farms in China using two-stage data envelopment analysis

    International Nuclear Information System (INIS)

    Wu, Yunna; Hu, Yong; Xiao, Xinli; Mao, Chunyu

    2016-01-01

    Highlights: • The efficiency of China’s wind farms is assessed by data envelopment analysis. • Tobit model is used to analyze the impact of uncontrollable factors on efficiency. • Sensitivity analysis is conducted to verify the stability of evaluation results. • Efficiency levels of Chinese wind farms are relatively high in general. • Age and wind curtailment rate negatively affect the productive efficiency. - Abstract: China has been the world’s leader in wind power capacity due to the promotion of favorable policies. Given the rare research on the efficiency of China’s wind farms, this study analyzes the productive efficiency of 42 large-scale wind farms in China using a two-stage analysis. In the first stage, efficiency scores of wind farms are determined with data envelopment analysis and the sensitivity analysis is conducted to verify the robustness of efficiency calculation results. In the second stage, the Tobit regression is employed to explore the relationship between the efficiency scores and the environment variables that are beyond the control of wind farms. According to the results, all wind farms studied operate at an acceptable level. However, 50% of them overinvest in the installed capacity and about 48% have the electricity-saving potential. The most important factors affecting the efficiency of wind farms are the installed capacity and the wind power density. In addition, the age of the wind farm and the wind curtailment rate have a negative effect on productive efficiency, whereas the ownership of the wind farm has no significant effect. Findings from this study may be helpful for stakeholders in the wind industry to select wind power projects, optimize operational strategies and make related policies.

  6. Statistical analysis plan for the EuroHYP-1 trial

    DEFF Research Database (Denmark)

    Winkel, Per; Bath, Philip M; Gluud, Christian

    2017-01-01

    Score; (4) brain infarct size at 48 +/-24 hours; (5) EQ-5D-5 L score, and (6) WHODAS 2.0 score. Other outcomes are: the primary safety outcome serious adverse events; and the incremental cost-effectiveness, and cost utility ratios. The analysis sets include (1) the intention-to-treat population, and (2...... outcome), logistic regression (binary outcomes), general linear model (continuous outcomes), and the Poisson or negative binomial model (rate outcomes). DISCUSSION: Major adjustments compared with the original statistical analysis plan encompass: (1) adjustment of analyses by nationality; (2) power......) the per protocol population. The sample size is estimated to 800 patients (5% type 1 and 20% type 2 errors). All analyses are adjusted for the protocol-specified stratification variables (nationality of centre), and the minimisation variables. In the analysis, we use ordinal regression (the primary...

  7. Closing the loop: modelling of heart failure progression from health to end-stage using a meta-analysis of left ventricular pressure-volume loops.

    Science.gov (United States)

    Warriner, David R; Brown, Alistair G; Varma, Susheel; Sheridan, Paul J; Lawford, Patricia; Hose, David R; Al-Mohammad, Abdallah; Shi, Yubing

    2014-01-01

    The American Heart Association (AHA)/American College of Cardiology (ACC) guidelines for the classification of heart failure (HF) are descriptive but lack precise and objective measures which would assist in categorising such patients. Our aim was two fold, firstly to demonstrate quantitatively the progression of HF through each stage using a meta-analysis of existing left ventricular (LV) pressure-volume (PV) loop data and secondly use the LV PV loop data to create stage specific HF models. A literature search yielded 31 papers with PV data, representing over 200 patients in different stages of HF. The raw pressure and volume data were extracted from the papers using a digitising software package and the means were calculated. The data demonstrated that, as HF progressed, stroke volume (SV), ejection fraction (EF%) decreased while LV volumes increased. A 2-element lumped parameter model was employed to model the mean loops and the error was calculated between the loops, demonstrating close fit between the loops. The only parameter that was consistently and statistically different across all the stages was the elastance (Emax). For the first time, the authors have created a visual and quantitative representation of the AHA/ACC stages of LVSD-HF, from normal to end-stage. The study demonstrates that robust, load-independent and reproducible parameters, such as elastance, can be used to categorise and model HF, complementing the existing classification. The modelled PV loops establish previously unknown physiological parameters for each AHA/ACC stage of LVSD-HF, such as LV elastance and highlight that it this parameter alone, in lumped parameter models, that determines the severity of HF. Such information will enable cardiovascular modellers with an interest in HF, to create more accurate models of the heart as it fails.

  8. Closing the loop: modelling of heart failure progression from health to end-stage using a meta-analysis of left ventricular pressure-volume loops.

    Directory of Open Access Journals (Sweden)

    David R Warriner

    Full Text Available INTRODUCTION: The American Heart Association (AHA/American College of Cardiology (ACC guidelines for the classification of heart failure (HF are descriptive but lack precise and objective measures which would assist in categorising such patients. Our aim was two fold, firstly to demonstrate quantitatively the progression of HF through each stage using a meta-analysis of existing left ventricular (LV pressure-volume (PV loop data and secondly use the LV PV loop data to create stage specific HF models. METHODS AND RESULTS: A literature search yielded 31 papers with PV data, representing over 200 patients in different stages of HF. The raw pressure and volume data were extracted from the papers using a digitising software package and the means were calculated. The data demonstrated that, as HF progressed, stroke volume (SV, ejection fraction (EF% decreased while LV volumes increased. A 2-element lumped parameter model was employed to model the mean loops and the error was calculated between the loops, demonstrating close fit between the loops. The only parameter that was consistently and statistically different across all the stages was the elastance (Emax. CONCLUSIONS: For the first time, the authors have created a visual and quantitative representation of the AHA/ACC stages of LVSD-HF, from normal to end-stage. The study demonstrates that robust, load-independent and reproducible parameters, such as elastance, can be used to categorise and model HF, complementing the existing classification. The modelled PV loops establish previously unknown physiological parameters for each AHA/ACC stage of LVSD-HF, such as LV elastance and highlight that it this parameter alone, in lumped parameter models, that determines the severity of HF. Such information will enable cardiovascular modellers with an interest in HF, to create more accurate models of the heart as it fails.

  9. NEW PARADIGM OF ANALYSIS OF STATISTICAL AND EXPERT DATA IN PROBLEMS OF ECONOMICS AND MANAGEMENT

    OpenAIRE

    Orlov A. I.

    2014-01-01

    The article is devoted to the methods of analysis of statistical and expert data in problems of economics and management that are discussed in the framework of scientific specialization "Mathematical methods of economy", including organizational-economic and economic-mathematical modeling, econometrics and statistics, as well as economic aspects of decision theory, systems analysis, cybernetics, operations research. The main provisions of the new paradigm of this scientific and practical fiel...

  10. Integrating Expert Knowledge with Statistical Analysis for Landslide Susceptibility Assessment at Regional Scale

    Directory of Open Access Journals (Sweden)

    Christos Chalkias

    2016-03-01

    Full Text Available In this paper, an integration landslide susceptibility model by combining expert-based and bivariate statistical analysis (Landslide Susceptibility Index—LSI approaches is presented. Factors related with the occurrence of landslides—such as elevation, slope angle, slope aspect, lithology, land cover, Mean Annual Precipitation (MAP and Peak Ground Acceleration (PGA—were analyzed within a GIS environment. This integrated model produced a landslide susceptibility map which categorized the study area according to the probability level of landslide occurrence. The accuracy of the final map was evaluated by Receiver Operating Characteristics (ROC analysis depending on an independent (validation dataset of landslide events. The prediction ability was found to be 76% revealing that the integration of statistical analysis with human expertise can provide an acceptable landslide susceptibility assessment at regional scale.

  11. Performance of real-time elastography for the staging of hepatic fibrosis: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Huisuo Hong

    Full Text Available BACKGROUND: With the rapid development of real-time elastography (RTE, a variety of measuring methods have been developed for the assessment of hepatic fibrosis. We evaluated the overall performance of four methods based on RTE by performing meta-analysis of published literature. METHODS: Online journal databases and a manual search from April 2000 to April 2014 were used. Studies from different databases that meet inclusion criteria were enrolled. The statistical analysis was performed using a random-effects model and fixed-effects model for the overall effectiveness of RTE. The area under the receiver operating characteristic curve (AUROC was calculated for various means. Fagan plot analysis was used to estimate the clinical utility of RTE, and the heterogeneity of the studies was explored with meta-regression analysis. RESULTS: Thirteen studies from published articles were enrolled and analyzed. The combined AUROC of the liver fibrosis index (LFI for the evaluation of significant fibrosis (F≥2, advanced fibrosis (F≥3, and cirrhosis (F = 4 were 0.79, 0.94, and 0.85, respectively. The AUROC of the elasticity index (EI ranged from 0.75 to 0.92 for F≥2 and 0.66 to 0.85 for F = 4. The overall AUROC of the elastic ratio of the liver for the intrahepatic venous vessels were 0.94, 0.93, and 0.96, respectively. The AUROC of the elastic ratio of the liver for the intercostal muscle in diagnosing advanced fibrosis and cirrhosis were 0.96 and 0.92, respectively. There was significant heterogeneity in the diagnostic odds ratio (DOR for F≥2 of LFI mainly due to etiology (p<0.01. CONCLUSION: The elastic ratio of the liver for the intrahepatic vein has excellent precision in differentiating each stage of hepatic fibrosis and is recommend to be applied to the clinic.

  12. Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements

    Science.gov (United States)

    Papa, A. R.; Akel, A. F.

    2009-05-01

    Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.

  13. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab; Meseguer, José

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability

  14. Statistical analysis of the W Cyg light curve

    International Nuclear Information System (INIS)

    Klyus, I.A.

    1983-01-01

    A statistical analysis of the light curve of W Cygni has been carried out. The process of brightness variations brightness of the star is shown to be a stationary stochastic one. The hypothesis of stationarity of the process was checked at the significance level of α=0.05. Oscillations of the brightness with average durations of 131 and 250 days have been found. It is proved that oscillations are narrow-band noise, i.e. cycles. Peaks on the power spectrum corresponding to these cycles exceed 99% confidence interval. It has been stated that the oscillations are independent

  15. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  16. Analysis of hip perfusion at early and reversible stages of aseptic hip necrosis

    International Nuclear Information System (INIS)

    Predic, P.; Dodig, D.; Karner, I.

    2002-01-01

    Aim: A proper early diagnosis of aseptic hip necrosis is very important for further therapy.Since there has always been a question of the amount of perfusion in hips at different stages of aseptic hip necrosis we tried to impartially examine the perfusion in hips at early and reversible stages of aseptic hip necrosis. Material and Methods:Study included 143 patients with aseptic hip necrosis.The 550-740 MBq of Tc-99m-DPD were on bolus injected.All patients were subjected to 3-phase scintigraphy of hips and quantitative calculation of relative perfusion in the artery phase (3T) at early and repeated at reversible stages of aseptic hip necrosis. Results: At the early stage of aseptic hip necrosis the obtained 3T was decreased from 0.94-0.69 (3T=0.80).Scintigrams showed a moderate increase or diffuse accumulation. At the reversible stages we obtained 3T decreased from 0.92-0.71 (3T=0.79) thus evidencing hypoperfusion.Scintigrams showed an diffuse increased accumulation. Conclusion: With the aseptic hip necrosis quantitative analysis of perfusion in the artery phase-3T indicates that the perfusion is decreased at all stages of the process which however shows a significantly falling trend with the progress of the disease

  17. Statistical Analysis of a Method to Predict Drug-Polymer Miscibility

    DEFF Research Database (Denmark)

    Knopp, Matthias Manne; Olesen, Niels Erik; Huang, Yanbin

    2016-01-01

    In this study, a method proposed to predict drug-polymer miscibility from differential scanning calorimetry measurements was subjected to statistical analysis. The method is relatively fast and inexpensive and has gained popularity as a result of the increasing interest in the formulation of drug...... as provided in this study. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci....

  18. A framework for 2-stage global sensitivity analysis of GastroPlus™ compartmental models.

    Science.gov (United States)

    Scherholz, Megerle L; Forder, James; Androulakis, Ioannis P

    2018-04-01

    Parameter sensitivity and uncertainty analysis for physiologically based pharmacokinetic (PBPK) models are becoming an important consideration for regulatory submissions, requiring further evaluation to establish the need for global sensitivity analysis. To demonstrate the benefits of an extensive analysis, global sensitivity was implemented for the GastroPlus™ model, a well-known commercially available platform, using four example drugs: acetaminophen, risperidone, atenolol, and furosemide. The capabilities of GastroPlus were expanded by developing an integrated framework to automate the GastroPlus graphical user interface with AutoIt and for execution of the sensitivity analysis in MATLAB ® . Global sensitivity analysis was performed in two stages using the Morris method to screen over 50 parameters for significant factors followed by quantitative assessment of variability using Sobol's sensitivity analysis. The 2-staged approach significantly reduced computational cost for the larger model without sacrificing interpretation of model behavior, showing that the sensitivity results were well aligned with the biopharmaceutical classification system. Both methods detected nonlinearities and parameter interactions that would have otherwise been missed by local approaches. Future work includes further exploration of how the input domain influences the calculated global sensitivity measures as well as extending the framework to consider a whole-body PBPK model.

  19. A Bifactor Approach to Model Multifaceted Constructs in Statistical Mediation Analysis

    Science.gov (United States)

    Gonzalez, Oscar; MacKinnon, David P.

    2018-01-01

    Statistical mediation analysis allows researchers to identify the most important mediating constructs in the causal process studied. Identifying specific mediators is especially relevant when the hypothesized mediating construct consists of multiple related facets. The general definition of the construct and its facets might relate differently to…

  20. Statistical Compilation of the ICT Sector and Policy Analysis | Page 4 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...