WorldWideScience

Sample records for non-parametric statistical tests

  1. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  2. Effect of non-normality on test statistics for one-way independent groups designs.

    Science.gov (United States)

    Cribbie, Robert A; Fiksenbaum, Lisa; Keselman, H J; Wilcox, Rand R

    2012-02-01

    The data obtained from one-way independent groups designs is typically non-normal in form and rarely equally variable across treatment populations (i.e., population variances are heterogeneous). Consequently, the classical test statistic that is used to assess statistical significance (i.e., the analysis of variance F test) typically provides invalid results (e.g., too many Type I errors, reduced power). For this reason, there has been considerable interest in finding a test statistic that is appropriate under conditions of non-normality and variance heterogeneity. Previously recommended procedures for analysing such data include the James test, the Welch test applied either to the usual least squares estimators of central tendency and variability, or the Welch test with robust estimators (i.e., trimmed means and Winsorized variances). A new statistic proposed by Krishnamoorthy, Lu, and Mathew, intended to deal with heterogeneous variances, though not non-normality, uses a parametric bootstrap procedure. In their investigation of the parametric bootstrap test, the authors examined its operating characteristics under limited conditions and did not compare it to the Welch test based on robust estimators. Thus, we investigated how the parametric bootstrap procedure and a modified parametric bootstrap procedure based on trimmed means perform relative to previously recommended procedures when data are non-normal and heterogeneous. The results indicated that the tests based on trimmed means offer the best Type I error control and power when variances are unequal and at least some of the distribution shapes are non-normal. © 2011 The British Psychological Society.

  3. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  5. Non-parametric Tuning of PID Controllers A Modified Relay-Feedback-Test Approach

    CERN Document Server

    Boiko, Igor

    2013-01-01

    The relay feedback test (RFT) has become a popular and efficient  tool used in process identification and automatic controller tuning. Non-parametric Tuning of PID Controllers couples new modifications of classical RFT with application-specific optimal tuning rules to form a non-parametric method of test-and-tuning. Test and tuning are coordinated through a set of common parameters so that a PID controller can obtain the desired gain or phase margins in a system exactly, even with unknown process dynamics. The concept of process-specific optimal tuning rules in the nonparametric setup, with corresponding tuning rules for flow, level pressure, and temperature control loops is presented in the text.   Common problems of tuning accuracy based on parametric and non-parametric approaches are addressed. In addition, the text treats the parametric approach to tuning based on the modified RFT approach and the exact model of oscillations in the system under test using the locus of a perturbedrelay system (LPRS) meth...

  6. Parametric Level Statistics in Random Matrix Theory: Exact Solution

    International Nuclear Information System (INIS)

    Kanzieper, E.

    1999-01-01

    During recent several years, the theory of non-Gaussian random matrix ensembles has experienced a sound progress motivated by new ideas in quantum chromodynamics (QCD) and mesoscopic physics. Invariant non-Gaussian random matrix models appear to describe universal features of low-energy part of the spectrum of Dirac operator in QCD, and electron level statistics in normal conducting-superconducting hybrid structures. They also serve as a basis for constructing the toy models of universal spectral statistics expected at the edge of the metal-insulator transition. While conventional spectral statistics has received a detailed study in the context of RMT, quite a bit is known about parametric level statistics in non-Gaussian random matrix models. In this communication we report about exact solution to the problem of parametric level statistics in unitary invariant, U(N), non-Gaussian ensembles of N x N Hermitian random matrices with either soft or strong level confinement. The solution is formulated within the framework of the orthogonal polynomial technique and is shown to depend on both the unfolded two-point scalar kernel and the level confinement through a double integral transformation which, in turn, provides a constructive tool for description of parametric level correlations in non-Gaussian RMT. In the case of soft level confinement, the formalism developed is potentially applicable to a study of parametric level statistics in an important class of random matrix models with finite level compressibility expected to describe a disorder-induced metal-insulator transition. In random matrix ensembles with strong level confinement, the solution presented takes a particular simple form in the thermodynamic limit: In this case, a new intriguing connection relation between the parametric level statistics and the scalar two-point kernel of an unperturbed ensemble is demonstrated to emerge. Extension of the results obtained to higher-order parametric level statistics is

  7. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications.

    Directory of Open Access Journals (Sweden)

    Elias Chaibub Neto

    Full Text Available In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson's sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling.

  8. kruX: matrix-based non-parametric eQTL discovery.

    Science.gov (United States)

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  9. Performances of non-parametric statistics in sensitivity analysis and parameter ranking

    International Nuclear Information System (INIS)

    Saltelli, A.

    1987-01-01

    Twelve parametric and non-parametric sensitivity analysis techniques are compared in the case of non-linear model responses. The test models used are taken from the long-term risk analysis for the disposal of high level radioactive waste in a geological formation. They describe the transport of radionuclides through a set of engineered and natural barriers from the repository to the biosphere and to man. The output data from these models are the dose rates affecting the maximum exposed individual of a critical group at a given point in time. All the techniques are applied to the output from the same Monte Carlo simulations, where a modified version of Latin Hypercube method is used for the sample selection. Hypothesis testing is systematically applied to quantify the degree of confidence in the results given by the various sensitivity estimators. The estimators are ranked according to their robustness and stability, on the basis of two test cases. The conclusions are that no estimator can be considered the best from all points of view and recommend the use of more than just one estimator in sensitivity analysis

  10. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  11. Normality Tests for Statistical Analysis: A Guide for Non-Statisticians

    Science.gov (United States)

    Ghasemi, Asghar; Zahediasl, Saleh

    2012-01-01

    Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808

  12. Technical Topic 3.2.2.d Bayesian and Non-Parametric Statistics: Integration of Neural Networks with Bayesian Networks for Data Fusion and Predictive Modeling

    Science.gov (United States)

    2016-05-31

    Distribution Unlimited UU UU UU UU 31-05-2016 15-Apr-2014 14-Jan-2015 Final Report: Technical Topic 3.2.2.d Bayesian and Non- parametric Statistics...of Papers published in non peer-reviewed journals: Final Report: Technical Topic 3.2.2.d Bayesian and Non- parametric Statistics: Integration of Neural...Transfer N/A Number of graduating undergraduates who achieved a 3.5 GPA to 4.0 (4.0 max scale ): Number of graduating undergraduates funded by a DoD funded

  13. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated

    Directory of Open Access Journals (Sweden)

    Elżbieta Sandurska

    2016-12-01

    Full Text Available Introduction: Application of statistical software typically does not require extensive statistical knowledge, allowing to easily perform even complex analyses. Consequently, test selection criteria and important assumptions may be easily overlooked or given insufficient consideration. In such cases, the results may likely lead to wrong conclusions. Aim: To discuss issues related to assumption violations in the case of Student's t-test and one-way ANOVA, two parametric tests frequently used in the field of sports science, and to recommend solutions. Description of the state of knowledge: Student's t-test and ANOVA are parametric tests, and therefore some of the assumptions that need to be satisfied include normal distribution of the data and homogeneity of variances in groups. If the assumptions are violated, the original design of the test is impaired, and the test may then be compromised giving spurious results. A simple method to normalize the data and to stabilize the variance is to use transformations. If such approach fails, a good alternative to consider is a nonparametric test, such as Mann-Whitney, the Kruskal-Wallis or Wilcoxon signed-rank tests. Summary: Thorough verification of the parametric tests assumptions allows for correct selection of statistical tools, which is the basis of well-grounded statistical analysis. With a few simple rules, testing patterns in the data characteristic for the study of sports science comes down to a straightforward procedure.

  14. A simple non-parametric goodness-of-fit test for elliptical copulas

    Directory of Open Access Journals (Sweden)

    Jaser Miriam

    2017-12-01

    Full Text Available In this paper, we propose a simple non-parametric goodness-of-fit test for elliptical copulas of any dimension. It is based on the equality of Kendall’s tau and Blomqvist’s beta for all bivariate margins. Nominal level and power of the proposed test are investigated in a Monte Carlo study. An empirical application illustrates our goodness-of-fit test at work.

  15. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography.

    Science.gov (United States)

    Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-06-01

    Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.

  16. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...... considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...

  17. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  18. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  19. The application of non-parametric statistical method for an ALARA implementation

    International Nuclear Information System (INIS)

    Cho, Young Ho; Herr, Young Hoi

    2003-01-01

    The cost-effective reduction of Occupational Radiation Dose (ORD) at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORD data of existing plants. Through the data analysis, it is required to identify what are the jobs of repetitive high ORD at the nuclear power plant. In this study, Percentile Rank Sum Method (PRSM) is proposed to identify repetitive high ORD jobs, which is based on non-parametric statistical theory. As a case study, the method is applied to ORD data of maintenance and repair jobs at Kori units 3 and 4 that are pressurized water reactors with 950 MWe capacity and have been operated since 1986 and 1987, respectively in Korea. The results was verified and validated, and PRSM has been demonstrated to be an efficient method of analyzing the data

  20. Spurious Seasonality Detection: A Non-Parametric Test Proposal

    Directory of Open Access Journals (Sweden)

    Aurelio F. Bariviera

    2018-01-01

    Full Text Available This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact of the hidden correlation structure of the data. We present simulations based on artificial time series as well. While time series generated with long memory are prone to exhibit daily seasonality, pure white noise signals exhibit no pattern preference. Since ours is a non-parametric test, it requires no assumptions about the distribution of returns, so that it could be a practical alternative to conventional econometric tests. We also made an exhaustive application of the here-proposed technique to 83 stock indexes around the world. Finally, the paper highlights the relevance of symbolic analysis in economic time series studies.

  1. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    Science.gov (United States)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  2. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  3. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  4. Non-classical Signature of Parametric Fluorescence and its Application in Metrology

    Directory of Open Access Journals (Sweden)

    Hamar M.

    2014-08-01

    Full Text Available The article provides a short theoretical background of what the non-classical light means. We applied the criterion for the existence of non-classical effects derived by C.T. Lee on parametric fluorescence. The criterion was originally derived for the study of two light beams with one mode per beam. We checked if the criterion is still working for two multimode beams of parametric down-conversion through numerical simulations. The theoretical results were tested by measurement of photon number statistics of twin beams emitted by nonlinear BBO crystal pumped by intense femtoseconds UV pulse. We used ICCD camera as the detector of photons in both beams. It appears that the criterion can be used for the measurement of the quantum efficiencies of the ICCD cameras.

  5. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated

    OpenAIRE

    Sandurska, Elżbieta; Szulc, Aleksandra

    2016-01-01

    Sandurska Elżbieta, Szulc Aleksandra. A method of statistical analysis in the field of sports science when assumptions of parametric tests are not violated. Journal of Education Health and Sport. 2016;6(13):275-287. eISSN 2391-8306. DOI http://dx.doi.org/10.5281/zenodo.293762 http://ojs.ukw.edu.pl/index.php/johs/article/view/4278 The journal has had 7 points in Ministry of Science and Higher Education parametric evaluation. Part B item 754 (09.12.2016). 754 Journal...

  6. A Non-Parametric Surrogate-based Test of Significance for T-Wave Alternans Detection

    Science.gov (United States)

    Nemati, Shamim; Abdala, Omar; Bazán, Violeta; Yim-Yeh, Susie; Malhotra, Atul; Clifford, Gari

    2010-01-01

    We present a non-parametric adaptive surrogate test that allows for the differentiation of statistically significant T-Wave Alternans (TWA) from alternating patterns that can be solely explained by the statistics of noise. The proposed test is based on estimating the distribution of noise induced alternating patterns in a beat sequence from a set of surrogate data derived from repeated reshuffling of the original beat sequence. Thus, in assessing the significance of the observed alternating patterns in the data no assumptions are made about the underlying noise distribution. In addition, since the distribution of noise-induced alternans magnitudes is calculated separately for each sequence of beats within the analysis window, the method is robust to data non-stationarities in both noise and TWA. The proposed surrogate method for rejecting noise was compared to the standard noise rejection methods used with the Spectral Method (SM) and the Modified Moving Average (MMA) techniques. Using a previously described realistic multi-lead model of TWA, and real physiological noise, we demonstrate the proposed approach reduces false TWA detections, while maintaining a lower missed TWA detection compared with all the other methods tested. A simple averaging-based TWA estimation algorithm was coupled with the surrogate significance testing and was evaluated on three public databases; the Normal Sinus Rhythm Database (NRSDB), the Chronic Heart Failure Database (CHFDB) and the Sudden Cardiac Death Database (SCDDB). Differences in TWA amplitudes between each database were evaluated at matched heart rate (HR) intervals from 40 to 120 beats per minute (BPM). Using the two-sample Kolmogorov-Smirnov test, we found that significant differences in TWA levels exist between each patient group at all decades of heart rates. The most marked difference was generally found at higher heart rates, and the new technique resulted in a larger margin of separability between patient populations than

  7. [The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].

    Science.gov (United States)

    Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel

    2017-01-01

    The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  8. The research protocol VI: How to choose the appropriate statistical test. Inferential statistics

    Directory of Open Access Journals (Sweden)

    Eric Flores-Ruiz

    2017-10-01

    Full Text Available The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.

  9. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  10. Study on Semi-Parametric Statistical Model of Safety Monitoring of Cracks in Concrete Dams

    Directory of Open Access Journals (Sweden)

    Chongshi Gu

    2013-01-01

    Full Text Available Cracks are one of the hidden dangers in concrete dams. The study on safety monitoring models of concrete dam cracks has always been difficult. Using the parametric statistical model of safety monitoring of cracks in concrete dams, with the help of the semi-parametric statistical theory, and considering the abnormal behaviors of these cracks, the semi-parametric statistical model of safety monitoring of concrete dam cracks is established to overcome the limitation of the parametric model in expressing the objective model. Previous projects show that the semi-parametric statistical model has a stronger fitting effect and has a better explanation for cracks in concrete dams than the parametric statistical model. However, when used for forecast, the forecast capability of the semi-parametric statistical model is equivalent to that of the parametric statistical model. The modeling of the semi-parametric statistical model is simple, has a reasonable principle, and has a strong practicality, with a good application prospect in the actual project.

  11. Rank-based permutation approaches for non-parametric factorial designs.

    Science.gov (United States)

    Umlauft, Maria; Konietschke, Frank; Pauly, Markus

    2017-11-01

    Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.

  12. Non-parametric order statistics method applied to uncertainty propagation in fuel rod calculations

    International Nuclear Information System (INIS)

    Arimescu, V.E.; Heins, L.

    2001-01-01

    Advances in modeling fuel rod behavior and accumulations of adequate experimental data have made possible the introduction of quantitative methods to estimate the uncertainty of predictions made with best-estimate fuel rod codes. The uncertainty range of the input variables is characterized by a truncated distribution which is typically a normal, lognormal, or uniform distribution. While the distribution for fabrication parameters is defined to cover the design or fabrication tolerances, the distribution of modeling parameters is inferred from the experimental database consisting of separate effects tests and global tests. The final step of the methodology uses a Monte Carlo type of random sampling of all relevant input variables and performs best-estimate code calculations to propagate these uncertainties in order to evaluate the uncertainty range of outputs of interest for design analysis, such as internal rod pressure and fuel centerline temperature. The statistical method underlying this Monte Carlo sampling is non-parametric order statistics, which is perfectly suited to evaluate quantiles of populations with unknown distribution. The application of this method is straightforward in the case of one single fuel rod, when a 95/95 statement is applicable: 'with a probability of 95% and confidence level of 95% the values of output of interest are below a certain value'. Therefore, the 0.95-quantile is estimated for the distribution of all possible values of one fuel rod with a statistical confidence of 95%. On the other hand, a more elaborate procedure is required if all the fuel rods in the core are being analyzed. In this case, the aim is to evaluate the following global statement: with 95% confidence level, the expected number of fuel rods which are not exceeding a certain value is all the fuel rods in the core except only a few fuel rods. In both cases, the thresholds determined by the analysis should be below the safety acceptable design limit. An indirect

  13. Level-statistics in Disordered Systems: A single parametric scaling and Connection to Brownian Ensembles

    OpenAIRE

    Shukla, Pragya

    2004-01-01

    We find that the statistics of levels undergoing metal-insulator transition in systems with multi-parametric Gaussian disorders and non-interacting electrons behaves in a way similar to that of the single parametric Brownian ensembles \\cite{dy}. The latter appear during a Poisson $\\to$ Wigner-Dyson transition, driven by a random perturbation. The analogy provides the analytical evidence for the single parameter scaling of the level-correlations in disordered systems as well as a tool to obtai...

  14. Bayesian non parametric modelling of Higgs pair production

    Directory of Open Access Journals (Sweden)

    Scarpa Bruno

    2017-01-01

    Full Text Available Statistical classification models are commonly used to separate a signal from a background. In this talk we face the problem of isolating the signal of Higgs pair production using the decay channel in which each boson decays into a pair of b-quarks. Typically in this context non parametric methods are used, such as Random Forests or different types of boosting tools. We remain in the same non-parametric framework, but we propose to face the problem following a Bayesian approach. A Dirichlet process is used as prior for the random effects in a logit model which is fitted by leveraging the Polya-Gamma data augmentation. Refinements of the model include the insertion in the simple model of P-splines to relate explanatory variables with the response and the use of Bayesian trees (BART to describe the atoms in the Dirichlet process.

  15. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  16. Evaluation of model-based versus non-parametric monaural noise-reduction approaches for hearing aids.

    Science.gov (United States)

    Harlander, Niklas; Rosenkranz, Tobias; Hohmann, Volker

    2012-08-01

    Single channel noise reduction has been well investigated and seems to have reached its limits in terms of speech intelligibility improvement, however, the quality of such schemes can still be advanced. This study tests to what extent novel model-based processing schemes might improve performance in particular for non-stationary noise conditions. Two prototype model-based algorithms, a speech-model-based, and a auditory-model-based algorithm were compared to a state-of-the-art non-parametric minimum statistics algorithm. A speech intelligibility test, preference rating, and listening effort scaling were performed. Additionally, three objective quality measures for the signal, background, and overall distortions were applied. For a better comparison of all algorithms, particular attention was given to the usage of the similar Wiener-based gain rule. The perceptual investigation was performed with fourteen hearing-impaired subjects. The results revealed that the non-parametric algorithm and the auditory model-based algorithm did not affect speech intelligibility, whereas the speech-model-based algorithm slightly decreased intelligibility. In terms of subjective quality, both model-based algorithms perform better than the unprocessed condition and the reference in particular for highly non-stationary noise environments. Data support the hypothesis that model-based algorithms are promising for improving performance in non-stationary noise conditions.

  17. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt

    2013-01-01

    in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...

  18. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods: A Comparison with Clinical Assessment

    Science.gov (United States)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration

  19. Parametric Statistics of Individual Energy Levels in Random Hamiltonians

    OpenAIRE

    Smolyarenko, I. E.; Simons, B. D.

    2002-01-01

    We establish a general framework to explore parametric statistics of individual energy levels in disordered and chaotic quantum systems of unitary symmetry. The method is applied to the calculation of the universal intra-level parametric velocity correlation function and the distribution of level shifts under the influence of an arbitrary external perturbation.

  20. Parametric statistical inference basic theory and modern approaches

    CERN Document Server

    Zacks, Shelemyahu; Tsokos, C P

    1981-01-01

    Parametric Statistical Inference: Basic Theory and Modern Approaches presents the developments and modern trends in statistical inference to students who do not have advanced mathematical and statistical preparation. The topics discussed in the book are basic and common to many fields of statistical inference and thus serve as a jumping board for in-depth study. The book is organized into eight chapters. Chapter 1 provides an overview of how the theory of statistical inference is presented in subsequent chapters. Chapter 2 briefly discusses statistical distributions and their properties. Chapt

  1. Statistical prediction of parametric roll using FORM

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher; Choi, Ju-hyuck; Nielsen, Ulrik Dam

    2017-01-01

    Previous research has shown that the First Order Reliability Method (FORM) can be an efficient method for estimation of outcrossing rates and extreme value statistics for stationary stochastic processes. This is so also for bifurcation type of processes like parametric roll of ships. The present...

  2. Assessing pupil and school performance by non-parametric and parametric techniques

    NARCIS (Netherlands)

    de Witte, K.; Thanassoulis, E.; Simpson, G.; Battisti, G.; Charlesworth-May, A.

    2010-01-01

    This paper discusses the use of the non-parametric free disposal hull (FDH) and the parametric multi-level model (MLM) as alternative methods for measuring pupil and school attainment where hierarchical structured data are available. Using robust FDH estimates, we show how to decompose the overall

  3. Sensitivity of Technical Efficiency Estimates to Estimation Methods: An Empirical Comparison of Parametric and Non-Parametric Approaches

    OpenAIRE

    de-Graft Acquah, Henry

    2014-01-01

    This paper highlights the sensitivity of technical efficiency estimates to estimation approaches using empirical data. Firm specific technical efficiency and mean technical efficiency are estimated using the non parametric Data Envelope Analysis (DEA) and the parametric Corrected Ordinary Least Squares (COLS) and Stochastic Frontier Analysis (SFA) approaches. Mean technical efficiency is found to be sensitive to the choice of estimation technique. Analysis of variance and Tukey’s test sugge...

  4. Mathematical statistics

    CERN Document Server

    Pestman, Wiebe R

    2009-01-01

    This textbook provides a broad and solid introduction to mathematical statistics, including the classical subjects hypothesis testing, normal regression analysis, and normal analysis of variance. In addition, non-parametric statistics and vectorial statistics are considered, as well as applications of stochastic analysis in modern statistics, e.g., Kolmogorov-Smirnov testing, smoothing techniques, robustness and density estimation. For students with some elementary mathematical background. With many exercises. Prerequisites from measure theory and linear algebra are presented.

  5. The geometry of distributional preferences and a non-parametric identification approach: The Equality Equivalence Test.

    Science.gov (United States)

    Kerschbamer, Rudolf

    2015-05-01

    This paper proposes a geometric delineation of distributional preference types and a non-parametric approach for their identification in a two-person context. It starts with a small set of assumptions on preferences and shows that this set (i) naturally results in a taxonomy of distributional archetypes that nests all empirically relevant types considered in previous work; and (ii) gives rise to a clean experimental identification procedure - the Equality Equivalence Test - that discriminates between archetypes according to core features of preferences rather than properties of specific modeling variants. As a by-product the test yields a two-dimensional index of preference intensity.

  6. Image sequence analysis in nuclear medicine: (1) Parametric imaging using statistical modelling

    International Nuclear Information System (INIS)

    Liehn, J.C.; Hannequin, P.; Valeyre, J.

    1989-01-01

    This is a review of parametric imaging methods on Nuclear Medicine. A Parametric Image is an image in which each pixel value is a function of the value of the same pixel of an image sequence. The Local Model Method is the fitting of each pixel time activity curve by a model which parameter values form the Parametric Images. The Global Model Method is the modelling of the changes between two images. It is applied to image comparison. For both methods, the different models, the identification criterion, the optimization methods and the statistical properties of the images are discussed. The analysis of one or more Parametric Images is performed using 1D or 2D histograms. The statistically significant Parametric Images, (Images of significant Variances, Amplitudes and Differences) are also proposed [fr

  7. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  8. A non-parametric consistency test of the ΛCDM model with Planck CMB data

    Energy Technology Data Exchange (ETDEWEB)

    Aghamousa, Amir; Shafieloo, Arman [Korea Astronomy and Space Science Institute, Daejeon 305-348 (Korea, Republic of); Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr [School of Physics, The University of New South Wales, Sydney NSW 2052 (Australia)

    2017-09-01

    Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation of the base ΛCDM model as cosmology's gold standard.

  9. Statistical trend analysis methods for temporal phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  10. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  11. Speaker Linking and Applications using Non-Parametric Hashing Methods

    Science.gov (United States)

    2016-09-08

    nonparametric estimate of a multivariate density function,” The Annals of Math- ematical Statistics , vol. 36, no. 3, pp. 1049–1051, 1965. [9] E. A. Patrick...Speaker Linking and Applications using Non-Parametric Hashing Methods† Douglas Sturim and William M. Campbell MIT Lincoln Laboratory, Lexington, MA...with many approaches [1, 2]. For this paper, we focus on using i-vectors [2], but the methods apply to any embedding. For the task of speaker QBE and

  12. A Range-Based Test for the Parametric Form of the Volatility in Diffusion Models

    DEFF Research Database (Denmark)

    Podolskij, Mark; Ziggel, Daniel

    statistic. Under rather weak assumptions on the drift and volatility we prove weak convergence of the test statistic to a centered mixed Gaussian distribution. As a consequence we obtain a test, which is consistent for any fixed alternative. Moreover, we present a parametric bootstrap procedure which...

  13. Nonparametric predictive inference for combining diagnostic tests with parametric copula

    Science.gov (United States)

    Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.

    2017-09-01

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.

  14. Testing Parametric versus Semiparametric Modelling in Generalized Linear Models

    NARCIS (Netherlands)

    Härdle, W.K.; Mammen, E.; Müller, M.D.

    1996-01-01

    We consider a generalized partially linear model E(Y|X,T) = G{X'b + m(T)} where G is a known function, b is an unknown parameter vector, and m is an unknown function.The paper introduces a test statistic which allows to decide between a parametric and a semiparametric model: (i) m is linear, i.e.

  15. Two-Sample Statistics for Testing the Equality of Survival Functions Against Improper Semi-parametric Accelerated Failure Time Alternatives: An Application to the Analysis of a Breast Cancer Clinical Trial

    Science.gov (United States)

    BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY

    2010-01-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627

  16. Two-sample statistics for testing the equality of survival functions against improper semi-parametric accelerated failure time alternatives: an application to the analysis of a breast cancer clinical trial.

    Science.gov (United States)

    Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry

    2004-06-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.

  17. Comparison of parametric and bootstrap method in bioequivalence test.

    Science.gov (United States)

    Ahn, Byung-Jin; Yim, Dong-Seok

    2009-10-01

    The estimation of 90% parametric confidence intervals (CIs) of mean AUC and Cmax ratios in bioequivalence (BE) tests are based upon the assumption that formulation effects in log-transformed data are normally distributed. To compare the parametric CIs with those obtained from nonparametric methods we performed repeated estimation of bootstrap-resampled datasets. The AUC and Cmax values from 3 archived datasets were used. BE tests on 1,000 resampled datasets from each archived dataset were performed using SAS (Enterprise Guide Ver.3). Bootstrap nonparametric 90% CIs of formulation effects were then compared with the parametric 90% CIs of the original datasets. The 90% CIs of formulation effects estimated from the 3 archived datasets were slightly different from nonparametric 90% CIs obtained from BE tests on resampled datasets. Histograms and density curves of formulation effects obtained from resampled datasets were similar to those of normal distribution. However, in 2 of 3 resampled log (AUC) datasets, the estimates of formulation effects did not follow the Gaussian distribution. Bias-corrected and accelerated (BCa) CIs, one of the nonparametric CIs of formulation effects, shifted outside the parametric 90% CIs of the archived datasets in these 2 non-normally distributed resampled log (AUC) datasets. Currently, the 80~125% rule based upon the parametric 90% CIs is widely accepted under the assumption of normally distributed formulation effects in log-transformed data. However, nonparametric CIs may be a better choice when data do not follow this assumption.

  18. Parametric and Non-Parametric Vibration-Based Structural Identification Under Earthquake Excitation

    Science.gov (United States)

    Pentaris, Fragkiskos P.; Fouskitakis, George N.

    2014-05-01

    The problem of modal identification in civil structures is of crucial importance, and thus has been receiving increasing attention in recent years. Vibration-based methods are quite promising as they are capable of identifying the structure's global characteristics, they are relatively easy to implement and they tend to be time effective and less expensive than most alternatives [1]. This paper focuses on the off-line structural/modal identification of civil (concrete) structures subjected to low-level earthquake excitations, under which, they remain within their linear operating regime. Earthquakes and their details are recorded and provided by the seismological network of Crete [2], which 'monitors' the broad region of south Hellenic arc, an active seismic region which functions as a natural laboratory for earthquake engineering of this kind. A sufficient number of seismic events are analyzed in order to reveal the modal characteristics of the structures under study, that consist of the two concrete buildings of the School of Applied Sciences, Technological Education Institute of Crete, located in Chania, Crete, Hellas. Both buildings are equipped with high-sensitivity and accuracy seismographs - providing acceleration measurements - established at the basement (structure's foundation) presently considered as the ground's acceleration (excitation) and at all levels (ground floor, 1st floor, 2nd floor and terrace). Further details regarding the instrumentation setup and data acquisition may be found in [3]. The present study invokes stochastic, both non-parametric (frequency-based) and parametric methods for structural/modal identification (natural frequencies and/or damping ratios). Non-parametric methods include Welch-based spectrum and Frequency response Function (FrF) estimation, while parametric methods, include AutoRegressive (AR), AutoRegressive with eXogeneous input (ARX) and Autoregressive Moving-Average with eXogeneous input (ARMAX) models[4, 5

  19. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  20. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  1. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  2. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  3. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  4. Non-parametric estimation of the individual's utility map

    OpenAIRE

    Noguchi, Takao; Sanborn, Adam N.; Stewart, Neil

    2013-01-01

    Models of risky choice have attracted much attention in behavioural economics. Previous research has repeatedly demonstrated that individuals' choices are not well explained by expected utility theory, and a number of alternative models have been examined using carefully selected sets of choice alternatives. The model performance however, can depend on which choice alternatives are being tested. Here we develop a non-parametric method for estimating the utility map over the wide range of choi...

  5. Experimental Sentinel-2 LAI estimation using parametric, non-parametric and physical retrieval methods - A comparison

    NARCIS (Netherlands)

    Verrelst, Jochem; Rivera, Juan Pablo; Veroustraete, Frank; Muñoz-Marí, Jordi; Clevers, J.G.P.W.; Camps-Valls, Gustau; Moreno, José

    2015-01-01

    Given the forthcoming availability of Sentinel-2 (S2) images, this paper provides a systematic comparison of retrieval accuracy and processing speed of a multitude of parametric, non-parametric and physically-based retrieval methods using simulated S2 data. An experimental field dataset (SPARC),

  6. Non-statistical behavior of coupled optical systems

    International Nuclear Information System (INIS)

    Perez, G.; Pando Lambruschini, C.; Sinha, S.; Cerdeira, H.A.

    1991-10-01

    We study globally coupled chaotic maps modeling an optical system, and find clear evidence of non-statistical behavior: the mean square deviation (MSD) of the mean field saturates with respect to increase in the number of elements coupled, after a critical value, and its distribution is clearly non-Gaussian. We also find that the power spectrum of the mean field displays well defined peaks, indicating a subtle coherence among different elements, even in the ''turbulent'' phase. This system is a physically realistic model that may be experimentally realizable. It is also a higher dimensional example (as each individual element is given by a complex map). Its study confirms that the phenomena observed in a wide class of coupled one-dimensional maps are present here as well. This gives more evidence to believe that such non-statistical behavior is probably generic in globally coupled systems. We also investigate the influence of parametric fluctuations on the MSD. (author). 10 refs, 7 figs, 1 tab

  7. Voxel-based analysis of cerebral glucose metabolism in AD and non-AD degenerative dementia using statistical parametric mapping

    International Nuclear Information System (INIS)

    Li Zugui; Gao Shuo; Zhang Benshu; Ma Aijun; Cai Li; Li Dacheng; Li Yansheng; Liu Lei

    2008-01-01

    Objective: It is know that Alzheimer's disease (AD) and non-AD degenerative dementia have some clinical features in common. The aim of this study was to investigate the specific patterns of regional, cerebral glucose metabolism of AD and non-AD degenerative dementia patients, using a voxel-based 18 F-fluorodeoxyglucose (FDG) PET study. Methods: Twenty-three AD patients and 24 non-AD degenerative dementia patients including 9 Parkinson's disease with dementia(PDD), 7 frontal-temporal dementia (FTD), 8 dementia of Lewy bodies (DLB) patients, and 40 normal controls (NC)were included in the study. To evaluate the relative cerebral metabolic rate of glucose (rCMRglc), 18 F-FDG PET imaging was performed in all subjects. Subsequently, statistical comparison of PET data with NC was performed using statistical parametric mapping (SPM). Results: The AD-associated FDG imaging pattern typically presented as focal cortical hypometabolism in bilateral parietotemporal association cortes and(or) frontal lobe and the posterior cingulate gyms. As compared with the comparative NC, FTD group demonstrated significant regional reductions in rCMRglc in bilateral frontal, parietal lobes, the cingulate gyri, insulae, left precuneus, and the subcortical structures (including right putamen, right medial dorsal nucleus and ventral anterior nucleus). The PDD group showed regional reductions in rCMRglc in bilateral frontal cortexes, parietotemporal association cortexes, and the subcortical structures (including left caudate, right putamen, the dorsomedial thalamus, lateral posterior nucleus, and pulvinar). By the voxel-by-voxel comparison between the DLB group and NC group, regional reductions in rCMRglc included bilateral occipital cortexes, precuneuses, frontal and parietal lobes, left anterior cingulate gyms, right superior temporal cortex, and the subcortical structures including putamen, caudate, lateral posterior nucleus, and pulvinar. Conclusions: The rCMRglc was found to be different

  8. Cliff´s Delta Calculator: A non-parametric effect size program for two groups of observations

    Directory of Open Access Journals (Sweden)

    Guillermo Macbeth

    2011-05-01

    Full Text Available The Cliff´s Delta statistic is an effect size measure that quantifies the amount of difference between two non-parametric variables beyond p-values interpretation. This measure can be understood as a useful complementary analysis for the corresponding hypothesis testing. During the last two decades the use of effect size measures has been strongly encouraged by methodologists and leading institutions of behavioral sciences. The aim of this contribution is to introduce the Cliff´s Delta Calculator software that performs such analysis and offers some interpretation tips. Differences and similarities with the parametric case are analysed and illustrated. The implementation of this free program is fully described and compared with other calculators. Alternative algorithmic approaches are mathematically analysed and a basic linear algebra proof of its equivalence is formally presented. Two worked examples in cognitive psychology are commented. A visual interpretation of Cliff´s Delta is suggested. Availability, installation and applications of the program are presented and discussed.

  9. Synchronization of chaos in non-identical parametrically excited systems

    International Nuclear Information System (INIS)

    Idowu, B.A.; Vincent, U.E.; Njah, A.N.

    2009-01-01

    In this paper, we investigate the synchronization of chaotic systems consisting of non-identical parametrically excited oscillators. The active control technique is employed to design control functions based on Lyapunov stability theory and Routh-Hurwitz criteria so as to achieve global chaos synchronization between a parametrically excited gyroscope and each of the parametrically excited pendulum and Duffing oscillator. Numerical simulations are implemented to verify the results.

  10. Crossing statistic: Bayesian interpretation, model selection and resolving dark energy parametrization problem

    International Nuclear Information System (INIS)

    Shafieloo, Arman

    2012-01-01

    By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties

  11. Estimating technical efficiency in the hospital sector with panel data: a comparison of parametric and non-parametric techniques.

    Science.gov (United States)

    Siciliani, Luigi

    2006-01-01

    Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.

  12. Parametric Sensitivity Tests- European PEM Fuel Cell Stack Test Procedures

    DEFF Research Database (Denmark)

    Araya, Samuel Simon; Andreasen, Søren Juhl; Kær, Søren Knudsen

    2014-01-01

    performed based on test procedures proposed by a European project, Stack-Test. The sensitivity of a Nafion-based low temperature PEMFC stack’s performance to parametric changes was the main objective of the tests. Four crucial parameters for fuel cell operation were chosen; relative humidity, temperature......As fuel cells are increasingly commercialized for various applications, harmonized and industry-relevant test procedures are necessary to benchmark tests and to ensure comparability of stack performance results from different parties. This paper reports the results of parametric sensitivity tests......, pressure, and stoichiometry at varying current density. Furthermore, procedures for polarization curve recording were also tested both in ascending and descending current directions....

  13. Appraisal of within- and between-laboratory reproducibility of non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: comparison of OECD TG429 performance standard and statistical evaluation.

    Science.gov (United States)

    Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin

    2015-05-05

    Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are point...

  15. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  16. SOCR Analyses: Implementation and Demonstration of a New Graphical Statistics Educational Toolkit

    Directory of Open Access Journals (Sweden)

    Annie Chu

    2009-04-01

    Full Text Available The web-based, Java-written SOCR (Statistical Online Computational Resource toolshave been utilized in many undergraduate and graduate level statistics courses for sevenyears now (Dinov 2006; Dinov et al. 2008b. It has been proven that these resourcescan successfully improve students' learning (Dinov et al. 2008b. Being rst publishedonline in 2005, SOCR Analyses is a somewhat new component and it concentrate on datamodeling for both parametric and non-parametric data analyses with graphical modeldiagnostics. One of the main purposes of SOCR Analyses is to facilitate statistical learn-ing for high school and undergraduate students. As we have already implemented SOCRDistributions and Experiments, SOCR Analyses and Charts fulll the rest of a standardstatistics curricula. Currently, there are four core components of SOCR Analyses. Linearmodels included in SOCR Analyses are simple linear regression, multiple linear regression,one-way and two-way ANOVA. Tests for sample comparisons include t-test in the para-metric category. Some examples of SOCR Analyses' in the non-parametric category areWilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, Kolmogorov-Smirno testand Fligner-Killeen test. Hypothesis testing models include contingency table, Friedman'stest and Fisher's exact test. The last component of Analyses is a utility for computingsample sizes for normal distribution. In this article, we present the design framework,computational implementation and the utilization of SOCR Analyses.

  17. Comparative Study of Parametric and Non-parametric Approaches in Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Katebi, S.D.; Blanke, M.; Katebi, M.R.

    This report describes a comparative study between two approaches to fault detection and isolation in dynamic systems. The first approach uses a parametric model of the system. The main components of such techniques are residual and signature generation for processing and analyzing. The second...... approach is non-parametric in the sense that the signature analysis is only dependent on the frequency or time domain information extracted directly from the input-output signals. Based on these approaches, two different fault monitoring schemes are developed where the feature extraction and fault decision...

  18. Testing isotropy in the local Universe

    Energy Technology Data Exchange (ETDEWEB)

    Appleby, Stephen; Shafieloo, Arman, E-mail: stephen.appleby@apctp.org, E-mail: arman@apctp.org [Asia Pacific Center for Theoretical Physics, Pohang, Gyeongbuk 790-784 (Korea, Republic of)

    2014-10-01

    We test the isotropy of the local distribution of galaxies using the 2MASS extended source catalogue. By decomposing the full sky survey into distinct patches and using a combination of photometric and spectroscopic redshift data, we use both parametric and non-parametric methods to obtain the shape of the luminosity function in each patch. We use the shape of the luminosity function to test the statistical isotropy of the underlying galaxy distribution. The parametric estimator shows some evidence of a hemispherical asymmetry in the north/south Galactic plane. However the non-parametric estimator exhibits no significant anisotropy, with the galaxy distribution being consistent with the assumption of isotropy in all regions considered. The parametric asymmetry is attributed to the relatively poor fit of the functional form to the underlying data. When using the non-parametric estimator, we do find a dipole in the shape of the luminosity function, with maximal deviation from isotropy at galactic coordinate (b,l)=(30{sup o},315{sup o}). However we can ascribe no strong statistical significance to this observation.

  19. A Monte Carlo-adjusted goodness-of-fit test for parametric models describing spatial point patterns

    KAUST Repository

    Dao, Ngocanh; Genton, Marc G.

    2014-01-01

    Assessing the goodness-of-fit (GOF) for intricate parametric spatial point process models is important for many application fields. When the probability density of the statistic of the GOF test is intractable, a commonly used procedure is the Monte

  20. SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.

    Science.gov (United States)

    Chu, Annie; Cui, Jenny; Dinov, Ivo D

    2009-03-01

    The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most

  1. Parametric Roll Resonance Detection using Phase Correlation and Log-likelihood Testing Techniques

    DEFF Research Database (Denmark)

    Galeazzi, Roberto; Blanke, Mogens; Poulsen, Niels Kjølstad

    2009-01-01

    generation warning system the purpose of which is to provide the master with an onboard system able to trigger an alarm when parametric roll is likely to happen within the immediate future. A detection scheme is introduced, which is able to issue a warning within five roll periods after a resonant motion......Real-time detection of parametric roll is still an open issue that is gathering an increasing attention. A first generation warning systems, based on guidelines and polar diagrams, showed their potential to face issues like long-term prediction and risk assessment. This paper presents a second...... started. After having determined statistical properties of the signals at hand, a detector based on the generalised log-likelihood ratio test (GLRT) is designed to look for variation in signal power. The ability of the detector to trigger alarms when parametric roll is going to onset is evaluated on two...

  2. Parametric analysis of the statistical model of the stick-slip process

    Science.gov (United States)

    Lima, Roberta; Sampaio, Rubens

    2017-06-01

    In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.

  3. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  4. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  5. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  6. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  7. Revisiting non-degenerate parametric down-conversion

    Indian Academy of Sciences (India)

    conversion process is studied by recasting the time evolution equations for the basic op- erators in an equivalent ... We consider a model of non-degenerate parametric down-conversion process com- posed of two coupled ..... e−iωat and eiωbt have been left out in writing down the final results in ref. [4], even though these ...

  8. Bayesian non- and semi-parametric methods and applications

    CERN Document Server

    Rossi, Peter

    2014-01-01

    This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number

  9. Trend Analysis of Pahang River Using Non-Parametric Analysis: Mann Kendalls Trend Test

    International Nuclear Information System (INIS)

    Nur Hishaam Sulaiman; Mohd Khairul Amri Kamarudin; Mohd Khairul Amri Kamarudin; Ahmad Dasuki Mustafa; Muhammad Azizi Amran; Fazureen Azaman; Ismail Zainal Abidin; Norsyuhada Hairoma

    2015-01-01

    Flood is common in Pahang especially during northeast monsoon season from November to February. Three river cross station: Lubuk Paku, Sg. Yap and Temerloh were selected as area of this study. The stream flow and water level data were gathered from DID record. Data set for this study were analysed by using non-parametric analysis, Mann-Kendall Trend Test. The results that obtained from stream flow and water level analysis indicate that there are positively significant trend for Lubuk Paku (0.001) and Sg. Yap (<0.0001) from 1972-2011 with the p-value < 0.05. Temerloh (0.178) data from 1963-2011 recorded no trend for stream flow parameter but negative trend for water level parameter. Hydrological pattern and trend are extremely affected by outside factors such as north east monsoon season that occurred in South China Sea and affected Pahang during November to March. There are other factors such as development and management of the areas which can be considered as factors affected the data and results. Hydrological Pattern is important to indicate the river trend such as stream flow and water level. It can be used as flood mitigation by local authorities. (author)

  10. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable...... of the energy at mean-level crossings, which yields the damping relative to white noise intensity. Finally, an estimate of the noise intensity is extracted by estimating the absolute damping from the autocovariance functions of a set of modified phase plane variables at different energy levels. The method...

  11. A nonparametric spatial scan statistic for continuous data.

    Science.gov (United States)

    Jung, Inkyung; Cho, Ho Jin

    2015-10-20

    Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.

  12. rSeqNP: a non-parametric approach for detecting differential expression and splicing from RNA-Seq data.

    Science.gov (United States)

    Shi, Yang; Chinnaiyan, Arul M; Jiang, Hui

    2015-07-01

    High-throughput sequencing of transcriptomes (RNA-Seq) has become a powerful tool to study gene expression. Here we present an R package, rSeqNP, which implements a non-parametric approach to test for differential expression and splicing from RNA-Seq data. rSeqNP uses permutation tests to access statistical significance and can be applied to a variety of experimental designs. By combining information across isoforms, rSeqNP is able to detect more differentially expressed or spliced genes from RNA-Seq data. The R package with its source code and documentation are freely available at http://www-personal.umich.edu/∼jianghui/rseqnp/. jianghui@umich.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Non-parametric smoothing of experimental data

    International Nuclear Information System (INIS)

    Kuketayev, A.T.; Pen'kov, F.M.

    2007-01-01

    Full text: Rapid processing of experimental data samples in nuclear physics often requires differentiation in order to find extrema. Therefore, even at the preliminary stage of data analysis, a range of noise reduction methods are used to smooth experimental data. There are many non-parametric smoothing techniques: interval averages, moving averages, exponential smoothing, etc. Nevertheless, it is more common to use a priori information about the behavior of the experimental curve in order to construct smoothing schemes based on the least squares techniques. The latter methodology's advantage is that the area under the curve can be preserved, which is equivalent to conservation of total speed of counting. The disadvantages of this approach include the lack of a priori information. For example, very often the sums of undifferentiated (by a detector) peaks are replaced with one peak during the processing of data, introducing uncontrolled errors in the determination of the physical quantities. The problem is solvable only by having experienced personnel, whose skills are much greater than the challenge. We propose a set of non-parametric techniques, which allows the use of any additional information on the nature of experimental dependence. The method is based on a construction of a functional, which includes both experimental data and a priori information. Minimum of this functional is reached on a non-parametric smoothed curve. Euler (Lagrange) differential equations are constructed for these curves; then their solutions are obtained analytically or numerically. The proposed approach allows for automated processing of nuclear physics data, eliminating the need for highly skilled laboratory personnel. Pursuant to the proposed approach is the possibility to obtain smoothing curves in a given confidence interval, e.g. according to the χ 2 distribution. This approach is applicable when constructing smooth solutions of ill-posed problems, in particular when solving

  14. A Monte Carlo-adjusted goodness-of-fit test for parametric models describing spatial point patterns

    KAUST Repository

    Dao, Ngocanh

    2014-04-03

    Assessing the goodness-of-fit (GOF) for intricate parametric spatial point process models is important for many application fields. When the probability density of the statistic of the GOF test is intractable, a commonly used procedure is the Monte Carlo GOF test. Additionally, if the data comprise a single dataset, a popular version of the test plugs a parameter estimate in the hypothesized parametric model to generate data for theMonte Carlo GOF test. In this case, the test is invalid because the resulting empirical level does not reach the nominal level. In this article, we propose a method consisting of nested Monte Carlo simulations which has the following advantages: the bias of the resulting empirical level of the test is eliminated, hence the empirical levels can always reach the nominal level, and information about inhomogeneity of the data can be provided.We theoretically justify our testing procedure using Taylor expansions and demonstrate that it is correctly sized through various simulation studies. In our first data application, we discover, in agreement with Illian et al., that Phlebocarya filifolia plants near Perth, Australia, can follow a homogeneous Poisson clustered process that provides insight into the propagation mechanism of these plants. In our second data application, we find, in contrast to Diggle, that a pairwise interaction model provides a good fit to the micro-anatomy data of amacrine cells designed for analyzing the developmental growth of immature retina cells in rabbits. This article has supplementary material online. © 2013 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

  15. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation.

    Science.gov (United States)

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (~15-20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate Ki and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final Ki parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion study

  16. Dynamic whole-body PET parametric imaging: II. Task-oriented statistical estimation

    International Nuclear Information System (INIS)

    Karakatsanis, Nicolas A; Lodge, Martin A; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-01-01

    In the context of oncology, dynamic PET imaging coupled with standard graphical linear analysis has been previously employed to enable quantitative estimation of tracer kinetic parameters of physiological interest at the voxel level, thus, enabling quantitative PET parametric imaging. However, dynamic PET acquisition protocols have been confined to the limited axial field-of-view (∼15–20 cm) of a single-bed position and have not been translated to the whole-body clinical imaging domain. On the contrary, standardized uptake value (SUV) PET imaging, considered as the routine approach in clinical oncology, commonly involves multi-bed acquisitions, but is performed statically, thus not allowing for dynamic tracking of the tracer distribution. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. In a companion study, we presented a novel clinically feasible dynamic (4D) multi-bed PET acquisition protocol as well as the concept of whole-body PET parametric imaging employing Patlak ordinary least squares (OLS) regression to estimate the quantitative parameters of tracer uptake rate K i and total blood distribution volume V. In the present study, we propose an advanced hybrid linear regression framework, driven by Patlak kinetic voxel correlations, to achieve superior trade-off between contrast-to-noise ratio (CNR) and mean squared error (MSE) than provided by OLS for the final K i parametric images, enabling task-based performance optimization. Overall, whether the observer's task is to detect a tumor or quantitatively assess treatment response, the proposed statistical estimation framework can be adapted to satisfy the specific task performance criteria, by adjusting the Patlak correlation-coefficient (WR) reference value. The multi-bed dynamic acquisition protocol, as optimized in the preceding companion

  17. Variance in parametric images: direct estimation from parametric projections

    International Nuclear Information System (INIS)

    Maguire, R.P.; Leenders, K.L.; Spyrou, N.M.

    2000-01-01

    Recent work has shown that it is possible to apply linear kinetic models to dynamic projection data in PET in order to calculate parameter projections. These can subsequently be back-projected to form parametric images - maps of parameters of physiological interest. Critical to the application of these maps, to test for significant changes between normal and pathophysiology, is an assessment of the statistical uncertainty. In this context, parametric images also include simple integral images from, e.g., [O-15]-water used to calculate statistical parametric maps (SPMs). This paper revisits the concept of parameter projections and presents a more general formulation of the parameter projection derivation as well as a method to estimate parameter variance in projection space, showing which analysis methods (models) can be used. Using simulated pharmacokinetic image data we show that a method based on an analysis in projection space inherently calculates the mathematically rigorous pixel variance. This results in an estimation which is as accurate as either estimating variance in image space during model fitting, or estimation by comparison across sets of parametric images - as might be done between individuals in a group pharmacokinetic PET study. The method based on projections has, however, a higher computational efficiency, and is also shown to be more precise, as reflected in smooth variance distribution images when compared to the other methods. (author)

  18. Change of diffusion anisotropy in patients with acute cerebral infarction using statistical parametric analysis

    International Nuclear Information System (INIS)

    Morita, Naomi; Harada, Masafumi; Uno, Masaaki; Furutani, Kaori; Nishitani, Hiromu

    2006-01-01

    We conducted statistical parametric comparison of fractional anisotropy (FA) images and quantified FA values to determine whether significant change occurs in the ischemic region. The subjects were 20 patients seen within 24 h after onset of ischemia. For statistical comparison of FA images, a sample FA image was coordinated by the Talairach template, and each FA map was normalized. Statistical comparison was conducted using statistical parametric mapping (SPM) 99. Regions of interest were set in the same region on apparent diffusion coefficient (ADC) and FA maps, the region being consistent with the hyperintense region on diffusion-weighted images (DWIs). The contralateral region was also measured to obtain asymmetry ratios of ADC and FA. Regions with areas of statistical significance on FA images were found only in the white matter of three patients, although the regions were smaller than hyperintense regions on DWIs. The mean ADC and FA ratios were 0.64±0.16 and 0.93±0.09, respectively, and the degree of FA change was less than that of the ADC change. Significant change in diffusion anisotropy was limited to the severely infarcted core of the white matter. We believe statistical comparison of FA maps to be useful for detecting different regions of diffusion anisotropy. (author)

  19. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  20. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  1. Functional brain mapping using H215O positron emission tomography (I): statistical parametric mapping method

    International Nuclear Information System (INIS)

    Lee, Dong Soo; Lee, Jae Sung; Kim, Kyeong Min; Chung, June Key; Lee, Myung Chul

    1998-01-01

    We investigated the statistical methods to compose the functional brain map of human working memory and the principal factors that have an effect on the methods for localization. Repeated PET scans with successive four tasks, which consist of one control and three different activation tasks, were performed on six right-handed normal volunteers for 2 minutes after bolus injections of 925 MBq H 2 15 O at the intervals of 30 minutes. Image data were analyzed using SPM96 (Statistical Parametric Mapping) implemented with Matlab (Mathworks Inc., U.S.A.). Images from the same subject were spatially registered and were normalized using linear and nonlinear transformation methods. Significant difference between control and each activation state was estimated at every voxel based on the general linear model. Differences of global counts were removed using analysis of covariance (ANCOVA) with global activity as covariate. Using the mean and variance for each condition which was adjusted using ANCOVA, t-statistics was performed on every voxel. To interpret the results more easily, t-values were transformed to the standard Gaussian distribution (Z-score). All the subjects carried out the activation and control tests successfully. Average rate of correct answers was 95%. The numbers of activated blobs were 4 for verbal memory I, 9 for verbal memory II, 9 for visual memory, and 6 for conjunctive activation of these three tasks. The verbal working memory activates predominantly left-sided structures, and the visual memory activates the right hemisphere. We conclude that rCBF PET imaging and statistical parametric mapping method were useful in the localization of the brain regions for verbal and visual working memory

  2. Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly

    International Nuclear Information System (INIS)

    Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A.

    1998-01-01

    Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual's T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age

  3. Statistical parametric mapping of Tc-99m HMPAO SPECT cerebral perfusion in the normal elderly

    Energy Technology Data Exchange (ETDEWEB)

    Turlakow, A.; Scott, A.M.; Berlangieri, S.U.; Sonkila, C.; Wardill, T.D.; Crowley, K.; Abbott, D.; Egan, G.F.; McKay, W.J.; Hughes, A. [Austin and Repatriation Medical Centre, Heidelberg, VIC (Australia). Departments of Nuclear Medicine and Centre for PET Neurology and Clinical Neuropsychology

    1998-06-01

    Full text: The clinical value of Tc-99m HMPAO SPECT cerebral blood flow studies in cognitive and neuropsychiatric disorders has been well described. Currently, interpretation of these studies relies on qualitative or semi- quantitative techniques. The aim of our study is to generate statistical measures of regional cerebral perfusion in the normal elderly using statistical parametric mapping (Friston et al, Wellcome Department of Cognitive Neurology, London, UK) in order to facilitate the objective analysis of cerebral blood flow studies in patient groups. A cohort of 20 healthy, elderly volunteers, aged 68 to 81 years, was prospectively selected on the basis of normal physical examination and neuropsychological testing. Subjects with risk factors, or a history of cognitive impairment were excluded from our study group. All volunteers underwent SPECT cerebral blood flow imaging, 30 minutes following the administration of 370 MBq Tc-99m HMPAO, on a Trionix Triad XLT triple-headed scanner (Trionix Research Laboratory Twinsburg, OH) using high resolution, fan-beam collimators resulting in a system resolution of 10 mm full width at half-maximum (FWHM). The SPECT cerebral blood flow studies were analysed using statistical parametric mapping (SPM) software specifically developed for the routine statistical analysis of functional neuroimaging data. The SPECT images were coregistered with each individual`s T1-weighted MR volume brain scan and spatially normalized to standardised Talairach space. Using SPM, these data were analyzed for differences in interhemispheric regional cerebral blood flow. Significant asymmetry of cerebral perfusion was detected in the pre-central gyrus at the 95th percentile. In conclusion, the interpretation of cerebral blood flow studies in the elderly should take into account the statistically significant asymmetry in interhemispheric pre-central cortical blood flow. In the future, clinical studies will be compared to statistical data sets in age

  4. Detection of Parametric Roll on Ships

    DEFF Research Database (Denmark)

    Galeazzi, Roberto; Blanke, Mogens; Poulsen, Niels Kjølstad

    2012-01-01

    phenomenon could make the navigator change ship’s speed and heading, and these remedial actions could make the vessel escape the bifurcation. This chapter proposes non-parametric methods to detect the onset of parametric roll resonance. Theoretical conditions for parametric resonance are re...... on experimental data from towing tank tests and data from a container ship passing an Atlantic storm....

  5. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  6. Non-asymptotic fractional order differentiators via an algebraic parametric method

    KAUST Repository

    Liu, Dayan; Gibaru, O.; Perruquetti, Wilfrid

    2012-01-01

    Recently, Mboup, Join and Fliess [27], [28] introduced non-asymptotic integer order differentiators by using an algebraic parametric estimation method [7], [8]. In this paper, in order to obtain non-asymptotic fractional order differentiators we apply this algebraic parametric method to truncated expansions of fractional Taylor series based on the Jumarie's modified Riemann-Liouville derivative [14]. Exact and simple formulae for these differentiators are given where a sliding integration window of a noisy signal involving Jacobi polynomials is used without complex mathematical deduction. The efficiency and the stability with respect to corrupting noises of the proposed fractional order differentiators are shown in numerical simulations. © 2012 IEEE.

  7. Non-asymptotic fractional order differentiators via an algebraic parametric method

    KAUST Repository

    Liu, Dayan

    2012-08-01

    Recently, Mboup, Join and Fliess [27], [28] introduced non-asymptotic integer order differentiators by using an algebraic parametric estimation method [7], [8]. In this paper, in order to obtain non-asymptotic fractional order differentiators we apply this algebraic parametric method to truncated expansions of fractional Taylor series based on the Jumarie\\'s modified Riemann-Liouville derivative [14]. Exact and simple formulae for these differentiators are given where a sliding integration window of a noisy signal involving Jacobi polynomials is used without complex mathematical deduction. The efficiency and the stability with respect to corrupting noises of the proposed fractional order differentiators are shown in numerical simulations. © 2012 IEEE.

  8. Practical statistics in pain research.

    Science.gov (United States)

    Kim, Tae Kyun

    2017-10-01

    Pain is subjective, while statistics related to pain research are objective. This review was written to help researchers involved in pain research make statistical decisions. The main issues are related with the level of scales that are often used in pain research, the choice of statistical methods between parametric or nonparametric statistics, and problems which arise from repeated measurements. In the field of pain research, parametric statistics used to be applied in an erroneous way. This is closely related with the scales of data and repeated measurements. The level of scales includes nominal, ordinal, interval, and ratio scales. The level of scales affects the choice of statistics between parametric or non-parametric methods. In the field of pain research, the most frequently used pain assessment scale is the ordinal scale, which would include the visual analogue scale (VAS). There used to be another view, however, which considered the VAS to be an interval or ratio scale, so that the usage of parametric statistics would be accepted practically in some cases. Repeated measurements of the same subjects always complicates statistics. It means that measurements inevitably have correlations between each other, and would preclude the application of one-way ANOVA in which independence between the measurements is necessary. Repeated measures of ANOVA (RMANOVA), however, would permit the comparison between the correlated measurements as long as the condition of sphericity assumption is satisfied. Conclusively, parametric statistical methods should be used only when the assumptions of parametric statistics, such as normality and sphericity, are established.

  9. A Robust Semi-Parametric Test for Detecting Trait-Dependent Diversification.

    Science.gov (United States)

    Rabosky, Daniel L; Huang, Huateng

    2016-03-01

    Rates of species diversification vary widely across the tree of life and there is considerable interest in identifying organismal traits that correlate with rates of speciation and extinction. However, it has been challenging to develop methodological frameworks for testing hypotheses about trait-dependent diversification that are robust to phylogenetic pseudoreplication and to directionally biased rates of character change. We describe a semi-parametric test for trait-dependent diversification that explicitly requires replicated associations between character states and diversification rates to detect effects. To use the method, diversification rates are reconstructed across a phylogenetic tree with no consideration of character states. A test statistic is then computed to measure the association between species-level traits and the corresponding diversification rate estimates at the tips of the tree. The empirical value of the test statistic is compared to a null distribution that is generated by structured permutations of evolutionary rates across the phylogeny. The test is applicable to binary discrete characters as well as continuous-valued traits and can accommodate extremely sparse sampling of character states at the tips of the tree. We apply the test to several empirical data sets and demonstrate that the method has acceptable Type I error rates. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Evaluating statistical tests on OLAP cubes to compare degree of disease.

    Science.gov (United States)

    Ordonez, Carlos; Chen, Zhibo

    2009-09-01

    Statistical tests represent an important technique used to formulate and validate hypotheses on a dataset. They are particularly useful in the medical domain, where hypotheses link disease with medical measurements, risk factors, and treatment. In this paper, we propose to compute parametric statistical tests treating patient records as elements in a multidimensional cube. We introduce a technique that combines dimension lattice traversal and statistical tests to discover significant differences in the degree of disease within pairs of patient groups. In order to understand a cause-effect relationship, we focus on patient group pairs differing in one dimension. We introduce several optimizations to prune the search space, to discover significant group pairs, and to summarize results. We present experiments showing important medical findings and evaluating scalability with medical datasets.

  11. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  12. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  13. Parametric estimation in the wave buoy analogy - an elaborated approach based on energy considerations

    DEFF Research Database (Denmark)

    Montazeri, Najmeh; Nielsen, Ulrik Dam

    2014-01-01

    the ship’s wave-induced responses based on different statistical inferences including parametric and non-parametric approaches. This paper considers a concept to improve the estimate obtained by the parametric method for sea state estimation. The idea is illustrated by an analysis made on full-scale...

  14. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor.

    Science.gov (United States)

    Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio

    2010-01-01

    In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.

  15. Short-term monitoring of benzene air concentration in an urban area: a preliminary study of application of Kruskal-Wallis non-parametric test to assess pollutant impact on global environment and indoor

    Directory of Open Access Journals (Sweden)

    Maria Chiara Mura

    2010-12-01

    Full Text Available In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6, concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%; most important, they suggest a possible procedure to optimize network design.

  16. An introduction to inferential statistics: A review and practical guide

    International Nuclear Information System (INIS)

    Marshall, Gill; Jonker, Leon

    2011-01-01

    Building on the first part of this series regarding descriptive statistics, this paper demonstrates why it is advantageous for radiographers to understand the role of inferential statistics in deducing conclusions from a sample and their application to a wider population. This is necessary so radiographers can understand the work of others, can undertake their own research and evidence base their practice. This article explains p values and confidence intervals. It introduces the common statistical tests that comprise inferential statistics, and explains the use of parametric and non-parametric statistics. To do this, the paper reviews relevant literature, and provides a checklist of points to consider before and after applying statistical tests to a data set. The paper provides a glossary of relevant terms and the reader is advised to refer to this when any unfamiliar terms are used in the text. Together with the information provided on descriptive statistics in an earlier article, it can be used as a starting point for applying statistics in radiography practice and research.

  17. An introduction to inferential statistics: A review and practical guide

    Energy Technology Data Exchange (ETDEWEB)

    Marshall, Gill, E-mail: gill.marshall@cumbria.ac.u [Faculty of Health, Medical Sciences and Social Care, University of Cumbria, Lancaster LA1 3JD (United Kingdom); Jonker, Leon [Faculty of Health, Medical Sciences and Social Care, University of Cumbria, Lancaster LA1 3JD (United Kingdom)

    2011-02-15

    Building on the first part of this series regarding descriptive statistics, this paper demonstrates why it is advantageous for radiographers to understand the role of inferential statistics in deducing conclusions from a sample and their application to a wider population. This is necessary so radiographers can understand the work of others, can undertake their own research and evidence base their practice. This article explains p values and confidence intervals. It introduces the common statistical tests that comprise inferential statistics, and explains the use of parametric and non-parametric statistics. To do this, the paper reviews relevant literature, and provides a checklist of points to consider before and after applying statistical tests to a data set. The paper provides a glossary of relevant terms and the reader is advised to refer to this when any unfamiliar terms are used in the text. Together with the information provided on descriptive statistics in an earlier article, it can be used as a starting point for applying statistics in radiography practice and research.

  18. Statistical significance of trends in monthly heavy precipitation over the US

    KAUST Repository

    Mahajan, Salil

    2011-05-11

    Trends in monthly heavy precipitation, defined by a return period of one year, are assessed for statistical significance in observations and Global Climate Model (GCM) simulations over the contiguous United States using Monte Carlo non-parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall\\'s τ test, implying the robustness of the approach. Two different observational data-sets are employed to test for trends in monthly heavy precipitation and are found to exhibit consistent results. Both data-sets demonstrate upward trends, one of which is found to be statistically significant at the 95% confidence level. Upward trends similar to observations are observed in some climate model simulations of the twentieth century, but their statistical significance is marginal. For projections of the twenty-first century, a statistically significant upwards trend is observed in most of the climate models analyzed. The change in the simulated precipitation variance appears to be more important in the twenty-first century projections than changes in the mean precipitation. Stochastic fluctuations of the climate-system are found to be dominate monthly heavy precipitation as some GCM simulations show a downwards trend even in the twenty-first century projections when the greenhouse gas forcings are strong. © 2011 Springer-Verlag.

  19. Statistical testing and power analysis for brain-wide association study.

    Science.gov (United States)

    Gong, Weikang; Wan, Lin; Lu, Wenlian; Ma, Liang; Cheng, Fan; Cheng, Wei; Grünewald, Stefan; Feng, Jianfeng

    2018-04-05

    The identification of connexel-wise associations, which involves examining functional connectivities between pairwise voxels across the whole brain, is both statistically and computationally challenging. Although such a connexel-wise methodology has recently been adopted by brain-wide association studies (BWAS) to identify connectivity changes in several mental disorders, such as schizophrenia, autism and depression, the multiple correction and power analysis methods designed specifically for connexel-wise analysis are still lacking. Therefore, we herein report the development of a rigorous statistical framework for connexel-wise significance testing based on the Gaussian random field theory. It includes controlling the family-wise error rate (FWER) of multiple hypothesis testings using topological inference methods, and calculating power and sample size for a connexel-wise study. Our theoretical framework can control the false-positive rate accurately, as validated empirically using two resting-state fMRI datasets. Compared with Bonferroni correction and false discovery rate (FDR), it can reduce false-positive rate and increase statistical power by appropriately utilizing the spatial information of fMRI data. Importantly, our method bypasses the need of non-parametric permutation to correct for multiple comparison, thus, it can efficiently tackle large datasets with high resolution fMRI images. The utility of our method is shown in a case-control study. Our approach can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail. A software package is available at https://github.com/weikanggong/BWAS. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Non-Parametric Analysis of Rating Transition and Default Data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...

  1. Non-parametric analysis of production efficiency of poultry egg ...

    African Journals Online (AJOL)

    Non-parametric analysis of production efficiency of poultry egg farmers in Delta ... analysis of factors affecting the output of poultry farmers showed that stock ... should be put in place for farmers to learn the best farm practices carried out on the ...

  2. Non-parametric model selection for subject-specific topological organization of resting-state functional connectivity.

    Science.gov (United States)

    Ferrarini, Luca; Veer, Ilya M; van Lew, Baldur; Oei, Nicole Y L; van Buchem, Mark A; Reiber, Johan H C; Rombouts, Serge A R B; Milles, J

    2011-06-01

    In recent years, graph theory has been successfully applied to study functional and anatomical connectivity networks in the human brain. Most of these networks have shown small-world topological characteristics: high efficiency in long distance communication between nodes, combined with highly interconnected local clusters of nodes. Moreover, functional studies performed at high resolutions have presented convincing evidence that resting-state functional connectivity networks exhibits (exponentially truncated) scale-free behavior. Such evidence, however, was mostly presented qualitatively, in terms of linear regressions of the degree distributions on log-log plots. Even when quantitative measures were given, these were usually limited to the r(2) correlation coefficient. However, the r(2) statistic is not an optimal estimator of explained variance, when dealing with (truncated) power-law models. Recent developments in statistics have introduced new non-parametric approaches, based on the Kolmogorov-Smirnov test, for the problem of model selection. In this work, we have built on this idea to statistically tackle the issue of model selection for the degree distribution of functional connectivity at rest. The analysis, performed at voxel level and in a subject-specific fashion, confirmed the superiority of a truncated power-law model, showing high consistency across subjects. Moreover, the most highly connected voxels were found to be consistently part of the default mode network. Our results provide statistically sound support to the evidence previously presented in literature for a truncated power-law model of resting-state functional connectivity. Copyright © 2010 Elsevier Inc. All rights reserved.

  3. Assessing T cell clonal size distribution: a non-parametric approach.

    Science.gov (United States)

    Bolkhovskaya, Olesya V; Zorin, Daniil Yu; Ivanchenko, Mikhail V

    2014-01-01

    Clonal structure of the human peripheral T-cell repertoire is shaped by a number of homeostatic mechanisms, including antigen presentation, cytokine and cell regulation. Its accurate tuning leads to a remarkable ability to combat pathogens in all their variety, while systemic failures may lead to severe consequences like autoimmune diseases. Here we develop and make use of a non-parametric statistical approach to assess T cell clonal size distributions from recent next generation sequencing data. For 41 healthy individuals and a patient with ankylosing spondylitis, who undergone treatment, we invariably find power law scaling over several decades and for the first time calculate quantitatively meaningful values of decay exponent. It has proved to be much the same among healthy donors, significantly different for an autoimmune patient before the therapy, and converging towards a typical value afterwards. We discuss implications of the findings for theoretical understanding and mathematical modeling of adaptive immunity.

  4. Predictive capacity of a non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: Comparison of a cutoff approach and inferential statistics.

    Science.gov (United States)

    Kim, Da-Eun; Yang, Hyeri; Jang, Won-Hee; Jung, Kyoung-Mi; Park, Miyoung; Choi, Jin Kyu; Jung, Mi-Sook; Jeon, Eun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Park, Jung Eun; Sohn, Soo Jung; Kim, Tae Sung; Ahn, Il Young; Jeong, Tae-Cheon; Lim, Kyung-Min; Bae, SeungJin

    2016-01-01

    In order for a novel test method to be applied for regulatory purposes, its reliability and relevance, i.e., reproducibility and predictive capacity, must be demonstrated. Here, we examine the predictive capacity of a novel non-radioisotopic local lymph node assay, LLNA:BrdU-FCM (5-bromo-2'-deoxyuridine-flow cytometry), with a cutoff approach and inferential statistics as a prediction model. 22 reference substances in OECD TG429 were tested with a concurrent positive control, hexylcinnamaldehyde 25%(PC), and the stimulation index (SI) representing the fold increase in lymph node cells over the vehicle control was obtained. The optimal cutoff SI (2.7≤cutoff <3.5), with respect to predictive capacity, was obtained by a receiver operating characteristic curve, which produced 90.9% accuracy for the 22 substances. To address the inter-test variability in responsiveness, SI values standardized with PC were employed to obtain the optimal percentage cutoff (42.6≤cutoff <57.3% of PC), which produced 86.4% accuracy. A test substance may be diagnosed as a sensitizer if a statistically significant increase in SI is elicited. The parametric one-sided t-test and non-parametric Wilcoxon rank-sum test produced 77.3% accuracy. Similarly, a test substance could be defined as a sensitizer if the SI means of the vehicle control, and of the low, middle, and high concentrations were statistically significantly different, which was tested using ANOVA or Kruskal-Wallis, with post hoc analysis, Dunnett, or DSCF (Dwass-Steel-Critchlow-Fligner), respectively, depending on the equal variance test, producing 81.8% accuracy. The absolute SI-based cutoff approach produced the best predictive capacity, however the discordant decisions between prediction models need to be examined further. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Further Empirical Results on Parametric Versus Non-Parametric IRT Modeling of Likert-Type Personality Data

    Science.gov (United States)

    Maydeu-Olivares, Albert

    2005-01-01

    Chernyshenko, Stark, Chan, Drasgow, and Williams (2001) investigated the fit of Samejima's logistic graded model and Levine's non-parametric MFS model to the scales of two personality questionnaires and found that the graded model did not fit well. We attribute the poor fit of the graded model to small amounts of multidimensionality present in…

  6. Use of statistical parametric mapping of 18F-FDG-PET in frontal lobe epilepsy

    International Nuclear Information System (INIS)

    Plotkin, M.; Amthauer, H.; Luedemann, L.; Hartkop, E.; Ruf, J.; Gutberlet, M.; Bertram, H.; Felix, R.; Venz, St.; Merschhemke, M.; Meencke, H.-J.

    2003-01-01

    Aim: Evaluation of the use of statistical parametrical mapping (SPM) of FDG-PET for seizure lateralization in frontal lobe epilepsy. Patients: 38 patients with suspected frontal lobe epilepsy supported by clinical findings and video-EEG monitoring. Method: Statistical parametrical maps were generated by subtraction of individual scans from a control group, formed by 16 patients with negative neurological/psychiatric history and no abnormalities in the MR scan. The scans were also analyzed visually as well as semiquantitatively by manually drawn ROIs. Results: SPM showed a better accordance to the results of surface EEG monitoring compared with visual scan analysis and ROI quantification. In comparison with intracranial EEG recordings, the best performance was achieved by combining the ROI based quantification with SPM analysis. Conclusion: These findings suggest that SPM analysis of FDG-PET data could be a useful as complementary tool in the evaluation of seizure focus lateralization in patients with supposed frontal lobe epilepsy. (orig.)

  7. Detuned mechanical parametric amplification as a quantum non-demolition measurement

    International Nuclear Information System (INIS)

    Szorkovszky, A; Bowen, W P; Clerk, A A; Doherty, A C

    2014-01-01

    Recently it has been demonstrated that the combination of continuous position detection with detuned parametric driving can lead to significant steady-state mechanical squeezing, far beyond the 3 dB limit normally associated with parametric driving. In this work, we show the close connection between this detuned scheme and quantum non-demolition (QND) measurement of a single mechanical quadrature. In particular, we show that applying an experimentally realistic detuned parametric drive to a cavity optomechanical system allows one to effectively realize a QND measurement despite being in the bad-cavity limit. In the limit of strong squeezing, we show that this scheme offers significant advantages over standard backaction evasion, not only by allowing operation in the weak measurement and low efficiency regimes, but also in terms of the purity of the mechanical state

  8. Statistical analysis of non-homogeneous Poisson processes. Statistical processing of a particle multidetector

    International Nuclear Information System (INIS)

    Lacombe, J.P.

    1985-12-01

    Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr

  9. A question of separation: disentangling tracer bias and gravitational non-linearity with counts-in-cells statistics

    Science.gov (United States)

    Uhlemann, C.; Feix, M.; Codis, S.; Pichon, C.; Bernardeau, F.; L'Huillier, B.; Kim, J.; Hong, S. E.; Laigle, C.; Park, C.; Shin, J.; Pogosyan, D.

    2018-02-01

    Starting from a very accurate model for density-in-cells statistics of dark matter based on large deviation theory, a bias model for the tracer density in spheres is formulated. It adopts a mean bias relation based on a quadratic bias model to relate the log-densities of dark matter to those of mass-weighted dark haloes in real and redshift space. The validity of the parametrized bias model is established using a parametrization-independent extraction of the bias function. This average bias model is then combined with the dark matter PDF, neglecting any scatter around it: it nevertheless yields an excellent model for densities-in-cells statistics of mass tracers that is parametrized in terms of the underlying dark matter variance and three bias parameters. The procedure is validated on measurements of both the one- and two-point statistics of subhalo densities in the state-of-the-art Horizon Run 4 simulation showing excellent agreement for measured dark matter variance and bias parameters. Finally, it is demonstrated that this formalism allows for a joint estimation of the non-linear dark matter variance and the bias parameters using solely the statistics of subhaloes. Having verified that galaxy counts in hydrodynamical simulations sampled on a scale of 10 Mpc h-1 closely resemble those of subhaloes, this work provides important steps towards making theoretical predictions for density-in-cells statistics applicable to upcoming galaxy surveys like Euclid or WFIRST.

  10. Angular spectrum characters of high gain non-critical phase match optical parametric oscillators

    International Nuclear Information System (INIS)

    Liu Jian-Hui; Liu Qiang; Gong Ma-Li

    2011-01-01

    The angular spectrum gain characters and the power magnification characters of high gain non-walk-off colinear optical parametric oscillators have been studied using the non-colinear phase match method for the first time. The experimental results of the KTiOAsO 4 and the KTiOPO 4 crystals are discussed in detail. At the high energy single resonant condition, low reflective ratio of the output mirror for the signal and long non-linear crystal are beneficial for small divergence angles. This method can also be used for other high gain non-walk-off phase match optical parametric processes. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  11. Application of statistical parametric mapping in PET and SPECT brain functional imaging

    International Nuclear Information System (INIS)

    Guo Wanhua

    2002-01-01

    Regional of interest (ROI) is the method regularly used to analyze brain functional imaging. But, due to its obvious shortcomings such as subjectivity and poor reproducibility, precise analyzing the brain function was seriously limited. Therefore, statistical parametric mapping (SPM) as an automatic analyze software was developed based on voxel or pixel to resolve this problem. Using numerous mathematical models, it can be used to statistically assess the whole brain pixel. Present review introduces its main principle, modular composition and practical application. It can be concluded, with development of neuroscience, the SPM software will be used more widely in relative field, like neurobiology, cognition and neuropharmacology

  12. Analysis of brain SPECT with the statistical parametric mapping package SPM99

    International Nuclear Information System (INIS)

    Barnden, L.R.; Rowe, C.C.

    2000-01-01

    Full text: The Statistical Parametric Mapping (SPM) package of the Welcome Department of Cognitive Neurology permits detection in the brain of different regional uptake in an individual subject or a population of subjects compared to a normal population. SPM does not require a-priori specification of regions of interest. Recently SPM has been upgraded from SPM96 to SPM99. Our aim was to vary brain SPECT processing options in the application of SPM to optimise the final statistical map in three clinical trials. The sensitivity of SPM depends on the fidelity of the preliminary spatial normalisation of each scan to the standard anatomical space defined by a template scan provided with SPM. We generated our own SPECT template and compared spatial normalisation to it and to SPM's internal PET template. We also investigated the effects of scatter subtraction, stripping of scalp activity, reconstruction algorithm, non-linear deformation and derivation of spatial normalisation parameters using co-registered MR. Use of our SPECT template yielded better results than with SPM's PET template. Accuracy of SPECT to MR co-registration was 2.5mm with SPM96 and 1.2mm with SPM99. Stripping of scalp activity improved results with SPM96 but was unnecessary with SPM99. Scatter subtraction increased the sensitivity of SPM. Non-linear deformation additional to linear (affine) transformation only marginally improved the final result. Use of the SPECT template yielded more significant results than those obtained when co registered MR was used to derive the transformation parameters. SPM99 is more robust than SPM96 and optimum SPECT analysis requires a SPECT template. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  13. Assessing T cell clonal size distribution: a non-parametric approach.

    Directory of Open Access Journals (Sweden)

    Olesya V Bolkhovskaya

    Full Text Available Clonal structure of the human peripheral T-cell repertoire is shaped by a number of homeostatic mechanisms, including antigen presentation, cytokine and cell regulation. Its accurate tuning leads to a remarkable ability to combat pathogens in all their variety, while systemic failures may lead to severe consequences like autoimmune diseases. Here we develop and make use of a non-parametric statistical approach to assess T cell clonal size distributions from recent next generation sequencing data. For 41 healthy individuals and a patient with ankylosing spondylitis, who undergone treatment, we invariably find power law scaling over several decades and for the first time calculate quantitatively meaningful values of decay exponent. It has proved to be much the same among healthy donors, significantly different for an autoimmune patient before the therapy, and converging towards a typical value afterwards. We discuss implications of the findings for theoretical understanding and mathematical modeling of adaptive immunity.

  14. Dependence between fusion temperatures and chemical components of a certain type of coal using classical, non-parametric and bootstrap techniques

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Manteiga, W.; Prada-Sanchez, J.M.; Fiestras-Janeiro, M.G.; Garcia-Jurado, I. (Universidad de Santiago de Compostela, Santiago de Compostela (Spain). Dept. de Estadistica e Investigacion Operativa)

    1990-11-01

    A statistical study of the dependence between various critical fusion temperatures of a certain kind of coal and its chemical components is carried out. As well as using classical dependence techniques (multiple, stepwise and PLS regression, principal components, canonical correlation, etc.) together with the corresponding inference on the parameters of interest, non-parametric regression and bootstrap inference are also performed. 11 refs., 3 figs., 8 tabs.

  15. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  16. Optomechanical entanglement via non-degenerate parametric interactions

    Science.gov (United States)

    Ahmed, Rizwan; Qamar, Shahid

    2017-10-01

    We present a scheme for the optomechanical entanglement between a micro-mechanical mirror and the field inside a bimodal cavity system using a non-degenerate optical parametric amplifier (NOPA). Our results show that the introduction of NOPA makes the entanglement stronger or more robust against the mean number of average thermal phonons and cavity decay. Interestingly, macroscopic entanglement depends upon the choice of the phase associated with classical field driving NOPA. We also consider the effects of input laser power on optomechanical entanglement.

  17. A statistical approach to bioclimatic trend detection in the airborne pollen records of Catalonia (NE Spain)

    Science.gov (United States)

    Fernández-Llamazares, Álvaro; Belmonte, Jordina; Delgado, Rosario; De Linares, Concepción

    2014-04-01

    Airborne pollen records are a suitable indicator for the study of climate change. The present work focuses on the role of annual pollen indices for the detection of bioclimatic trends through the analysis of the aerobiological spectra of 11 taxa of great biogeographical relevance in Catalonia over an 18-year period (1994-2011), by means of different parametric and non-parametric statistical methods. Among others, two non-parametric rank-based statistical tests were performed for detecting monotonic trends in time series data of the selected airborne pollen types and we have observed that they have similar power in detecting trends. Except for those cases in which the pollen data can be well-modeled by a normal distribution, it is better to apply non-parametric statistical methods to aerobiological studies. Our results provide a reliable representation of the pollen trends in the region and suggest that greater pollen quantities are being liberated to the atmosphere in the last years, specially by Mediterranean taxa such as Pinus, Total Quercus and Evergreen Quercus, although the trends may differ geographically. Longer aerobiological monitoring periods are required to corroborate these results and survey the increasing levels of certain pollen types that could exert an impact in terms of public health.

  18. Machine learning and statistical techniques : an application to the prediction of insolvency in Spanish non-life insurance companies

    OpenAIRE

    Díaz, Zuleyka; Segovia, María Jesús; Fernández, José

    2005-01-01

    Prediction of insurance companies insolvency has arisen as an important problem in the field of financial research. Most methods applied in the past to tackle this issue are traditional statistical techniques which use financial ratios as explicative variables. However, these variables often do not satisfy statistical assumptions, which complicates the application of the mentioned methods. In this paper, a comparative study of the performance of two non-parametric machine learning techniques ...

  19. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei

    2014-11-07

    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh functions. Then the deviations of the autocovariances based on these systematic samples from the corresponding autocovariances of the whole time series are calculated and the uniform asymptotic joint normality of these deviations over different systematic samples is obtained. With a double-order selection scheme, our test statistic is constructed by combining the deviations at different lags in the systematic samples. The null asymptotic distribution of the statistic proposed is derived and the consistency of the test is shown under fixed and local alternatives. Simulation studies demonstrate well-behaved finite sample properties of the method proposed. Comparisons with some existing tests in terms of power are given both analytically and empirically. In addition, the method proposed is applied to check the stationarity assumption of a chemical process viscosity readings data set.

  20. Influence of image reconstruction methods on statistical parametric mapping of brain PET images

    International Nuclear Information System (INIS)

    Yin Dayi; Chen Yingmao; Yao Shulin; Shao Mingzhe; Yin Ling; Tian Jiahe; Cui Hongyan

    2007-01-01

    Objective: Statistic parametric mapping (SPM) was widely recognized as an useful tool in brain function study. The aim of this study was to investigate if imaging reconstruction algorithm of PET images could influence SPM of brain. Methods: PET imaging of whole brain was performed in six normal volunteers. Each volunteer had two scans with true and false acupuncturing. The PET scans were reconstructed using ordered subsets expectation maximization (OSEM) and filtered back projection (FBP) with 3 varied parameters respectively. The images were realigned, normalized and smoothed using SPM program. The difference between true and false acupuncture scans was tested using a matched pair t test at every voxel. Results: (1) SPM corrected multiple comparison (P corrected uncorrected <0.001): SPM derived from the images with different reconstruction method were different. The largest difference, in number and position of the activated voxels, was noticed between FBP and OSEM re- construction algorithm. Conclusions: The method of PET image reconstruction could influence the results of SPM uncorrected multiple comparison. Attention should be paid when the conclusion was drawn using SPM uncorrected multiple comparison. (authors)

  1. Classification rates: non‐parametric verses parametric models using ...

    African Journals Online (AJOL)

    This research sought to establish if non parametric modeling achieves a higher correct classification ratio than a parametric model. The local likelihood technique was used to model fit the data sets. The same sets of data were modeled using parametric logit and the abilities of the two models to correctly predict the binary ...

  2. Optomechanical entanglement via non-degenerate parametric interactions

    International Nuclear Information System (INIS)

    Ahmed, Rizwan; Qamar, Shahid

    2017-01-01

    We present a scheme for the optomechanical entanglement between a micro-mechanical mirror and the field inside a bimodal cavity system using a non-degenerate optical parametric amplifier (NOPA). Our results show that the introduction of NOPA makes the entanglement stronger or more robust against the mean number of average thermal phonons and cavity decay. Interestingly, macroscopic entanglement depends upon the choice of the phase associated with classical field driving NOPA. We also consider the effects of input laser power on optomechanical entanglement. (paper)

  3. Cosmological Non-Gaussian Signature Detection: Comparing Performance of Different Statistical Tests

    Directory of Open Access Journals (Sweden)

    O. Forni

    2005-09-01

    Full Text Available Currently, it appears that the best method for non-Gaussianity detection in the cosmic microwave background (CMB consists in calculating the kurtosis of the wavelet coefficients. We know that wavelet-kurtosis outperforms other methods such as the bispectrum, the genus, ridgelet-kurtosis, and curvelet-kurtosis on an empirical basis, but relatively few studies have compared other transform-based statistics, such as extreme values, or more recent tools such as higher criticism (HC, or proposed “best possible” choices for such statistics. In this paper, we consider two models for transform-domain coefficients: (a a power-law model, which seems suited to the wavelet coefficients of simulated cosmic strings, and (b a sparse mixture model, which seems suitable for the curvelet coefficients of filamentary structure. For model (a, if power-law behavior holds with finite 8th moment, excess kurtosis is an asymptotically optimal detector, but if the 8th moment is not finite, a test based on extreme values is asymptotically optimal. For model (b, if the transform coefficients are very sparse, a recent test, higher criticism, is an optimal detector, but if they are dense, kurtosis is an optimal detector. Empirical wavelet coefficients of simulated cosmic strings have power-law character, infinite 8th moment, while curvelet coefficients of the simulated cosmic strings are not very sparse. In all cases, excess kurtosis seems to be an effective test in moderate-resolution imagery.

  4. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  5. Theoretical remarks on the statistics of three discriminants in Piety's automated signature analysis of PSD [Power Spectral Density] data

    International Nuclear Information System (INIS)

    Behringer, K.; Spiekerman, G.

    1984-01-01

    Piety (1977) proposed an automated signature analysis of power spectral density data. Eight statistical decision discriminants are introduced. For nearly all the discriminants, improved confidence statements can be made. The statistical characteristics of the last three discriminants, which are applications of non-parametric tests, are considered. (author)

  6. SPM analysis of parametric (R)-[11C]PK11195 binding images: plasma input versus reference tissue parametric methods.

    Science.gov (United States)

    Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald

    2007-05-01

    (R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).

  7. Functional brain mapping using H{sub 2}{sup 15}O positron emission tomography (I): statistical parametric mapping method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Soo; Lee, Jae Sung; Kim, Kyeong Min; Chung, June Key; Lee, Myung Chul [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of)

    1998-08-01

    We investigated the statistical methods to compose the functional brain map of human working memory and the principal factors that have an effect on the methods for localization. Repeated PET scans with successive four tasks, which consist of one control and three different activation tasks, were performed on six right-handed normal volunteers for 2 minutes after bolus injections of 925 MBq H{sub 2}{sup 15}O at the intervals of 30 minutes. Image data were analyzed using SPM96 (Statistical Parametric Mapping) implemented with Matlab (Mathworks Inc., U.S.A.). Images from the same subject were spatially registered and were normalized using linear and nonlinear transformation methods. Significant difference between control and each activation state was estimated at every voxel based on the general linear model. Differences of global counts were removed using analysis of covariance (ANCOVA) with global activity as covariate. Using the mean and variance for each condition which was adjusted using ANCOVA, t-statistics was performed on every voxel. To interpret the results more easily, t-values were transformed to the standard Gaussian distribution (Z-score). All the subjects carried out the activation and control tests successfully. Average rate of correct answers was 95%. The numbers of activated blobs were 4 for verbal memory I, 9 for verbal memory II, 9 for visual memory, and 6 for conjunctive activation of these three tasks. The verbal working memory activates predominantly left-sided structures, and the visual memory activates the right hemisphere. We conclude that rCBF PET imaging and statistical parametric mapping method were useful in the localization of the brain regions for verbal and visual working memory.

  8. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  9. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  10. A comparative study of non-parametric models for identification of ...

    African Journals Online (AJOL)

    However, the frequency response method using random binary signals was good for unpredicted white noise characteristics and considered the best method for non-parametric system identifica-tion. The autoregressive external input (ARX) model was very useful for system identification, but on applicati-on, few input ...

  11. Impulse response identification with deterministic inputs using non-parametric methods

    International Nuclear Information System (INIS)

    Bhargava, U.K.; Kashyap, R.L.; Goodman, D.M.

    1985-01-01

    This paper addresses the problem of impulse response identification using non-parametric methods. Although the techniques developed herein apply to the truncated, untruncated, and the circulant models, we focus on the truncated model which is useful in certain applications. Two methods of impulse response identification will be presented. The first is based on the minimization of the C/sub L/ Statistic, which is an estimate of the mean-square prediction error; the second is a Bayesian approach. For both of these methods, we consider the effects of using both the identity matrix and the Laplacian matrix as weights on the energy in the impulse response. In addition, we present a method for estimating the effective length of the impulse response. Estimating the length is particularly important in the truncated case. Finally, we develop a method for estimating the noise variance at the output. Often, prior information on the noise variance is not available, and a good estimate is crucial to the success of estimating the impulse response with a nonparametric technique

  12. Parametric Sensitivity Tests—European Polymer Electrolyte Membrane Fuel Cell Stack Test Procedures

    DEFF Research Database (Denmark)

    Araya, Samuel Simon; Andreasen, Søren Juhl; Kær, Søren Knudsen

    2014-01-01

    performed based on test procedures proposed by a European project, Stack-Test. The sensitivity of a Nafion-based low temperature PEMFC stack’s performance to parametric changes was the main objective of the tests. Four crucial parameters for fuel cell operation were chosen; relative humidity, temperature......As fuel cells are increasingly commercialized for various applications, harmonized and industry-relevant test procedures are necessary to benchmark tests and to ensure comparability of stack performance results from different parties. This paper reports the results of parametric sensitivity tests......, pressure, and stoichiometry at varying current density. Furthermore, procedures for polarization curve recording were also tested both in ascending and descending current directions....

  13. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  14. Non-parametric early seizure detection in an animal model of temporal lobe epilepsy

    Science.gov (United States)

    Talathi, Sachin S.; Hwang, Dong-Uk; Spano, Mark L.; Simonotto, Jennifer; Furman, Michael D.; Myers, Stephen M.; Winters, Jason T.; Ditto, William L.; Carney, Paul R.

    2008-03-01

    The performance of five non-parametric, univariate seizure detection schemes (embedding delay, Hurst scale, wavelet scale, nonlinear autocorrelation and variance energy) were evaluated as a function of the sampling rate of EEG recordings, the electrode types used for EEG acquisition, and the spatial location of the EEG electrodes in order to determine the applicability of the measures in real-time closed-loop seizure intervention. The criteria chosen for evaluating the performance were high statistical robustness (as determined through the sensitivity and the specificity of a given measure in detecting a seizure) and the lag in seizure detection with respect to the seizure onset time (as determined by visual inspection of the EEG signal by a trained epileptologist). An optimality index was designed to evaluate the overall performance of each measure. For the EEG data recorded with microwire electrode array at a sampling rate of 12 kHz, the wavelet scale measure exhibited better overall performance in terms of its ability to detect a seizure with high optimality index value and high statistics in terms of sensitivity and specificity.

  15. Parametric and nonparametric Granger causality testing: Linkages between international stock markets

    Science.gov (United States)

    De Gooijer, Jan G.; Sivarajasingham, Selliah

    2008-04-01

    This study investigates long-term linear and nonlinear causal linkages among eleven stock markets, six industrialized markets and five emerging markets of South-East Asia. We cover the period 1987-2006, taking into account the on-set of the Asian financial crisis of 1997. We first apply a test for the presence of general nonlinearity in vector time series. Substantial differences exist between the pre- and post-crisis period in terms of the total number of significant nonlinear relationships. We then examine both periods, using a new nonparametric test for Granger noncausality and the conventional parametric Granger noncausality test. One major finding is that the Asian stock markets have become more internationally integrated after the Asian financial crisis. An exception is the Sri Lankan market with almost no significant long-term linear and nonlinear causal linkages with other markets. To ensure that any causality is strictly nonlinear in nature, we also examine the nonlinear causal relationships of VAR filtered residuals and VAR filtered squared residuals for the post-crisis sample. We find quite a few remaining significant bi- and uni-directional causal nonlinear relationships in these series. Finally, after filtering the VAR-residuals with GARCH-BEKK models, we show that the nonparametric test statistics are substantially smaller in both magnitude and statistical significance than those before filtering. This indicates that nonlinear causality can, to a large extent, be explained by simple volatility effects.

  16. Continuous/discrete non parametric Bayesian belief nets with UNICORN and UNINET

    NARCIS (Netherlands)

    Cooke, R.M.; Kurowicka, D.; Hanea, A.M.; Morales Napoles, O.; Ababei, D.A.; Ale, B.J.M.; Roelen, A.

    2007-01-01

    Hanea et al. (2006) presented a method for quantifying and computing continuous/discrete non parametric Bayesian Belief Nets (BBN). Influences are represented as conditional rank correlations, and the joint normal copula enables rapid sampling and conditionalization. Further mathematical background

  17. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei; Wang, Suojin; Wang, Haiyan

    2014-01-01

    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh

  18. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...

  19. Functional summary statistics for the Johnson-Mehl model

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    The Johnson-Mehl germination-growth model is a spatio-temporal point process model which among other things have been used for the description of neurotransmitters datasets. However, for such datasets parametric Johnson-Mehl models fitted by maximum likelihood have yet not been evaluated by means...... of functional summary statistics. This paper therefore invents four functional summary statistics adapted to the Johnson-Mehl model, with two of them based on the second-order properties and the other two on the nuclei-boundary distances for the associated Johnson-Mehl tessellation. The functional summary...... statistics theoretical properties are investigated, non-parametric estimators are suggested, and their usefulness for model checking is examined in a simulation study. The functional summary statistics are also used for checking fitted parametric Johnson-Mehl models for a neurotransmitters dataset....

  20. Numerical investigations of non-collinear optical parametric chirped pulse amplification for Laguerre-Gaussian vortex beam

    Science.gov (United States)

    Xu, Lu; Yu, Lianghong; Liang, Xiaoyan

    2016-04-01

    We present for the first time a scheme to amplify a Laguerre-Gaussian vortex beam based on non-collinear optical parametric chirped pulse amplification (OPCPA). In addition, a three-dimensional numerical model of non-collinear optical parametric amplification was deduced in the frequency domain, in which the effects of non-collinear configuration, temporal and spatial walk-off, group-velocity dispersion and diffraction were also taken into account, to trace the dynamics of the Laguerre-Gaussian vortex beam and investigate its critical parameters in the non-collinear OPCPA process. Based on the numerical simulation results, the scheme shows promise for implementation in a relativistic twisted laser pulse system, which will diversify the light-matter interaction field.

  1. Robustness to non-normality of common tests for the many-sample location problem

    Directory of Open Access Journals (Sweden)

    Azmeri Khan

    2003-01-01

    Full Text Available This paper studies the effect of deviating from the normal distribution assumption when considering the power of two many-sample location test procedures: ANOVA (parametric and Kruskal-Wallis (non-parametric. Power functions for these tests under various conditions are produced using simulation, where the simulated data are produced using MacGillivray and Cannon's [10] recently suggested g-and-k distribution. This distribution can provide data with selected amounts of skewness and kurtosis by varying two nearly independent parameters.

  2. Monitoring coastal marshes biomass with CASI: a comparison of parametric and non-parametric models

    Science.gov (United States)

    Mo, Y.; Kearney, M.

    2017-12-01

    Coastal marshes are important carbon sinks that face multiple natural and anthropogenic stresses. Optical remote sensing is a powerful tool for closely monitoring the biomass of coastal marshes. However, application of hyperspectral sensors on assessing the biomass of diverse coastal marsh ecosystems is limited. This study samples spectral and biophysical data from coastal freshwater, intermediate, brackish, and saline marshes in Louisiana, and develops parametric and non-parametric models for using the Compact Airborne Spectrographic Imager (CASI) to retrieve the marshes' biomass. Linear models and random forest models are developed from simulated CASI data (48 bands, 380-1050 nm, bandwidth 14 nm). Linear models are also developed using narrowband vegetation indices computed from all possible band combinations from the blue, red, and near infrared wavelengths. It is found that the linear models derived from the optimal narrowband vegetation indices provide strong predictions for the marshes' Leaf Area Index (LAI; R2 > 0.74 for ARVI), but not for their Aboveground Green Biomass (AGB; R2 > 0.25). The linear models derived from the simulated CASI data strongly predict the marshes' LAI (R2 = 0.93) and AGB (R2 = 0.71) and have 27 and 30 bands/variables in the final models through stepwise regression, respectively. The random forest models derived from the simulated CASI data also strongly predict the marshes' LAI and AGB (R2 = 0.91 and 0.84, respectively), where the most important variables for predicting LAI are near infrared bands at 784 and 756 nm and for predicting ABG are red bands at 684 and 670 nm. In sum, the random forest model is preferable for assessing coastal marsh biomass using CASI data as it offers high R2 for both LAI and AGB. The superior performance of the random forest model is likely to due to that it fully utilizes the full-spectrum data and makes no assumption of the approximate normality of the sampling population. This study offers solutions

  3. Parametric tests of a traction drive retrofitted to an automotive gas turbine

    Science.gov (United States)

    Rohn, D. A.; Lowenthal, S. H.; Anderson, N. E.

    1980-01-01

    The results of a test program to retrofit a high performance fixed ratio Nasvytis Multiroller Traction Drive in place of a helical gear set to a gas turbine engine are presented. Parametric tests up to a maximum engine power turbine speed of 45,500 rpm and to a power level of 11 kW were conducted. Comparisons were made to similar drives that were parametrically tested on a back-to-back test stand. The drive showed good compatibility with the gas turbine engine. Specific fuel consumption of the engine with the traction drive speed reducer installed was comparable to the original helical gearset equipped engine.

  4. A non-parametric hierarchical model to discover behavior dynamics from tracks

    NARCIS (Netherlands)

    Kooij, J.F.P.; Englebienne, G.; Gavrila, D.M.

    2012-01-01

    We present a novel non-parametric Bayesian model to jointly discover the dynamics of low-level actions and high-level behaviors of tracked people in open environments. Our model represents behaviors as Markov chains of actions which capture high-level temporal dynamics. Actions may be shared by

  5. How characterization and clearance process is planned to be optimized by combining MARSSIM methods with parametric statistics in decommissioning of Karolinska University Hospital in Stockholm

    International Nuclear Information System (INIS)

    Jiselmark, J.

    2017-01-01

    There are different standards for the characterization and clearance process used globally in the radiological industry. All of them have advantages and disadvantages. This paper is describing a decommissioning project which is combining two methods in order to use the advantages of both and minimizing the disadvantages. In Sweden there have been a standard since several years to use a method based on parametric Bayesian statistics for the characterization and clearance process. This method has great advantages close to the clearance limits due to few measurements per m"2, an ability to add extra measurements if needed and an ability to reshape area units without restarting the clearance process. Since the method is based on units with a normal or LOG-normal distribution of the contamination there can be several units far from the clearance limits. The American MARSSIM method use non parametric statistics instead of parametric. In comparison to the Bayesian methods this results in the disadvantage of less accuracy close to the clearance limits but also in the great advantage with few units far from the clearance limits. In the characterizing and clearance process of old radiological facilities at the Karolinska University Hospital in Stockholm the MARSSIM method is combined with the Bayesian statistics method to minimize the amount of measurements and by that the cost for clearance. By using Bayesian statistics close to the clearance limits, more areas will be approved for clearance and the risk of having to redo the survey is minimized. By using MARSSIM methods in the area with an assumed contamination below 25 % of the clearance limits, the areas are not needed to be divided into units with normal or LOG-normal distributed activity. Bigger areas can be handled as units which result in fewer measurements and a faster process. (authors)

  6. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  7. Parametric effects on glass reaction in the unsaturated test method

    International Nuclear Information System (INIS)

    Woodland, A.B.; Bates, J.K.; Gerding, T.J.

    1991-12-01

    The Unsaturated Test Method has been applied to study glass reaction under conditions that may be present at the potential Yucca Mountain site, currently under evaluation for storage of reprocessed high-level nuclear waste. The results from five separate sets of parametric experiments are presented wherein test parameters ranging from water contact volume to sensitization of metal in contact with the glass were examined. The most significant effect was observed when the volume of water, as controlled by the water inject volume and interval period, was such to allow exfoliation of reacted glass to occur. The extent of reaction was also influenced to a lesser extent by the degree of sensitization of the 304L stainless steel. For each experiment, the release of cations from the glass and alteration of the glass were examined. The major alteration product is a smectite clay that forms both from precipitation from solution and from in-situ alteration of the glass itself. It is this clay that undergoes exfoliation as water drips from the glass. A comparison is made between the results of the parametric experiments with those of static leach tests. In the static tests the rates of release become progressively reduced through 39 weeks while, in contrast, they remain relatively constant in the parametric experiments for at least 300 weeks. This differing behavior may be attributable to the dripping water environment where fresh water is periodically added and where evaporation can occur

  8. Comparison of parametric instabilities for different test mass materials in advanced gravitational wave interferometers

    International Nuclear Information System (INIS)

    Ju, L.; Zhao, C.; Gras, S.; Degallaix, J.; Blair, D.G.; Munch, J.; Reitze, D.H.

    2006-01-01

    Following the recognition that parametric instabilities can significantly compromise the performance of advanced laser interferometer gravitational wave detectors, we compare the performance of three different test mass configurations: all fused silica test masses, all sapphire test masses and fused silica inboard test masses with sapphire end test masses. We show that the configuration with sapphire end test masses offers the opportunity for thermal tuning on a time scale comparable to the ring up time of oscillatory instabilities. This approach may enable significant reduction of parametric gain

  9. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    Science.gov (United States)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  10. Hadron Energy Reconstruction for ATLAS Barrel Combined Calorimeter Using Non-Parametrical Method

    CERN Document Server

    Kulchitskii, Yu A

    2000-01-01

    Hadron energy reconstruction for the ATLAS barrel prototype combined calorimeter in the framework of the non-parametrical method is discussed. The non-parametrical method utilizes only the known e/h ratios and the electron calibration constants and does not require the determination of any parameters by a minimization technique. Thus, this technique lends itself to fast energy reconstruction in a first level trigger. The reconstructed mean values of the hadron energies are within \\pm1% of the true values and the fractional energy resolution is [(58\\pm 3)%{\\sqrt{GeV}}/\\sqrt{E}+(2.5\\pm0.3)%]\\bigoplus(1.7\\pm0.2) GeV/E. The value of the e/h ratio obtained for the electromagnetic compartment of the combined calorimeter is 1.74\\pm0.04. Results of a study of the longitudinal hadronic shower development are also presented.

  11. Tips and Tricks for Successful Application of Statistical Methods to Biological Data.

    Science.gov (United States)

    Schlenker, Evelyn

    2016-01-01

    This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.

  12. Single photon emission computed tomography and statistical parametric mapping analysis in cirrhotic patients with and without minimal hepatic encephalopathy

    International Nuclear Information System (INIS)

    Nakagawa, Yuri; Matsumura, Kaname; Iwasa, Motoh; Kaito, Masahiko; Adachi, Yukihiko; Takeda, Kan

    2004-01-01

    The early diagnosis and treatment of cognitive impairment in cirrhotic patients is needed to improve the patients' daily living. In this study, alterations of regional cerebral blood flow (rCBF) were evaluated in cirrhotic patients using statistical parametric mapping (SPM). The relationships between rCBF and neuropsychological test, severity of disease and biochemical data were also assessed. 99m Tc-ethyl cysteinate dimer single photon emission computed tomography was performed in 20 patients with non-alcoholic liver cirrhosis without overt hepatic encephalopathy (HE) and in 20 age-matched healthy subjects. Neuropsychological tests were performed in 16 patients; of these 7 had minimal HE. Regional CBF images were also analyzed in these groups using SPM. On SPM analysis, cirrhotic patients showed regions of significant hypoperfusion in the superior and middle frontal gyri, and inferior parietal lobules compared with the control group. These areas included parts of the premotor and parietal associated areas of the cortex. Among the cirrhotic patients, those with minimal HE had regions of significant hypoperfusion in the cingulate gyri bilaterally as compared with those without minimal HE. Abnormal function in the above regions may account for the relatively selective neuropsychological deficits in the cognitive status of patients with cirrhosis. These findings may be important in the identification and management of cirrhotic patients with minimal HE. (author)

  13. Non-Intrusive Solution of Stochastic and Parametric Equations

    KAUST Repository

    Matthies, Hermann

    2015-01-07

    Many problems depend on parameters, which may be a finite set of numerical values, or mathematically more complicated objects like for example processes or fields. We address the situation where we have an equation which depends on parameters; stochastic equations are a special case of such parametric problems where the parameters are elements from a probability space. One common way to represent this dependability on parameters is by evaluating the state (or solution) of the system under investigation for different values of the parameters. But often one wants to evaluate the solution quickly for a new set of parameters where it has not been sampled. In this situation it may be advantageous to express the parameter dependent solution with an approximation which allows for rapid evaluation of the solution. Such approximations are also called proxy or surrogate models, response functions, or emulators. All these methods may be seen as functional approximations—representations of the solution by an “easily computable” function of the parameters, as opposed to pure samples. The most obvious methods of approximation used are based on interpolation, in this context often labelled as collocation. In the frequent situation where one has a “solver” for the equation for a given parameter value, i.e. a software component or a program, it is evident that this can be used to independently—if desired in parallel—solve for all the parameter values which subsequently may be used either for the interpolation or in the quadrature for the projection. Such methods are therefore uncoupled for each parameter value, and they additionally often carry the label “non-intrusive”. Without much argument all other methods— which produce a coupled system of equations–are almost always labelled as “intrusive”, meaning that one cannot use the original solver. We want to show here that this not necessarily the case. Another approach is to choose some other projection onto

  14. Non-Intrusive Solution of Stochastic and Parametric Equations

    KAUST Repository

    Matthies, Hermann

    2015-01-01

    Many problems depend on parameters, which may be a finite set of numerical values, or mathematically more complicated objects like for example processes or fields. We address the situation where we have an equation which depends on parameters; stochastic equations are a special case of such parametric problems where the parameters are elements from a probability space. One common way to represent this dependability on parameters is by evaluating the state (or solution) of the system under investigation for different values of the parameters. But often one wants to evaluate the solution quickly for a new set of parameters where it has not been sampled. In this situation it may be advantageous to express the parameter dependent solution with an approximation which allows for rapid evaluation of the solution. Such approximations are also called proxy or surrogate models, response functions, or emulators. All these methods may be seen as functional approximations—representations of the solution by an “easily computable” function of the parameters, as opposed to pure samples. The most obvious methods of approximation used are based on interpolation, in this context often labelled as collocation. In the frequent situation where one has a “solver” for the equation for a given parameter value, i.e. a software component or a program, it is evident that this can be used to independently—if desired in parallel—solve for all the parameter values which subsequently may be used either for the interpolation or in the quadrature for the projection. Such methods are therefore uncoupled for each parameter value, and they additionally often carry the label “non-intrusive”. Without much argument all other methods— which produce a coupled system of equations–are almost always labelled as “intrusive”, meaning that one cannot use the original solver. We want to show here that this not necessarily the case. Another approach is to choose some other projection onto

  15. Performance of non-parametric algorithms for spatial mapping of tropical forest structure

    Directory of Open Access Journals (Sweden)

    Liang Xu

    2016-08-01

    Full Text Available Abstract Background Mapping tropical forest structure is a critical requirement for accurate estimation of emissions and removals from land use activities. With the availability of a wide range of remote sensing imagery of vegetation characteristics from space, development of finer resolution and more accurate maps has advanced in recent years. However, the mapping accuracy relies heavily on the quality of input layers, the algorithm chosen, and the size and quality of inventory samples for calibration and validation. Results By using airborne lidar data as the “truth” and focusing on the mean canopy height (MCH as a key structural parameter, we test two commonly-used non-parametric techniques of maximum entropy (ME and random forest (RF for developing maps over a study site in Central Gabon. Results of mapping show that both approaches have improved accuracy with more input layers in mapping canopy height at 100 m (1-ha pixels. The bias-corrected spatial models further improve estimates for small and large trees across the tails of height distributions with a trade-off in increasing overall mean squared error that can be readily compensated by increasing the sample size. Conclusions A significant improvement in tropical forest mapping can be achieved by weighting the number of inventory samples against the choice of image layers and the non-parametric algorithms. Without future satellite observations with better sensitivity to forest biomass, the maps based on existing data will remain slightly biased towards the mean of the distribution and under and over estimating the upper and lower tails of the distribution.

  16. 100 statistical tests

    CERN Document Server

    Kanji, Gopal K

    2006-01-01

    This expanded and updated Third Edition of Gopal K. Kanji's best-selling resource on statistical tests covers all the most commonly used tests with information on how to calculate and interpret results with simple datasets. Each entry begins with a short summary statement about the test's purpose, and contains details of the test objective, the limitations (or assumptions) involved, a brief outline of the method, a worked example, and the numerical calculation. 100 Statistical Tests, Third Edition is the one indispensable guide for users of statistical materials and consumers of statistical information at all levels and across all disciplines.

  17. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  18. A non-parametric Bayesian approach to decompounding from high frequency data

    NARCIS (Netherlands)

    Gugushvili, Shota; van der Meulen, F.H.; Spreij, Peter

    2016-01-01

    Given a sample from a discretely observed compound Poisson process, we consider non-parametric estimation of the density f0 of its jump sizes, as well as of its intensity λ0. We take a Bayesian approach to the problem and specify the prior on f0 as the Dirichlet location mixture of normal densities.

  19. Parametric methods for spatial point processes

    DEFF Research Database (Denmark)

    Møller, Jesper

    is studied in Section 4, and Bayesian inference in Section 5. On one hand, as the development in computer technology and computational statistics continues,computationally-intensive simulation-based methods for likelihood inference probably will play a increasing role for statistical analysis of spatial...... inference procedures for parametric spatial point process models. The widespread use of sensible but ad hoc methods based on summary statistics of the kind studied in Chapter 4.3 have through the last two decades been supplied by likelihood based methods for parametric spatial point process models......(This text is submitted for the volume ‘A Handbook of Spatial Statistics' edited by A.E. Gelfand, P. Diggle, M. Fuentes, and P. Guttorp, to be published by Chapmand and Hall/CRC Press, and planned to appear as Chapter 4.4 with the title ‘Parametric methods'.) 1 Introduction This chapter considers...

  20. Confronting Passive and Active Sensors with Non-Gaussian Statistics

    Directory of Open Access Journals (Sweden)

    Pablo Rodríguez-Gonzálvez

    2014-07-01

    Full Text Available This paper has two motivations: firstly, to compare the Digital Surface Models (DSM derived by passive (digital camera and by active (terrestrial laser scanner remote sensing systems when applied to specific architectural objects, and secondly, to test how well the Gaussian classic statistics, with its Least Squares principle, adapts to data sets where asymmetrical gross errors may appear and whether this approach should be changed for a non-parametric one. The field of geomatic technology automation is immersed in a high demanding competition in which any innovation by one of the contenders immediately challenges the opponents to propose a better improvement. Nowadays, we seem to be witnessing an improvement of terrestrial photogrammetry and its integration with computer vision to overcome the performance limitations of laser scanning methods. Through this contribution some of the issues of this “technological race” are examined from the point of view of photogrammetry. A new software is introduced and an experimental test is designed, performed and assessed to try to cast some light on this thrilling match. For the case considered in this study, the results show good agreement between both sensors, despite considerable asymmetry. This asymmetry suggests that the standard Normal parameters are not adequate to assess this type of data, especially when accuracy is of importance. In this case, standard deviation fails to provide a good estimation of the results, whereas the results obtained for the Median Absolute Deviation and for the Biweight Midvariance are more appropriate measures.

  1. Inferential, non-parametric statistics to assess the quality of probabilistic forecast systems

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.; Lennox, S.; Stone, R.C.

    2007-01-01

    Many statistical forecast systems are available to interested users. To be useful for decision making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and its statistical manifestation have been firmly established, the forecasts must

  2. Statistical reliability analyses of two wood plastic composite extrusion processes

    International Nuclear Information System (INIS)

    Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.

    2011-01-01

    Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.

  3. Testing the Weak Form Efficiency of Karachi Stock Exchange

    Directory of Open Access Journals (Sweden)

    Muhammad Arshad Haroon

    2012-12-01

    Full Text Available In an efficient market, share prices reflect all available information. The study of efficient market hypothesis helps to take right decisions related to investments. In this research,weak form efficiency has been tested of Karachi Stock Exchange—KSE covering the period of 2nd November 1991 to 2nd November 2011. Descriptive statistics indicated the absence of weak form efficiency while results of non-parametric tests, showed consistency as well. We employed non-parametric tests were KS Goodness-of-Fit test,run test and autocorrelation test to find out serial independency of the data. Results prove that KSE is not weak-form-efficient. This happens because KSE is an emerging market and there, it has been observed that information take time to be processed. Thus it can besaid that technical analysis may be applied to gain abnormal returns.

  4. Statistical Analysis for High-Dimensional Data : The Abel Symposium 2014

    CERN Document Server

    Bühlmann, Peter; Glad, Ingrid; Langaas, Mette; Richardson, Sylvia; Vannucci, Marina

    2016-01-01

    This book features research contributions from The Abel Symposium on Statistical Analysis for High Dimensional Data, held in Nyvågar, Lofoten, Norway, in May 2014. The focus of the symposium was on statistical and machine learning methodologies specifically developed for inference in “big data” situations, with particular reference to genomic applications. The contributors, who are among the most prominent researchers on the theory of statistics for high dimensional inference, present new theories and methods, as well as challenging applications and computational solutions. Specific themes include, among others, variable selection and screening, penalised regression, sparsity, thresholding, low dimensional structures, computational challenges, non-convex situations, learning graphical models, sparse covariance and precision matrices, semi- and non-parametric formulations, multiple testing, classification, factor models, clustering, and preselection. Highlighting cutting-edge research and casting light on...

  5. Parametric methods outperformed non-parametric methods in comparisons of discrete numerical variables

    Directory of Open Access Journals (Sweden)

    Sandvik Leiv

    2011-04-01

    Full Text Available Abstract Background The number of events per individual is a widely reported variable in medical research papers. Such variables are the most common representation of the general variable type called discrete numerical. There is currently no consensus on how to compare and present such variables, and recommendations are lacking. The objective of this paper is to present recommendations for analysis and presentation of results for discrete numerical variables. Methods Two simulation studies were used to investigate the performance of hypothesis tests and confidence interval methods for variables with outcomes {0, 1, 2}, {0, 1, 2, 3}, {0, 1, 2, 3, 4}, and {0, 1, 2, 3, 4, 5}, using the difference between the means as an effect measure. Results The Welch U test (the T test with adjustment for unequal variances and its associated confidence interval performed well for almost all situations considered. The Brunner-Munzel test also performed well, except for small sample sizes (10 in each group. The ordinary T test, the Wilcoxon-Mann-Whitney test, the percentile bootstrap interval, and the bootstrap-t interval did not perform satisfactorily. Conclusions The difference between the means is an appropriate effect measure for comparing two independent discrete numerical variables that has both lower and upper bounds. To analyze this problem, we encourage more frequent use of parametric hypothesis tests and confidence intervals.

  6. Some statistical issues important to future developments in human radiation research

    International Nuclear Information System (INIS)

    Vaeth, Michael

    1991-01-01

    Using his two years experience at the Radiation Effects Research Foundation at Hiroshima, the author tries to outline some of the areas of statistics where methodologies relevant to the future developments in human radiation research are likely to be found. Problems related to statistical analysis of existing data are discussed, together with methodological developments in non-parametric and semi-parametric regression modelling, and interpretation and presentation of results. (Author)

  7. Energy consumption and economic growth: Parametric and non-parametric causality testing for the case of Greece

    International Nuclear Information System (INIS)

    Dergiades, Theologos; Martinopoulos, Georgios; Tsoulfidis, Lefteris

    2013-01-01

    The objective of this paper is to contribute towards the understanding of the linear and non-linear causal linkages between energy consumption and economic activity, making use of annual time series data of Greece for the period 1960–2008. Two are the salient features of our study: first, the total energy consumption has been adjusted for qualitative differences among its constituent components through the thermodynamics of energy conversion. In doing so, we rule out the possibility of a misleading inference with respect to causality due to aggregation bias. Second, the investigation of the causal linkage between economic growth and the adjusted for quality total energy consumption is conducted within a non-linear context. Our empirical results reveal significant unidirectional both linear and non-linear causal linkages running from total useful energy to economic growth. These findings may provide valuable information for the contemplation of more effective energy policies with respect to both the consumption of energy and environmental protection. - Highlights: ► The energy consumption and economic growth nexus is investigated for Greece. ► A quality-adjusted energy series is used in our analysis. ► The causality testing procedure is conducted within a non-linear context. ► A causality running from energy consumption to economic growth is verified

  8. Mathematical statistics and stochastic processes

    CERN Document Server

    Bosq, Denis

    2013-01-01

    Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

  9. An appraisal of statistical procedures used in derivation of reference intervals.

    Science.gov (United States)

    Ichihara, Kiyoshi; Boyd, James C

    2010-11-01

    When conducting studies to derive reference intervals (RIs), various statistical procedures are commonly applied at each step, from the planning stages to final computation of RIs. Determination of the necessary sample size is an important consideration, and evaluation of at least 400 individuals in each subgroup has been recommended to establish reliable common RIs in multicenter studies. Multiple regression analysis allows identification of the most important factors contributing to variation in test results, while accounting for possible confounding relationships among these factors. Of the various approaches proposed for judging the necessity of partitioning reference values, nested analysis of variance (ANOVA) is the likely method of choice owing to its ability to handle multiple groups and being able to adjust for multiple factors. Box-Cox power transformation often has been used to transform data to a Gaussian distribution for parametric computation of RIs. However, this transformation occasionally fails. Therefore, the non-parametric method based on determination of the 2.5 and 97.5 percentiles following sorting of the data, has been recommended for general use. The performance of the Box-Cox transformation can be improved by introducing an additional parameter representing the origin of transformation. In simulations, the confidence intervals (CIs) of reference limits (RLs) calculated by the parametric method were narrower than those calculated by the non-parametric approach. However, the margin of difference was rather small owing to additional variability in parametrically-determined RLs introduced by estimation of parameters for the Box-Cox transformation. The parametric calculation method may have an advantage over the non-parametric method in allowing identification and exclusion of extreme values during RI computation.

  10. MR diffusion tensor analysis of schizophrenic brain using statistical parametric mapping

    International Nuclear Information System (INIS)

    Yamada, Haruyasu; Abe, Osamu; Kasai, Kiyoto

    2005-01-01

    The purpose of this study is to investigate diffusion anisotropy in the schizophrenic brain by voxel-based analysis of diffusion tensor imaging (DTI), using statistical parametric mapping (SPM). We studied 33 patients with schizophrenia diagnosed by diagnostic and statistical manual of mental disorders (DSM)-IV criteria and 42 matched controls. The data was obtained with a 1.5 T MRI system. We used single-shot spin-echo planar sequences (repetition time/echo time (TR/TE)=5000/102 ms, 5 mm slice thickness and 1.5 mm gap, field of view (FOV)=21 x 21 cm 2 , number of excitation (NEX)=4, 128 x 128 pixel matrix) for diffusion tensor acquisition. Diffusion gradients (b-value of 500 or 1000 s/mm 2 ) were applied on two axes simultaneously. Diffusion properties were measured along 6 non-linear directions. The structural distortion induced by the large diffusion gradients was corrected, based on each T 2 -weighted echo-planar image (b=0 s/mm 2 ). The fractional anisotropy (FA) maps were generated on a voxel-by-voxel basis. T 2 -weighted echo-planar images were then segmented into gray matter, white matter, and cerebrospinal fluid, using SPM (Wellcome Department of Imaging, University College London, UK). All apparent diffusion coefficient (ADC) and FA maps in native space were transformed to the stereotactic space by registering each of the images to the same template image. The normalized data was smoothed and analyzed using SPM. The significant FA decrease in the patient group was found in the uncinate fasciculus, parahippocampal white matter, anterior cingulum and other areas (corrected p<0.05). No significant increased region was noted. Our results may reflect reduced diffusion anisotropy of the white matter pathway of the limbic system as shown by the decreased FA. Manual region-of-interest analysis is usually more sensitive than voxel-based analysis, but it is subjective and difficult to set with anatomical reproducibility. Voxel-based analysis of the diffusion tensor

  11. Parametric Portfolio Selection: Evaluating and Comparing to Markowitz Portfolios

    Directory of Open Access Journals (Sweden)

    Marcelo C. Medeiros

    2014-10-01

    Full Text Available In this paper we exploit the parametric portfolio optimization in the Brazilian market. Our data consists of monthly returns of 306 Brazilian stocks in the period between 2001 and 2013. We tested the model both in and out of sample and compared the results with the value and equal weighted portfolios and with a Markowitz based portfolio. We performed statistical inference in the parametric optimization using bootstrap techniques in order to build the parameters empirical distributions. Our results showed that the parametric optimization is a very efficient technique out of sample. It consistently showed superior results when compared with the VW, EW and Markowitz portfolios even when transaction costs were included. Finally, we consider the parametric approach to be very flexible to the inclusion of constraints in weights, transaction costs and listing and delisting of stocks.

  12. A multi-instrument non-parametric reconstruction of the electron pressure profile in the galaxy cluster CLJ1226.9+3332

    Science.gov (United States)

    Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-04-01

    Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 generation of SZ instruments such as NIKA2 and MUSTANG2.

  13. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  14. Parametric statistical techniques for the comparative analysis of censored reliability data: a review

    International Nuclear Information System (INIS)

    Bohoris, George A.

    1995-01-01

    This paper summarizes part of the work carried out to date on seeking analytical solutions to the two-sample problem with censored data in the context of reliability and maintenance optimization applications. For this purpose, parametric two-sample tests for failure and censored reliability data are introduced and their applicability/effectiveness in common engineering problems is reviewed

  15. EPA/NMED/LANL 1998 water quality results: Statistical analysis and comparison to regulatory standards

    International Nuclear Information System (INIS)

    Gallaher, B.; Mercier, T.; Black, P.; Mullen, K.

    2000-01-01

    Four governmental agencies conducted a round of groundwater, surface water, and spring water sampling at the Los Alamos National Laboratory during 1998. Samples were split among the four parties and sent to independent analytical laboratories. Results from three of the agencies were available for this study. Comparisons of analytical results that were paired by location and date were made between the various analytical laboratories. The results for over 50 split samples analyzed for inorganic chemicals, metals, and radionuclides were compared. Statistical analyses included non-parametric (sign test and signed-ranks test) and parametric (paired t-test and linear regression) methods. The data pairs were tested for statistically significant differences, defined by an observed significance level, or p-value, less than 0.05. The main conclusion is that the laboratories' performances are similar across most of the analytes that were measured. In some 95% of the laboratory measurements there was agreement on whether contaminant levels exceeded regulatory limits. The most significant differences in performance were noted for the radioactive suite, particularly for gross alpha particle activity and Sr-90

  16. Statistical concepts a second course

    CERN Document Server

    Lomax, Richard G

    2012-01-01

    Statistical Concepts consists of the last 9 chapters of An Introduction to Statistical Concepts, 3rd ed. Designed for the second course in statistics, it is one of the few texts that focuses just on intermediate statistics. The book highlights how statistics work and what they mean to better prepare students to analyze their own data and interpret SPSS and research results. As such it offers more coverage of non-parametric procedures used when standard assumptions are violated since these methods are more frequently encountered when working with real data. Determining appropriate sample sizes

  17. Diagnosis of regional cerebral blood flow abnormalities using SPECT: agreement between individualized statistical parametric maps and visual inspection by nuclear medicine physicians with different levels of expertise in nuclear neurology

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, Euclides Timoteo da, E-mail: euclidestimoteo@uol.com.b [Fundacao Pio XII, Barretos, SP (Brazil). Hospital de Cancer. Dept. de Medicina Nuclear; Buchpiguel, Carlos Alberto [Hospital do Coracao, Sao Paulo, SP (Brazil). Dept. de Medicina Nuclear; Nitrini, Ricardo [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina. Dept. de Neurologia; Tazima, Sergio [Hospital Alemao Oswaldo Cruz (HAOC), Sao Paulo, SP (Brazil). Dept. de Medicina Nuclear; Peres, Stela Verzinhase [Fundacao Pio XII, Barretos, SP (Brazil). Hospital de Cancer; Busatto Filho, Geraldo [Universidade de Sao Paulo (USP), SP (Brazil). Faculdade de Medicina. Div. de Medicina Nuclear

    2009-07-01

    Introduction: visual analysis is widely used to interpret regional cerebral blood flow (rCBF) SPECT images in clinical practice despite its limitations. Automated methods are employed to investigate between-group rCBF differences in research studies but have rarely been explored in individual analyses. Objectives: to compare visual inspection by nuclear physicians with the automated statistical parametric mapping program using a SPECT dataset of patients with neurological disorders and normal control images. Methods: using statistical parametric mapping, 14 SPECT images from patients with various neurological disorders were compared individually with a databank of 32 normal images using a statistical threshold of p<0.05 (corrected for multiple comparisons at the level of individual voxels or clusters). Statistical parametric mapping results were compared with visual analyses by a nuclear physician highly experienced in neurology (A) as well as a nuclear physician with a general background of experience (B) who independently classified images as normal or altered, and determined the location of changes and the severity. Results: of the 32 images of the normal databank, 4 generated maps showing rCBF abnormalities (p<0.05, corrected). Among the 14 images from patients with neurological disorders, 13 showed rCBF alterations. Statistical parametric mapping and physician A completely agreed on 84.37% and 64.28% of cases from the normal databank and neurological disorders, respectively. The agreement between statistical parametric mapping and ratings of physician B were lower (71.18% and 35.71%, respectively). Conclusion: statistical parametric mapping replicated the findings described by the more experienced nuclear physician. This finding suggests that automated methods for individually analyzing rCBF SPECT images may be a valuable resource to complement visual inspection in clinical practice. (author)

  18. Soliton Coupling Driven by Phase Fluctuations in Auto-Parametric Resonance

    CERN Document Server

    Binder, B

    2002-01-01

    In this paper the interaction of sine-Gordon solitons and mediating linear waves is modelled by a special case of auto-parametric resonance, the Rayleigh-type self-excited non-linear autonomous system driven by a statistical phase gradient related to the soliton energy. Spherical symmetry can stimulate "whispering gallery modes" (WGM) with integral coupling number M=137.

  19. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non-parametric...... and non-supervised approach, based on the Fisher-Jenks optimal classification algorithm, is used to identify multi-scale meteorological droughts on the basis of empirical cumulative distributions of 1, 3, 6, and 12-monthly precipitation totals. As input data for the classifier, we use the gridded GPCC...... for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009...

  20. Discrete non-parametric kernel estimation for global sensitivity analysis

    International Nuclear Information System (INIS)

    Senga Kiessé, Tristan; Ventura, Anne

    2016-01-01

    This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

  1. Convergence in energy consumption per capita across the US states, 1970–2013: An exploration through selected parametric and non-parametric methods

    International Nuclear Information System (INIS)

    Mohammadi, Hassan; Ram, Rati

    2017-01-01

    Noting the paucity of studies of convergence in energy consumption across the US states, and the usefulness of a study that shares the spirit of the enormous research on convergence in energy-related variables in cross-country contexts, this paper explores convergence in per-capita energy consumption across the US states over the 44-year period 1970–2013. Several well-known parametric and non-parametric approaches are explored partly to shed light on the substantive question and partly to provide a comparative methodological perspective on these approaches. Several statements summarize the outcome of our explorations. First, the widely-used Barro-type regressions do not indicate beta-convergence during the entire period or any of several sub-periods. Second, lack of sigma-convergence is also noted in terms of standard deviation of logarithms and coefficient of variation which do not show a decline between 1970 and 2013, but show slight upward trends. Third, kernel density function plots indicate some flattening of the distribution which is consistent with the results from sigma-convergence scenario. Fourth, intra-distribution mobility (“gamma convergence”) in terms of an index of rank concordance suggests a slow decline in the index. Fifth, the general impression from several types of panel and time-series unit-root tests is that of non-stationarity of the series and thus the lack of stochastic convergence during the period. Sixth, therefore, the overall impression seems to be that of the lack of convergence across states in per-capita energy consumption. The present interstate inequality in per-capita energy consumption may, therefore, reflect variations in structural factors and might not be expected to diminish.

  2. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  3. Examples in parametric inference with R

    CERN Document Server

    Dixit, Ulhas Jayram

    2016-01-01

    This book discusses examples in parametric inference with R. Combining basic theory with modern approaches, it presents the latest developments and trends in statistical inference for students who do not have an advanced mathematical and statistical background. The topics discussed in the book are fundamental and common to many fields of statistical inference and thus serve as a point of departure for in-depth study. The book is divided into eight chapters: Chapter 1 provides an overview of topics on sufficiency and completeness, while Chapter 2 briefly discusses unbiased estimation. Chapter 3 focuses on the study of moments and maximum likelihood estimators, and Chapter 4 presents bounds for the variance. In Chapter 5, topics on consistent estimator are discussed. Chapter 6 discusses Bayes, while Chapter 7 studies some more powerful tests. Lastly, Chapter 8 examines unbiased and other tests. Senior undergraduate and graduate students in statistics and mathematics, and those who have taken an introductory cou...

  4. Robustness of S1 statistic with Hodges-Lehmann for skewed distributions

    Science.gov (United States)

    Ahad, Nor Aishah; Yahaya, Sharipah Soaad Syed; Yin, Lee Ping

    2016-10-01

    Analysis of variance (ANOVA) is a common use parametric method to test the differences in means for more than two groups when the populations are normally distributed. ANOVA is highly inefficient under the influence of non- normal and heteroscedastic settings. When the assumptions are violated, researchers are looking for alternative such as Kruskal-Wallis under nonparametric or robust method. This study focused on flexible method, S1 statistic for comparing groups using median as the location estimator. S1 statistic was modified by substituting the median with Hodges-Lehmann and the default scale estimator with the variance of Hodges-Lehmann and MADn to produce two different test statistics for comparing groups. Bootstrap method was used for testing the hypotheses since the sampling distributions of these modified S1 statistics are unknown. The performance of the proposed statistic in terms of Type I error was measured and compared against the original S1 statistic, ANOVA and Kruskal-Wallis. The propose procedures show improvement compared to the original statistic especially under extremely skewed distribution.

  5. Latest astronomical constraints on some non-linear parametric dark energy models

    Science.gov (United States)

    Yang, Weiqiang; Pan, Supriya; Paliathanasis, Andronikos

    2018-04-01

    We consider non-linear redshift-dependent equation of state parameters as dark energy models in a spatially flat Friedmann-Lemaître-Robertson-Walker universe. To depict the expansion history of the universe in such cosmological scenarios, we take into account the large-scale behaviour of such parametric models and fit them using a set of latest observational data with distinct origin that includes cosmic microwave background radiation, Supernove Type Ia, baryon acoustic oscillations, redshift space distortion, weak gravitational lensing, Hubble parameter measurements from cosmic chronometers, and finally the local Hubble constant from Hubble space telescope. The fitting technique avails the publicly available code Cosmological Monte Carlo (COSMOMC), to extract the cosmological information out of these parametric dark energy models. From our analysis, it follows that those models could describe the late time accelerating phase of the universe, while they are distinguished from the Λ-cosmology.

  6. Testing for co-integration in vector autoregressions with non-stationary volatility

    DEFF Research Database (Denmark)

    Cavaliere, Giuseppe; Rahbek, Anders Christian; Taylor, Robert M.

    2010-01-01

    cases. We show that the conventional rank statistics computed as in (Johansen, 1988) and (Johansen, 1991) are potentially unreliable. In particular, their large sample distributions depend on the integrated covariation of the underlying multivariate volatility process which impacts on both the size...... and power of the associated co-integration tests, as we demonstrate numerically. A solution to the identified inference problem is provided by considering wild bootstrap-based implementations of the rank tests. These do not require the practitioner to specify a parametric model for volatility, or to assume...

  7. Applied statistical designs for the researcher

    CERN Document Server

    Paulson, Daryl S

    2003-01-01

    Research and Statistics Basic Review of Parametric Statistics Exploratory Data Analysis Two Sample Tests Completely Randomized One-Factor Analysis of Variance One and Two Restrictions on Randomization Completely Randomized Two-Factor Factorial Designs Two-Factor Factorial Completely Randomized Blocked Designs Useful Small Scale Pilot Designs Nested Statistical Designs Linear Regression Nonparametric Statistics Introduction to Research Synthesis and "Meta-Analysis" and Conclusory Remarks References Index.

  8. Towards a test of non-locality without 'supplementary assumptions'

    International Nuclear Information System (INIS)

    Barbieri, M.; De Martini, F.; Di Nepi, G.; Mataloni, P.

    2005-01-01

    We have experimentally tested the non-local properties of the two-photon states generated by a high brilliance source of entanglement which virtually allows the direct measurement of the full set of photon pairs created by the basic QED process implied by the parametric quantum scattering. Standard Bell measurements and Bell's inequality violation test have been realized over the entire cone of emission of the degenerate pairs. By the same source we have verified Hardy's ladder theory up to the 20th step and the contradiction between the standard quantum theory and the local realism has been tested for 41% of entangled pairs

  9. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  10. Statistical parametric mapping for effects of verapamil on olfactory connections of rat brain in vivo using manganese-enhanced MR imaging

    International Nuclear Information System (INIS)

    Soma, Tsutomu; Kurakawa, Masami; Koto, Daichi

    2011-01-01

    We investigated the effect of verapamil on the transport of manganese in the olfactory connections of rat brains in vivo using statistical parametric mapping and manganese-enhanced magnetic resonance (MR) imaging. We divided 12 7-week-old male Sprague-Dawley rats into 2 groups of six and injected 10 μL of saline into the right nasal cavities of the first group and 10 μL of verapamil (2.5 mg/mL) into the other group. Twenty minutes after the initial injection, we injected 10 μL of MnCl 2 (1 mol/L) into the right nasal cavities of both groups. We obtained serial T 1 -weighted MR images before administering the verapamil or saline and at 0.5, one, 24, 48, and 72 hours and 7 days after administering the MnCl 2 , spatially normalized the MR images on the rat brain atlas, and analyzed the data using voxel-based statistical comparison. Statistical parametric maps demonstrated the transport of manganese. Manganese ions created significant enhancement (t-score=36.6) 24 hours after MnCl 2 administration in the group administered saline but not at the same time point in the group receiving verapamil. The extent of significantly enhanced regions peaked at 72 hours in both groups and both sides of the brain. The peak of extent in the right side brain in the group injected with saline was 70.2 mm 3 and in the group with verapamil, 92.4 mm 3 . The extents in the left side were 64.0 mm 3 for the group with saline and 53.2 mm 3 for the group with verapamil. We applied statistical parametric mapping using manganese-enhanced MR imaging to demonstrate in vivo the transport of manganese in the olfactory connections of rat brains with and without verapamil and found that verapamil did affect this transport. (author)

  11. Statistical dynamics of parametrically perturbed sine-square map

    Indian Academy of Sciences (India)

    Abstract. We discuss the emergence and destruction of complex, critical and completely chaotic attractors in a nonlinear system when subjected to a small parametric perturba- tion in trigonometric, hyperbolic or noise function forms. For this purpose, a hybrid optical bistable system, which is a nonlinear physical system, has ...

  12. Measuring energy performance with sectoral heterogeneity: A non-parametric frontier approach

    International Nuclear Information System (INIS)

    Wang, H.; Ang, B.W.; Wang, Q.W.; Zhou, P.

    2017-01-01

    Evaluating economy-wide energy performance is an integral part of assessing the effectiveness of a country's energy efficiency policy. Non-parametric frontier approach has been widely used by researchers for such a purpose. This paper proposes an extended non-parametric frontier approach to studying economy-wide energy efficiency and productivity performances by accounting for sectoral heterogeneity. Relevant techniques in index number theory are incorporated to quantify the driving forces behind changes in the economy-wide energy productivity index. The proposed approach facilitates flexible modelling of different sectors' production processes, and helps to examine sectors' impact on the aggregate energy performance. A case study of China's economy-wide energy efficiency and productivity performances in its 11th five-year plan period (2006–2010) is presented. It is found that sectoral heterogeneities in terms of energy performance are significant in China. Meanwhile, China's economy-wide energy productivity increased slightly during the study period, mainly driven by the technical efficiency improvement. A number of other findings have also been reported. - Highlights: • We model economy-wide energy performance by considering sectoral heterogeneity. • The proposed approach can identify sectors' impact on the aggregate energy performance. • Obvious sectoral heterogeneities are identified in evaluating China's energy performance.

  13. A non-parametric peak calling algorithm for DamID-Seq.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    Full Text Available Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS of double sex (DSX-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq. One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only. After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1 reads resampling; 2 reads scaling (normalization and computing signal-to-noise fold changes; 3 filtering; 4 Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC. We also used irreproducible discovery rate (IDR analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  14. A non-parametric peak calling algorithm for DamID-Seq.

    Science.gov (United States)

    Li, Renhua; Hempel, Leonie U; Jiang, Tingbo

    2015-01-01

    Protein-DNA interactions play a significant role in gene regulation and expression. In order to identify transcription factor binding sites (TFBS) of double sex (DSX)-an important transcription factor in sex determination, we applied the DNA adenine methylation identification (DamID) technology to the fat body tissue of Drosophila, followed by deep sequencing (DamID-Seq). One feature of DamID-Seq data is that induced adenine methylation signals are not assured to be symmetrically distributed at TFBS, which renders the existing peak calling algorithms for ChIP-Seq, including SPP and MACS, inappropriate for DamID-Seq data. This challenged us to develop a new algorithm for peak calling. A challenge in peaking calling based on sequence data is estimating the averaged behavior of background signals. We applied a bootstrap resampling method to short sequence reads in the control (Dam only). After data quality check and mapping reads to a reference genome, the peaking calling procedure compromises the following steps: 1) reads resampling; 2) reads scaling (normalization) and computing signal-to-noise fold changes; 3) filtering; 4) Calling peaks based on a statistically significant threshold. This is a non-parametric method for peak calling (NPPC). We also used irreproducible discovery rate (IDR) analysis, as well as ChIP-Seq data to compare the peaks called by the NPPC. We identified approximately 6,000 peaks for DSX, which point to 1,225 genes related to the fat body tissue difference between female and male Drosophila. Statistical evidence from IDR analysis indicated that these peaks are reproducible across biological replicates. In addition, these peaks are comparable to those identified by use of ChIP-Seq on S2 cells, in terms of peak number, location, and peaks width.

  15. Conjugate pair of non-extensive statistics in quantum scattering

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.D.

    1999-01-01

    In this paper, by defining the Fourier transform of the scattering amplitudes as a bounded linear mapping from the space L 2p to the space L 2q when 1/(2p)+1/(2q)=1, we introduced a new concept in quantum physics in terms of Tsallis-like entropies S J (p) and S θ (q), namely, that of conjugate pair of non-extensive statistics. This new concept is experimentally illustrated by using 88 + 49 sets of pion-nucleon and pion-nucleus phase shifts. From the experimental determination of the (p,q) - non-extensivity indices by choosing the pairs for which the [χ L 2 (p) + χ θ 2 (q min )] - optimal - test function is minimum we get the conjugate pair of [(p min ,J),(q min , θ)]- non-extensive statistics with 0.50 ≤ p min ≤ 0.60. This new non-extensive statistical effect is experimentally evidenced with high degree of accuracy (CL≥ 99%). Moreover, it is worth to mention that the modification of the statistics has been more efficient than the modification of the PMD-SQS-optimum principle in obtaining the best overall fitting to the experimental data. (authors)

  16. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  17. Debt and growth: A non-parametric approach

    Science.gov (United States)

    Brida, Juan Gabriel; Gómez, David Matesanz; Seijas, Maria Nela

    2017-11-01

    In this study, we explore the dynamic relationship between public debt and economic growth by using a non-parametric approach based on data symbolization and clustering methods. The study uses annual data of general government consolidated gross debt-to-GDP ratio and gross domestic product for sixteen countries between 1977 and 2015. Using symbolic sequences, we introduce a notion of distance between the dynamical paths of different countries. Then, a Minimal Spanning Tree and a Hierarchical Tree are constructed from time series to help detecting the existence of groups of countries sharing similar economic performance. The main finding of the study appears for the period 2008-2016 when several countries surpassed the 90% debt-to-GDP threshold. During this period, three groups (clubs) of countries are obtained: high, mid and low indebted countries, suggesting that the employed debt-to-GDP threshold drives economic dynamics for the selected countries.

  18. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    International Nuclear Information System (INIS)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M.; Som, Seu S.; Haindl, Walter; Bye, Ann M.E.

    2004-01-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8±4.3 years; mean±SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the potential

  19. Application of statistical parametric mapping to SPET in the assessment of intractable childhood epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Bruggemann, Jason M.; Lawson, John A.; Cunningham, Anne M. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Som, Seu S.; Haindl, Walter [Department of Nuclear Medicine, Prince of Wales Hospital, Randwick, New South Wales (Australia); Bye, Ann M.E. [Department of Neurology, Sydney Children' s Hospital and School of Women' s and Children' s Health, Faculty of Medicine, University of New South Wales, Randwick, New South Wales (Australia); Department of Neurology, Sydney Children' s Hospital, High Street, 2031, Randwick, NSW (Australia)

    2004-03-01

    Statistical parametric mapping (SPM) quantification and analysis has been successfully applied to functional imaging studies of partial epilepsy syndromes in adults. The present study evaluated whether localisation of the epileptogenic zone (determined by SPM) improves upon visually examined single-photon emission tomography (SPET) imaging in presurgical assessment of children with temporal lobe epilepsy (TLE) and frontal lobe epilepsy (FLE). The patient sample consisted of 24 children (15 males) aged 2.1-17.8 years (9.8{+-}4.3 years; mean{+-}SD) with intractable TLE or FLE. SPET imaging was acquired routinely in presurgical evaluation. All patient images were transformed into the standard stereotactic space of the adult SPM SPET template prior to SPM statistical analysis. Individual patient images were contrasted with an adult control group of 22 healthy adult females. Resultant statistical parametric maps were rendered over the SPM canonical magnetic resonance imaging (MRI). Two corresponding sets of ictal and interictal SPM and SPET images were then generated for each patient. Experienced clinicians independently reviewed the image sets, blinded to clinical details. Concordance of the reports between SPM and SPET images, syndrome classification and MRI abnormality was studied. A fair level of inter-rater reliability (kappa=0.73) was evident for SPM localisation. SPM was concordant with SPET in 71% of all patients, the majority of the discordance being from the FLE group. SPM and SPET localisation were concordant with epilepsy syndrome in 80% of the TLE cases. Concordant localisation to syndrome was worse for both SPM (33%) and SPET (44%) in the FLE group. Data from a small sample of patients with varied focal structural pathologies suggested that SPM performed poorly relative to SPET in these cases. Concordance of SPM and SPET with syndrome was lower in patients younger than 6 years than in those aged 6 years and above. SPM is effective in localising the

  20. Non-parametric transformation for data correlation and integration: From theory to practice

    Energy Technology Data Exchange (ETDEWEB)

    Datta-Gupta, A.; Xue, Guoping; Lee, Sang Heon [Texas A& M Univ., College Station, TX (United States)

    1997-08-01

    The purpose of this paper is two-fold. First, we introduce the use of non-parametric transformations for correlating petrophysical data during reservoir characterization. Such transformations are completely data driven and do not require a priori functional relationship between response and predictor variables which is the case with traditional multiple regression. The transformations are very general, computationally efficient and can easily handle mixed data types for example, continuous variables such as porosity, permeability and categorical variables such as rock type, lithofacies. The power of the non-parametric transformation techniques for data correlation has been illustrated through synthetic and field examples. Second, we utilize these transformations to propose a two-stage approach for data integration during heterogeneity characterization. The principal advantages of our approach over traditional cokriging or cosimulation methods are: (1) it does not require a linear relationship between primary and secondary data, (2) it exploits the secondary information to its fullest potential by maximizing the correlation between the primary and secondary data, (3) it can be easily applied to cases where several types of secondary or soft data are involved, and (4) it significantly reduces variance function calculations and thus, greatly facilitates non-Gaussian cosimulation. We demonstrate the data integration procedure using synthetic and field examples. The field example involves estimation of pore-footage distribution using well data and multiple seismic attributes.

  1. Location tests for biomarker studies: a comparison using simulations for the two-sample case.

    Science.gov (United States)

    Scheinhardt, M O; Ziegler, A

    2013-01-01

    Gene, protein, or metabolite expression levels are often non-normally distributed, heavy tailed and contain outliers. Standard statistical approaches may fail as location tests in this situation. In three Monte-Carlo simulation studies, we aimed at comparing the type I error levels and empirical power of standard location tests and three adaptive tests [O'Gorman, Can J Stat 1997; 25: 269 -279; Keselman et al., Brit J Math Stat Psychol 2007; 60: 267- 293; Szymczak et al., Stat Med 2013; 32: 524 - 537] for a wide range of distributions. We simulated two-sample scenarios using the g-and-k-distribution family to systematically vary tail length and skewness with identical and varying variability between groups. All tests kept the type I error level when groups did not vary in their variability. The standard non-parametric U-test performed well in all simulated scenarios. It was outperformed by the two non-parametric adaptive methods in case of heavy tails or large skewness. Most tests did not keep the type I error level for skewed data in the case of heterogeneous variances. The standard U-test was a powerful and robust location test for most of the simulated scenarios except for very heavy tailed or heavy skewed data, and it is thus to be recommended except for these cases. The non-parametric adaptive tests were powerful for both normal and non-normal distributions under sample variance homogeneity. But when sample variances differed, they did not keep the type I error level. The parametric adaptive test lacks power for skewed and heavy tailed distributions.

  2. An empirical analysis of one, two, and three parametric logistic ...

    African Journals Online (AJOL)

    The purpose of this study was to determine the three parametric logistic IRT methods in dichotomous and ordinal test items due to differential item functioning using statistical DIF detection methods of SIBTEST, GMH, and LDFA. The study adopted instrumentation research design. The sample consisted of an intact class of ...

  3. MEASURING DARK MATTER PROFILES NON-PARAMETRICALLY IN DWARF SPHEROIDALS: AN APPLICATION TO DRACO

    International Nuclear Information System (INIS)

    Jardel, John R.; Gebhardt, Karl; Fabricius, Maximilian H.; Williams, Michael J.; Drory, Niv

    2013-01-01

    We introduce a novel implementation of orbit-based (or Schwarzschild) modeling that allows dark matter density profiles to be calculated non-parametrically in nearby galaxies. Our models require no assumptions to be made about velocity anisotropy or the dark matter profile. The technique can be applied to any dispersion-supported stellar system, and we demonstrate its use by studying the Local Group dwarf spheroidal galaxy (dSph) Draco. We use existing kinematic data at larger radii and also present 12 new radial velocities within the central 13 pc obtained with the VIRUS-W integral field spectrograph on the 2.7 m telescope at McDonald Observatory. Our non-parametric Schwarzschild models find strong evidence that the dark matter profile in Draco is cuspy for 20 ≤ r ≤ 700 pc. The profile for r ≥ 20 pc is well fit by a power law with slope α = –1.0 ± 0.2, consistent with predictions from cold dark matter simulations. Our models confirm that, despite its low baryon content relative to other dSphs, Draco lives in a massive halo.

  4. Statistical significance of trends in monthly heavy precipitation over the US

    KAUST Repository

    Mahajan, Salil; North, Gerald R.; Saravanan, R.; Genton, Marc G.

    2011-01-01

    -parametric and parametric bootstrapping techniques. The results from the two Monte Carlo approaches are found to be similar to each other, and also to the traditional non-parametric Kendall's τ test, implying the robustness of the approach. Two different observational data

  5. SPECT image analysis using statistical parametric mapping in patients with temporal lobe epilepsy associated with hippocampal sclerosis

    International Nuclear Information System (INIS)

    Shiraki, Junko

    2004-01-01

    The author examined interictal 123 I-IMP SPECT images using statistical parametric mapping (SPM) in 19 temporal lobe epilepsy patients who revealed hippocampal sclerosis with MRI. Decreased regional cerebral blood flow (rCBF) were shown for eight patients in the medial temporal lobe, six patients in the lateral temporal lobe and five patients in the both medial and lateral temporal lobe. These patients were classified into two types; medial type and lateral type, the former decreased rCBF only in medial and the latter decreased rCBF in the other temporal area. Correlation of rCBF and clinical parameters in the lateral type, age at seizure onset was significantly older (p=0.0098, t-test) than those of patients in the medial type. SPM analysis for interictal SPECT of temporal lobe epilepsy clarified location of decreased rCBF and find correlations with clinical characteristics. In addition, SPM analysis of SPECT was useful to understand pathophysiology of the epilepsy. (author)

  6. Technical issues relating to the statistical parametric mapping of brain SPECT studies

    International Nuclear Information System (INIS)

    Hatton, R.L.; Cordato, N.; Hutton, B.F.; Lau, Y.H.; Evans, S.G.

    2000-01-01

    Full text: Statistical Parametric Mapping (SPM) is a software tool designed for the statistical analysis of functional neuro images, specifically Positron Emission Tomography and functional Magnetic Resonance Imaging, and more recently SPECT. This review examines some problems associated with the analysis of SPECT. A comparison of a patient group with normal studies revealed factors that could influence results, some that commonly occur, others that require further exploration. To optimise the differences between two groups of subjects, both spatial variability and differences in global activity must be minimised. The choice and effectiveness of co registration method and approach to normalisation of activity concentration can affect the optimisation. A small number of subject scans were identified as possessing truncated data resulting in edge effects that could adversely influence the analysis. Other problems included unusual areas of significance possibly related to reconstruction methods and the geometry associated with nonparallel collimators. Areas of extra cerebral significance are a point of concern - and may result from scatter effects, or mis registration. Difficulties in patient positioning, due to postural limitations, can lead to resolution differences. SPM has been used to assess areas of statistical significance arising from these technical factors, as opposed to areas of true clinical significance when comparing subject groups. This contributes to a better understanding of the effects of technical factors so that these may be eliminated, minimised, or incorporated in the study design. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  7. A Rank Test on Equality of Population Medians

    OpenAIRE

    Pooi Ah Hin

    2012-01-01

    The Kruskal-Wallis test is a non-parametric test for the equality of K population medians. The test statistic involved is a measure of the overall closeness of the K average ranks in the individual samples to the average rank in the combined sample. The resulting acceptance region of the test however may not be the smallest region with the required acceptance probability under the null hypothesis. Presently an alternative acceptance region is constructed such that it has the smallest size, ap...

  8. Markovian Dynamics of Josephson Parametric Amplification

    Directory of Open Access Journals (Sweden)

    W. Kaiser

    2017-09-01

    Full Text Available In this work, we derive the dynamics of the lossy DC pumped non-degenerate Josephson parametric amplifier (DCPJPA. The main element in a DCPJPA is the superconducting Josephson junction. The DC bias generates the AC Josephson current varying the nonlinear inductance of the junction. By this way the Josephson junction acts as the pump oscillator as well as the time varying reactance of the parametric amplifier. In quantum-limited amplification, losses and noise have an increased impact on the characteristics of an amplifier. We outline the classical model of the lossy DCPJPA and derive the available noise power spectral densities. A classical treatment is not capable of including properties like spontaneous emission which is mandatory in case of amplification at the quantum limit. Thus, we derive a quantum mechanical model of the lossy DCPJPA. Thermal losses are modeled by the quantum Langevin approach, by coupling the quantized system to a photon heat bath in thermodynamic equilibrium. The mode occupation in the bath follows the Bose-Einstein statistics. Based on the second quantization formalism, we derive the Heisenberg equations of motion of both resonator modes. We assume the dynamics of the system to follow the Markovian approximation, i.e. the system only depends on its actual state and is memory-free. We explicitly compute the time evolution of the contributions to the signal mode energy and give numeric examples based on different damping and coupling constants. Our analytic results show, that this model is capable of including thermal noise into the description of the DC pumped non-degenerate Josephson parametric amplifier.

  9. Markovian Dynamics of Josephson Parametric Amplification

    Science.gov (United States)

    Kaiser, Waldemar; Haider, Michael; Russer, Johannes A.; Russer, Peter; Jirauschek, Christian

    2017-09-01

    In this work, we derive the dynamics of the lossy DC pumped non-degenerate Josephson parametric amplifier (DCPJPA). The main element in a DCPJPA is the superconducting Josephson junction. The DC bias generates the AC Josephson current varying the nonlinear inductance of the junction. By this way the Josephson junction acts as the pump oscillator as well as the time varying reactance of the parametric amplifier. In quantum-limited amplification, losses and noise have an increased impact on the characteristics of an amplifier. We outline the classical model of the lossy DCPJPA and derive the available noise power spectral densities. A classical treatment is not capable of including properties like spontaneous emission which is mandatory in case of amplification at the quantum limit. Thus, we derive a quantum mechanical model of the lossy DCPJPA. Thermal losses are modeled by the quantum Langevin approach, by coupling the quantized system to a photon heat bath in thermodynamic equilibrium. The mode occupation in the bath follows the Bose-Einstein statistics. Based on the second quantization formalism, we derive the Heisenberg equations of motion of both resonator modes. We assume the dynamics of the system to follow the Markovian approximation, i.e. the system only depends on its actual state and is memory-free. We explicitly compute the time evolution of the contributions to the signal mode energy and give numeric examples based on different damping and coupling constants. Our analytic results show, that this model is capable of including thermal noise into the description of the DC pumped non-degenerate Josephson parametric amplifier.

  10. Statistical studies of powerful extragalactic radio sources

    Energy Technology Data Exchange (ETDEWEB)

    Macklin, J T

    1981-01-01

    This dissertation is mainly about the use of efficient statistical tests to study the properties of powerful extragalactic radio sources. Most of the analysis is based on subsets of a sample of 166 bright (3CR) sources selected at 178 MHz. The first chapter is introductory and it is followed by three on the misalignment and symmetry of double radio sources. The properties of nuclear components in extragalactic sources are discussed in the next chapter, using statistical tests which make efficient use of upper limits, often the only available information on the flux density from the nuclear component. Multifrequency observations of four 3CR sources are presented in the next chapter. The penultimate chapter is about the analysis of correlations involving more than two variables. The Spearman partial rank correlation coefficient is shown to be the most powerful test available which is based on non-parametric statistics. It is therefore used to study the dependences of the properties of sources on their size at constant redshift, and the results are interpreted in terms of source evolution. Correlations of source properties with luminosity and redshift are then examined.

  11. Testing statistical hypotheses

    CERN Document Server

    Lehmann, E L

    2005-01-01

    The third edition of Testing Statistical Hypotheses updates and expands upon the classic graduate text, emphasizing optimality theory for hypothesis testing and confidence sets. The principal additions include a rigorous treatment of large sample optimality, together with the requisite tools. In addition, an introduction to the theory of resampling methods such as the bootstrap is developed. The sections on multiple testing and goodness of fit testing are expanded. The text is suitable for Ph.D. students in statistics and includes over 300 new problems out of a total of more than 760. E.L. Lehmann is Professor of Statistics Emeritus at the University of California, Berkeley. He is a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and the recipient of honorary degrees from the University of Leiden, The Netherlands and the University of Chicago. He is the author of Elements of Large-Sample Theory and (with George Casella) he is also the author of Theory of Point Estimat...

  12. Statistics for non-statisticians

    CERN Document Server

    Madsen, Birger Stjernholm

    2016-01-01

    This book was written for those who need to know how to collect, analyze and present data. It is meant to be a first course for practitioners, a book for private study or brush-up on statistics, and supplementary reading for general statistics classes. The book is untraditional, both with respect to the choice of topics and the presentation: Topics were determined by what is most useful for practical statistical work, and the presentation is as non-mathematical as possible. The book contains many examples using statistical functions in spreadsheets. In this second edition, new topics have been included e.g. within the area of statistical quality control, in order to make the book even more useful for practitioners working in industry. .

  13. Integration of association statistics over genomic regions using Bayesian adaptive regression splines

    Directory of Open Access Journals (Sweden)

    Zhang Xiaohua

    2003-11-01

    Full Text Available Abstract In the search for genetic determinants of complex disease, two approaches to association analysis are most often employed, testing single loci or testing a small group of loci jointly via haplotypes for their relationship to disease status. It is still debatable which of these approaches is more favourable, and under what conditions. The former has the advantage of simplicity but suffers severely when alleles at the tested loci are not in linkage disequilibrium (LD with liability alleles; the latter should capture more of the signal encoded in LD, but is far from simple. The complexity of haplotype analysis could be especially troublesome for association scans over large genomic regions, which, in fact, is becoming the standard design. For these reasons, the authors have been evaluating statistical methods that bridge the gap between single-locus and haplotype-based tests. In this article, they present one such method, which uses non-parametric regression techniques embodied by Bayesian adaptive regression splines (BARS. For a set of markers falling within a common genomic region and a corresponding set of single-locus association statistics, the BARS procedure integrates these results into a single test by examining the class of smooth curves consistent with the data. The non-parametric BARS procedure generally finds no signal when no liability allele exists in the tested region (ie it achieves the specified size of the test and it is sensitive enough to pick up signals when a liability allele is present. The BARS procedure provides a robust and potentially powerful alternative to classical tests of association, diminishes the multiple testing problem inherent in those tests and can be applied to a wide range of data types, including genotype frequencies estimated from pooled samples.

  14. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  15. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yan-Rong; Wang, Jian-Min [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn [Yunnan Observatories, Chinese Academy of Sciences, Kunming 650011 (China)

    2016-11-10

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  16. A NON-PARAMETRIC APPROACH TO CONSTRAIN THE TRANSFER FUNCTION IN REVERBERATION MAPPING

    International Nuclear Information System (INIS)

    Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming

    2016-01-01

    Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.

  17. Non-parametric causality detection: An application to social media and financial data

    Science.gov (United States)

    Tsapeli, Fani; Musolesi, Mirco; Tino, Peter

    2017-10-01

    According to behavioral finance, stock market returns are influenced by emotional, social and psychological factors. Several recent works support this theory by providing evidence of correlation between stock market prices and collective sentiment indexes measured using social media data. However, a pure correlation analysis is not sufficient to prove that stock market returns are influenced by such emotional factors since both stock market prices and collective sentiment may be driven by a third unmeasured factor. Controlling for factors that could influence the study by applying multivariate regression models is challenging given the complexity of stock market data. False assumptions about the linearity or non-linearity of the model and inaccuracies on model specification may result in misleading conclusions. In this work, we propose a novel framework for causal inference that does not require any assumption about a particular parametric form of the model expressing statistical relationships among the variables of the study and can effectively control a large number of observed factors. We apply our method in order to estimate the causal impact that information posted in social media may have on stock market returns of four big companies. Our results indicate that social media data not only correlate with stock market returns but also influence them.

  18. Statistical analysis of the electric energy production from photovoltaic conversion using mobile and fixed constructions

    Science.gov (United States)

    Bugała, Artur; Bednarek, Karol; Kasprzyk, Leszek; Tomczewski, Andrzej

    2017-10-01

    The paper presents the most representative - from the three-year measurement time period - characteristics of daily and monthly electricity production from a photovoltaic conversion using modules installed in a fixed and 2-axis tracking construction. Results are presented for selected summer, autumn, spring and winter days. Analyzed measuring stand is located on the roof of the Faculty of Electrical Engineering Poznan University of Technology building. The basic parameters of the statistical analysis like mean value, standard deviation, skewness, kurtosis, median, range, or coefficient of variation were used. It was found that the asymmetry factor can be useful in the analysis of the daily electricity production from a photovoltaic conversion. In order to determine the repeatability of monthly electricity production, occurring between the summer, and summer and winter months, a non-parametric Mann-Whitney U test was used as a statistical solution. In order to analyze the repeatability of daily peak hours, describing the largest value of the hourly electricity production, a non-parametric Kruskal-Wallis test was applied as an extension of the Mann-Whitney U test. Based on the analysis of the electric energy distribution from a prepared monitoring system it was found that traditional forecasting methods of the electricity production from a photovoltaic conversion, like multiple regression models, should not be the preferred methods of the analysis.

  19. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  20. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian u...

  1. Statistical inference a short course

    CERN Document Server

    Panik, Michael J

    2012-01-01

    A concise, easily accessible introduction to descriptive and inferential techniques Statistical Inference: A Short Course offers a concise presentation of the essentials of basic statistics for readers seeking to acquire a working knowledge of statistical concepts, measures, and procedures. The author conducts tests on the assumption of randomness and normality, provides nonparametric methods when parametric approaches might not work. The book also explores how to determine a confidence interval for a population median while also providing coverage of ratio estimation, randomness, and causal

  2. Parametrized tests of post-Newtonian theory using Advanced LIGO and Einstein Telescope

    International Nuclear Information System (INIS)

    Mishra, Chandra Kant; Arun, K. G.; Iyer, Bala R.; Sathyaprakash, B. S.

    2010-01-01

    General relativity has very specific predictions for the gravitational waveforms from inspiralling compact binaries obtained using the post-Newtonian (PN) approximation. We investigate the extent to which the measurement of the PN coefficients, possible with the second generation gravitational-wave detectors such as the Advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) and the third generation gravitational-wave detectors such as the Einstein Telescope (ET), could be used to test post-Newtonian theory and to put bounds on a subclass of parametrized-post-Einstein theories which differ from general relativity in a parametrized sense. We demonstrate this possibility by employing the best inspiralling waveform model for nonspinning compact binaries which is 3.5PN accurate in phase and 3PN in amplitude. Within the class of theories considered, Advanced LIGO can test the theory at 1.5PN and thus the leading tail term. Future observations of stellar mass black hole binaries by ET can test the consistency between the various PN coefficients in the gravitational-wave phasing over the mass range of 11-44M · . The choice of the lower frequency cutoff is important for testing post-Newtonian theory using the ET. The bias in the test arising from the assumption of nonspinning binaries is indicated.

  3. On the Optimal Location of Sensors for Parametric Identification of Linear Structural Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Brincker, Rune

    A survey of the field of optimal location of sensors for parametric identification of linear structural systems is presented. The survey shows that few papers are devoted to the case of optimal location sensors in which the measurements are modelled by a random field with non-trivial covariance...... function. Most often it is assumed that the results of the measurements are statistically independent variables. In an example the importance of considering the measurements as statistically dependent random variables is shown. The example is concerned with optimal location of sensors for parametric...... identification of modal parameters for a vibrating beam under random loading. The covariance of the modal parameters expected to be obtained is investigated to variations of number and location of sensors. Further, the influence of the noise on the optimal location of the sensors is investigated....

  4. Privacy-preserving Kruskal-Wallis test.

    Science.gov (United States)

    Guo, Suxin; Zhong, Sheng; Zhang, Aidong

    2013-10-01

    Statistical tests are powerful tools for data analysis. Kruskal-Wallis test is a non-parametric statistical test that evaluates whether two or more samples are drawn from the same distribution. It is commonly used in various areas. But sometimes, the use of the method is impeded by privacy issues raised in fields such as biomedical research and clinical data analysis because of the confidential information contained in the data. In this work, we give a privacy-preserving solution for the Kruskal-Wallis test which enables two or more parties to coordinately perform the test on the union of their data without compromising their data privacy. To the best of our knowledge, this is the first work that solves the privacy issues in the use of the Kruskal-Wallis test on distributed data. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios

    OpenAIRE

    Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M

    2004-01-01

    The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...

  6. Statistical Parametric Mapping to Identify Differences between Consensus-Based Joint Patterns during Gait in Children with Cerebral Palsy.

    Science.gov (United States)

    Nieuwenhuys, Angela; Papageorgiou, Eirini; Desloovere, Kaat; Molenaers, Guy; De Laet, Tinne

    2017-01-01

    Experts recently identified 49 joint motion patterns in children with cerebral palsy during a Delphi consensus study. Pattern definitions were therefore the result of subjective expert opinion. The present study aims to provide objective, quantitative data supporting the identification of these consensus-based patterns. To do so, statistical parametric mapping was used to compare the mean kinematic waveforms of 154 trials of typically developing children (n = 56) to the mean kinematic waveforms of 1719 trials of children with cerebral palsy (n = 356), which were classified following the classification rules of the Delphi study. Three hypotheses stated that: (a) joint motion patterns with 'no or minor gait deviations' (n = 11 patterns) do not differ significantly from the gait pattern of typically developing children; (b) all other pathological joint motion patterns (n = 38 patterns) differ from typically developing gait and the locations of difference within the gait cycle, highlighted by statistical parametric mapping, concur with the consensus-based classification rules. (c) all joint motion patterns at the level of each joint (n = 49 patterns) differ from each other during at least one phase of the gait cycle. Results showed that: (a) ten patterns with 'no or minor gait deviations' differed somewhat unexpectedly from typically developing gait, but these differences were generally small (≤3°); (b) all other joint motion patterns (n = 38) differed from typically developing gait and the significant locations within the gait cycle that were indicated by the statistical analyses, coincided well with the classification rules; (c) joint motion patterns at the level of each joint significantly differed from each other, apart from two sagittal plane pelvic patterns. In addition to these results, for several joints, statistical analyses indicated other significant areas during the gait cycle that were not included in the pattern definitions of the consensus study

  7. Strategies for the control of parametric instability in advanced gravitational wave detectors

    International Nuclear Information System (INIS)

    Ju, L; Blair, D G; Zhao, C; Gras, S; Zhang, Z; Barriga, P; Miao, H; Fan, Y; Merrill, L

    2009-01-01

    Parametric instabilities have been predicted to occur in all advanced high optical power gravitational wave detectors. In this paper we review the problem of parametric instabilities, summarize the latest findings and assess various schemes proposed for their control. We show that non-resonant passive damping of test masses reduces parametric instability but has a noise penalty, and fails to suppress the Q-factor of many modes. Resonant passive damping is shown to have significant advantages but requires detailed modeling. An optical feedback mode suppression interferometer is proposed which is capable of suppressing all instabilities but requires experimental development.

  8. Strategies for the control of parametric instability in advanced gravitational wave detectors

    Energy Technology Data Exchange (ETDEWEB)

    Ju, L; Blair, D G; Zhao, C; Gras, S; Zhang, Z; Barriga, P; Miao, H; Fan, Y; Merrill, L, E-mail: juli@physics.uwa.edu.a [School of Physics, University of Western Australia, 35 Stirling Highway, Crawley, Perth, WA 6009 (Australia)

    2009-01-07

    Parametric instabilities have been predicted to occur in all advanced high optical power gravitational wave detectors. In this paper we review the problem of parametric instabilities, summarize the latest findings and assess various schemes proposed for their control. We show that non-resonant passive damping of test masses reduces parametric instability but has a noise penalty, and fails to suppress the Q-factor of many modes. Resonant passive damping is shown to have significant advantages but requires detailed modeling. An optical feedback mode suppression interferometer is proposed which is capable of suppressing all instabilities but requires experimental development.

  9. Bootstrap Power of Time Series Goodness of fit tests

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2013-10-01

    Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.

  10. A local non-parametric model for trade sign inference

    Science.gov (United States)

    Blazejewski, Adam; Coggins, Richard

    2005-03-01

    We investigate a regularity in market order submission strategies for 12 stocks with large market capitalization on the Australian Stock Exchange. The regularity is evidenced by a predictable relationship between the trade sign (trade initiator), size of the trade, and the contents of the limit order book before the trade. We demonstrate this predictability by developing an empirical inference model to classify trades into buyer-initiated and seller-initiated. The model employs a local non-parametric method, k-nearest neighbor, which in the past was used successfully for chaotic time series prediction. The k-nearest neighbor with three predictor variables achieves an average out-of-sample classification accuracy of 71.40%, compared to 63.32% for the linear logistic regression with seven predictor variables. The result suggests that a non-linear approach may produce a more parsimonious trade sign inference model with a higher out-of-sample classification accuracy. Furthermore, for most of our stocks the observed regularity in market order submissions seems to have a memory of at least 30 trading days.

  11. On the Optimal Location of Sensors for Parametric Identification of Linear Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Brincker, Rune

    1994-01-01

    . It is assumed most often that the results of the measurements are statistically independent random variables. In an example the importance of considering the measurements as statistically dependent random variables is shown. The covariance of the model parameters expected to be obtained is investigated......An outline of the field of optimal location of sensors for parametric identification of linear structural systems is presented. There are few papers devoted to the case of optimal location of sensors in which the measurements are modeled by a random field with non-trivial covariance function...

  12. Hyperbolic and semi-parametric models in finance

    Science.gov (United States)

    Bingham, N. H.; Kiesel, Rüdiger

    2001-02-01

    The benchmark Black-Scholes-Merton model of mathematical finance is parametric, based on the normal/Gaussian distribution. Its principal parametric competitor, the hyperbolic model of Barndorff-Nielsen, Eberlein and others, is briefly discussed. Our main theme is the use of semi-parametric models, incorporating the mean vector and covariance matrix as in the Markowitz approach, plus a non-parametric part, a scalar function incorporating features such as tail-decay. Implementation is also briefly discussed.

  13. Two non-parametric methods for derivation of constraints from radiotherapy dose–histogram data

    International Nuclear Information System (INIS)

    Ebert, M A; Kennedy, A; Joseph, D J; Gulliford, S L; Buettner, F; Foo, K; Haworth, A; Denham, J W

    2014-01-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose–histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization. (note)

  14. Long-range memory and non-Markov statistical effects in human sensorimotor coordination

    Science.gov (United States)

    M. Yulmetyev, Renat; Emelyanova, Natalya; Hänggi, Peter; Gafarov, Fail; Prokhorov, Alexander

    2002-12-01

    In this paper, the non-Markov statistical processes and long-range memory effects in human sensorimotor coordination are investigated. The theoretical basis of this study is the statistical theory of non-stationary discrete non-Markov processes in complex systems (Phys. Rev. E 62, 6178 (2000)). The human sensorimotor coordination was experimentally studied by means of standard dynamical tapping test on the group of 32 young peoples with tap numbers up to 400. This test was carried out separately for the right and the left hand according to the degree of domination of each brain hemisphere. The numerical analysis of the experimental results was made with the help of power spectra of the initial time correlation function, the memory functions of low orders and the first three points of the statistical spectrum of non-Markovity parameter. Our observations demonstrate, that with the regard to results of the standard dynamic tapping-test it is possible to divide all examinees into five different dynamic types. We have introduced the conflict coefficient to estimate quantitatively the order-disorder effects underlying life systems. The last one reflects the existence of disbalance between the nervous and the motor human coordination. The suggested classification of the neurophysiological activity represents the dynamic generalization of the well-known neuropsychological types and provides the new approach in a modern neuropsychology.

  15. Statistical Survey of Non-Formal Education

    Directory of Open Access Journals (Sweden)

    Ondřej Nývlt

    2012-12-01

    Full Text Available focused on a programme within a regular education system. Labour market flexibility and new requirements on employees create a new domain of education called non-formal education. Is there a reliable statistical source with a good methodological definition for the Czech Republic? Labour Force Survey (LFS has been the basic statistical source for time comparison of non-formal education for the last ten years. Furthermore, a special Adult Education Survey (AES in 2011 was focused on individual components of non-formal education in a detailed way. In general, the goal of the EU is to use data from both internationally comparable surveys for analyses of the particular fields of lifelong learning in the way, that annual LFS data could be enlarged by detailed information from AES in five years periods. This article describes reliability of statistical data aboutnon-formal education. This analysis is usually connected with sampling and non-sampling errors.

  16. Proposing a framework for airline service quality evaluation using Type-2 Fuzzy TOPSIS and non-parametric analysis

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper focuses on evaluating airline service quality from the perspective of passengers' view. Until now a lot of researches has been performed in airline service quality evaluation in the world but a little research has been conducted in Iran, yet. In this study, a framework for measuring airline service quality in Iran is proposed. After reviewing airline service quality criteria, SSQAI model was selected because of its comprehensiveness in covering airline service quality dimensions. SSQAI questionnaire items were redesigned to adopt with Iranian airlines requirements and environmental circumstances in the Iran's economic and cultural context. This study includes fuzzy decision-making theory, considering the possible fuzzy subjective judgment of the evaluators during airline service quality evaluation. Fuzzy TOPSIS have been applied for ranking airlines service quality performances. Three major Iranian airlines which have the most passenger transfer volumes in domestic and foreign flights were chosen for evaluation in this research. Results demonstrated Mahan airline has got the best service quality performance rank in gaining passengers' satisfaction with delivery of high-quality services to its passengers, among the three major Iranian airlines. IranAir and Aseman airlines placed in the second and third rank, respectively, according to passenger's evaluation. Statistical analysis has been used in analyzing passenger responses. Due to the abnormality of data, Non-parametric tests were applied. To demonstrate airline ranks in every criterion separately, Friedman test was performed. Variance analysis and Tukey test were applied to study the influence of increasing in age and educational level of passengers on degree of their satisfaction from airline's service quality. Results showed that age has no significant relation to passenger satisfaction of airlines, however, increasing in educational level demonstrated a negative impact on

  17. Observational Signatures of Parametric Instability at 1AU

    Science.gov (United States)

    Bowen, T. A.; Bale, S. D.; Badman, S.

    2017-12-01

    Observations and simulations of inertial compressive turbulence in the solar wind are characterized by density structures anti-correlated with magnetic fluctuations parallel to the mean field. This signature has been interpreted as observational evidence for non-propagating pressure balanced structures (PBS), kinetic ion acoustic waves, as well as the MHD slow mode. Recent work, specifically Verscharen et al. (2017), has highlighted the unexpected fluid like nature of the solar wind. Given the high damping rates of parallel propagating compressive fluctuations, their ubiquity in satellite observations is surprising and suggests the presence of a driving process. One possible candidate for the generation of compressive fluctuations in the solar wind is the parametric instability, in which large amplitude Alfvenic fluctuations decay into parallel propagating compressive waves. This work employs 10 years of WIND observations in order to test the parametric decay process as a source of compressive waves in the solar wind through comparing collisionless damping rates of compressive fluctuations with growth rates of the parametric instability. Preliminary results suggest that generation of compressive waves through parametric decay is overdamped at 1 AU. However, the higher parametric decay rates expected in the inner heliosphere likely allow for growth of the slow mode-the remnants of which could explain density fluctuations observed at 1AU.

  18. Non-linear statistical downscaling of present and LGM precipitation and temperatures over Europe

    Directory of Open Access Journals (Sweden)

    M. Vrac

    2007-12-01

    Full Text Available Local-scale climate information is increasingly needed for the study of past, present and future climate changes. In this study we develop a non-linear statistical downscaling method to generate local temperatures and precipitation values from large-scale variables of a Earth System Model of Intermediate Complexity (here CLIMBER. Our statistical downscaling scheme is based on the concept of Generalized Additive Models (GAMs, capturing non-linearities via non-parametric techniques. Our GAMs are calibrated on the present Western Europe climate. For this region, annual GAMs (i.e. models based on 12 monthly values per location are fitted by combining two types of large-scale explanatory variables: geographical (e.g. topographical information and physical (i.e. entirely simulated by the CLIMBER model.

    To evaluate the adequacy of the non-linear transfer functions fitted on the present Western European climate, they are applied to different spatial and temporal large-scale conditions. Local projections for present North America and Northern Europe climates are obtained and compared to local observations. This partially addresses the issue of spatial robustness of our transfer functions by answering the question "does our statistical model remain valid when applied to large-scale climate conditions from a region different from the one used for calibration?". To asses their temporal performances, local projections for the Last Glacial Maximum period are derived and compared to local reconstructions and General Circulation Model outputs.

    Our downscaling methodology performs adequately for the Western Europe climate. Concerning the spatial and temporal evaluations, it does not behave as well for Northern America and Northern Europe climates because the calibration domain may be too different from the targeted regions. The physical explanatory variables alone are not capable of downscaling realistic values. However, the inclusion of

  19. The application of statistical and/or non-statistical sampling techniques by internal audit functions in the South African banking industry

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2015-03-01

    Full Text Available This article explores the use by internal audit functions of audit sampling techniques in order to test the effectiveness of controls in the banking sector. The article focuses specifically on the use of statistical and/or non-statistical sampling techniques by internal auditors. The focus of the research for this article was internal audit functions in the banking sector of South Africa. The results discussed in the article indicate that audit sampling is still used frequently as an audit evidence-gathering technique. Non-statistical sampling techniques are used more frequently than statistical sampling techniques for the evaluation of the sample. In addition, both techniques are regarded as important for the determination of the sample size and the selection of the sample items

  20. Parametric Methods for Order Tracking Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm

    2017-01-01

    Order tracking analysis is often used to find the critical speeds at which structural resonances are excited by a rotating machine. Typically, order tracking analysis is performed via non-parametric methods. In this report, however, we demonstrate some of the advantages of using a parametric method...

  1. Evaluating a stochastic parametrization for a fast–slow system using the Wasserstein distance

    Directory of Open Access Journals (Sweden)

    G. Vissio

    2018-06-01

    Full Text Available Constructing accurate, flexible, and efficient parametrizations is one of the great challenges in the numerical modeling of geophysical fluids. We consider here the simple yet paradigmatic case of a Lorenz 84 model forced by a Lorenz 63 model and derive a parametrization using a recently developed statistical mechanical methodology based on the Ruelle response theory. We derive an expression for the deterministic and the stochastic component of the parametrization and we show that the approach allows for dealing seamlessly with the case of the Lorenz 63 being a fast as well as a slow forcing compared to the characteristic timescales of the Lorenz 84 model. We test our results using both standard metrics based on the moments of the variables of interest as well as Wasserstein distance between the projected measure of the original system on the Lorenz 84 model variables and the measure of the parametrized one. By testing our methods on reduced-phase spaces obtained by projection, we find support for the idea that comparisons based on the Wasserstein distance might be of relevance in many applications despite the curse of dimensionality.

  2. Radiation parametric generation in non-linear crystals

    International Nuclear Information System (INIS)

    Pacheco, M.T.; Pereira, M.A.C.Q.

    1983-01-01

    A short historical development review is presented on the optical parametric oscillators. Analysis on behaviour of the simple resonant oscillators (SRO), double resonant oscillators (DRO) and ring resonant oscillators (RRO), in the plane wave pumping approximation is shown. Comparision between the three oscillators types is given. (Author) [pt

  3. Design and development of a parametrically excited nonlinear energy harvester

    International Nuclear Information System (INIS)

    Yildirim, Tanju; Ghayesh, Mergen H.; Li, Weihua; Alici, Gursel

    2016-01-01

    Highlights: • A parametrically broadband energy harvester was fabricated. • Strong softening-type nonlinear behaviour was observed. • Experiments were conducted showing the large bandwidth of the device. - Abstract: An energy harvester has been designed, fabricated and tested based on the nonlinear dynamical response of a parametrically excited clamped-clamped beam with a central point-mass; magnets have been used as the central point-mass which pass through a coil when parametrically excited. Experiments have been conducted for the energy harvester when the system is excited (i) harmonically near the primary resonance; (ii) harmonically near the principal parametric resonance; (iii) by means of a non-smooth periodic excitation. An electrodynamic shaker was used to parametrically excite the system and the corresponding displacement of the magnet and output voltages of the coil were measured. It has been shown that the system displays linear behaviour at the primary resonance; however, at the principal parametric resonance, the motion characteristic of the magnet substantially changed displaying a strong softening-type nonlinearity. Theoretical simulations have also been conducted in order to verify the experimental results; the comparison between theory and experiment were within very good agreement of each other. The energy harvester developed in this paper is capable of harvesting energy close to the primary resonance as well as the principal parametric resonance; the frequency-band has been broadened significantly mainly due to the nonlinear effects as well as the parametric excitation.

  4. On Rigorous Drought Assessment Using Daily Time Scale: Non-Stationary Frequency Analyses, Revisited Concepts, and a New Method to Yield Non-Parametric Indices

    Directory of Open Access Journals (Sweden)

    Charles Onyutha

    2017-10-01

    Full Text Available Some of the problems in drought assessments are that: analyses tend to focus on coarse temporal scales, many of the methods yield skewed indices, a few terminologies are ambiguously used, and analyses comprise an implicit assumption that the observations come from a stationary process. To solve these problems, this paper introduces non-stationary frequency analyses of quantiles. How to use non-parametric rescaling to obtain robust indices that are not (or minimally skewed is also introduced. To avoid ambiguity, some concepts on, e.g., incidence, extremity, etc., were revisited through shift from monthly to daily time scale. Demonstrations on the introduced methods were made using daily flow and precipitation insufficiency (precipitation minus potential evapotranspiration from the Blue Nile basin in Africa. Results show that, when a significant trend exists in extreme events, stationarity-based quantiles can be far different from those when non-stationarity is considered. The introduced non-parametric indices were found to closely agree with the well-known standardized precipitation evapotranspiration indices in many aspects but skewness. Apart from revisiting some concepts, the advantages of the use of fine instead of coarse time scales in drought assessment were given. The links for obtaining freely downloadable tools on how to implement the introduced methods were provided.

  5. Using an Autonomous Scale Ship Model for Resistance and Parametric Roll Tests

    Directory of Open Access Journals (Sweden)

    Fernando LOPEZ PEŇA

    2015-04-01

    Full Text Available This work presents the developing of a self-propelled scale ship model aimed to perform resistance and parametric roll tests in towing tanks. The main characteristic of the proposed system is that it doesn’t have any material link to a towing device to carry out the tests. This ship model has been fully instrumented in order to acquire all the significant raw data, process them onboard and communicate with an inshore station. This works presents a description of the proposed model as well as some results obtained by its use during a towing tank testing campaign.

  6. Planar Parametrization in Isogeometric Analysis

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Nguyen, Dang-Manh

    2012-01-01

    Before isogeometric analysis can be applied to solving a partial differential equation posed over some physical domain, one needs to construct a valid parametrization of the geometry. The accuracy of the analysis is affected by the quality of the parametrization. The challenge of computing...... and maintaining a valid geometry parametrization is particularly relevant in applications of isogemetric analysis to shape optimization, where the geometry varies from one optimization iteration to another. We propose a general framework for handling the geometry parametrization in isogeometric analysis and shape...... are suitable for our framework. The non-linear methods we consider are based on solving a constrained optimization problem numerically, and are divided into two classes, geometry-oriented methods and analysis-oriented methods. Their performance is illustrated through a few numerical examples....

  7. A Third Moment Adjusted Test Statistic for Small Sample Factor Analysis.

    Science.gov (United States)

    Lin, Johnny; Bentler, Peter M

    2012-01-01

    Goodness of fit testing in factor analysis is based on the assumption that the test statistic is asymptotically chi-square; but this property may not hold in small samples even when the factors and errors are normally distributed in the population. Robust methods such as Browne's asymptotically distribution-free method and Satorra Bentler's mean scaling statistic were developed under the presumption of non-normality in the factors and errors. This paper finds new application to the case where factors and errors are normally distributed in the population but the skewness of the obtained test statistic is still high due to sampling error in the observed indicators. An extension of Satorra Bentler's statistic is proposed that not only scales the mean but also adjusts the degrees of freedom based on the skewness of the obtained test statistic in order to improve its robustness under small samples. A simple simulation study shows that this third moment adjusted statistic asymptotically performs on par with previously proposed methods, and at a very small sample size offers superior Type I error rates under a properly specified model. Data from Mardia, Kent and Bibby's study of students tested for their ability in five content areas that were either open or closed book were used to illustrate the real-world performance of this statistic.

  8. Quantitative analysis of diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) for brain disorders

    Science.gov (United States)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon

    2013-07-01

    This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.

  9. Data driven smooth tests for composite hypotheses

    NARCIS (Netherlands)

    Inglot, Tadeusz; Kallenberg, Wilbert C.M.; Ledwina, Teresa

    1997-01-01

    The classical problem of testing goodness-of-fit of a parametric family is reconsidered. A new test for this problem is proposed and investigated. The new test statistic is a combination of the smooth test statistic and Schwarz's selection rule. More precisely, as the sample size increases, an

  10. Parametric Instability in Advanced Laser Interferometer Gravitational Wave Detectors

    International Nuclear Information System (INIS)

    Ju, L; Grass, S; Zhao, C; Degallaix, J; Blair, D G

    2006-01-01

    High frequency parametric instabilities in optical cavities are radiation pressure induced interactions between test mass mechanical modes and cavity optical modes. The parametric gain depends on the cavity power and the quality factor of the test mass internal modes (usually in ultrasonic frequency range), as well as the overlap integral for the mechanical and optical modes. In advanced laser interferometers which require high optical power and very low acoustic loss test masses, parametric instabilities could prevent interferometer operation if not suppressed. Here we review the problem of parametric instabilities in advanced detector configurations for different combinations of sapphire and fused silica test masses, and compare three methods for control or suppression of parametric instabilities-thermal tuning, surface damping and active feedback

  11. Parametric form of QCD travelling waves

    OpenAIRE

    Peschanski, R.

    2005-01-01

    We derive parametric travelling-wave solutions of non-linear QCD equations. They describe the evolution towards saturation in the geometric scaling region. The method, based on an expansion in the inverse of the wave velocity, leads to a solvable hierarchy of differential equations. A universal parametric form of travelling waves emerges from the first two orders of the expansion.

  12. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping.

    Science.gov (United States)

    McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W

    2008-02-29

    Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). A significant area of hypoperfusion (P TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques.

  13. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  14. The insignificance of statistical significance testing

    Science.gov (United States)

    Johnson, Douglas H.

    1999-01-01

    Despite their use in scientific journals such as The Journal of Wildlife Management, statistical hypothesis tests add very little value to the products of research. Indeed, they frequently confuse the interpretation of data. This paper describes how statistical hypothesis tests are often viewed, and then contrasts that interpretation with the correct one. I discuss the arbitrariness of P-values, conclusions that the null hypothesis is true, power analysis, and distinctions between statistical and biological significance. Statistical hypothesis testing, in which the null hypothesis about the properties of a population is almost always known a priori to be false, is contrasted with scientific hypothesis testing, which examines a credible null hypothesis about phenomena in nature. More meaningful alternatives are briefly outlined, including estimation and confidence intervals for determining the importance of factors, decision theory for guiding actions in the face of uncertainty, and Bayesian approaches to hypothesis testing and other statistical practices.

  15. Parametric fitting of data obtained from detectors with finite resolution and limited acceptance

    International Nuclear Information System (INIS)

    Gagunashvili, N.D.

    2011-01-01

    A goodness-of-fit test for fitting of a parametric model to data obtained from a detector with finite resolution and limited acceptance is proposed. The parameters of the model are found by minimization of a statistic that is used for comparing experimental data and simulated reconstructed data. Numerical examples are presented to illustrate and validate the fitting procedure.

  16. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    Science.gov (United States)

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  17. The problem of low variance voxels in statistical parametric mapping; a new hat avoids a 'haircut'.

    Science.gov (United States)

    Ridgway, Gerard R; Litvak, Vladimir; Flandin, Guillaume; Friston, Karl J; Penny, Will D

    2012-02-01

    Statistical parametric mapping (SPM) locates significant clusters based on a ratio of signal to noise (a 'contrast' of the parameters divided by its standard error) meaning that very low noise regions, for example outside the brain, can attain artefactually high statistical values. Similarly, the commonly applied preprocessing step of Gaussian spatial smoothing can shift the peak statistical significance away from the peak of the contrast and towards regions of lower variance. These problems have previously been identified in positron emission tomography (PET) (Reimold et al., 2006) and voxel-based morphometry (VBM) (Acosta-Cabronero et al., 2008), but can also appear in functional magnetic resonance imaging (fMRI) studies. Additionally, for source-reconstructed magneto- and electro-encephalography (M/EEG), the problems are particularly severe because sparsity-favouring priors constrain meaningfully large signal and variance to a small set of compactly supported regions within the brain. (Acosta-Cabronero et al., 2008) suggested adding noise to background voxels (the 'haircut'), effectively increasing their noise variance, but at the cost of contaminating neighbouring regions with the added noise once smoothed. Following theory and simulations, we propose to modify--directly and solely--the noise variance estimate, and investigate this solution on real imaging data from a range of modalities. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2003-01-01

    Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...... for timber are investigated....

  19. Testing statistical hypotheses of equivalence

    CERN Document Server

    Wellek, Stefan

    2010-01-01

    Equivalence testing has grown significantly in importance over the last two decades, especially as its relevance to a variety of applications has become understood. Yet published work on the general methodology remains scattered in specialists' journals, and for the most part, it focuses on the relatively narrow topic of bioequivalence assessment.With a far broader perspective, Testing Statistical Hypotheses of Equivalence provides the first comprehensive treatment of statistical equivalence testing. The author addresses a spectrum of specific, two-sided equivalence testing problems, from the

  20. The SU(1, 1) Perelomov number coherent states and the non-degenerate parametric amplifier

    Energy Technology Data Exchange (ETDEWEB)

    Ojeda-Guillén, D., E-mail: dojedag@ipn.mx; Granados, V. D. [Escuela Superior de Física y Matemáticas, Instituto Politécnico Nacional, Ed. 9, Unidad Profesional Adolfo López Mateos, C.P. 07738 México D. F. (Mexico); Mota, R. D. [Escuela Superior de Ingeniería Mecánica y Eléctrica, Unidad Culhuacán, Instituto Politécnico Nacional, Av. Santa Ana No. 1000, Col. San Francisco Culhuacán, Delegación Coyoacán, C.P. 04430, México D. F. (Mexico)

    2014-04-15

    We construct the Perelomov number coherent states for an arbitrary su(1, 1) group operation and study some of their properties. We introduce three operators which act on Perelomov number coherent states and close the su(1, 1) Lie algebra. By using the tilting transformation we apply our results to obtain the energy spectrum and eigenfunctions of the non-degenerate parametric amplifier. We show that these eigenfunctions are the Perelomov number coherent states of the two-dimensional harmonic oscillator.

  1. Statistical hypothesis testing with SAS and R

    CERN Document Server

    Taeger, Dirk

    2014-01-01

    A comprehensive guide to statistical hypothesis testing with examples in SAS and R When analyzing datasets the following questions often arise:Is there a short hand procedure for a statistical test available in SAS or R?If so, how do I use it?If not, how do I program the test myself? This book answers these questions and provides an overview of the most commonstatistical test problems in a comprehensive way, making it easy to find and performan appropriate statistical test. A general summary of statistical test theory is presented, along with a basicdescription for each test, including the

  2. Hardware-Software Complex for Functional and Parametric Tests of ARM Microcontrollers STM32F1XX

    Directory of Open Access Journals (Sweden)

    Egorov Aleksey

    2016-01-01

    Full Text Available The article presents the hardware-software complex for functional and parametric tests of ARM microcontrollers STM32F1XX. The complex is based on PXI devices by National Instruments and LabVIEW software environment. Data exchange procedure between a microcontroller under test and the complex hardware is describes. Some test results are also presented.

  3. Statistical modelling approach to derive quantitative nanowastes classification index; estimation of nanomaterials exposure

    CSIR Research Space (South Africa)

    Ntaka, L

    2013-08-01

    Full Text Available . In this work, statistical inference approach specifically the non-parametric bootstrapping and linear model were applied. Data used to develop the model were sourced from the literature. 104 data points with information on aggregation, natural organic matter...

  4. Parametric analysis of ATM solar array.

    Science.gov (United States)

    Singh, B. K.; Adkisson, W. B.

    1973-01-01

    The paper discusses the methods used for the calculation of ATM solar array performance characteristics and provides the parametric analysis of solar panels used in SKYLAB. To predict the solar array performance under conditions other than test conditions, a mathematical model has been developed. Four computer programs have been used to convert the solar simulator test data to the parametric curves. The first performs module summations, the second determines average solar cell characteristics which will cause a mathematical model to generate a curve matching the test data, the third is a polynomial fit program which determines the polynomial equations for the solar cell characteristics versus temperature, and the fourth program uses the polynomial coefficients generated by the polynomial curve fit program to generate the parametric data.

  5. Brain SPECT analysis using statistical parametric mapping in patients with posttraumatic stress disorder

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Euy Neyng; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Yang, Dong Won [College of Medicine, The Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2001-07-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with posttraumatic stress disorder (PTSD) using statistical parametric mapping (SPM99). Noninvasive rCBF measurements using {sup 99m}Tc-ethyl cysteinate dimer (ECD) SPECT were performed on 23 patients with PTSD and 21 age matched normal controls without re-exposure to accident-related stimuli. The relative rCBF maps in patients with PTSD and controls were compared. In patients with PTSD, significant increased rCBF was found along the limbic system in the brain. There were a few foci of decreased rCBF in the superior frontal gyrus, parietal and temporal region. PTSD is associated with increased rCBF in limbic areas compared with age-matched normal controls. These findings implicate regions of the limbic brain, which may mediate the response to aversive stimuli in healthy individuals, play on important role in patients suffering from PTSD and suggest that ongoing hyperfunction of 'overlearned survival response' or flashbacks response in these regions after painful, life threatening, or horrifying events without re-exposure to same traumatic stimulus.

  6. Brain SPECT analysis using statistical parametric mapping in patients with posttraumatic stress disorder

    International Nuclear Information System (INIS)

    Kim, Euy Neyng; Sohn, Hyung Sun; Kim, Sung Hoon; Chung, Soo Kyo; Yang, Dong Won

    2001-01-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with posttraumatic stress disorder (PTSD) using statistical parametric mapping (SPM99). Noninvasive rCBF measurements using 99m Tc-ethyl cysteinate dimer (ECD) SPECT were performed on 23 patients with PTSD and 21 age matched normal controls without re-exposure to accident-related stimuli. The relative rCBF maps in patients with PTSD and controls were compared. In patients with PTSD, significant increased rCBF was found along the limbic system in the brain. There were a few foci of decreased rCBF in the superior frontal gyrus, parietal and temporal region. PTSD is associated with increased rCBF in limbic areas compared with age-matched normal controls. These findings implicate regions of the limbic brain, which may mediate the response to aversive stimuli in healthy individuals, play on important role in patients suffering from PTSD and suggest that ongoing hyperfunction of 'overlearned survival response' or flashbacks response in these regions after painful, life threatening, or horrifying events without re-exposure to same traumatic stimulus

  7. Parametric and non-parametric models for lifespan modeling of insulation systems in electrical machines

    OpenAIRE

    Salameh , Farah; Picot , Antoine; Chabert , Marie; Maussion , Pascal

    2017-01-01

    International audience; This paper describes an original statistical approach for the lifespan modeling of electric machine insulation materials. The presented models aim to study the effect of three main stress factors (voltage, frequency and temperature) and their interactions on the insulation lifespan. The proposed methodology is applied to two different insulation materials tested in partial discharge regime. Accelerated ageing tests are organized according to experimental optimization m...

  8. Motivations of parametric studies

    International Nuclear Information System (INIS)

    Birac, C.

    1988-01-01

    The paper concerns the motivations of parametric studies in connection with the Programme for the Inspection of Steel Components PISC II. The objective of the PISC II exercise is to evaluate the effectiveness of current and advanced NDT techniques for inspection of reactor pressure vessel components. The parametric studies were initiated to determine the influence of some parameters on defect detection and dimensioning, and to increase the technical bases of the Round Robin Tests. A description is given of the content of the parametric studies including:- the effect of the defects' characteristics, the effect of equipment characteristics, the effect of cladding, and possible use of electromagnetic techniques. (U.K.)

  9. L2 Writing in Test and Non-test Situations: Process and Product

    Directory of Open Access Journals (Sweden)

    Baraa Khuder

    2015-02-01

    Full Text Available Test writers sometimes complain they cannot perform to their true abilities because of time constraints. We therefore examined differences in terms of process and product between texts produced under test and non-test conditions. Ten L2 postgraduates wrote two argumentative essays, one under test conditions, with only forty minutes being allowed and without recourse to resources, and one under non-test conditions, with unlimited time as well as access to the Internet. Keystroke logging, screen capture software, and stimulated recall protocols were used, participants explaining and commenting on their writing processes. Sixteen writing process types were identified. Higher proportions of the processes of translation and surface revision were recorded in the test situation, while meaningful revision and evaluation were both higher in the non-test situation. There was a statistically significant difference between time allocation for different processes at different stages. Experienced teachers awarded the non-test texts a mean score of almost one point (0.8 higher. A correlational analysis examining the relationship between writing process and product quality showed that while the distribution of writing processes can have an impact on text quality in the test situation, it had no effect on the product in the non-test situation.

  10. Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.

    Science.gov (United States)

    Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J

    2017-10-20

    This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.

  11. TOWARD HIGH-PRECISION SEISMIC STUDIES OF WHITE DWARF STARS: PARAMETRIZATION OF THE CORE AND TESTS OF ACCURACY

    Energy Technology Data Exchange (ETDEWEB)

    Giammichele, N.; Fontaine, G.; Brassard, P. [Département de Physique, Université de Montréal, Montréal, QC H3C 3J7 (Canada); Charpinet, S. [Université de Toulouse, UPS-OMP, IRAP, Toulouse F-31400 (France)

    2017-01-10

    We present a prescription for parametrizing the chemical profile in the core of white dwarfs in light of the recent discovery that pulsation modes may sometimes be deeply confined in some cool pulsating white dwarfs. Such modes may be used as unique probes of the complicated chemical stratification that results from several processes that occurred in previous evolutionary phases of intermediate-mass stars. This effort is part of our ongoing quest for more credible and realistic seismic models of white dwarfs using static, parametrized equilibrium structures. Inspired by successful techniques developed in design optimization fields (such as aerodynamics), we exploit Akima splines for the tracing of the chemical profile of oxygen (carbon) in the core of a white dwarf model. A series of tests are then presented to better seize the precision and significance of the results that can be obtained in an asteroseismological context. We also show that the new parametrization passes an essential basic test, as it successfully reproduces the chemical stratification of a full evolutionary model.

  12. TOWARD HIGH-PRECISION SEISMIC STUDIES OF WHITE DWARF STARS: PARAMETRIZATION OF THE CORE AND TESTS OF ACCURACY

    International Nuclear Information System (INIS)

    Giammichele, N.; Fontaine, G.; Brassard, P.; Charpinet, S.

    2017-01-01

    We present a prescription for parametrizing the chemical profile in the core of white dwarfs in light of the recent discovery that pulsation modes may sometimes be deeply confined in some cool pulsating white dwarfs. Such modes may be used as unique probes of the complicated chemical stratification that results from several processes that occurred in previous evolutionary phases of intermediate-mass stars. This effort is part of our ongoing quest for more credible and realistic seismic models of white dwarfs using static, parametrized equilibrium structures. Inspired by successful techniques developed in design optimization fields (such as aerodynamics), we exploit Akima splines for the tracing of the chemical profile of oxygen (carbon) in the core of a white dwarf model. A series of tests are then presented to better seize the precision and significance of the results that can be obtained in an asteroseismological context. We also show that the new parametrization passes an essential basic test, as it successfully reproduces the chemical stratification of a full evolutionary model.

  13. Introductory statistics for the behavioral sciences

    CERN Document Server

    Welkowitz, Joan; Cohen, Jacob

    1971-01-01

    Introductory Statistics for the Behavioral Sciences provides an introduction to statistical concepts and principles. This book emphasizes the robustness of parametric procedures wherein such significant tests as t and F yield accurate results even if such assumptions as equal population variances and normal population distributions are not well met.Organized into three parts encompassing 16 chapters, this book begins with an overview of the rationale upon which much of behavioral science research is based, namely, drawing inferences about a population based on data obtained from a samp

  14. Applications of quantum entropy to statistics

    International Nuclear Information System (INIS)

    Silver, R.N.; Martz, H.F.

    1994-01-01

    This paper develops two generalizations of the maximum entropy (ME) principle. First, Shannon classical entropy is replaced by von Neumann quantum entropy to yield a broader class of information divergences (or penalty functions) for statistics applications. Negative relative quantum entropy enforces convexity, positivity, non-local extensivity and prior correlations such as smoothness. This enables the extension of ME methods from their traditional domain of ill-posed in-verse problems to new applications such as non-parametric density estimation. Second, given a choice of information divergence, a combination of ME and Bayes rule is used to assign both prior and posterior probabilities. Hyperparameters are interpreted as Lagrange multipliers enforcing constraints. Conservation principles are proposed to act statistical regularization and other hyperparameters, such as conservation of information and smoothness. ME provides an alternative to heirarchical Bayes methods

  15. Parametric model measurement: reframing traditional measurement ideas in neuropsychological practice and research.

    Science.gov (United States)

    Brown, Gregory G; Thomas, Michael L; Patt, Virginie

    Neuropsychology is an applied measurement field with its psychometric work primarily built upon classical test theory (CTT). We describe a series of psychometric models to supplement the use of CTT in neuropsychological research and test development. We introduce increasingly complex psychometric models as measurement algebras, which include model parameters that represent abilities and item properties. Within this framework of parametric model measurement (PMM), neuropsychological assessment involves the estimation of model parameters with ability parameter values assuming the role of test 'scores'. Moreover, the traditional notion of measurement error is replaced by the notion of parameter estimation error, and the definition of reliability becomes linked to notions of item and test information. The more complex PMM approaches incorporate into the assessment of neuropsychological performance formal parametric models of behavior validated in the experimental psychology literature, along with item parameters. These PMM approaches endorse the use of experimental manipulations of model parameters to assess a test's construct representation. Strengths and weaknesses of these models are evaluated by their implications for measurement error conditional upon ability level, sensitivity to sample characteristics, computational challenges to parameter estimation, and construct validity. A family of parametric psychometric models can be used to assess latent processes of interest to neuropsychologists. By modeling latent abilities at the item level, psychometric studies in neuropsychology can investigate construct validity and measurement precision within a single framework and contribute to a unification of statistical methods within the framework of generalized latent variable modeling.

  16. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping

    International Nuclear Information System (INIS)

    McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W

    2008-01-01

    Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). A significant area of hypoperfusion (P < 0.01) was found as a response to the TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques

  17. Behavioral investment strategy matters: a statistical arbitrage approach

    OpenAIRE

    Sun, David; Tsai, Shih-Chuan; Wang, Wei

    2011-01-01

    In this study, we employ a statistical arbitrage approach to demonstrate that momentum investment strategy tend to work better in periods longer than six months, a result different from findings in past literature. Compared with standard parametric tests, the statistical arbitrage method produces more clearly that momentum strategies work only in longer formation and holding periods. Also they yield positive significant returns in an up market, but negative yet insignificant returns in a down...

  18. Statistical Analysis of Data with Non-Detectable Values

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L.

    2004-08-26

    Environmental exposure measurements are, in general, positive and may be subject to left censoring, i.e. the measured value is less than a ''limit of detection''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. A basic problem of interest in environmental risk assessment is to determine if the mean concentration of an analyte is less than a prescribed action level. Parametric methods, used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level and/or an upper percentile (e.g. the 95th percentile) are used to characterize exposure levels, and upper confidence limits are needed to describe the uncertainty in these estimates. In certain situations it is of interest to estimate the probability of observing a future (or ''missed'') value of a lognormal variable. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on the 95th percentile (i.e. the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical

  19. Evaluating Two Models of Collaborative Tests in an Online Introductory Statistics Course

    Science.gov (United States)

    Björnsdóttir, Auðbjörg; Garfield, Joan; Everson, Michelle

    2015-01-01

    This study explored the use of two different types of collaborative tests in an online introductory statistics course. A study was designed and carried out to investigate three research questions: (1) What is the difference in students' learning between using consensus and non-consensus collaborative tests in the online environment?, (2) What is…

  20. Polarimetric Segmentation Using Wishart Test Statistic

    DEFF Research Database (Denmark)

    Skriver, Henning; Schou, Jesper; Nielsen, Allan Aasbjerg

    2002-01-01

    A newly developed test statistic for equality of two complex covariance matrices following the complex Wishart distribution and an associated asymptotic probability for the test statistic has been used in a segmentation algorithm. The segmentation algorithm is based on the MUM (merge using moments......) approach, which is a merging algorithm for single channel SAR images. The polarimetric version described in this paper uses the above-mentioned test statistic for merging. The segmentation algorithm has been applied to polarimetric SAR data from the Danish dual-frequency, airborne polarimetric SAR, EMISAR...

  1. Parametric Method Performance for Dynamic 3'-Deoxy-3'-18F-Fluorothymidine PET/CT in Epidermal Growth Factor Receptor-Mutated Non-Small Cell Lung Carcinoma Patients Before and During Therapy.

    Science.gov (United States)

    Kramer, Gerbrand Maria; Frings, Virginie; Heijtel, Dennis; Smit, E F; Hoekstra, Otto S; Boellaard, Ronald

    2017-06-01

    The objective of this study was to validate several parametric methods for quantification of 3'-deoxy-3'- 18 F-fluorothymidine ( 18 F-FLT) PET in advanced-stage non-small cell lung carcinoma (NSCLC) patients with an activating epidermal growth factor receptor mutation who were treated with gefitinib or erlotinib. Furthermore, we evaluated the impact of noise on accuracy and precision of the parametric analyses of dynamic 18 F-FLT PET/CT to assess the robustness of these methods. Methods : Ten NSCLC patients underwent dynamic 18 F-FLT PET/CT at baseline and 7 and 28 d after the start of treatment. Parametric images were generated using plasma input Logan graphic analysis and 2 basis functions-based methods: a 2-tissue-compartment basis function model (BFM) and spectral analysis (SA). Whole-tumor-averaged parametric pharmacokinetic parameters were compared with those obtained by nonlinear regression of the tumor time-activity curve using a reversible 2-tissue-compartment model with blood volume fraction. In addition, 2 statistically equivalent datasets were generated by countwise splitting the original list-mode data, each containing 50% of the total counts. Both new datasets were reconstructed, and parametric pharmacokinetic parameters were compared between the 2 replicates and the original data. Results: After the settings of each parametric method were optimized, distribution volumes (V T ) obtained with Logan graphic analysis, BFM, and SA all correlated well with those derived using nonlinear regression at baseline and during therapy ( R 2 ≥ 0.94; intraclass correlation coefficient > 0.97). SA-based V T images were most robust to increased noise on a voxel-level (repeatability coefficient, 16% vs. >26%). Yet BFM generated the most accurate K 1 values ( R 2 = 0.94; intraclass correlation coefficient, 0.96). Parametric K 1 data showed a larger variability in general; however, no differences were found in robustness between methods (repeatability coefficient, 80

  2. An Evaluation of Parametric and Nonparametric Models of Fish Population Response.

    Energy Technology Data Exchange (ETDEWEB)

    Haas, Timothy C.; Peterson, James T.; Lee, Danny C.

    1999-11-01

    Predicting the distribution or status of animal populations at large scales often requires the use of broad-scale information describing landforms, climate, vegetation, etc. These data, however, often consist of mixtures of continuous and categorical covariates and nonmultiplicative interactions among covariates, complicating statistical analyses. Using data from the interior Columbia River Basin, USA, we compared four methods for predicting the distribution of seven salmonid taxa using landscape information. Subwatersheds (mean size, 7800 ha) were characterized using a set of 12 covariates describing physiography, vegetation, and current land-use. The techniques included generalized logit modeling, classification trees, a nearest neighbor technique, and a modular neural network. We evaluated model performance using out-of-sample prediction accuracy via leave-one-out cross-validation and introduce a computer-intensive Monte Carlo hypothesis testing approach for examining the statistical significance of landscape covariates with the non-parametric methods. We found the modular neural network and the nearest-neighbor techniques to be the most accurate, but were difficult to summarize in ways that provided ecological insight. The modular neural network also required the most extensive computer resources for model fitting and hypothesis testing. The generalized logit models were readily interpretable, but were the least accurate, possibly due to nonlinear relationships and nonmultiplicative interactions among covariates. Substantial overlap among the statistically significant (P<0.05) covariates for each method suggested that each is capable of detecting similar relationships between responses and covariates. Consequently, we believe that employing one or more methods may provide greater biological insight without sacrificing prediction accuracy.

  3. Post traumatic brain perfusion SPECT analysis using reconstructed ROI maps of radioactive microsphere derived cerebral blood flow and statistical parametric mapping

    Directory of Open Access Journals (Sweden)

    Gonzalez-Brito Manuel

    2008-02-01

    Full Text Available Abstract Background Assessment of cerebral blood flow (CBF by SPECT could be important in the management of patients with severe traumatic brain injury (TBI because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia, or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. Methods The focal effects of moderate traumatic brain injury (TBI on cerebral blood flow (CBF by SPECT cerebral blood perfusion (CBP imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM. Results A significant area of hypoperfusion (P Conclusion The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques.

  4. Statistical downscaling of rainfall: a non-stationary and multi-resolution approach

    Science.gov (United States)

    Rashid, Md. Mamunur; Beecham, Simon; Chowdhury, Rezaul Kabir

    2016-05-01

    A novel downscaling technique is proposed in this study whereby the original rainfall and reanalysis variables are first decomposed by wavelet transforms and rainfall is modelled using the semi-parametric additive model formulation of Generalized Additive Model in Location, Scale and Shape (GAMLSS). The flexibility of the GAMLSS model makes it feasible as a framework for non-stationary modelling. Decomposition of a rainfall series into different components is useful to separate the scale-dependent properties of the rainfall as this varies both temporally and spatially. The study was conducted at the Onkaparinga river catchment in South Australia. The model was calibrated over the period 1960 to 1990 and validated over the period 1991 to 2010. The model reproduced the monthly variability and statistics of the observed rainfall well with Nash-Sutcliffe efficiency (NSE) values of 0.66 and 0.65 for the calibration and validation periods, respectively. It also reproduced well the seasonal rainfall over the calibration (NSE = 0.37) and validation (NSE = 0.69) periods for all seasons. The proposed model was better than the tradition modelling approach (application of GAMLSS to the original rainfall series without decomposition) at reproducing the time-frequency properties of the observed rainfall, and yet it still preserved the statistics produced by the traditional modelling approach. When downscaling models were developed with general circulation model (GCM) historical output datasets, the proposed wavelet-based downscaling model outperformed the traditional downscaling model in terms of reproducing monthly rainfall for both the calibration and validation periods.

  5. Parametric Thinking in Urban Design

    DEFF Research Database (Denmark)

    Steinø, Nicolai

    2010-01-01

    The paper states that most applications of parametric mod- elling to architecture and urban design fall into one of two strands of either form for form’s sake, or the negotiation of environmental con- cerns, while approaches which allow scenarios to be easily tested and modified without the appli...... of the paper. The pros and cons of this simple approach is discussed, and the paper con- cludes, that while it does not represent a suitable solution in all cases, it fills a gap among the existing approaches to parametric urban de- sign.......The paper states that most applications of parametric mod- elling to architecture and urban design fall into one of two strands of either form for form’s sake, or the negotiation of environmental con- cerns, while approaches which allow scenarios to be easily tested and modified without...

  6. Parametric roll resonance monitoring using signal-based detection

    DEFF Research Database (Denmark)

    Galeazzi, Roberto; Blanke, Mogens; Falkenberg, Thomas

    2015-01-01

    Extreme roll motion of ships can be caused by several phenomena, one of which is parametric roll resonance. Several incidents occurred unexpectedly around the millennium and caused vast fiscal losses on large container vessels. The phenomenon is now well understood and some consider parametric roll...... algorithms in real conditions, and to evaluate the frequency of parametric roll events on the selected vessels. Detection performance is scrutinised through the validation of the detected events using owners’ standard methods, and supported by available wave radar data. Further, a bivariate statistical...... analysis of the outcome of the signal-based detectors is performed to assess the real life false alarm probability. It is shown that detection robustness and very low false warning rates are obtained. The study concludes that small parametric roll events are occurring, and that the proposed signal...

  7. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  8. Optimal parametric modelling of measured short waves

    Digital Repository Service at National Institute of Oceanography (India)

    Mandal, S.

    the importance of selecting a suitable sampling interval for better estimates of parametric modelling and also for better statistical representation. Implementation of the above algorithms in a structural monitoring system has the potential advantage of storing...

  9. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    Science.gov (United States)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  10. Brain SPECT analysis using statistical parametric mapping in patients with transient global amnesia

    Energy Technology Data Exchange (ETDEWEB)

    Kim, E. N.; Sohn, H. S.; Kim, S. H; Chung, S. K.; Yang, D. W. [College of Medicine, The Catholic Univ. of Korea, Seoul (Korea, Republic of)

    2001-07-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with transient global amnesia (TGA) using statistical parametric mapping 99 (SPM99). Noninvasive rCBF measurements using 99mTc-ethyl cysteinate dimer (ECD) SPECT were performed on 8 patients with TGA and 17 age matched controls. The relative rCBF maps in patients with TGA and controls were compared. In patients with TGA, significantly decreased rCBF was found along the left superior temporal extending to left parietal region of the brain and left thalamus. There were areas of increased rCBF in the right temporal, right frontal region and right thalamus. We could demonstrate decreased perfusion in left cerebral hemisphere and increased perfusion in right cerebral hemisphere in patients with TGA using SPM99. The reciprocal change of rCBF between right and left cerebral hemisphere in patients with TGA might suggest that imbalanced neuronal activity between the bilateral hemispheres may be important role in the pathogenesis of the TGA. For quantitative SPECT analysis in TGA patients, we recommend SPM99 rather than the ROI method because of its definitive advantages.

  11. Brain SPECT analysis using statistical parametric mapping in patients with transient global amnesia

    International Nuclear Information System (INIS)

    Kim, E. N.; Sohn, H. S.; Kim, S. H; Chung, S. K.; Yang, D. W.

    2001-01-01

    This study investigated alterations in regional cerebral blood flow (rCBF) in patients with transient global amnesia (TGA) using statistical parametric mapping 99 (SPM99). Noninvasive rCBF measurements using 99mTc-ethyl cysteinate dimer (ECD) SPECT were performed on 8 patients with TGA and 17 age matched controls. The relative rCBF maps in patients with TGA and controls were compared. In patients with TGA, significantly decreased rCBF was found along the left superior temporal extending to left parietal region of the brain and left thalamus. There were areas of increased rCBF in the right temporal, right frontal region and right thalamus. We could demonstrate decreased perfusion in left cerebral hemisphere and increased perfusion in right cerebral hemisphere in patients with TGA using SPM99. The reciprocal change of rCBF between right and left cerebral hemisphere in patients with TGA might suggest that imbalanced neuronal activity between the bilateral hemispheres may be important role in the pathogenesis of the TGA. For quantitative SPECT analysis in TGA patients, we recommend SPM99 rather than the ROI method because of its definitive advantages

  12. Is there much variation in variation? Revisiting statistics of small area variation in health services research

    Directory of Open Access Journals (Sweden)

    Ibáñez Berta

    2009-04-01

    Full Text Available Abstract Background The importance of Small Area Variation Analysis for policy-making contrasts with the scarcity of work on the validity of the statistics used in these studies. Our study aims at 1 determining whether variation in utilization rates between health areas is higher than would be expected by chance, 2 estimating the statistical power of the variation statistics; and 3 evaluating the ability of different statistics to compare the variability among different procedures regardless of their rates. Methods Parametric bootstrap techniques were used to derive the empirical distribution for each statistic under the hypothesis of homogeneity across areas. Non-parametric procedures were used to analyze the empirical distribution for the observed statistics and compare the results in six situations (low/medium/high utilization rates and low/high variability. A small scale simulation study was conducted to assess the capacity of each statistic to discriminate between different scenarios with different degrees of variation. Results Bootstrap techniques proved to be good at quantifying the difference between the null hypothesis and the variation observed in each situation, and to construct reliable tests and confidence intervals for each of the variation statistics analyzed. Although the good performance of Systematic Component of Variation (SCV, Empirical Bayes (EB statistic shows better behaviour under the null hypothesis, it is able to detect variability if present, it is not influenced by the procedure rate and it is best able to discriminate between different degrees of heterogeneity. Conclusion The EB statistics seems to be a good alternative to more conventional statistics used in small-area variation analysis in health service research because of its robustness.

  13. Efficient statistically accurate algorithms for the Fokker-Planck equation in large dimensions

    Science.gov (United States)

    Chen, Nan; Majda, Andrew J.

    2018-02-01

    Solving the Fokker-Planck equation for high-dimensional complex turbulent dynamical systems is an important and practical issue. However, most traditional methods suffer from the curse of dimensionality and have difficulties in capturing the fat tailed highly intermittent probability density functions (PDFs) of complex systems in turbulence, neuroscience and excitable media. In this article, efficient statistically accurate algorithms are developed for solving both the transient and the equilibrium solutions of Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures. The algorithms involve a hybrid strategy that requires only a small number of ensembles. Here, a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious non-parametric Gaussian kernel density estimation in the remaining low-dimensional subspace. Particularly, the parametric method provides closed analytical formulae for determining the conditional Gaussian distributions in the high-dimensional subspace and is therefore computationally efficient and accurate. The full non-Gaussian PDF of the system is then given by a Gaussian mixture. Different from traditional particle methods, each conditional Gaussian distribution here covers a significant portion of the high-dimensional PDF. Therefore a small number of ensembles is sufficient to recover the full PDF, which overcomes the curse of dimensionality. Notably, the mixture distribution has significant skill in capturing the transient behavior with fat tails of the high-dimensional non-Gaussian PDFs, and this facilitates the algorithms in accurately describing the intermittency and extreme events in complex turbulent systems. It is shown in a stringent set of test problems that the method only requires an order of O (100) ensembles to successfully recover the highly non-Gaussian transient PDFs in up to 6

  14. International Conference on Robust Statistics 2015

    CERN Document Server

    Basu, Ayanendranath; Filzmoser, Peter; Mukherjee, Diganta

    2016-01-01

    This book offers a collection of recent contributions and emerging ideas in the areas of robust statistics presented at the International Conference on Robust Statistics 2015 (ICORS 2015) held in Kolkata during 12–16 January, 2015. The book explores the applicability of robust methods in other non-traditional areas which includes the use of new techniques such as skew and mixture of skew distributions, scaled Bregman divergences, and multilevel functional data methods; application areas being circular data models and prediction of mortality and life expectancy. The contributions are of both theoretical as well as applied in nature. Robust statistics is a relatively young branch of statistical sciences that is rapidly emerging as the bedrock of statistical analysis in the 21st century due to its flexible nature and wide scope. Robust statistics supports the application of parametric and other inference techniques over a broader domain than the strictly interpreted model scenarios employed in classical statis...

  15. Parametric resonance in the early Universe—a fitting analysis

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa, Daniel G. [Theoretical Physics Department, CERN, Geneva (Switzerland); Torrentí, Francisco, E-mail: daniel.figueroa@cern.ch, E-mail: f.torrenti@csic.es [Instituto de Física Teórica IFT-UAM/CSIC, Universidad Autónoma de Madrid, Cantoblanco 28049, Madrid (Spain)

    2017-02-01

    Particle production via parametric resonance in the early Universe, is a non-perturbative, non-linear and out-of-equilibrium phenomenon. Although it is a well studied topic, whenever a new scenario exhibits parametric resonance, a full re-analysis is normally required. To avoid this tedious task, many works present often only a simplified linear treatment of the problem. In order to surpass this circumstance in the future, we provide a fitting analysis of parametric resonance through all its relevant stages: initial linear growth, non-linear evolution, and relaxation towards equilibrium. Using lattice simulations in an expanding grid in 3+1 dimensions, we parametrize the dynamics' outcome scanning over the relevant ingredients: role of the oscillatory field, particle coupling strength, initial conditions, and background expansion rate. We emphasize the inaccuracy of the linear calculation of the decay time of the oscillatory field, and propose a more appropriate definition of this scale based on the subsequent non-linear dynamics. We provide simple fits to the relevant time scales and particle energy fractions at each stage. Our fits can be applied to post-inflationary preheating scenarios, where the oscillatory field is the inflaton, or to spectator-field scenarios, where the oscillatory field can be e.g. a curvaton, or the Standard Model Higgs.

  16. Parametric resonance in the early Universe—a fitting analysis

    International Nuclear Information System (INIS)

    Figueroa, Daniel G.; Torrentí, Francisco

    2017-01-01

    Particle production via parametric resonance in the early Universe, is a non-perturbative, non-linear and out-of-equilibrium phenomenon. Although it is a well studied topic, whenever a new scenario exhibits parametric resonance, a full re-analysis is normally required. To avoid this tedious task, many works present often only a simplified linear treatment of the problem. In order to surpass this circumstance in the future, we provide a fitting analysis of parametric resonance through all its relevant stages: initial linear growth, non-linear evolution, and relaxation towards equilibrium. Using lattice simulations in an expanding grid in 3+1 dimensions, we parametrize the dynamics' outcome scanning over the relevant ingredients: role of the oscillatory field, particle coupling strength, initial conditions, and background expansion rate. We emphasize the inaccuracy of the linear calculation of the decay time of the oscillatory field, and propose a more appropriate definition of this scale based on the subsequent non-linear dynamics. We provide simple fits to the relevant time scales and particle energy fractions at each stage. Our fits can be applied to post-inflationary preheating scenarios, where the oscillatory field is the inflaton, or to spectator-field scenarios, where the oscillatory field can be e.g. a curvaton, or the Standard Model Higgs.

  17. Finding differentially expressed genes in high dimensional data: Rank based test statistic via a distance measure.

    Science.gov (United States)

    Mathur, Sunil; Sadana, Ajit

    2015-12-01

    We present a rank-based test statistic for the identification of differentially expressed genes using a distance measure. The proposed test statistic is highly robust against extreme values and does not assume the distribution of parent population. Simulation studies show that the proposed test is more powerful than some of the commonly used methods, such as paired t-test, Wilcoxon signed rank test, and significance analysis of microarray (SAM) under certain non-normal distributions. The asymptotic distribution of the test statistic, and the p-value function are discussed. The application of proposed method is shown using a real-life data set. © The Author(s) 2011.

  18. Density Fluctuations in the Solar Wind Driven by Alfvén Wave Parametric Decay

    Science.gov (United States)

    Bowen, Trevor A.; Badman, Samuel; Hellinger, Petr; Bale, Stuart D.

    2018-02-01

    Measurements and simulations of inertial compressive turbulence in the solar wind are characterized by anti-correlated magnetic fluctuations parallel to the mean field and density structures. This signature has been interpreted as observational evidence for non-propagating pressure balanced structures, kinetic ion-acoustic waves, as well as the MHD slow-mode. Given the high damping rates of parallel propagating compressive fluctuations, their ubiquity in satellite observations is surprising and suggestive of a local driving process. One possible candidate for the generation of compressive fluctuations in the solar wind is the Alfvén wave parametric instability. Here, we test the parametric decay process as a source of compressive waves in the solar wind by comparing the collisionless damping rates of compressive fluctuations with growth rates of the parametric decay instability daughter waves. Our results suggest that generation of compressive waves through parametric decay is overdamped at 1 au, but that the presence of slow-mode-like density fluctuations is correlated with the parametric decay of Alfvén waves.

  19. Explorations in Statistics: Hypothesis Tests and P Values

    Science.gov (United States)

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…

  20. Autonomic Differentiation Map: A Novel Statistical Tool for Interpretation of Heart Rate Variability

    Directory of Open Access Journals (Sweden)

    Daniela Lucini

    2018-04-01

    Full Text Available In spite of the large body of evidence suggesting Heart Rate Variability (HRV alone or combined with blood pressure variability (providing an estimate of baroreflex gain as a useful technique to assess the autonomic regulation of the cardiovascular system, there is still an ongoing debate about methodology, interpretation, and clinical applications. In the present investigation, we hypothesize that non-parametric and multivariate exploratory statistical manipulation of HRV data could provide a novel informational tool useful to differentiate normal controls from clinical groups, such as athletes, or subjects affected by obesity, hypertension, or stress. With a data-driven protocol in 1,352 ambulant subjects, we compute HRV and baroreflex indices from short-term data series as proxies of autonomic (ANS regulation. We apply a three-step statistical procedure, by first removing age and gender effects. Subsequently, by factor analysis, we extract four ANS latent domains that detain the large majority of information (86.94%, subdivided in oscillatory (40.84%, amplitude (18.04%, pressure (16.48%, and pulse domains (11.58%. Finally, we test the overall capacity to differentiate clinical groups vs. control. To give more practical value and improve readability, statistical results concerning individual discriminant ANS proxies and ANS differentiation profiles are displayed through peculiar graphical tools, i.e., significance diagram and ANS differentiation map, respectively. This approach, which simultaneously uses all available information about the system, shows what domains make up the difference in ANS discrimination. e.g., athletes differ from controls in all domains, but with a graded strength: maximal in the (normalized oscillatory and in the pulse domains, slightly less in the pressure domain and minimal in the amplitude domain. The application of multiple (non-parametric and exploratory statistical and graphical tools to ANS proxies defines

  1. Nuclear modification factor using Tsallis non-extensive statistics

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, Sushanta; Garg, Prakhar; Kumar, Prateek; Sahoo, Raghunath [Indian Institute of Technology Indore, Discipline of Physics, School of Basic Sciences, Simrol (India); Bhattacharyya, Trambak; Cleymans, Jean [University of Cape Town, UCT-CERN Research Centre and Department of Physics, Rondebosch (South Africa)

    2016-09-15

    The nuclear modification factor is derived using Tsallis non-extensive statistics in relaxation time approximation. The variation of the nuclear modification factor with transverse momentum for different values of the non-extensive parameter, q, is also observed. The experimental data from RHIC and LHC are analysed in the framework of Tsallis non-extensive statistics in a relaxation time approximation. It is shown that the proposed approach explains the R{sub AA} of all particles over a wide range of transverse momentum but does not seem to describe the rise in R{sub AA} at very high transverse momenta. (orig.)

  2. Robust inference from multiple test statistics via permutations: a better alternative to the single test statistic approach for randomized trials.

    Science.gov (United States)

    Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie

    2013-01-01

    Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Distinguish Dynamic Basic Blocks by Structural Statistical Testing

    DEFF Research Database (Denmark)

    Petit, Matthieu; Gotlieb, Arnaud

    Statistical testing aims at generating random test data that respect selected probabilistic properties. A distribution probability is associated with the program input space in order to achieve statistical test purpose: to test the most frequent usage of software or to maximize the probability of...... control flow path) during the test data selection. We implemented this algorithm in a statistical test data generator for Java programs. A first experimental validation is presented...

  4. A generalized parametric response mapping method for analysis of multi-parametric imaging: A feasibility study with application to glioblastoma.

    Science.gov (United States)

    Lausch, Anthony; Yeung, Timothy Pok-Chi; Chen, Jeff; Law, Elton; Wang, Yong; Urbini, Benedetta; Donelli, Filippo; Manco, Luigi; Fainardi, Enrico; Lee, Ting-Yim; Wong, Eugene

    2017-11-01

    -enhancing lesion (CEL) and a 1 cm shell of surrounding peri-tumoral tissue were performed. Prediction using tumor volume metrics was also investigated. Leave-one-out cross validation (LOOCV) was used in combination with permutation testing to assess preliminary predictive efficacy and estimate statistically robust P-values. The predictive endpoint was overall survival (OS) greater than or equal to the median OS of 18.2 months. Single-parameter PRM and multi-parametric response maps (MPRMs) were generated for each patient and used to predict OS via the LOOCV. Tumor volume metrics (P ≥ 0.071 ± 0.01) and single-parameter PRM analyses (P ≥ 0.170 ± 0.01) were not found to be predictive of OS within this study. MPRM analysis of the peri-tumoral region but not the CEL was found to be predictive of OS with a classification sensitivity, specificity and accuracy of 80%, 100%, and 89%, respectively (P = 0.001 ± 0.01). The feasibility of a generalized MPRM analysis framework was demonstrated with improved prediction of overall survival compared to the original single-parameter method when applied to a glioblastoma dataset. The proposed algorithm takes the spatial heterogeneity in multi-parametric response into consideration and enables visualization. MPRM analysis of peri-tumoral regions was shown to have predictive potential supporting further investigation of a larger glioblastoma dataset. © 2017 American Association of Physicists in Medicine.

  5. The Galker test of speech reception in noise

    DEFF Research Database (Denmark)

    Lauritsen, Maj-Britt Glenn; Söderström, Margareta; Kreiner, Svend

    2016-01-01

    PURPOSE: We tested "the Galker test", a speech reception in noise test developed for primary care for Danish preschool children, to explore if the children's ability to hear and understand speech was associated with gender, age, middle ear status, and the level of background noise. METHODS......: The Galker test is a 35-item audio-visual, computerized word discrimination test in background noise. Included were 370 normally developed children attending day care center. The children were examined with the Galker test, tympanometry, audiometry, and the Reynell test of verbal comprehension. Parents...... and daycare teachers completed questionnaires on the children's ability to hear and understand speech. As most of the variables were not assessed using interval scales, non-parametric statistics (Goodman-Kruskal's gamma) were used for analyzing associations with the Galker test score. For comparisons...

  6. Coherent states for oscillators of non-conventional statistics

    International Nuclear Information System (INIS)

    Dao Vong Duc; Nguyen Ba An

    1998-12-01

    In this work we consider systematically the concept of coherent states for oscillators of non-conventional statistics - parabose oscillator, infinite statistics oscillator and generalised q-deformed oscillator. The expressions for the quadrature variances and particle number distribution are derived and displayed graphically. The obtained results show drastic changes when going from one statistics to another. (author)

  7. Non-extensive statistical effects in nuclear many-body problems

    International Nuclear Information System (INIS)

    Lavagno, A.; Quarati, P.

    2007-01-01

    Density and temperature conditions in many stellar core and in the first stage of relativistic heavy-ion collisions imply the presence of non-ideal plasma effects with memory and long-range interactions between particles. Recent progress in statistical mechanics indicates that Tsallis non-extensive thermostatistics could be the natural generalization of the standard classical and quantum statistics, when memory effects and long range forces are not negligible. In this framework, we show that in weakly non-ideal plasma non-extensive effects should be taken into account to derive the equilibrium distribution functions, the quantum fluctuations and correlations between the particles. The strong influence of these effects is discussed in the context of the solar plasma physics and in the high-energy nuclear-nuclear collision experiments. Although the deviation from Boltzmann-Gibbs statistics, in both cases, is very small, the stellar plasma and the hadronic gas are strongly influenced by the non-extensive feature and the discrepancies between experimental data and theoretical previsions are sensibly reduced. (authors)

  8. Limiting processes in non-equilibrium classical statistical mechanics

    International Nuclear Information System (INIS)

    Jancel, R.

    1983-01-01

    After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr

  9. A parametric FE modeling of brake for non-linear analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed,Ibrahim; Fatouh, Yasser [Automotive and Tractors Technology Department, Faculty of Industrial Education, Helwan University, Cairo (Egypt); Aly, Wael [Refrigeration and Air-Conditioning Technology Department, Faculty of Industrial Education, Helwan University, Cairo (Egypt)

    2013-07-01

    A parametric modeling of a drum brake based on 3-D Finite Element Methods (FEM) for non-contact analysis is presented. Many parameters are examined during this study such as the effect of drum-lining interface stiffness, coefficient of friction, and line pressure on the interface contact. Firstly, the modal analysis of the drum brake is also studied to get the natural frequency and instability of the drum to facilitate transforming the modal elements to non-contact elements. It is shown that the Unsymmetric solver of the modal analysis is efficient enough to solve this linear problem after transforming the non-linear behavior of the contact between the drum and the lining to a linear behavior. A SOLID45 which is a linear element is used in the modal analysis and then transferred to non-linear elements which are Targe170 and Conta173 that represent the drum and lining for contact analysis study. The contact analysis problems are highly non-linear and require significant computer resources to solve it, however, the contact problem give two significant difficulties. Firstly, the region of contact is not known based on the boundary conditions such as line pressure, and drum and friction material specs. Secondly, these contact problems need to take the friction into consideration. Finally, it showed a good distribution of the nodal reaction forces on the slotted lining contact surface and existing of the slot in the middle of the lining can help in wear removal due to the friction between the lining and the drum. Accurate contact stiffness can give a good representation for the pressure distribution between the lining and the drum. However, a full contact of the front part of the slotted lining could occur in case of 20, 40, 60 and 80 bar of piston pressure and a partially contact between the drum and lining can occur in the rear part of the slotted lining.

  10. On the problem of neutron spectroscopy of parametrically non-equilibrium quasiparticles in solids

    International Nuclear Information System (INIS)

    Vo Khong An'.

    1981-01-01

    A suitable for numerical estimations formula for coherent neutron inelastic scattering cross sections on the plasmon-phonon mixed modes of electron-phonon systems in the parametric resonance conditions is obtained from the analytical one presented in the previous work using some relations of the general parametric excitation theory. The cross sections of neutron scattering on the high-frequency plasmon-like and the low-frequency longitudinal optical phonon-like modes in InSb crystals are calculated as functions of the driving laser field intensity, which show an increase in values by about two orders of magnitude as the field intensity approaches the parametric excitation threshold

  11. Likert scales, levels of measurement and the "laws" of statistics.

    Science.gov (United States)

    Norman, Geoff

    2010-12-01

    Reviewers of research reports frequently criticize the choice of statistical methods. While some of these criticisms are well-founded, frequently the use of various parametric methods such as analysis of variance, regression, correlation are faulted because: (a) the sample size is too small, (b) the data may not be normally distributed, or (c) The data are from Likert scales, which are ordinal, so parametric statistics cannot be used. In this paper, I dissect these arguments, and show that many studies, dating back to the 1930s consistently show that parametric statistics are robust with respect to violations of these assumptions. Hence, challenges like those above are unfounded, and parametric methods can be utilized without concern for "getting the wrong answer".

  12. Design Automation Using Script Languages. High-Level CAD Templates in Non-Parametric Programs

    Science.gov (United States)

    Moreno, R.; Bazán, A. M.

    2017-10-01

    The main purpose of this work is to study the advantages offered by the application of traditional techniques of technical drawing in processes for automation of the design, with non-parametric CAD programs, provided with scripting languages. Given that an example drawing can be solved with traditional step-by-step detailed procedures, is possible to do the same with CAD applications and to generalize it later, incorporating references. In today’s modern CAD applications, there are striking absences of solutions for building engineering: oblique projections (military and cavalier), 3D modelling of complex stairs, roofs, furniture, and so on. The use of geometric references (using variables in script languages) and their incorporation into high-level CAD templates allows the automation of processes. Instead of repeatedly creating similar designs or modifying their data, users should be able to use these templates to generate future variations of the same design. This paper presents the automation process of several complex drawing examples based on CAD script files aided with parametric geometry calculation tools. The proposed method allows us to solve complex geometry designs not currently incorporated in the current CAD applications and to subsequently create other new derivatives without user intervention. Automation in the generation of complex designs not only saves time but also increases the quality of the presentations and reduces the possibility of human errors.

  13. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    Science.gov (United States)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  14. Parametric Resonance in the Early Universe - A Fitting Analysis

    CERN Document Server

    Figueroa, Daniel G.

    2017-02-01

    Particle production via parametric resonance in the early Universe, is a non-perturbative, non-linear and out-of-equilibrium phenomenon. Although it is a well studied topic, whenever a new scenario exhibits parametric resonance, a full re-analysis is normally required. To avoid this tedious task, many works present often only a simplified linear treatment of the problem. In order to surpass this circumstance in the future, we provide a fitting analysis of parametric resonance through all its relevant stages: initial linear growth, non-linear evolution, and relaxation towards equilibrium. Using lattice simulations in an expanding grid in $3+1$ dimensions, we parametrise the dynamics' outcome scanning over the relevant ingredients: role of the oscillatory field, particle coupling strength, initial conditions, and background expansion rate. We emphasise the inaccuracy of the linear calculation of the decay time of the oscillatory field, and propose a more appropriate definition of this scale based on the subsequ...

  15. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  16. Statistical parametric mapping in the detection of rCBF changes in mild Alzheimer's disease

    International Nuclear Information System (INIS)

    Rowe, C.; Barnden, L.; Boundy, K.; McKinnon, J.; Liptak, M.

    1998-01-01

    Full text: Reduction in temporoparietal regional cerebral blood flow (rCBF) is proportional to the degree of cognitive deficit in patients with Alzheimer's Disease (AD). The characteristic pattern is readily apparent in advanced disease but is often subtle in early stage AD, reducing the clinical value of SPECT in the management of this condition. We have previously reported that Statistical Parametric Mapping (SPM95) revealed significant temporoparietal hypoperfusion when 10 patients with mild AD (classified by the Clinical Dementia Rating Scale) were compared to 10 age matched normals. We have now begun to evaluate the sensitivity and specificity of SPM95 in individuals with mild AD by comparison to our bank of 39 normals (30 female, 9 male, age range 26 to 74, mean age 52). Preliminary results reveal low sensitivity (<40%) when the standard reference region for normalization (i.e. global brain counts) is used. Better results are expected from normalizing to the cerebellum or basal ganglia and this is under investigation. An objective method to improve the accuracy of rCBF imaging for the diagnosis of early AD would be very useful in clinical practice. This study will demonstrate whether SPM can fulfill this role

  17. PHAZE, Parametric Hazard Function Estimation

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: Phaze performs statistical inference calculations on a hazard function (also called a failure rate or intensity function) based on reported failure times of components that are repaired and restored to service. Three parametric models are allowed: the exponential, linear, and Weibull hazard models. The inference includes estimation (maximum likelihood estimators and confidence regions) of the parameters and of the hazard function itself, testing of hypotheses such as increasing failure rate, and checking of the model assumptions. 2 - Methods: PHAZE assumes that the failures of a component follow a time-dependent (or non-homogenous) Poisson process and that the failure counts in non-overlapping time intervals are independent. Implicit in the independence property is the assumption that the component is restored to service immediately after any failure, with negligible repair time. The failures of one component are assumed to be independent of those of another component; a proportional hazards model is used. Data for a component are called time censored if the component is observed for a fixed time-period, or plant records covering a fixed time-period are examined, and the failure times are recorded. The number of these failures is random. Data are called failure censored if the component is kept in service until a predetermined number of failures has occurred, at which time the component is removed from service. In this case, the number of failures is fixed, but the end of the observation period equals the final failure time and is random. A typical PHAZE session consists of reading failure data from a file prepared previously, selecting one of the three models, and performing data analysis (i.e., performing the usual statistical inference about the parameters of the model, with special emphasis on the parameter(s) that determine whether the hazard function is increasing). The final goals of the inference are a point estimate

  18. Theory of fluctuations and parametric noise in a point nuclear reactor model

    International Nuclear Information System (INIS)

    Rodriguez, M.A.; San Miguel, M.; Sancho, J.M.

    1984-01-01

    We present a joint description of internal fluctuations and parametric noise in a point nuclear reactor model in which delayed neutrons and a detector are considered. We obtain kinetic equations for the first moments and define effective kinetic parameters which take into account the effect of parametric Gaussian white noise. We comment on the validity of Langevin approximations for this problem. We propose a general method to deal with weak but otherwise arbitrary non-white parametric noise. Exact kinetic equations are derived for Gaussian non-white noise. (author)

  19. Parametric Linear Dynamic Logic

    Directory of Open Access Journals (Sweden)

    Peter Faymonville

    2014-08-01

    Full Text Available We introduce Parametric Linear Dynamic Logic (PLDL, which extends Linear Dynamic Logic (LDL by temporal operators equipped with parameters that bound their scope. LDL was proposed as an extension of Linear Temporal Logic (LTL that is able to express all ω-regular specifications while still maintaining many of LTL's desirable properties like an intuitive syntax and a translation into non-deterministic Büchi automata of exponential size. But LDL lacks capabilities to express timing constraints. By adding parameterized operators to LDL, we obtain a logic that is able to express all ω-regular properties and that subsumes parameterized extensions of LTL like Parametric LTL and PROMPT-LTL. Our main technical contribution is a translation of PLDL formulas into non-deterministic Büchi word automata of exponential size via alternating automata. This yields a PSPACE model checking algorithm and a realizability algorithm with doubly-exponential running time. Furthermore, we give tight upper and lower bounds on optimal parameter values for both problems. These results show that PLDL model checking and realizability are not harder than LTL model checking and realizability.

  20. Field-scale sensitivity of vegetation discrimination to hyperspectral reflectance and coupled statistics

    DEFF Research Database (Denmark)

    Manevski, Kiril; Jabloun, Mohamed; Gupta, Manika

    2016-01-01

    a more powerful input to a nonparametric analysis for discrimination at the field scale, when compared with unaltered reflectance and parametric analysis. However, the discrimination outputs interact and are very sensitive to the number of observations - an important implication for the design......Remote sensing of land covers utilizes an increasing number of methods for spectral reflectance processing and its accompanying statistics to discriminate between the covers’ spectral signatures at various scales. To this end, the present chapter deals with the field-scale sensitivity...... of the vegetation spectral discrimination to the most common types of reflectance (unaltered and continuum-removed) and statistical tests (parametric and nonparametric analysis of variance). It is divided into two distinct parts. The first part summarizes the current knowledge in relation to vegetation...

  1. Nonparametric tests for equality of psychometric functions.

    Science.gov (United States)

    García-Pérez, Miguel A; Núñez-Antón, Vicente

    2017-12-07

    Many empirical studies measure psychometric functions (curves describing how observers' performance varies with stimulus magnitude) because these functions capture the effects of experimental conditions. To assess these effects, parametric curves are often fitted to the data and comparisons are carried out by testing for equality of mean parameter estimates across conditions. This approach is parametric and, thus, vulnerable to violations of the implied assumptions. Furthermore, testing for equality of means of parameters may be misleading: Psychometric functions may vary meaningfully across conditions on an observer-by-observer basis with no effect on the mean values of the estimated parameters. Alternative approaches to assess equality of psychometric functions per se are thus needed. This paper compares three nonparametric tests that are applicable in all situations of interest: The existing generalized Mantel-Haenszel test, a generalization of the Berry-Mielke test that was developed here, and a split variant of the generalized Mantel-Haenszel test also developed here. Their statistical properties (accuracy and power) are studied via simulation and the results show that all tests are indistinguishable as to accuracy but they differ non-uniformly as to power. Empirical use of the tests is illustrated via analyses of published data sets and practical recommendations are given. The computer code in MATLAB and R to conduct these tests is available as Electronic Supplemental Material.

  2. Non-process instrumentation surveillance and test reduction

    International Nuclear Information System (INIS)

    Ferrell, R.; LeDonne, V.; Donat, T.; Thomson, I.; Sarlitto, M.

    1993-12-01

    Analysis of operating experience, instrument failure modes, and degraded instrument performance has led to a reduction in Technical Specification surveillance and test requirements for nuclear power plant process instrumentation. These changes have resulted in lower plant operations and maintenance (O ampersand M) labor costs. This report explores the possibility of realizing similar savings by reducing requirements for non-process instrumentation. The project team reviewed generic Technical Specifications for the four major US nuclear steam supply system (NSSS) vendors (Westinghouse, General Electric, Combustion Engineering, and Babcock ampersand Wilcox) to identify nonprocess instrumentation for which surveillance/test requirements could be reduced. The team surveyed 10 utilities to identify specific non-process instrumentation at their plants for which requirements could be reduced. The team evaluated utility analytic approaches used to justify changes in surveillance/test requirements for process equipment to determine their applicability to non-process instrumentation. The report presents a prioritized list of non-process instrumentation systems suitable for surveillance/test requirements reduction. The top three systems in the list are vibration monitors, leak detection monitors, and chemistry monitors. In general, most non-process instrumentation governed by Technical Specification requirements are candidates for requirements reduction. If statistical requirements are somewhat relaxed, the analytic approaches previously used to reduce requirements for process instrumentation can be applied to non-process instrumentation. The report identifies as viable the technical approaches developed and successfully used by Southern California Edison, Arizona Public Service, and Boston Edison

  3. Effect of an external alternating electric field non-monochromaticity on parametric excitation of surface ion cyclotron X-modes

    International Nuclear Information System (INIS)

    Girka, V O; Puzyrkov, S Yu; Shpagina, V O; Shpagina, L O

    2012-01-01

    The application of an external alternating electric field in the range of ion cyclotron frequencies is a well-known method for the excitation of surface electromagnetic waves. The present paper is devoted to the development of a kinetic theory of parametric excitation of these eigenwaves propagating across an external steady magnetic field along the plasma boundary at the second harmonic of the ion cyclotron frequency. Unlike previous papers on this subject, parametric excitation of surface ion cyclotron X-modes is studied here under the condition of non-monochromaticity of an external alternating electric field. Non-monochromaticity of the external alternating electric field is modeled by the superposition of two uniform and monochromatic electric fields with different amplitudes and frequencies. The nonlinear boundary condition is formulated for a tangential magnetic field of the studied surface waves. An infinite set of equations for the harmonics of a tangential electric field is solved using the approximation of the wave packet consisting of the main harmonic and two nearest satellite harmonics. Two different regimes of instability have been considered. If one of the applied generators has an operation frequency that is close to the ion cyclotron frequency, then changing the amplitude of the second generator allows one to enhance the growth rate of the parametric instability or to diminish it. But if the operation frequencies of the both generators are not close to the ion cyclotron frequency, then changing the amplitudes of their fields allows one to decrease the growth rate of the instability and even to suppress its development. The problem is studied both analytically and numerically.

  4. EFFECTS OF PARAMETRIC VARIATIONS ON SEISMIC ANALYSIS METHODS FOR NON-CLASSICALLY DAMPED COUPLED SYSTEMS

    International Nuclear Information System (INIS)

    XU, J.; DEGRASSI, G.

    2000-01-01

    A comprehensive benchmark program was developed by Brookhaven National Laboratory (BNL) to perform an evaluation of state-of-the-art methods and computer programs for performing seismic analyses of coupled systems with non-classical damping. The program, which was sponsored by the US Nuclear Regulatory Commission (NRC), was designed to address various aspects of application and limitations of these state-of-the-art analysis methods to typical coupled nuclear power plant (NPP) structures with non-classical damping, and was carried out through analyses of a set of representative benchmark problems. One objective was to examine the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled systems. The examination was performed using parametric variations for three simple benchmark models. This paper presents the comparisons and evaluation of the program participants' results to the BNL exact solutions for the applicable ranges of modeling dynamic characteristic parameters

  5. Fast, Sequence Adaptive Parcellation of Brain MR Using Parametric Models

    DEFF Research Database (Denmark)

    Puonti, Oula; Iglesias, Juan Eugenio; Van Leemput, Koen

    2013-01-01

    In this paper we propose a method for whole brain parcellation using the type of generative parametric models typically used in tissue classification. Compared to the non-parametric, multi-atlas segmentation techniques that have become popular in recent years, our method obtains state-of-the-art ...

  6. Modelación de episodios críticos de contaminación por material particulado (PM10 en Santiago de Chile: Comparación de la eficiencia predictiva de los modelos paramétricos y no paramétricos Modeling critical episodes of air pollution by PM10 in Santiago, Chile: Comparison of the predictive efficiency of parametric and non-parametric statistical models

    Directory of Open Access Journals (Sweden)

    Sergio A. Alvarado

    2010-12-01

    Full Text Available Objetivo: Evaluar la eficiencia predictiva de modelos estadísticos paramétricos y no paramétricos para predecir episodios críticos de contaminación por material particulado PM10 del día siguiente, que superen en Santiago de Chile la norma de calidad diaria. Una predicción adecuada de tales episodios permite a la autoridad decretar medidas restrictivas que aminoren la gravedad del episodio, y consecuentemente proteger la salud de la comunidad. Método: Se trabajó con las concentraciones de material particulado PM10 registradas en una estación asociada a la red de monitorización de la calidad del aire MACAM-2, considerando 152 observaciones diarias de 14 variables, y con información meteorológica registrada durante los años 2001 a 2004. Se ajustaron modelos estadísticos paramétricos Gamma usando el paquete estadístico STATA v11, y no paramétricos usando una demo del software estadístico MARS v 2.0 distribuida por Salford-Systems. Resultados: Ambos métodos de modelación presentan una alta correlación entre los valores observados y los predichos. Los modelos Gamma presentan mejores aciertos que MARS para las concentraciones de PM10 con valores Objective: To evaluate the predictive efficiency of two statistical models (one parametric and the other non-parametric to predict critical episodes of air pollution exceeding daily air quality standards in Santiago, Chile by using the next day PM10 maximum 24h value. Accurate prediction of such episodes would allow restrictive measures to be applied by health authorities to reduce their seriousness and protect the community´s health. Methods: We used the PM10 concentrations registered by a station of the Air Quality Monitoring Network (152 daily observations of 14 variables and meteorological information gathered from 2001 to 2004. To construct predictive models, we fitted a parametric Gamma model using STATA v11 software and a non-parametric MARS model by using a demo version of Salford

  7. PHYSICS OF NON-GAUSSIAN FIELDS AND THE COSMOLOGICAL GENUS STATISTIC

    International Nuclear Information System (INIS)

    James, J. Berian

    2012-01-01

    We report a technique to calculate the impact of distinct physical processes inducing non-Gaussianity on the cosmological density field. A natural decomposition of the cosmic genus statistic into an orthogonal polynomial sequence allows complete expression of the scale-dependent evolution of the topology of large-scale structure, in which effects including galaxy bias, nonlinear gravitational evolution, and primordial non-Gaussianity may be delineated. The relationship of this decomposition to previous methods for analyzing the genus statistic is briefly considered and the following applications are made: (1) the expression of certain systematics affecting topological measurements, (2) the quantification of broad deformations from Gaussianity that appear in the genus statistic as measured in the Horizon Run simulation, and (3) the study of the evolution of the genus curve for simulations with primordial non-Gaussianity. These advances improve the treatment of flux-limited galaxy catalogs for use with this measurement and further the use of the genus statistic as a tool for exploring non-Gaussianity.

  8. Phase locking and quantum statistics in a parametrically driven nonlinear resonator

    OpenAIRE

    Hovsepyan, G. H.; Shahinyan, A. R.; Chew, Lock Yue; Kryuchkyan, G. Yu.

    2016-01-01

    We discuss phase-locking phenomena at low-level of quanta for parametrically driven nonlinear Kerr resonator (PDNR) in strong quantum regime. Oscillatory mode of PDNR is created in the process of a degenerate down-conversion of photons under interaction with a train of external Gaussian pulses. We calculate the Wigner functions of cavity mode showing two-fold symmetry in phase space and analyse formation of phase-locked states in the regular as well as the quantum chaotic regime.

  9. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    International Nuclear Information System (INIS)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David; Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Buchhave, Lars A.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  10. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C. [Astronomy Department, University of Florida, 211 Bryant Space Sciences Center, Gainesville, FL 32611 (United States); Fabrycky, Daniel C. [UCO/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States); Steffen, Jason H. [Fermilab Center for Particle Astrophysics, P.O. Box 500, MS 127, Batavia, IL 60510 (United States); Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Welsh, William F. [Astronomy Department, San Diego State University, San Diego, CA 92182-1221 (United States); Allen, Christopher [Orbital Sciences Corporation/NASA Ames Research Center, Moffett Field, CA 94035 (United States); Batalha, Natalie M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192 (United States); Buchhave, Lars A., E-mail: eford@astro.ufl.edu [Niels Bohr Institute, Copenhagen University, DK-2100 Copenhagen (Denmark); Collaboration: Kepler Science Team; and others

    2012-05-10

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  11. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  12. Parametric analyses of summative scores may lead to conflicting inferences when comparing groups: A simulation study.

    Science.gov (United States)

    Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S

    2015-04-01

    To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.

  13. Parametric statistical inference for discretely observed diffusion processes

    DEFF Research Database (Denmark)

    Pedersen, Asger Roer

    Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology......Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology...

  14. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  15. Analysis of Preference Data Using Intermediate Test Statistic Abstract

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... West African Journal of Industrial and Academic Research Vol.7 No. 1 June ... Keywords:-Preference data, Friedman statistic, multinomial test statistic, intermediate test statistic. ... new method and consequently a new statistic ...

  16. New Graphical Methods and Test Statistics for Testing Composite Normality

    Directory of Open Access Journals (Sweden)

    Marc S. Paolella

    2015-07-01

    Full Text Available Several graphical methods for testing univariate composite normality from an i.i.d. sample are presented. They are endowed with correct simultaneous error bounds and yield size-correct tests. As all are based on the empirical CDF, they are also consistent for all alternatives. For one test, called the modified stabilized probability test, or MSP, a highly simplified computational method is derived, which delivers the test statistic and also a highly accurate p-value approximation, essentially instantaneously. The MSP test is demonstrated to have higher power against asymmetric alternatives than the well-known and powerful Jarque-Bera test. A further size-correct test, based on combining two test statistics, is shown to have yet higher power. The methodology employed is fully general and can be applied to any i.i.d. univariate continuous distribution setting.

  17. Cerebral Blood Flow Changes after Shunt in Hydrocephalus after Aneurysmal Subarachnoid Hemorrhage: Analysis by statistical Parametric Mapping

    International Nuclear Information System (INIS)

    Hyun, I. Y.; Choi, W. S.; Pak, H. S.

    2003-01-01

    The purpose of this study was to evaluate the changes of regional cerebral blood flow (rCBF) after shunt operation in patients with hydrocephalus after aneurysmal subarachnoid hemorrhage ba statistical parametric mapping (SPM). Seven patients (4 male, mean age 54 years) with hydrocephalus after aneurysmal subarachnoid hemorrhage underwent a shunt operation. Tc-99m HMPAO SPECT was performed within I week before, and 2 weeks after the shunt operation. All of the SPECT images were spatially transformed to standard space, smoothed, and globally normalized. After spatial and count normalization, rCBF of pre- and post- shunting Tc- 99m HMPAO SPECT was estimated at every voxel using t statistics. The voxels with a P value of less than 0.001 were considered to be significantly different. The shunt operation was effective in all patients. Pre-shunting Tc-99m HMPAO SPECT showed hypoperfusion, predominantly in the periventricular area. After shunt operation, periventricular low perfusion was disappeared. The results of this study show that periventricular CBF is impaired in hydrocephalus after aneurysmal subarachnoid hemorrhage. Significant increase of periventricular CBF after shunt operation suggests the evaluation of periventricular CBF by SPM might be of value for the prediction of shunt effectiveness in hydrocephalus

  18. Cerebral Blood Flow Changes after Shunt in Hydrocephalus after Aneurysmal Subarachnoid Hemorrhage: Analysis by statistical Parametric Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, I. Y.; Choi, W. S.; Pak, H. S. [College of Medicine, Univ. of Inhwa, Incheon (Korea, Republic of)

    2003-07-01

    The purpose of this study was to evaluate the changes of regional cerebral blood flow (rCBF) after shunt operation in patients with hydrocephalus after aneurysmal subarachnoid hemorrhage ba statistical parametric mapping (SPM). Seven patients (4 male, mean age 54 years) with hydrocephalus after aneurysmal subarachnoid hemorrhage underwent a shunt operation. Tc-99m HMPAO SPECT was performed within I week before, and 2 weeks after the shunt operation. All of the SPECT images were spatially transformed to standard space, smoothed, and globally normalized. After spatial and count normalization, rCBF of pre- and post- shunting Tc- 99m HMPAO SPECT was estimated at every voxel using t statistics. The voxels with a P value of less than 0.001 were considered to be significantly different. The shunt operation was effective in all patients. Pre-shunting Tc-99m HMPAO SPECT showed hypoperfusion, predominantly in the periventricular area. After shunt operation, periventricular low perfusion was disappeared. The results of this study show that periventricular CBF is impaired in hydrocephalus after aneurysmal subarachnoid hemorrhage. Significant increase of periventricular CBF after shunt operation suggests the evaluation of periventricular CBF by SPM might be of value for the prediction of shunt effectiveness in hydrocephalus.

  19. Statistical Analysis of the Polarimetric Cloud Analysis and Seeding Test (POLCAST) Field Projects

    Science.gov (United States)

    Ekness, Jamie Lynn

    The North Dakota farming industry brings in more than $4.1 billion annually in cash receipts. Unfortunately, agriculture sales vary significantly from year to year, which is due in large part to weather events such as hail storms and droughts. One method to mitigate drought is to use hygroscopic seeding to increase the precipitation efficiency of clouds. The North Dakota Atmospheric Research Board (NDARB) sponsored the Polarimetric Cloud Analysis and Seeding Test (POLCAST) research project to determine the effectiveness of hygroscopic seeding in North Dakota. The POLCAST field projects obtained airborne and radar observations, while conducting randomized cloud seeding. The Thunderstorm Identification Tracking and Nowcasting (TITAN) program is used to analyze radar data (33 usable cases) in determining differences in the duration of the storm, rain rate and total rain amount between seeded and non-seeded clouds. The single ratio of seeded to non-seeded cases is 1.56 (0.28 mm/0.18 mm) or 56% increase for the average hourly rainfall during the first 60 minutes after target selection. A seeding effect is indicated with the lifetime of the storms increasing by 41 % between seeded and non-seeded clouds for the first 60 minutes past seeding decision. A double ratio statistic, a comparison of radar derived rain amount of the last 40 minutes of a case (seed/non-seed), compared to the first 20 minutes (seed/non-seed), is used to account for the natural variability of the cloud system and gives a double ratio of 1.85. The Mann-Whitney test on the double ratio of seeded to non-seeded cases (33 cases) gives a significance (p-value) of 0.063. Bootstrapping analysis of the POLCAST set indicates that 50 cases would provide statistically significant results based on the Mann-Whitney test of the double ratio. All the statistical analysis conducted on the POLCAST data set show that hygroscopic seeding in North Dakota does increase precipitation. While an additional POLCAST field

  20. Statistical Analysis of Environmental Tritium around Wolsong Site

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ju Youl [FNC Technology Co., Yongin (Korea, Republic of)

    2010-04-15

    To find the relationship among airborne tritium, tritium in rainwater, TFWT (Tissue Free Water Tritium) and TBT (Tissue Bound Tritium), statistical analysis is conducted based on tritium data measured at KHNP employees' house around Wolsong nuclear power plants during 10 years from 1999 to 2008. The results show that tritium in such media exhibits a strong seasonal and annual periodicity. Tritium concentration in rainwater is observed to be highly correlated with TFWT and directly transmitted to TFWT without delay. The response of environmental radioactivity of tritium around Wolsong site is analyzed using time-series technique and non-parametric trend analysis. Tritium in the atmosphere and rainwater is strongly auto-correlated by seasonal and annual periodicity. TFWT concentration in pine needle is proven to be more sensitive to rainfall phenomenon than other weather variables. Non-parametric trend analysis of TFWT concentration within pine needle shows a increasing slope in terms of confidence level of 95%. This study demonstrates a usefulness of time-series and trend analysis for the interpretation of environmental radioactivity relationship with various environmental media.

  1. Statistical methods for determination of background levels for naturally occuring radionuclides in soil at a RCRA facility

    International Nuclear Information System (INIS)

    Guha, S.; Taylor, J.H.

    1996-01-01

    It is critical that summary statistics on background data, or background levels, be computed based on standardized and defensible statistical methods because background levels are frequently used in subsequent analyses and comparisons performed by separate analysts over time. The final background for naturally occurring radionuclide concentrations in soil at a RCRA facility, and the associated statistical methods used to estimate these concentrations, are presented. The primary objective is to describe, via a case study, the statistical methods used to estimate 95% upper tolerance limits (UTL) on radionuclide background soil data sets. A 95% UTL on background samples can be used as a screening level concentration in the absence of definitive soil cleanup criteria for naturally occurring radionuclides. The statistical methods are based exclusively on EPA guidance. This paper includes an introduction, a discussion of the analytical results for the radionuclides and a detailed description of the statistical analyses leading to the determination of 95% UTLs. Soil concentrations reported are based on validated data. Data sets are categorized as surficial soil; samples collected at depths from zero to one-half foot; and deep soil, samples collected from 3 to 5 feet. These data sets were tested for statistical outliers and underlying distributions were determined by using the chi-squared test for goodness-of-fit. UTLs for the data sets were then computed based on the percentage of non-detects and the appropriate best-fit distribution (lognormal, normal, or non-parametric). For data sets containing greater than approximately 50% nondetects, nonparametric UTLs were computed

  2. Using Spline Regression in Semi-Parametric Stochastic Frontier Analysis: An Application to Polish Dairy Farms

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    of specifying an unsuitable functional form and thus, model misspecification and biased parameter estimates. Given these problems of the DEA and the SFA, Fan, Li and Weersink (1996) proposed a semi-parametric stochastic frontier model that estimates the production function (frontier) by non......), Kumbhakar et al. (2007), and Henningsen and Kumbhakar (2009). The aim of this paper and its main contribution to the existing literature is the estimation semi-parametric stochastic frontier models using a different non-parametric estimation technique: spline regression (Ma et al. 2011). We apply...... efficiency of Polish dairy farms contributes to the insight into this dynamic process. Furthermore, we compare and evaluate the results of this spline-based semi-parametric stochastic frontier model with results of other semi-parametric stochastic frontier models and of traditional parametric stochastic...

  3. Acceleration of the direct reconstruction of linear parametric images using nested algorithms

    International Nuclear Information System (INIS)

    Wang Guobao; Qi Jinyi

    2010-01-01

    Parametric imaging using dynamic positron emission tomography (PET) provides important information for biological research and clinical diagnosis. Indirect and direct methods have been developed for reconstructing linear parametric images from dynamic PET data. Indirect methods are relatively simple and easy to implement because the image reconstruction and kinetic modeling are performed in two separate steps. Direct methods estimate parametric images directly from raw PET data and are statistically more efficient. However, the convergence rate of direct algorithms can be slow due to the coupling between the reconstruction and kinetic modeling. Here we present two fast gradient-type algorithms for direct reconstruction of linear parametric images. The new algorithms decouple the reconstruction and linear parametric modeling at each iteration by employing the principle of optimization transfer. Convergence speed is accelerated by running more sub-iterations of linear parametric estimation because the computation cost of the linear parametric modeling is much less than that of the image reconstruction. Computer simulation studies demonstrated that the new algorithms converge much faster than the traditional expectation maximization (EM) and the preconditioned conjugate gradient algorithms for dynamic PET.

  4. Stellar parametrization from Gaia RVS spectra

    Science.gov (United States)

    Recio-Blanco, A.; de Laverny, P.; Allende Prieto, C.; Fustes, D.; Manteiga, M.; Arcay, B.; Bijaoui, A.; Dafonte, C.; Ordenovic, C.; Ordoñez Blanco, D.

    2016-01-01

    Context. Among the myriad of data collected by the ESA Gaia satellite, about 150 million spectra will be delivered by the Radial Velocity Spectrometer (RVS) for stars as faint as GRVS~ 16. A specific stellar parametrization will be performed on most of these RVS spectra, I.e. those with enough high signal-to-noise ratio (S/N), which should correspond to single stars that have a magnitude in the RVS band brighter than ~14.5. Some individual chemical abundances will also be estimated for the brightest targets. Aims: We describe the different parametrization codes that have been specifically developed or adapted for RVS spectra within the GSP-Spec working group of the analysis consortium. The tested codes are based on optimisation (FERRE and GAUGUIN), projection (MATISSE), or pattern-recognition methods (Artificial Neural Networks). We present and discuss each of their expected performances in the recovered stellar atmospheric parameters (effective temperature, surface gravity, overall metallicity) for B- to K-type stars. The performances for determining of [α/Fe] ratios are also presented for cool stars. Methods: Each code has been homogeneously tested with a large grid of RVS simulated synthetic spectra of BAFGK-spectral types (dwarfs and giants), with metallicities varying from 10-2.5 to 10+ 0.5 the solar metallicity, and taking variations of ±0.4 dex in the composition of the α-elements into consideration. The tests were performed for S/N ranging from ten to 350. Results: For all the stellar types we considered, stars brighter than GRVS~ 12.5 are very efficiently parametrized by the GSP-Spec pipeline, including reliable estimations of [α/Fe]. Typical internal errors for FGK metal-rich and metal-intermediate stars are around 40 K in Teff, 0.10 dex in log(g), 0.04 dex in [M/H], and 0.03 dex in [α/Fe] at GRVS = 10.3. They degrade to 155 K in Teff, 0.15 dex in log(g), 0.10 dex in [M/H], and 0.1 dex in [α/Fe] at GRVS~ 12. Similar accuracies in Teff and [M/H] are

  5. 2015 International Symposium in Statistics

    CERN Document Server

    2016-01-01

    This proceedings volume contains eight selected papers that were presented in the International Symposium in Statistics (ISS) 2015 On Advances in Parametric and Semi-parametric Analysis of Multivariate, Time Series, Spatial-temporal, and Familial-longitudinal Data, held in St. John’s, Canada from July 6 to 8, 2015. The main objective of the ISS-2015 was the discussion on advances and challenges in parametric and semi-parametric analysis for correlated data in both continuous and discrete setups. Thus, as a reflection of the theme of the symposium, the eight papers of this proceedings volume are presented in four parts. Part I is comprised of papers examining Elliptical t Distribution Theory. In Part II, the papers cover spatial and temporal data analysis. Part III is focused on longitudinal multinomial models in parametric and semi-parametric setups. Finally Part IV concludes with a paper on the inferences for longitudinal data subject to a challenge of important covariates selection from a set of large num...

  6. Different uptake of 99mTc-ECD adn 99mTc-HMPAO in the same brains: analysis by statistical parametric mapping.

    Science.gov (United States)

    Hyun, Y; Lee, J S; Rha, J H; Lee, I K; Ha, C K; Lee, D S

    2001-02-01

    The purpose of this study was to investigate the differences between technetium-99m ethyl cysteinate dimer (99mTc-ECD) and technetium-99m hexamethylpropylene amine oxime (99mTc-HMPAO) uptake in the same brains by means of statistical parametric mapping (SPM) analysis. We examined 20 patients (9 male, 11 female, mean age 62+/-12 years) using 99mTc-ECD and 99mTc-HMPAO single-photon emission tomography (SPET) and magnetic resonance imaging (MRI) of the brain less than 7 days after onset of stroke. MRI showed no cortical infarctions. Infarctions in the pons (6 patients) and medulla (1), ischaemic periventricular white matter lesions (13) and lacunar infarction (7) were found on MRI. Split-dose and sequential SPET techniques were used for 99mTc-ECD and 99mTc-HMPAO brain SPET, without repositioning of the patient. All of the SPET images were spatially transformed to standard space, smoothed and globally normalized. The differences between the 99mTc-ECD and 99mTc-HMPAO SPET images were statistically analysed using statistical parametric mapping (SPM) 96 software. The difference between two groups was considered significant at a threshold of uncorrected P values less than 0.01. Visual analysis showed no hypoperfused areas on either 99mTc-ECD or 99mTc-HMPAO SPET images. SPM analysis revealed significantly different uptake of 99mTc-ECD and 99mTc-HMPAO in the same brains. On the 99mTc-ECD SPET images, relatively higher uptake was observed in the frontal, parietal and occipital lobes, in the left superior temporal lobe and in the superior region of the cerebellum. On the 99mTc-HMPAO SPET images, relatively higher uptake was observed in the medial temporal lobes, thalami, periventricular white matter and brain stem. These differences in uptake of the two tracers in the same brains on SPM analysis suggest that interpretation of cerebral perfusion is possible using SPET with 99mTc-ECD and 99mTc-HMPAO.

  7. Optimized statistical parametric mapping procedure for NIRS data contaminated by motion artifacts : Neurometric analysis of body schema extension.

    Science.gov (United States)

    Suzuki, Satoshi

    2017-09-01

    This study investigated the spatial distribution of brain activity on body schema (BS) modification induced by natural body motion using two versions of a hand-tracing task. In Task 1, participants traced Japanese Hiragana characters using the right forefinger, requiring no BS expansion. In Task 2, participants performed the tracing task with a long stick, requiring BS expansion. Spatial distribution was analyzed using general linear model (GLM)-based statistical parametric mapping of near-infrared spectroscopy data contaminated with motion artifacts caused by the hand-tracing task. Three methods were utilized in series to counter the artifacts, and optimal conditions and modifications were investigated: a model-free method (Step 1), a convolution matrix method (Step 2), and a boxcar-function-based Gaussian convolution method (Step 3). The results revealed four methodological findings: (1) Deoxyhemoglobin was suitable for the GLM because both Akaike information criterion and the variance against the averaged hemodynamic response function were smaller than for other signals, (2) a high-pass filter with a cutoff frequency of .014 Hz was effective, (3) the hemodynamic response function computed from a Gaussian kernel function and its first- and second-derivative terms should be included in the GLM model, and (4) correction of non-autocorrelation and use of effective degrees of freedom were critical. Investigating z-maps computed according to these guidelines revealed that contiguous areas of BA7-BA40-BA21 in the right hemisphere became significantly activated ([Formula: see text], [Formula: see text], and [Formula: see text], respectively) during BS modification while performing the hand-tracing task.

  8. Modified Distribution-Free Goodness-of-Fit Test Statistic.

    Science.gov (United States)

    Chun, So Yeon; Browne, Michael W; Shapiro, Alexander

    2018-03-01

    Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.

  9. A NEW TEST OF THE STATISTICAL NATURE OF THE BRIGHTEST CLUSTER GALAXIES

    International Nuclear Information System (INIS)

    Lin, Yen-Ting; Ostriker, Jeremiah P.; Miller, Christopher J.

    2010-01-01

    A novel statistic is proposed to examine the hypothesis that all cluster galaxies are drawn from the same luminosity distribution (LD). In such a 'statistical model' of galaxy LD, the brightest cluster galaxies (BCGs) are simply the statistical extreme of the galaxy population. Using a large sample of nearby clusters, we show that BCGs in high luminosity clusters (e.g., L tot ∼> 4 x 10 11 h -2 70 L sun ) are unlikely (probability ≤3 x 10 -4 ) to be drawn from the LD defined by all red cluster galaxies more luminous than M r = -20. On the other hand, BCGs in less luminous clusters are consistent with being the statistical extreme. Applying our method to the second brightest galaxies, we show that they are consistent with being the statistical extreme, which implies that the BCGs are also distinct from non-BCG luminous, red, cluster galaxies. We point out some issues with the interpretation of the classical tests proposed by Tremaine and Richstone (TR) that are designed to examine the statistical nature of BCGs, investigate the robustness of both our statistical test and those of TR against difficulties in photometry of galaxies of large angular size, and discuss the implication of our findings on surveys that use the luminous red galaxies to measure the baryon acoustic oscillation features in the galaxy power spectrum.

  10. Statistical tests for the Gaussian nature of primordial fluctuations through CBR experiments

    International Nuclear Information System (INIS)

    Luo, X.

    1994-01-01

    Information about the physical processes that generate the primordial fluctuations in the early Universe can be gained by testing the Gaussian nature of the fluctuations through cosmic microwave background radiation (CBR) temperature anisotropy experiments. One of the crucial aspects of density perturbations that are produced by the standard inflation scenario is that they are Gaussian, whereas seeds produced by topological defects left over from an early cosmic phase transition tend to be non-Gaussian. To carry out this test, sophisticated statistical tools are required. In this paper, we will discuss several such statistical tools, including multivariant skewness and kurtosis, Euler-Poincare characteristics, the three-point temperature correlation function, and Hotelling's T 2 statistic defined through bispectral estimates of a one-dimensional data set. The effect of noise present in the current data is discussed in detail and the COBE 53 GHz data set is analyzed. Our analysis shows that, on the large angular scale to which COBE is sensitive, the statistics are probably Gaussian. On the small angular scales, the importance of Hotelling's T 2 statistic is stressed, and the minimum sample size required to test Gaussianity is estimated. Although the current data set available from various experiments at half-degree scales is still too small, improvement of the data set by roughly a factor of 2 will be enough to test the Gaussianity statistically. On the arc min scale, we analyze the recent RING data through bispectral analysis, and the result indicates possible deviation from Gaussianity. Effects of point sources are also discussed. It is pointed out that the Gaussianity problem can be resolved in the near future by ground-based or balloon-borne experiments

  11. Investigation of olfactory function in normal volunteers and patients with anosmia : analysis of brain perfusion SPECTs using statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Y. A.; Kim, S. H.; Sohn, H. S.; Chung, S. K. [Catholic University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    The purpose of this study was to investigate olfactory function with Tc-99m ECD brain perfusion SPECT using statistical parametric mapping (SPM) analysis in normal volunteers and patients with anosmia. The study populations were 8 subjects matched healthy volunteers and 16 subjects matched patients with anosmia. We obtaibed baseline and post-stimulation (3% butanol) brain perfusion SPECTs in the silent dark room. We analyzed the all SPECTs using SPM. The difference between two sets of brain perfusion SPECTs were compared with t-test. The voxels with p-value of less than 0.01 were considered to be significantly different. We demonstrated increased perfusion in the both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on post-stimulation brain SPECT in normal volunteers, and demonstrated decreased perfusion in the both cingulate gyri, right middle temporal gyrus, right rectal gyrus and both superior and inferior frontal gyri in the 10 patients with anosmia. No significant hypoperfusion area was observed in the other 6 patients with anosmia. The baseline and post-stimulation brain perfusion SPECTs can helpful in the evaluation of olfactory function and be useful in the diagnosis of anosmia.

  12. Investigation of olfactory function in normal volunteers and patients with anosmia : analysis of brain perfusion SPECTs using statistical parametric mapping

    International Nuclear Information System (INIS)

    Chung, Y. A.; Kim, S. H.; Sohn, H. S.; Chung, S. K.

    2002-01-01

    The purpose of this study was to investigate olfactory function with Tc-99m ECD brain perfusion SPECT using statistical parametric mapping (SPM) analysis in normal volunteers and patients with anosmia. The study populations were 8 subjects matched healthy volunteers and 16 subjects matched patients with anosmia. We obtaibed baseline and post-stimulation (3% butanol) brain perfusion SPECTs in the silent dark room. We analyzed the all SPECTs using SPM. The difference between two sets of brain perfusion SPECTs were compared with t-test. The voxels with p-value of less than 0.01 were considered to be significantly different. We demonstrated increased perfusion in the both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on post-stimulation brain SPECT in normal volunteers, and demonstrated decreased perfusion in the both cingulate gyri, right middle temporal gyrus, right rectal gyrus and both superior and inferior frontal gyri in the 10 patients with anosmia. No significant hypoperfusion area was observed in the other 6 patients with anosmia. The baseline and post-stimulation brain perfusion SPECTs can helpful in the evaluation of olfactory function and be useful in the diagnosis of anosmia

  13. Controlling flexible rotor vibrations using parametric excitation

    Energy Technology Data Exchange (ETDEWEB)

    Atepor, L, E-mail: katepor@yahoo.co [Department of Mechanical Engineering, University of Glasgow, G12 8QQ (United Kingdom)

    2009-08-01

    This paper presents both theoretical and experimental studies of an active vibration controller for vibration in a flexible rotor system. The paper shows that the vibration amplitude can be modified by introducing an axial parametric excitation. The perturbation method of multiple scales is used to solve the equations of motion. The steady-state responses, with and without the parametric excitation terms, is investigated. An experimental test machine uses a piezoelectric exciter mounted on the end of the shaft. The results show a reduction in the rotor response amplitude under principal parametric resonance, and some good correlation between theory and experiment.

  14. MRI volumetric measurement of hippocampal formation based on statistic parametric mapping

    International Nuclear Information System (INIS)

    Hua Jianming; Jiang Biao; Zhou Jiong; Zhang Weimin

    2010-01-01

    Objective: To study MRI volumetric measurement of hippocampal formation using statistic parametric mapping (SPM) software and to discuss the value of the method applied to Alzheimer's disease (AD). Methods: The SPM software was used to divide the three-dimensional MRI brain image into gray matter, white matter and CSF separately. The bilateral hippocampal formations in both AD group and normal control group were delineated and the volumes were measured. The SPM method was compared with conventional method based on region of interest (ROI), which was the gold standard of volume measurement. The time used in measuring the volume by these two methods were respectively recorded and compared by two independent samples't test. Moreover, 7 physicians measured the left hippocampal formation of one same control with both of the two methods. The frequency distribution and dispersion of data acquired with the two methods were evaluated using standard deviation coefficient. Results (1) The volume of the bilateral hippocampal formations with SPM method was (1.88 ± 0.07) cm 3 and (1.93 ± 0.08) cm 3 respectively in the AD group, while was (2.99 ± 0.07) cm 3 and (3.02 ± 0.06) cm 3 in the control group. The volume of bilateral hippocampal formations measured by ROI method was (1.87 ± 0.06) cm 3 and (1.91 ± 0.09) cm 3 in the AD group, while was (2.97 ± 0.08) cm 3 and (3.00 ± 0.05) cm 3 in the control group. There was no significant difference between SPM method and conventional ROI method in the AD group and the control group (t=1.500, 1.617, 1.095, 1.889, P>0.05). However, the time used for delineation and volume measurement was significantly different. The time used in SPM measurement was (38.1 ± 2.0) min, while that in ROI measurement was (55.4 ± 2.4) min (t=-25.918, P 3 respectively. The frequency distribution of hippocampal formation volume measured by SPM method and ROI method was different. The CV SPM was 7% and the CV ROI was 19%. Conclusions: The borders of

  15. [Do we always correctly interpret the results of statistical nonparametric tests].

    Science.gov (United States)

    Moczko, Jerzy A

    2014-01-01

    Mann-Whitney, Wilcoxon, Kruskal-Wallis and Friedman tests create a group of commonly used tests to analyze the results of clinical and laboratory data. These tests are considered to be extremely flexible and their asymptotic relative efficiency exceeds 95 percent. Compared with the corresponding parametric tests they do not require checking the fulfillment of the conditions such as the normality of data distribution, homogeneity of variance, the lack of correlation means and standard deviations, etc. They can be used both in the interval and or-dinal scales. The article presents an example Mann-Whitney test, that does not in any case the choice of these four nonparametric tests treated as a kind of gold standard leads to correct inference.

  16. Statistical time series methods for damage diagnosis in a scale aircraft skeleton structure: loosened bolts damage scenarios

    International Nuclear Information System (INIS)

    Kopsaftopoulos, Fotis P; Fassois, Spilios D

    2011-01-01

    A comparative assessment of several vibration based statistical time series methods for Structural Health Monitoring (SHM) is presented via their application to a scale aircraft skeleton laboratory structure. A brief overview of the methods, which are either scalar or vector type, non-parametric or parametric, and pertain to either the response-only or excitation-response cases, is provided. Damage diagnosis, including both the detection and identification subproblems, is tackled via scalar or vector vibration signals. The methods' effectiveness is assessed via repeated experiments under various damage scenarios, with each scenario corresponding to the loosening of one or more selected bolts. The results of the study confirm the 'global' damage detection capability and effectiveness of statistical time series methods for SHM.

  17. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo [Ajou University School of Medicine, Suwon (Korea, Republic of); Lee, Jae Sung [Seoul national University College of Medicine, Seoul (Korea, Republic of)

    2002-07-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5{+-}3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis.

  18. Comparison of normal adult and children brain SPECT imaging using statistical parametric mapping(SPM)

    International Nuclear Information System (INIS)

    Lee, Myoung Hoon; Yoon, Seok Nam; Joh, Chul Woo; Lee, Dong Soo; Lee, Jae Sung

    2002-01-01

    This study compared rCBF pattern in normal adult and normal children using statistical parametric mapping (SPM). The purpose of this study was to determine distribution pattern not seen visual analysis in both groups. Tc-99m ECD brain SPECT was performed in 12 normal adults (M:F=11:1, average age 35 year old) and 6 normal control children (M:F=4:2, 10.5±3.1y) who visited psychiatry clinic to evaluate ADHD. Their brain SPECT revealed normal rCBF pattern in visual analysis and they were diagnosed clinically normal. Using SPM method, we compared normal adult group's SPECT images with those of 6 normal children subjects and measured the extent of the area with significant hypoperfusion and hyperperfusion (p<0.001, extent threshold=16). The areas of both angnlar gyrus, both postcentral gyrus, both superior frontal gyrus, and both superior parietal lobe showed significant hyperperfusion in normal adult group compared with normal children group. The areas of left amygdala gyrus, brain stem, both cerebellum, left globus pallidus, both hippocampal formations, both parahippocampal gyrus, both thalamus, both uncus, both lateral and medial occipitotemporal gyrus revealed significantly hyperperfusion in the children. These results demonstrated that SPM can say more precise anatomical area difference not seen visual analysis

  19. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    Science.gov (United States)

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  20. Critical statistics for non-Hermitian matrices

    International Nuclear Information System (INIS)

    Garcia-Garcia, A.M.; Verbaarschot, J.J.M.; Nishigaki, S.M.

    2002-01-01

    We introduce a generalized ensemble of non-Hermitian matrices interpolating between the Gaussian Unitary Ensemble, the Ginibre ensemble, and the Poisson ensemble. The joint eigenvalue distribution of this model is obtained by means of an extension of the Itzykson-Zuber formula to general complex matrices. Its correlation functions are studied both in the case of weak non-Hermiticity and in the case of strong non-Hermiticity. In the weak non-Hermiticity limit we show that the spectral correlations in the bulk of the spectrum display critical statistics: the asymptotic linear behavior of the number variance is already approached for energy differences of the order of the eigenvalue spacing. To lowest order, its slope does not depend on the degree of non-Hermiticity. Close the edge, the spectral correlations are similar to the Hermitian case. In the strong non-Hermiticity limit the crossover behavior from the Ginibre ensemble to the Poisson ensemble first appears close to the surface of the spectrum. Our model may be relevant for the description of the spectral correlations of an open disordered system close to an Anderson transition

  1. Similar tests and the standardized log likelihood ratio statistic

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    1986-01-01

    When testing an affine hypothesis in an exponential family the 'ideal' procedure is to calculate the exact similar test, or an approximation to this, based on the conditional distribution given the minimal sufficient statistic under the null hypothesis. By contrast to this there is a 'primitive......' approach in which the marginal distribution of a test statistic considered and any nuisance parameter appearing in the test statistic is replaced by an estimate. We show here that when using standardized likelihood ratio statistics the 'primitive' procedure is in fact an 'ideal' procedure to order O(n -3...

  2. Two independent pivotal statistics that test location and misspecification and add-up to the Anderson-Rubin statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2002-01-01

    We extend the novel pivotal statistics for testing the parameters in the instrumental variables regression model. We show that these statistics result from a decomposition of the Anderson-Rubin statistic into two independent pivotal statistics. The first statistic is a score statistic that tests

  3. Statistical Multiplicity in Systematic Reviews of Anaesthesia Interventions: A Quantification and Comparison between Cochrane and Non-Cochrane Reviews

    DEFF Research Database (Denmark)

    Imberger, Georgina; Vejlby, Alexandra Hedvig Damgaard; Hansen, Sara Bohnstedt

    2011-01-01

    Systematic reviews with meta-analyses often contain many statistical tests. This multiplicity may increase the risk of type I error. Few attempts have been made to address the problem of statistical multiplicity in systematic reviews. Before the implications are properly considered, the size...... of systematic reviews and aimed to assess whether this quantity is different in Cochrane and non-Cochrane reviews....... of the issue deserves clarification. Because of the emphasis on bias evaluation and because of the editorial processes involved, Cochrane reviews may contain more multiplicity than their non-Cochrane counterparts. This study measured the quantity of statistical multiplicity present in a population...

  4. A Statistical Analysis of Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Stephen Chan

    2017-05-01

    Full Text Available We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization, of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, however, no single distribution fits well jointly to all the cryptocurrencies analysed. We find that for the most popular currencies, such as Bitcoin and Litecoin, the generalized hyperbolic distribution gives the best fit, while for the smaller cryptocurrencies the normal inverse Gaussian distribution, generalized t distribution, and Laplace distribution give good fits. The results are important for investment and risk management purposes.

  5. Altered glucose metabolism in juvenile myoclonic epilepsy: a PET study with statistical parametric mapping

    International Nuclear Information System (INIS)

    Lim, G. C.; Kim, J. H.; Kang, J. G.; Kim, J. S.; Yeo, J. S.; Lee, S. A.; Moon, D. H

    2004-01-01

    Juvenile myoclonic epilepsy (JME) is a hereditary, age-dependent epilepsy syndrome, characterized by myoclonic jerks on awakening and generalized tonic-clonic seizures. Although there have been considerable studies on the mechanism to elucidate pathogenesis of JME, the accurate pathogenesis of JME remains obscure. The aim of this study was to investigate alterations of cerebral glucose metabolism in patients with JME. We studied 16 JME patients (Mean age: 22 yrs, M/F: 9/7) with brain FDG-PET and simultaneous EEG recording. On the basis of the number of generalized spike-and-wave (GSW) discharges on the 30 min EEG recording after the injection of FDG (370MBq), we classified patients into two groups (patients in group A had 10 or more GSW and group B. 9 or less). We applied the automated and objective technique of statistical parametric mapping (SPM) to the analysis of FDG-PET to determine the significant hyper- and hypometabolic regions compared with those of 19 age matched normal control subjects. We found significant hypermetabolic regions in bilateral thalamus and central portion of upper brainstem in 16 patients with JME at a statistical threshold of uncorrected P < 0.05. These changes were also shown in group A (n=8), but not in group B (n=8). Additionally, we found significant hypometabolism in bilateral, widespread cortical regions in 16 patients with JME at a threshold of uncorrected P < 0.01. Similar hypometabolic patterns were also observed in both group A and group B, being more prominent in group A. This study provides evidence for the key role of the thalamus and brainstem reticular activating system in generating spontaneous GSW discharge, which is considered as a fundamental pathogenesis underlying JME. This study also suggests that patients with JME might suffer from subtle abnormalities of cognitive and executive cortical functions

  6. Altered glucose metabolism in juvenile myoclonic epilepsy: a PET study with statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lim, G. C.; Kim, J. H.; Kang, J. G.; Kim, J. S.; Yeo, J. S.; Lee, S. A.; Moon, D. H [Asan Medical Center, Seoul (Korea, Republic of)

    2004-07-01

    Juvenile myoclonic epilepsy (JME) is a hereditary, age-dependent epilepsy syndrome, characterized by myoclonic jerks on awakening and generalized tonic-clonic seizures. Although there have been considerable studies on the mechanism to elucidate pathogenesis of JME, the accurate pathogenesis of JME remains obscure. The aim of this study was to investigate alterations of cerebral glucose metabolism in patients with JME. We studied 16 JME patients (Mean age: 22 yrs, M/F: 9/7) with brain FDG-PET and simultaneous EEG recording. On the basis of the number of generalized spike-and-wave (GSW) discharges on the 30 min EEG recording after the injection of FDG (370MBq), we classified patients into two groups (patients in group A had 10 or more GSW and group B. 9 or less). We applied the automated and objective technique of statistical parametric mapping (SPM) to the analysis of FDG-PET to determine the significant hyper- and hypometabolic regions compared with those of 19 age matched normal control subjects. We found significant hypermetabolic regions in bilateral thalamus and central portion of upper brainstem in 16 patients with JME at a statistical threshold of uncorrected P < 0.05. These changes were also shown in group A (n=8), but not in group B (n=8). Additionally, we found significant hypometabolism in bilateral, widespread cortical regions in 16 patients with JME at a threshold of uncorrected P < 0.01. Similar hypometabolic patterns were also observed in both group A and group B, being more prominent in group A. This study provides evidence for the key role of the thalamus and brainstem reticular activating system in generating spontaneous GSW discharge, which is considered as a fundamental pathogenesis underlying JME. This study also suggests that patients with JME might suffer from subtle abnormalities of cognitive and executive cortical functions.

  7. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    Science.gov (United States)

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal

  8. Multi parametrical indicator test for urban wastewater influence

    Science.gov (United States)

    Humer, Franko; Weiss, Stefan; Reinnicke, Sandra; Clara, Manfred; Grath, Johannes; Windhofer, Georg

    2013-04-01

    Austria's drinking water is abstracted from groundwater. While 50 % of the Austrian population are supplied with spring water, the other 50 % get their drinking water from groundwater supplies, in part from enormous quaternary valley and basin deposits, subjected to intensive use by population, industry, agriculture and traffic/transport. Due to protected areas around drinking water wells and springs, there is no treatment necessary in most cases. Water bodies, however, can be affected by different pathways from natural, industrial and urban sources. Identification of anthropogenic sources is paramount for taking appropriate measures to safeguard the quality of drinking water supply. Common parameters like boron are widely used as tracers indicating anthropogenic impacts (e.g. wastewater contamination of groundwater systems). Unfortunately application of these conventional indicators is often limited due to high dilution. Another application where common parameters have their limits is the identification and quantification of the diffuse nitrogen input to water by the stable isotopes of nitrogen and oxygen in nitrate. Without any additional tracers the source distinction of nitrate from manure or waste water is still difficult. Even the application of boron isotopes can in some cases not avoid ambiguous interpretation. Therefore the Umweltbundesamt (Environment Agency Austria) developed a multi parametrical indicator test which shall allow for identification and quantification of anthropogenic pollutions. The test aims at analysing eight target substances which are well known to occur in wastewater: Acesulfame and sucralose (two artificial, calorie-free sweeteners), benzotriazole and tolyltriazole (two industrial chemicals/corrosion inhibitors), metoprolol, sotalol, carbamazepine and the metabolite 10,11-Dihydro-10,11-dihydroxycarbamazepin (pharmaceuticals). These substances are polar and degradation in the aquatic system by microbiological processes is not

  9. Non-parametric characterization of long-term rainfall time series

    Science.gov (United States)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  10. Caveats for using statistical significance tests in research assessments

    DEFF Research Database (Denmark)

    Schneider, Jesper Wiborg

    2013-01-01

    controversial and numerous criticisms have been leveled against their use. Based on examples from articles by proponents of the use statistical significance tests in research assessments, we address some of the numerous problems with such tests. The issues specifically discussed are the ritual practice......This article raises concerns about the advantages of using statistical significance tests in research assessments as has recently been suggested in the debate about proper normalization procedures for citation indicators by Opthof and Leydesdorff (2010). Statistical significance tests are highly...... argue that applying statistical significance tests and mechanically adhering to their results are highly problematic and detrimental to critical thinking. We claim that the use of such tests do not provide any advantages in relation to deciding whether differences between citation indicators...

  11. A Neural Parametric Singing Synthesizer Modeling Timbre and Expression from Natural Songs

    Directory of Open Access Journals (Sweden)

    Merlijn Blaauw

    2017-12-01

    Full Text Available We recently presented a new model for singing synthesis based on a modified version of the WaveNet architecture. Instead of modeling raw waveform, we model features produced by a parametric vocoder that separates the influence of pitch and timbre. This allows conveniently modifying pitch to match any target melody, facilitates training on more modest dataset sizes, and significantly reduces training and generation times. Nonetheless, compared to modeling waveform directly, ways of effectively handling higher-dimensional outputs, multiple feature streams and regularization become more important with our approach. In this work, we extend our proposed system to include additional components for predicting F0 and phonetic timings from a musical score with lyrics. These expression-related features are learned together with timbrical features from a single set of natural songs. We compare our method to existing statistical parametric, concatenative, and neural network-based approaches using quantitative metrics as well as listening tests.

  12. Teaching Statistics in Language Testing Courses

    Science.gov (United States)

    Brown, James Dean

    2013-01-01

    The purpose of this article is to examine the literature on teaching statistics for useful ideas that teachers of language testing courses can draw on and incorporate into their teaching toolkits as they see fit. To those ends, the article addresses eight questions: What is known generally about teaching statistics? Why are students so anxious…

  13. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  14. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  15. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  16. The choice of statistical methods for comparisons of dosimetric data in radiotherapy.

    Science.gov (United States)

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-09-18

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman's test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman's rank and Kendall's rank tests. The Friedman's test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density-corrected methods as compared to the reference method. Spearman's and Kendall's rank tests indicated a positive correlation between the doses calculated with the different methods

  17. A Novel Parametric Model For The Human Respiratory System

    Directory of Open Access Journals (Sweden)

    Clara Mihaela IONESCU

    2003-12-01

    Full Text Available The purpose of this work is to present some recent results in an ongoing research project between Ghent University and Chess Medical Technology Company Belgium. The overall aim of the project is to provide a fast method for identification of the human respiratory system in order to allow for an instantaneously diagnosis of the patient by the medical staff. A novel parametric model of the human respiratory system as well as the obtained experimental results is presented in this paper. A prototype apparatus developed by the company, based on the forced oscillation technique is used to record experimental data from 4 patients in this paper. Signal processing is based on spectral analysis and is followed by the parametric identification of a non-linear mechanistic model. The parametric model is equivalent to the structure of a simple electrical RLC-circuit, containing a non-linear capacitor. These parameters have a useful and easy-to-interpret physical meaning for the medical staff members.

  18. The applications of Complexity Theory and Tsallis Non-extensive Statistics at Solar Plasma Dynamics

    Science.gov (United States)

    Pavlos, George

    2015-04-01

    As the solar plasma lives far from equilibrium it is an excellent laboratory for testing complexity theory and non-equilibrium statistical mechanics. In this study, we present the highlights of complexity theory and Tsallis non extensive statistical mechanics as concerns their applications at solar plasma dynamics, especially at sunspot, solar flare and solar wind phenomena. Generally, when a physical system is driven far from equilibrium states some novel characteristics can be observed related to the nonlinear character of dynamics. Generally, the nonlinearity in space plasma dynamics can generate intermittent turbulence with the typical characteristics of the anomalous diffusion process and strange topologies of stochastic space plasma fields (velocity and magnetic fields) caused by the strange dynamics and strange kinetics (Zaslavsky, 2002). In addition, according to Zelenyi and Milovanov (2004) the complex character of the space plasma system includes the existence of non-equilibrium (quasi)-stationary states (NESS) having the topology of a percolating fractal set. The stabilization of a system near the NESS is perceived as a transition into a turbulent state determined by self-organization processes. The long-range correlation effects manifest themselves as a strange non-Gaussian behavior of kinetic processes near the NESS plasma state. The complex character of space plasma can also be described by the non-extensive statistical thermodynamics pioneered by Tsallis, which offers a consistent and effective theoretical framework, based on a generalization of Boltzmann - Gibbs (BG) entropy, to describe far from equilibrium nonlinear complex dynamics (Tsallis, 2009). In a series of recent papers, the hypothesis of Tsallis non-extensive statistics in magnetosphere, sunspot dynamics, solar flares, solar wind and space plasma in general, was tested and verified (Karakatsanis et al., 2013; Pavlos et al., 2014; 2015). Our study includes the analysis of solar plasma time

  19. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  20. A Non-Parametric Delphi Approach to Foster Innovation Policy Debate in Spain

    Directory of Open Access Journals (Sweden)

    Juan Carlos Salazar-Elena

    2016-05-01

    Full Text Available The aim of this paper is to identify some changes needed in Spain’s innovation policy to fill the gap between its innovation results and those of other European countries in lieu of sustainable leadership. To do this we apply the Delphi methodology to experts from academia, business, and government. To overcome the shortcomings of traditional descriptive methods, we develop an inferential analysis by following a non-parametric bootstrap method which enables us to identify important changes that should be implemented. Particularly interesting is the support found for improving the interconnections among the relevant agents of the innovation system (instead of focusing exclusively in the provision of knowledge and technological inputs through R and D activities, or the support found for “soft” policy instruments aimed at providing a homogeneous framework to assess the innovation capabilities of firms (e.g., for funding purposes. Attention to potential innovators among small and medium enterprises (SMEs and traditional industries is particularly encouraged by experts.

  1. Non-parametric adaptive importance sampling for the probability estimation of a launcher impact position

    International Nuclear Information System (INIS)

    Morio, Jerome

    2011-01-01

    Importance sampling (IS) is a useful simulation technique to estimate critical probability with a better accuracy than Monte Carlo methods. It consists in generating random weighted samples from an auxiliary distribution rather than the distribution of interest. The crucial part of this algorithm is the choice of an efficient auxiliary PDF that has to be able to simulate more rare random events. The optimisation of this auxiliary distribution is often in practice very difficult. In this article, we propose to approach the IS optimal auxiliary density with non-parametric adaptive importance sampling (NAIS). We apply this technique for the probability estimation of spatial launcher impact position since it has currently become a more and more important issue in the field of aeronautics.

  2. Investigation of olfactory function in normal volunteers by Tc-99m ECD Brain SPECT: Analysis using statistical parametric mapping

    International Nuclear Information System (INIS)

    Chung, Y.A.; Kim, S.H.; Park, Y.H.; Lee, S.Y.; Sohn, H.S.; Chung, S.K.

    2002-01-01

    The purpose of this study was to investigate olfactory function according to Tc-99m ECD uptake pattern in brain perfusion SPET of normal volunteer by means of statistical parametric mapping (SPM) analysis. The study population was 8 healthy volunteer subjects (M:F = 6:2, age range: 22-54 years, mean 34 years). We performed baseline brain perfusion SPET using 555 MBq of Tc-99m ECD in a silent dark room. Two hours later, we obtained brain perfusion SPET using 1110 MBq of Tc-99m ECD after 3% butanol solution under the same condition. All SPET images were spatially transformed to standard space smoothed and globally normalized. The differences between the baseline and odor-identification SPET images were statistically analyzed using SPM-99 software. The difference between two sets of brain perfusion SPET was considered significant at a threshold of uncorrected p values less than 0.01. SPM analysis revealed significant hyper-perfusion in both cingulated gyri, right middle temporal gyrus, right superior and inferior frontal gyri, right lingual gyrus and right fusiform gyrus on odor-identification SPET. This study shows that brain perfusion SPET can securely support other diagnostic techniques in the evaluation of olfactory function

  3. A non-parametric test for partial monotonicity in multiple regression

    NARCIS (Netherlands)

    van Beek, M.; Daniëls, H.A.M.

    Partial positive (negative) monotonicity in a dataset is the property that an increase in an independent variable, ceteris paribus, generates an increase (decrease) in the dependent variable. A test for partial monotonicity in datasets could (1) increase model performance if monotonicity may be

  4. The conformity of BPP and vibroacoustic stimulation results in fetal non reactive non stress test

    Directory of Open Access Journals (Sweden)

    M. Modarres

    2006-08-01

    Full Text Available Background: The most frequently used test for evaluation of fetal health is the Non Stress Test (NST. Unfortunately it has a high incidence of false positive results. The combination of vibroacoustic stimulation with the NTS has been shown to reduce non reactive results. Methods: A tests assessment method was chosen with a simple randomized sampling. 40 pregnant women with non reactive NST in the first 20 minutes who received VAS in one of Tehran University's Hospitals were compared with BPP scores. A vibroacoustic stimulation was applied for a 3 seconds on the maternal abdomen and fallowed within 10 minutes.Data collection tools were NST, sonography instruments ,NST result paper, tooth brusher, watch, demographic questioner and check list. Data analysis was made by descriptive static and by using the Fisher's Exact Test (with level of significant at p<0/05. All statistical analysis were performed using an spss/win. Results: After VAS, 70% of non reactive tracing became reactive. All cases with fetal reactivity response after a VAS had a subsequent BPP score of 8 (negative predictive value of 100%. False positivity of VAS was lower than NST. Conclusion: VAS offers benefits, by decreasing the incidence of non reactive test and reducing test time. VAS lowers the rate of false positive NST. VAS is safe and allows more efficient of prenatal services. This test could be used as a rapid antepartum test to predict fetal well-being.

  5. PARAMETRIC DISTRIBUTION FAMILIES USEFUL FOR MODELLING NON - LIFE INSURANCE PAYMENTS DATA. TAIL BEHAVIOUR

    Directory of Open Access Journals (Sweden)

    Sandra Teodorescu

    2013-11-01

    Full Text Available The present paper describes a series of parametric distributions used for modeling non-life insurance payments data.Of those listed, special attention is paid to the transformed Beta distribution family.This distribution as well as those which are obtained from it(special cases of four-parameter transformed Beta distribution are used in the modeling of high costs, or even extreme ones.In the literature it follows the tail behaviour of distributions depending on the parameters, because the insurance payments data are tipically highly positively skewed and distributed with large upper tails.In the paper is described the tail behavior of the distribution in the left and right side respectively, and deduced from it, a general case.There are also some graphs of probability density function for one of the transformed Beta family members, which comes to reinforce the comments made.

  6. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  7. Weak Form Efficiency of the Chittagong Stock Exchange: An Empirical Analysis (2006-2016

    Directory of Open Access Journals (Sweden)

    Shahadat Hussain

    2017-01-01

    Full Text Available We study the random walk behavior of Chittagong Stock Exchange (CSE by using daily returns of three indices for the period of 2006 to 2016 employing both non-parametric test (run test and parametric tests [autocorrelation coefficient test, Ljung– Box (LB statistics]. The skewness and kurtosis properties of daily return series are non-normal, with a hint of positively skewed and leptokurtic distribution. The results of run test; autocorrelation and Ljung–Box (LB statistics provide evidences against random walk behavior in the Chittagong Stock Exchange. Overall our result suggest that Chittagong Stock Exchange does not exhibit weak form of efficiency. Hence, there is opportunity of generating a superior return by the active investors.

  8. Vector field statistical analysis of kinematic and force trajectories.

    Science.gov (United States)

    Pataky, Todd C; Robinson, Mark A; Vanrenterghem, Jos

    2013-09-27

    When investigating the dynamics of three-dimensional multi-body biomechanical systems it is often difficult to derive spatiotemporally directed predictions regarding experimentally induced effects. A paradigm of 'non-directed' hypothesis testing has emerged in the literature as a result. Non-directed analyses typically consist of ad hoc scalar extraction, an approach which substantially simplifies the original, highly multivariate datasets (many time points, many vector components). This paper describes a commensurately multivariate method as an alternative to scalar extraction. The method, called 'statistical parametric mapping' (SPM), uses random field theory to objectively identify field regions which co-vary significantly with the experimental design. We compared SPM to scalar extraction by re-analyzing three publicly available datasets: 3D knee kinematics, a ten-muscle force system, and 3D ground reaction forces. Scalar extraction was found to bias the analyses of all three datasets by failing to consider sufficient portions of the dataset, and/or by failing to consider covariance amongst vector components. SPM overcame both problems by conducting hypothesis testing at the (massively multivariate) vector trajectory level, with random field corrections simultaneously accounting for temporal correlation and vector covariance. While SPM has been widely demonstrated to be effective for analyzing 3D scalar fields, the current results are the first to demonstrate its effectiveness for 1D vector field analysis. It was concluded that SPM offers a generalized, statistically comprehensive solution to scalar extraction's over-simplification of vector trajectories, thereby making it useful for objectively guiding analyses of complex biomechanical systems. © 2013 Published by Elsevier Ltd. All rights reserved.

  9. Improving the Crossing-SIBTEST Statistic for Detecting Non-uniform DIF.

    Science.gov (United States)

    Chalmers, R Philip

    2018-06-01

    This paper demonstrates that, after applying a simple modification to Li and Stout's (Psychometrika 61(4):647-677, 1996) CSIBTEST statistic, an improved variant of the statistic could be realized. It is shown that this modified version of CSIBTEST has a more direct association with the SIBTEST statistic presented by Shealy and Stout (Psychometrika 58(2):159-194, 1993). In particular, the asymptotic sampling distributions and general interpretation of the effect size estimates are the same for SIBTEST and the new CSIBTEST. Given the more natural connection to SIBTEST, it is shown that Li and Stout's hypothesis testing approach is insufficient for CSIBTEST; thus, an improved hypothesis testing procedure is required. Based on the presented arguments, a new chi-squared-based hypothesis testing approach is proposed for the modified CSIBTEST statistic. Positive results from a modest Monte Carlo simulation study strongly suggest the original CSIBTEST procedure and randomization hypothesis testing approach should be replaced by the modified statistic and hypothesis testing method.

  10. Confidence intervals permit, but don't guarantee, better inference than statistical significance testing

    Directory of Open Access Journals (Sweden)

    Melissa Coulson

    2010-07-01

    Full Text Available A statistically significant result, and a non-significant result may differ little, although significance status may tempt an interpretation of difference. Two studies are reported that compared interpretation of such results presented using null hypothesis significance testing (NHST, or confidence intervals (CIs. Authors of articles published in psychology, behavioural neuroscience, and medical journals were asked, via email, to interpret two fictitious studies that found similar results, one statistically significant, and the other non-significant. Responses from 330 authors varied greatly, but interpretation was generally poor, whether results were presented as CIs or using NHST. However, when interpreting CIs respondents who mentioned NHST were 60% likely to conclude, unjustifiably, the two results conflicted, whereas those who interpreted CIs without reference to NHST were 95% likely to conclude, justifiably, the two results were consistent. Findings were generally similar for all three disciplines. An email survey of academic psychologists confirmed that CIs elicit better interpretations if NHST is not invoked. Improved statistical inference can result from encouragement of meta-analytic thinking and use of CIs but, for full benefit, such highly desirable statistical reform requires also that researchers interpret CIs without recourse to NHST.

  11. Assessment of hydrogen levels in Zircaloy-2 by non-destructive testing

    International Nuclear Information System (INIS)

    De, P.K.; John, J.T.; Banerjee, S.; Jayakumar, T.; Thavasimuthu, M.; Raj, B.

    1998-01-01

    A non-destructive assessment of Zircaloy-2 samples charged with hydrogen in the range of 50 to 1150 mg/kg has been made using ultrasonic and eddy current testing. It has been found that the ratio of the longitudinal to the shear wave velocity is a parameter which can be directly correlated with the hydrogen content up to a level of 100 to 200 mg/kg. This parameter together with the values of longitudinal and shear wave velocities can be utilized in a multi-parametric correlation approach for estimation of higher levels of the hydrogen content (up to 1150 mg/kg). The sensitivity at different ranges has been found to be acceptable. Ultrasonic attenuation measurements at higher frequencies and eddy current test parameter are also effective for estimation of hydrogen levels above 250 mg/kg in zirconium alloys. Microstructural characterization including TEM studies have been carried out for studying the influence of the type and the morphology of hydride precipitates on ultrasonic parameters. (orig.)

  12. The method of varying amplitudes for solving (non)linear problems involving strong parametric excitation

    DEFF Research Database (Denmark)

    Sorokin, Vladislav; Thomsen, Jon Juel

    2015-01-01

    Parametrically excited systems appear in many fields of science and technology, intrinsically or imposed purposefully; e.g. spatially periodic structures represent an important class of such systems [4]. When the parametric excitation can be considered weak, classical asymptotic methods like...... the method of averaging [2] or multiple scales [6] can be applied. However, with many practically important applications this simplification is inadequate, e.g. with spatially periodic structures it restricts the possibility to affect their effective dynamic properties by a structural parameter modulation...... of considerable magnitude. Approximate methods based on Floquet theory [4] for analyzing problems involving parametric excitation, e.g. the classical Hill’s method of infinite determinants [3,4], can be employed also in cases of strong excitation; however, with Floquet theory being applicable only for linear...

  13. SPSS for applied sciences basic statistical testing

    CERN Document Server

    Davis, Cole

    2013-01-01

    This book offers a quick and basic guide to using SPSS and provides a general approach to solving problems using statistical tests. It is both comprehensive in terms of the tests covered and the applied settings it refers to, and yet is short and easy to understand. Whether you are a beginner or an intermediate level test user, this book will help you to analyse different types of data in applied settings. It will also give you the confidence to use other statistical software and to extend your expertise to more specific scientific settings as required.The author does not use mathematical form

  14. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  15. A comparison of confidence/credible interval methods for the area under the ROC curve for continuous diagnostic tests with small sample size.

    Science.gov (United States)

    Feng, Dai; Cortese, Giuliana; Baumgartner, Richard

    2017-12-01

    The receiver operating characteristic (ROC) curve is frequently used as a measure of accuracy of continuous markers in diagnostic tests. The area under the ROC curve (AUC) is arguably the most widely used summary index for the ROC curve. Although the small sample size scenario is common in medical tests, a comprehensive study of small sample size properties of various methods for the construction of the confidence/credible interval (CI) for the AUC has been by and large missing in the literature. In this paper, we describe and compare 29 non-parametric and parametric methods for the construction of the CI for the AUC when the number of available observations is small. The methods considered include not only those that have been widely adopted, but also those that have been less frequently mentioned or, to our knowledge, never applied to the AUC context. To compare different methods, we carried out a simulation study with data generated from binormal models with equal and unequal variances and from exponential models with various parameters and with equal and unequal small sample sizes. We found that the larger the true AUC value and the smaller the sample size, the larger the discrepancy among the results of different approaches. When the model is correctly specified, the parametric approaches tend to outperform the non-parametric ones. Moreover, in the non-parametric domain, we found that a method based on the Mann-Whitney statistic is in general superior to the others. We further elucidate potential issues and provide possible solutions to along with general guidance on the CI construction for the AUC when the sample size is small. Finally, we illustrate the utility of different methods through real life examples.

  16. Evaluation of seizure propagation on ictal brain SPECT using statistical parametric mapping in temporal lobe epilepsy

    International Nuclear Information System (INIS)

    Jeon, Tae Joo; Lee, Jong Doo; Kim, Hee Joung; Lee, Byung In; Kim, Ok Joon; Kim, Min Jung; Jeon, Jeong Dong

    1999-01-01

    Ictal brain SPECT has a high diagnostic sensitivity exceeding 90 % in the localization of seizure focus, however, it often shows increased uptake within the extratemporal areas due to early propagation of seizure discharge. This study aimed to evaluate seizure propagation on ictal brian SPECT in patients with temporal lobe epilepsy (TLE) by statistical parametric mapping (SPM). Twenty-one patients (age 27.14 5.79 y) with temporal lobe epilepsy (right in 8, left in 13) who had successful seizure outcome after surgery and nine normal control were included. The data of ictal and interictal brain SPECT of the patients and baseline SPECT of normal control group were analyzed using automatic image registration and SPM96 softwares. The statistical analysis was performed to compare the mean SPECT image of normal group with individual ictal SPECT, and each mean image of the interictal groups of the right or left TLE with individual ictal scans. The t statistic SPM [t] was transformed to SPM [Z] with a threshold of 1.64. The statistical results were displayed and rendered on the reference 3 dimensional MRI images with P value of 0.05 and uncorrected extent threshold p value of 0.5 for SPM [Z]. SPM data demonstrated increased uptake within the epileptic lesion in 19 patients (90.4 %), among them, localized increased uptake confined to the epileptogenic lesion was seen in only 4 (19%) but 15 patients (71.4%) showed hyperperfusion within propagation sites. Bi-temporal hyperperfusion was observed in 11 out of 19 patients (57.9%, 5 in the right and 6 in the left); higher uptake within the lesion than contralateral side in 9, similar activity in 1 and higher uptake within contralateral lobe in one. Extra-temporal hyperperfusion was observed in 8 (2 in the right, 3 in the left, 3 in bilateral); unilateral hyperperfusion within the epileptogenic temporal lobe and extra-temporal area in 4, bi-temporal with extra-temporal hyperperfusion in remaining 4. Ictal brain SPECT is highly

  17. A parametric LTR solution for discrete-time systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1989-01-01

    A parametric LTR (loop transfer recovery) solution for discrete-time compensators incorporating filtering observers which achieve exact recovery is presented for both minimum- and non-minimum-phase systems. First the recovery error, which defines the difference between the target loop transfer...... and the full loop transfer function, is manipulated into a general form involving the target loop transfer matrix and the fundamental recovery matrix. A parametric LTR solution based on the recovery matrix is developed. It is shown that the LQR/LTR (linear quadratic Gaussian/loop transfer recovery) solution...

  18. A comparison of test statistics for the recovery of rapid growth-based enumeration tests

    NARCIS (Netherlands)

    van den Heuvel, Edwin R.; IJzerman-Boon, Pieta C.

    This paper considers five test statistics for comparing the recovery of a rapid growth-based enumeration test with respect to the compendial microbiological method using a specific nonserial dilution experiment. The finite sample distributions of these test statistics are unknown, because they are

  19. Bifurcation characteristics in parametric systems with combined damping

    Czech Academy of Sciences Publication Activity Database

    Hortel, Milan; Škuderová, Alena

    Vol.16, č. 3 (2009), s. 1-9 ISSN 1802-1484 R&D Projects: GA ČR(CZ) GA101/07/0884 Institutional research plan: CEZ:AV0Z20760514 Keywords : non-linear dynamics * parametric systems * impact effects * linear and non-linear damping Subject RIV: BI - Acoustics http://www.im.fme.vutbr.cz/pdf/16_3_221.full.pdf

  20. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  1. Nonconventional concrete hollow blocks evaluation by destructive and non-destructive testing

    Directory of Open Access Journals (Sweden)

    M.S. Rodrigues

    Full Text Available The aim of this study was to evaluate cementitious matrices properties by partial replacement of Portland cement by silica fume (SF or by rice husk ash (RHA, and their application in nonbearing hollow blocks, tested by destructive and non-destructive methods. The following mixtures were produced: reference (100% of Portland cement and Portland cement replacement (10% by mass with SF or RHA. The non-destructive testing showed that the highest values of UPV were obtained for SF-based blocks and RHA-based blocks. The destructive test showed better results for SF-based blocks, but there was no statistical difference between the RHA-based and control ones.

  2. A non-parametric estimator for the doubly-periodic Poisson intensity function

    NARCIS (Netherlands)

    R. Helmers (Roelof); I.W. Mangku (Wayan); R. Zitikis

    2007-01-01

    textabstractIn a series of papers, J. Garrido and Y. Lu have proposed and investigated a doubly-periodic Poisson model, and then applied it to analyze hurricane data. The authors have suggested several parametric models for the underlying intensity function. In the present paper we construct and

  3. Non-classical signature of parametric fluorescence and its application in metrology

    Czech Academy of Sciences Publication Activity Database

    Hamar, Martin; Michálek, Václav; Pathak, A.

    2014-01-01

    Roč. 14, č. 4 (2014), s. 227-236 ISSN 1335-8871 R&D Projects: GA ČR GAP205/12/0382 Institutional support: RVO:68378271 Keywords : parametric fluorescence * photon number squeezed light * quantum efficiency Subject RIV: BH - Optics, Masers, Lasers Impact factor: 0.989, year: 2014

  4. A Non-Parametric Item Response Theory Evaluation of the CAGE Instrument Among Older Adults.

    Science.gov (United States)

    Abdin, Edimansyah; Sagayadevan, Vathsala; Vaingankar, Janhavi Ajit; Picco, Louisa; Chong, Siow Ann; Subramaniam, Mythily

    2018-02-23

    The validity of the CAGE using item response theory (IRT) has not yet been examined in older adult population. This study aims to investigate the psychometric properties of the CAGE using both non-parametric and parametric IRT models, assess whether there is any differential item functioning (DIF) by age, gender and ethnicity and examine the measurement precision at the cut-off scores. We used data from the Well-being of the Singapore Elderly study to conduct Mokken scaling analysis (MSA), dichotomous Rasch and 2-parameter logistic IRT models. The measurement precision at the cut-off scores were evaluated using classification accuracy (CA) and classification consistency (CC). The MSA showed the overall scalability H index was 0.459, indicating a medium performing instrument. All items were found to be homogenous, measuring the same construct and able to discriminate well between respondents with high levels of the construct and the ones with lower levels. The item discrimination ranged from 1.07 to 6.73 while the item difficulty ranged from 0.33 to 2.80. Significant DIF was found for 2-item across ethnic group. More than 90% (CC and CA ranged from 92.5% to 94.3%) of the respondents were consistently and accurately classified by the CAGE cut-off scores of 2 and 3. The current study provides new evidence on the validity of the CAGE from the IRT perspective. This study provides valuable information of each item in the assessment of the overall severity of alcohol problem and the precision of the cut-off scores in older adult population.

  5. Approximations to the distribution of a test statistic in covariance structure analysis: A comprehensive study.

    Science.gov (United States)

    Wu, Hao

    2018-05-01

    In structural equation modelling (SEM), a robust adjustment to the test statistic or to its reference distribution is needed when its null distribution deviates from a χ 2 distribution, which usually arises when data do not follow a multivariate normal distribution. Unfortunately, existing studies on this issue typically focus on only a few methods and neglect the majority of alternative methods in statistics. Existing simulation studies typically consider only non-normal distributions of data that either satisfy asymptotic robustness or lead to an asymptotic scaled χ 2 distribution. In this work we conduct a comprehensive study that involves both typical methods in SEM and less well-known methods from the statistics literature. We also propose the use of several novel non-normal data distributions that are qualitatively different from the non-normal distributions widely used in existing studies. We found that several under-studied methods give the best performance under specific conditions, but the Satorra-Bentler method remains the most viable method for most situations. © 2017 The British Psychological Society.

  6. The relationship between multilevel models and non-parametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residual heteroscedasticity.

    Science.gov (United States)

    Rights, Jason D; Sterba, Sonya K

    2016-11-01

    Multilevel data structures are common in the social sciences. Often, such nested data are analysed with multilevel models (MLMs) in which heterogeneity between clusters is modelled by continuously distributed random intercepts and/or slopes. Alternatively, the non-parametric multilevel regression mixture model (NPMM) can accommodate the same nested data structures through discrete latent class variation. The purpose of this article is to delineate analytic relationships between NPMM and MLM parameters that are useful for understanding the indirect interpretation of the NPMM as a non-parametric approximation of the MLM, with relaxed distributional assumptions. We define how seven standard and non-standard MLM specifications can be indirectly approximated by particular NPMM specifications. We provide formulas showing how the NPMM can serve as an approximation of the MLM in terms of intraclass correlation, random coefficient means and (co)variances, heteroscedasticity of residuals at level 1, and heteroscedasticity of residuals at level 2. Further, we discuss how these relationships can be useful in practice. The specific relationships are illustrated with simulated graphical demonstrations, and direct and indirect interpretations of NPMM classes are contrasted. We provide an R function to aid in implementing and visualizing an indirect interpretation of NPMM classes. An empirical example is presented and future directions are discussed. © 2016 The British Psychological Society.

  7. Using photons for non-destructive testing of thick materials: a simulation study

    International Nuclear Information System (INIS)

    Oishi, Ryutaro; Nagai, Hideki

    2004-01-01

    Positron annihilation spectroscopy using positron annihilation lifetimes has been successfully studied for non-destructive material testing. A positron inspection probe is annihilated with an electron at the front of the material. The application of the positron lifetime method is restricted to thin materials. A photon with energy exceeding 1.02MeV reaches the materials' depth and can produce a positron through γ-conversion. Such a photon-produced positron is a probe for thick materials. The probability of γ-conversion, however, is low. The method of photon-produced positron annihilation lifetimes is restricted by statistics. We estimated the expected number of events and the statistical uncertainties of the lifetime measurements for a non-destructive test, such as an SUS316 fatigue monitoring, to construct a fatigue-monitoring system

  8. New Parametric Imaging Algorithm for Quantification of Binding Parameter in non-reversible compartment model: MLAIR

    International Nuclear Information System (INIS)

    Kim, Su Jin; Lee, Jae Sung; Kim, Yu Kyeong; Lee, Dong Soo

    2007-01-01

    Parametric imaging allows us analysis of the entire brain or body image. Graphical approaches are commonly employed to generate parametric imaging through linear or multilinear regression. However, this linear regression method has limited accuracy due to bias in high level of noise data. Several methods have been proposed to reduce bias for linear regression estimation especially in reversible model. In this study, we focus on generating a net accumulation rate (K i ), which is related to binding parameter in brain receptor study, parametric imaging in an irreversible compartment model using multiple linear analysis. The reliability of a newly developed multiple linear analysis method (MLAIR) was assessed through the Monte Carlo simulation, and we applied it to a [ 11 C]MeNTI PET for opioid receptor

  9. Non-parametric Bayesian networks: Improving theory and reviewing applications

    International Nuclear Information System (INIS)

    Hanea, Anca; Morales Napoles, Oswaldo; Ababei, Dan

    2015-01-01

    Applications in various domains often lead to high dimensional dependence modelling. A Bayesian network (BN) is a probabilistic graphical model that provides an elegant way of expressing the joint distribution of a large number of interrelated variables. BNs have been successfully used to represent uncertain knowledge in a variety of fields. The majority of applications use discrete BNs, i.e. BNs whose nodes represent discrete variables. Integrating continuous variables in BNs is an area fraught with difficulty. Several methods that handle discrete-continuous BNs have been proposed in the literature. This paper concentrates only on one method called non-parametric BNs (NPBNs). NPBNs were introduced in 2004 and they have been or are currently being used in at least twelve professional applications. This paper provides a short introduction to NPBNs, a couple of theoretical advances, and an overview of applications. The aim of the paper is twofold: one is to present the latest improvements of the theory underlying NPBNs, and the other is to complement the existing overviews of BNs applications with the NPNBs applications. The latter opens the opportunity to discuss some difficulties that applications pose to the theoretical framework and in this way offers some NPBN modelling guidance to practitioners. - Highlights: • The paper gives an overview of the current NPBNs methodology. • We extend the NPBN methodology by relaxing the conditions of one of its fundamental theorems. • We propose improvements of the data mining algorithm for the NPBNs. • We review the professional applications of the NPBNs.

  10. Non-covalent interactions across organic and biological subsets of chemical space: Physics-based potentials parametrized from machine learning

    Science.gov (United States)

    Bereau, Tristan; DiStasio, Robert A.; Tkatchenko, Alexandre; von Lilienfeld, O. Anatole

    2018-06-01

    Classical intermolecular potentials typically require an extensive parametrization procedure for any new compound considered. To do away with prior parametrization, we propose a combination of physics-based potentials with machine learning (ML), coined IPML, which is transferable across small neutral organic and biologically relevant molecules. ML models provide on-the-fly predictions for environment-dependent local atomic properties: electrostatic multipole coefficients (significant error reduction compared to previously reported), the population and decay rate of valence atomic densities, and polarizabilities across conformations and chemical compositions of H, C, N, and O atoms. These parameters enable accurate calculations of intermolecular contributions—electrostatics, charge penetration, repulsion, induction/polarization, and many-body dispersion. Unlike other potentials, this model is transferable in its ability to handle new molecules and conformations without explicit prior parametrization: All local atomic properties are predicted from ML, leaving only eight global parameters—optimized once and for all across compounds. We validate IPML on various gas-phase dimers at and away from equilibrium separation, where we obtain mean absolute errors between 0.4 and 0.7 kcal/mol for several chemically and conformationally diverse datasets representative of non-covalent interactions in biologically relevant molecules. We further focus on hydrogen-bonded complexes—essential but challenging due to their directional nature—where datasets of DNA base pairs and amino acids yield an extremely encouraging 1.4 kcal/mol error. Finally, and as a first look, we consider IPML for denser systems: water clusters, supramolecular host-guest complexes, and the benzene crystal.

  11. Crossing statistic: reconstructing the expansion history of the universe

    International Nuclear Information System (INIS)

    Shafieloo, Arman

    2012-01-01

    We present that by combining Crossing Statistic [1,2] and Smoothing method [3-5] one can reconstruct the expansion history of the universe with a very high precision without considering any prior on the cosmological quantities such as the equation of state of dark energy. We show that the presented method performs very well in reconstruction of the expansion history of the universe independent of the underlying models and it works well even for non-trivial dark energy models with fast or slow changes in the equation of state of dark energy. Accuracy of the reconstructed quantities along with independence of the method to any prior or assumption gives the proposed method advantages to the other non-parametric methods proposed before in the literature. Applying on the Union 2.1 supernovae combined with WiggleZ BAO data we present the reconstructed results and test the consistency of the two data sets in a model independent manner. Results show that latest available supernovae and BAO data are in good agreement with each other and spatially flat ΛCDM model is in concordance with the current data

  12. A new formalism for non extensive physical systems: Tsallis Thermo statistics

    International Nuclear Information System (INIS)

    Tirnakli, U.; Bueyuekkilic, F.; Demirhan, D.

    1999-01-01

    Although Boltzmann-Gibbs (BG) statistics provides a suitable tool which enables us to handle a large number of physical systems satisfactorily, it has some basic restrictions. Recently a non extensive thermo statistics has been proposed by C.Tsallis to handle the non extensive physical systems and up to now, besides the generalization of some of the conventional concepts, the formalism has been prosperous in some of the physical applications. In this study, our effort is to introduce Tsallis thermo statistics in some details and to emphasize its achievements on physical systems by noting the recent developments on this line

  13. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    Science.gov (United States)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  14. Parametric Thermal Models of the Transient Reactor Test Facility (TREAT)

    Energy Technology Data Exchange (ETDEWEB)

    Bradley K. Heath

    2014-03-01

    This work supports the restart of transient testing in the United States using the Department of Energy’s Transient Reactor Test Facility at the Idaho National Laboratory. It also supports the Global Threat Reduction Initiative by reducing proliferation risk of high enriched uranium fuel. The work involves the creation of a nuclear fuel assembly model using the fuel performance code known as BISON. The model simulates the thermal behavior of a nuclear fuel assembly during steady state and transient operational modes. Additional models of the same geometry but differing material properties are created to perform parametric studies. The results show that fuel and cladding thermal conductivity have the greatest effect on fuel temperature under the steady state operational mode. Fuel density and fuel specific heat have the greatest effect for transient operational model. When considering a new fuel type it is recommended to use materials that decrease the specific heat of the fuel and the thermal conductivity of the fuel’s cladding in order to deal with higher density fuels that accompany the LEU conversion process. Data on the latest operating conditions of TREAT need to be attained in order to validate BISON’s results. BISON’s models for TREAT (material models, boundary convection models) are modest and need additional work to ensure accuracy and confidence in results.

  15. Non-linearity consideration when analyzing reactor noise statistical characteristics. [BWR

    Energy Technology Data Exchange (ETDEWEB)

    Kebadze, B V; Adamovski, L A

    1975-06-01

    Statistical characteristics of boiling water reactor noise in the vicinity of stability threshold are studied. The reactor is considered as a non-linear system affected by random perturbations. To solve a non-linear problem the principle of statistical linearization is used. It is shown that the halfwidth of resonance peak in neutron power noise spectrum density as well as the reciprocal of noise dispersion, which are used in predicting a stable operation theshold, are different from zero both within and beyond the stability boundary the determination of which was based on linear criteria.

  16. An artificial neural network architecture for non-parametric visual odometry in wireless capsule endoscopy

    Science.gov (United States)

    Dimas, George; Iakovidis, Dimitris K.; Karargyris, Alexandros; Ciuti, Gastone; Koulaouzidis, Anastasios

    2017-09-01

    Wireless capsule endoscopy is a non-invasive screening procedure of the gastrointestinal (GI) tract performed with an ingestible capsule endoscope (CE) of the size of a large vitamin pill. Such endoscopes are equipped with a usually low-frame-rate color camera which enables the visualization of the GI lumen and the detection of pathologies. The localization of the commercially available CEs is performed in the 3D abdominal space using radio-frequency (RF) triangulation from external sensor arrays, in combination with transit time estimation. State-of-the-art approaches, such as magnetic localization, which have been experimentally proved more accurate than the RF approach, are still at an early stage. Recently, we have demonstrated that CE localization is feasible using solely visual cues and geometric models. However, such approaches depend on camera parameters, many of which are unknown. In this paper the authors propose a novel non-parametric visual odometry (VO) approach to CE localization based on a feed-forward neural network architecture. The effectiveness of this approach in comparison to state-of-the-art geometric VO approaches is validated using a robotic-assisted in vitro experimental setup.

  17. An artificial neural network architecture for non-parametric visual odometry in wireless capsule endoscopy

    International Nuclear Information System (INIS)

    Dimas, George; Iakovidis, Dimitris K; Karargyris, Alexandros; Ciuti, Gastone; Koulaouzidis, Anastasios

    2017-01-01

    Wireless capsule endoscopy is a non-invasive screening procedure of the gastrointestinal (GI) tract performed with an ingestible capsule endoscope (CE) of the size of a large vitamin pill. Such endoscopes are equipped with a usually low-frame-rate color camera which enables the visualization of the GI lumen and the detection of pathologies. The localization of the commercially available CEs is performed in the 3D abdominal space using radio-frequency (RF) triangulation from external sensor arrays, in combination with transit time estimation. State-of-the-art approaches, such as magnetic localization, which have been experimentally proved more accurate than the RF approach, are still at an early stage. Recently, we have demonstrated that CE localization is feasible using solely visual cues and geometric models. However, such approaches depend on camera parameters, many of which are unknown. In this paper the authors propose a novel non-parametric visual odometry (VO) approach to CE localization based on a feed-forward neural network architecture. The effectiveness of this approach in comparison to state-of-the-art geometric VO approaches is validated using a robotic-assisted in vitro experimental setup. (paper)

  18. Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.

    Science.gov (United States)

    Festing, M F

    2001-01-01

    In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.

  19. MIDAS: Regionally linear multivariate discriminative statistical mapping.

    Science.gov (United States)

    Varol, Erdem; Sotiras, Aristeidis; Davatzikos, Christos

    2018-07-01

    Statistical parametric maps formed via voxel-wise mass-univariate tests, such as the general linear model, are commonly used to test hypotheses about regionally specific effects in neuroimaging cross-sectional studies where each subject is represented by a single image. Despite being informative, these techniques remain limited as they ignore multivariate relationships in the data. Most importantly, the commonly employed local Gaussian smoothing, which is important for accounting for registration errors and making the data follow Gaussian distributions, is usually chosen in an ad hoc fashion. Thus, it is often suboptimal for the task of detecting group differences and correlations with non-imaging variables. Information mapping techniques, such as searchlight, which use pattern classifiers to exploit multivariate information and obtain more powerful statistical maps, have become increasingly popular in recent years. However, existing methods may lead to important interpretation errors in practice (i.e., misidentifying a cluster as informative, or failing to detect truly informative voxels), while often being computationally expensive. To address these issues, we introduce a novel efficient multivariate statistical framework for cross-sectional studies, termed MIDAS, seeking highly sensitive and specific voxel-wise brain maps, while leveraging the power of regional discriminant analysis. In MIDAS, locally linear discriminative learning is applied to estimate the pattern that best discriminates between two groups, or predicts a variable of interest. This pattern is equivalent to local filtering by an optimal kernel whose coefficients are the weights of the linear discriminant. By composing information from all neighborhoods that contain a given voxel, MIDAS produces a statistic that collectively reflects the contribution of the voxel to the regional classifiers as well as the discriminative power of the classifiers. Critically, MIDAS efficiently assesses the

  20. A parametric reconstruction of the deceleration parameter

    Energy Technology Data Exchange (ETDEWEB)

    Al Mamon, Abdulla [Manipal University, Manipal Centre for Natural Sciences, Manipal (India); Visva-Bharati, Department of Physics, Santiniketan (India); Das, Sudipta [Visva-Bharati, Department of Physics, Santiniketan (India)

    2017-07-15

    The present work is based on a parametric reconstruction of the deceleration parameter q(z) in a model for the spatially flat FRW universe filled with dark energy and non-relativistic matter. In cosmology, the parametric reconstruction technique deals with an attempt to build up a model by choosing some specific evolution scenario for a cosmological parameter and then estimate the values of the parameters with the help of different observational datasets. In this paper, we have proposed a logarithmic parametrization of q(z) to probe the evolution history of the universe. Using the type Ia supernova, baryon acoustic oscillation and the cosmic microwave background datasets, the constraints on the arbitrary model parameters q{sub 0} and q{sub 1} are obtained (within 1σ and 2σ confidence limits) by χ{sup 2}-minimization technique. We have then reconstructed the deceleration parameter, the total EoS parameter ω{sub tot}, the jerk parameter and have compared the reconstructed results of q(z) with other well-known parametrizations of q(z). We have also shown that two model selection criteria (namely, the Akaike information criterion and Bayesian information criterion) provide a clear indication that our reconstructed model is well consistent with other popular models. (orig.)

  1. Output Only Modal Testing of a Car Body Subject to Engine Excitation

    DEFF Research Database (Denmark)

    Brincker, Rune; Andersen, Palle; Møller, Nis

    2000-01-01

    In this paper an output only modal testing and identification of a car body subject to engine excitation is presented. The response data were analyzed using two different techniques: a non-parametric technique based on Frequency Domain Decomposition (FDD), and a parametric technique working...

  2. Towards an Industrial Application of Statistical Uncertainty Analysis Methods to Multi-physical Modelling and Safety Analyses

    International Nuclear Information System (INIS)

    Zhang, Jinzhao; Segurado, Jacobo; Schneidesch, Christophe

    2013-01-01

    Since 1980's, Tractebel Engineering (TE) has being developed and applied a multi-physical modelling and safety analyses capability, based on a code package consisting of the best estimate 3D neutronic (PANTHER), system thermal hydraulic (RELAP5), core sub-channel thermal hydraulic (COBRA-3C), and fuel thermal mechanic (FRAPCON/FRAPTRAN) codes. A series of methodologies have been developed to perform and to license the reactor safety analysis and core reload design, based on the deterministic bounding approach. Following the recent trends in research and development as well as in industrial applications, TE has been working since 2010 towards the application of the statistical sensitivity and uncertainty analysis methods to the multi-physical modelling and licensing safety analyses. In this paper, the TE multi-physical modelling and safety analyses capability is first described, followed by the proposed TE best estimate plus statistical uncertainty analysis method (BESUAM). The chosen statistical sensitivity and uncertainty analysis methods (non-parametric order statistic method or bootstrap) and tool (DAKOTA) are then presented, followed by some preliminary results of their applications to FRAPCON/FRAPTRAN simulation of OECD RIA fuel rod codes benchmark and RELAP5/MOD3.3 simulation of THTF tests. (authors)

  3. Clinical Laboratory Tests in Some Acute Exogenous Poisonings.

    Science.gov (United States)

    Tufkova, Stoilka G; Yankov, Ivan V; Paskaleva, Diana A

    2017-09-01

    There is no specific toxicological screening of clinical laboratory parameters in clinical toxicology when it comes to acute exogenous poisoning. To determine routine clinical laboratory parameters and indicators for assessment of vital functions in patients with acute intoxications. One hundred and fifty-three patients were included in the present study. They were hospitalized in the Department of Clinical Toxicology at St. George University Hospital, Plovdiv for cerebral toxicity inducing medication (n = 45), alcohol (n = 40), heroin abuse (n = 33). The controls were 35. The laboratory tests were conducted in compliance with the standards of the clinical laboratory. We used the following statistical analyses: analysis of variance (the ucriterion of normal distribution, the Student's t-test, dispersion analysis based on ANOVA) and non-parametric analysis. Based on the routine hematological parameters with statistically significant changes in three groups of poisoning are: red blood cells, hematocrit, hemoglobin (except alcohol intoxication) and leukocytes. We found statistically significant changes in serum total protein, sodium and bilirubin. The highest statistical significance is the increased activity of AST and ALT. We present a model for selection of clinical laboratory tests for severe acute poisoning with modern equipment under standardized conditions. The results of the study suggest that the clinical laboratory constellation we used can be used as a mandatory element in the diagnosis of moderate and severe intoxication with the mentioned toxic substances.

  4. The effect of non-weight bearing group-exercising on females with non-specific chronic low back pain: a randomized single blind controlled pilot study.

    Science.gov (United States)

    Masharawi, Youssef; Nadaf, Nedal

    2013-01-01

    The aim of this study was to investigate the effect of active non-weight-bearing (NWB) group exercising on women with non specific chronic low back pain (NSCLBP). Forty females with NSCLBP were assigned in a randomized control longitudinal single blinded pilot study. 20 of them were assigned to a NWB bi-weekly group exercise class and 20 females were included in the control group. The exercises involved the entire lumbo-pelvic spine aimed at improving lumbar mobility/flexibility and stability. Pain intensity (VAS), back specific disability (Rolland Morris questionnaire-RMQ), and lumbar flexion and extension ranges of motion measurements were taken prior to intervention (t(0)), immediately following 4 weeks of intervention (t(1)) and 8 weeks later (t(fu)). Reliability trials were conducted on 10 females. Non-parametric tests were used for statistical significance (p exercising improves functional, painful status, lumbar flexion and extension ranges of motion in women suffering from NSCLBP.

  5. Generalized ensemble theory with non-extensive statistics

    Science.gov (United States)

    Shen, Ke-Ming; Zhang, Ben-Wei; Wang, En-Ke

    2017-12-01

    The non-extensive canonical ensemble theory is reconsidered with the method of Lagrange multipliers by maximizing Tsallis entropy, with the constraint that the normalized term of Tsallis' q -average of physical quantities, the sum ∑ pjq, is independent of the probability pi for Tsallis parameter q. The self-referential problem in the deduced probability and thermal quantities in non-extensive statistics is thus avoided, and thermodynamical relationships are obtained in a consistent and natural way. We also extend the study to the non-extensive grand canonical ensemble theory and obtain the q-deformed Bose-Einstein distribution as well as the q-deformed Fermi-Dirac distribution. The theory is further applied to the generalized Planck law to demonstrate the distinct behaviors of the various generalized q-distribution functions discussed in literature.

  6. Evaluating three proposals for testing independence in non linear spatial processes

    OpenAIRE

    López Hernández, Fernando Antonio; Mate Sánchez-Val, María Luz; Artal Tur, Andrés

    2013-01-01

    [ENG]This paper evaluates the behaviour of different families of tests when checking for spatial independence in the presence of nonlinearities. To reach this goal, we select three representative proposals. The usual parametric tests of I-Moran, the nonparametric proposal of Brett and Pinkse (1997), and the semiparametric Scan test. In order to study how they perform, we simulate different nonlinear spatial structures by Monte Carlo methods, hence conducting empirical tests on ...

  7. The choice of statistical methods for comparisons of dosimetric data in radiotherapy

    International Nuclear Information System (INIS)

    Chaikh, Abdulhamid; Giraud, Jean-Yves; Perrin, Emmanuel; Bresciani, Jean-Pierre; Balosso, Jacques

    2014-01-01

    Novel irradiation techniques are continuously introduced in radiotherapy to optimize the accuracy, the security and the clinical outcome of treatments. These changes could raise the question of discontinuity in dosimetric presentation and the subsequent need for practice adjustments in case of significant modifications. This study proposes a comprehensive approach to compare different techniques and tests whether their respective dose calculation algorithms give rise to statistically significant differences in the treatment doses for the patient. Statistical investigation principles are presented in the framework of a clinical example based on 62 fields of radiotherapy for lung cancer. The delivered doses in monitor units were calculated using three different dose calculation methods: the reference method accounts the dose without tissues density corrections using Pencil Beam Convolution (PBC) algorithm, whereas new methods calculate the dose with tissues density correction for 1D and 3D using Modified Batho (MB) method and Equivalent Tissue air ratio (ETAR) method, respectively. The normality of the data and the homogeneity of variance between groups were tested using Shapiro-Wilks and Levene test, respectively, then non-parametric statistical tests were performed. Specifically, the dose means estimated by the different calculation methods were compared using Friedman’s test and Wilcoxon signed-rank test. In addition, the correlation between the doses calculated by the three methods was assessed using Spearman’s rank and Kendall’s rank tests. The Friedman’s test showed a significant effect on the calculation method for the delivered dose of lung cancer patients (p <0.001). The density correction methods yielded to lower doses as compared to PBC by on average (−5 ± 4.4 SD) for MB and (−4.7 ± 5 SD) for ETAR. Post-hoc Wilcoxon signed-rank test of paired comparisons indicated that the delivered dose was significantly reduced using density

  8. Model-independent test for scale-dependent non-Gaussianities in the cosmic microwave background.

    Science.gov (United States)

    Räth, C; Morfill, G E; Rossmanith, G; Banday, A J; Górski, K M

    2009-04-03

    We present a model-independent method to test for scale-dependent non-Gaussianities in combination with scaling indices as test statistics. Therefore, surrogate data sets are generated, in which the power spectrum of the original data is preserved, while the higher order correlations are partly randomized by applying a scale-dependent shuffling procedure to the Fourier phases. We apply this method to the Wilkinson Microwave Anisotropy Probe data of the cosmic microwave background and find signatures for non-Gaussianities on large scales. Further tests are required to elucidate the origin of the detected anomalies.

  9. Ensuring Positiveness of the Scaled Difference Chi-square Test Statistic.

    Science.gov (United States)

    Satorra, Albert; Bentler, Peter M

    2010-06-01

    A scaled difference test statistic [Formula: see text] that can be computed from standard software of structural equation models (SEM) by hand calculations was proposed in Satorra and Bentler (2001). The statistic [Formula: see text] is asymptotically equivalent to the scaled difference test statistic T̄(d) introduced in Satorra (2000), which requires more involved computations beyond standard output of SEM software. The test statistic [Formula: see text] has been widely used in practice, but in some applications it is negative due to negativity of its associated scaling correction. Using the implicit function theorem, this note develops an improved scaling correction leading to a new scaled difference statistic T̄(d) that avoids negative chi-square values.

  10. Circulation and Directional Amplification in the Josephson Parametric Converter

    Science.gov (United States)

    Hatridge, Michael

    Nonreciprocal transport and directional amplification of weak microwave signals are fundamental ingredients in performing efficient measurements of quantum states of flying microwave light. This challenge has been partly met, as quantum-limited amplification is now regularly achieved with parametrically-driven, Josephson-junction based superconducting circuits. However, these devices are typically non-directional, requiring external circulators to separate incoming and outgoing signals. Recently this limitation has been overcome by several proposals and experimental realizations of both directional amplifiers and circulators based on interference between several parametric processes in a single device. This new class of multi-parametrically driven devices holds the promise of achieving a variety of desirable characteristics simultaneously- directionality, reduced gain-bandwidth constraints and quantum-limited added noise, and are good candidates for on-chip integration with other superconducting circuits such as qubits.

  11. Condensation of an ideal gas obeying non-Abelian statistics.

    Science.gov (United States)

    Mirza, Behrouz; Mohammadzadeh, Hosein

    2011-09-01

    We consider the thermodynamic geometry of an ideal non-Abelian gas. We show that, for a certain value of the fractional parameter and at the relevant maximum value of fugacity, the thermodynamic curvature has a singular point. This indicates a condensation such as Bose-Einstein condensation for non-Abelian statistics and we work out the phase transition temperature in various dimensions.

  12. Quantile regression for the statistical analysis of immunological data with many non-detects.

    Science.gov (United States)

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  13. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei

    2011-07-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work was originally motivated by a unique testing problem in genetic epidemiology (Chatterjee, et al., 2006) that involved a typical generalized linear model but with an additional term reminiscent of the Tukey one-degree-of-freedom formulation, and their interest was in testing for main effects of the genetic variables, while gaining statistical power by allowing for a possible interaction between genes and the environment. Later work (Maity, et al., 2009) involved the possibility of modeling the environmental variable nonparametrically, but they focused on whether there was a parametric main effect for the genetic variables. In this paper, we consider the complementary problem, where the interest is in testing for the main effect of the nonparametrically modeled environmental variable. We derive a generalized likelihood ratio test for this hypothesis, show how to implement it, and provide evidence that our method can improve statistical power when compared to standard partially linear models with main effects only. We use the method for the primary purpose of analyzing data from a case-control study of colorectal adenoma.

  14. Comparação de duas metodologias de amostragem atmosférica com ferramenta estatística não paramétrica Comparison of two atmospheric sampling methodologies with non-parametric statistical tools

    Directory of Open Access Journals (Sweden)

    Maria João Nunes

    2005-03-01

    Full Text Available In atmospheric aerosol sampling, it is inevitable that the air that carries particles is in motion, as a result of both externally driven wind and the sucking action of the sampler itself. High or low air flow sampling speeds may lead to significant particle size bias. The objective of this work is the validation of measurements enabling the comparison of species concentration from both air flow sampling techniques. The presence of several outliers and increase of residuals with concentration becomes obvious, requiring non-parametric methods, recommended for the handling of data which may not be normally distributed. This way, conversion factors are obtained for each of the various species under study using Kendall regression.

  15. Non-identical anyons and new statistics for spinons and holons

    International Nuclear Information System (INIS)

    Mor, T.

    1993-01-01

    We discuss the various existing proposals for the statistics of holons and spinons (the proposed fundamental excitations of the tJ model for HTSC), and we present new anyonic alternatives. We generalize Wilczek's realization of anyons to include non-identical anyons with mutual statistics and concentrate on particular cases which preserve time-reversal symmetry. Under the restriction of time-reversal symmetry, we find two additional possibilities for the statistics of holons and spinons - either both holons and spinons are bosons or both are fermions. In addition they obey an antisymmetric mutual statistics. We discuss the pairing mechanism of holons in this model, and the microscopic origins for this statistical behavior. (orig.)

  16. The parametrized simulation of electromagnetic showers

    International Nuclear Information System (INIS)

    Peters, S.

    1992-09-01

    The simulation of electromagnetic showers in calorimeters by detailed tracking of all secondary particles is extremely computer time consuming. Without loosing considerably in precision, the use of parametrizations for global shower properties may reduce the computing time by factors of 10 1 to 10 4 , depending on the energy, the degree of parametrization, and the complexity in the material description and the cut off energies in the detailed simulation. To arrive at a high degree of universality, parametrizations of individual electromagnetic showers in homogeneous media are developed, taking the dependence of the shower development on the material into account. In sampling calorimeters, the inhomogeneous material distribution leads to additional effects which can be taken into account by geometry dependent terms in the parametrization of the longitudinal and radial energy density distributions. Comparisons with detailed simulations of homogeneous and sampling calorimeters show very good agreement in the fluctuations, correlations, and signal averages of spatial energy distributions. Verifications of the algorithms for the simulation of the H1 detector are performed using calorimeter test data for different moduls of the H1 liquid argon calorimeter. Special attention has been paid to electron pion separation, which is of great importance for physics analysis. (orig.) [de

  17. [Detection of quadratic phase coupling between EEG signal components by nonparamatric and parametric methods of bispectral analysis].

    Science.gov (United States)

    Schmidt, K; Witte, H

    1999-11-01

    Recently the assumption of the independence of individual frequency components in a signal has been rejected, for example, for the EEG during defined physiological states such as sleep or sedation [9, 10]. Thus, the use of higher-order spectral analysis capable of detecting interrelations between individual signal components has proved useful. The aim of the present study was to investigate the quality of various non-parametric and parametric estimation algorithms using simulated as well as true physiological data. We employed standard algorithms available for the MATLAB. The results clearly show that parametric bispectral estimation is superior to non-parametric estimation in terms of the quality of peak localisation and the discrimination from other peaks.

  18. Phase statistics in non-Gaussian scattering

    International Nuclear Information System (INIS)

    Watson, Stephen M; Jakeman, Eric; Ridley, Kevin D

    2006-01-01

    Amplitude weighting can improve the accuracy of frequency measurements in signals corrupted by multiplicative speckle noise. When the speckle field constitutes a circular complex Gaussian process, the optimal function of amplitude weighting is provided by the field intensity, corresponding to the intensity-weighted phase derivative statistic. In this paper, we investigate the phase derivative and intensity-weighted phase derivative returned from a two-dimensional random walk, which constitutes a generic scattering model capable of producing both Gaussian and non-Gaussian fluctuations. Analytical results are developed for the correlation properties of the intensity-weighted phase derivative, as well as limiting probability densities of the scattered field. Numerical simulation is used to generate further probability densities and determine optimal weighting criteria from non-Gaussian fields. The results are relevant to frequency retrieval in radiation scattered from random media

  19. A Bayesian non-inferiority test for two independent binomial proportions.

    Science.gov (United States)

    Kawasaki, Yohei; Miyaoka, Etsuo

    2013-01-01

    In drug development, non-inferiority tests are often employed to determine the difference between two independent binomial proportions. Many test statistics for non-inferiority are based on the frequentist framework. However, research on non-inferiority in the Bayesian framework is limited. In this paper, we suggest a new Bayesian index τ = P(π₁  > π₂-Δ₀|X₁, X₂), where X₁ and X₂ denote binomial random variables for trials n1 and n₂, and parameters π₁ and π₂ , respectively, and the non-inferiority margin is Δ₀> 0. We show two calculation methods for τ, an approximate method that uses normal approximation and an exact method that uses an exact posterior PDF. We compare the approximate probability with the exact probability for τ. Finally, we present the results of actual clinical trials to show the utility of index τ. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Statistical tests for person misfit in computerized adaptive testing

    NARCIS (Netherlands)

    Glas, Cornelis A.W.; Meijer, R.R.; van Krimpen-Stoop, Edith

    1998-01-01

    Recently, several person-fit statistics have been proposed to detect nonfitting response patterns. This study is designed to generalize an approach followed by Klauer (1995) to an adaptive testing system using the two-parameter logistic model (2PL) as a null model. The approach developed by Klauer

  1. Statistical analysis of natural radiation levels inside the UNICAMP campus through the use of Geiger-Muller counter

    International Nuclear Information System (INIS)

    Fontolan, Juliana A.; Biral, Antonio Renato P.

    2013-01-01

    It is known that the distribution at time intervals of random and unrelated events leads to the Poisson distribution . This work aims to study the distribution in time intervals of events resulting from radioactive decay of atoms present in the UNICAMP where activities involving the use of ionizing radiation are performed environments . The proposal is that the distribution surveys at intervals of these events in different locations of the university are carried out through the use of a Geiger-Mueller tube . In a next step , the evaluation of distributions obtained by using non- parametric statistics (Chi- square and Kolmogorov Smirnoff) will be taken . For analyzes involving correlations we intend to use the ANOVA (Analysis of Variance) statistical tool . Measured in six different places within the Campinas , with the use of Geiger- Muller its count mode and a time window of 20 seconds was performed . Through statistical tools chi- square and Kolmogorov Smirnoff tests, using the EXCEL program , it was observed that the distributions actually refer to a Poisson distribution. Finally, the next step is to perform analyzes involving correlations using the statistical tool ANOVA

  2. Quantile regression for the statistical analysis of immunological data with many non-detects

    NARCIS (Netherlands)

    Eilers, P.H.C.; Roder, E.; Savelkoul, H.F.J.; Wijk, van R.G.

    2012-01-01

    Background Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical

  3. [Clinical research IV. Relevancy of the statistical test chosen].

    Science.gov (United States)

    Talavera, Juan O; Rivas-Ruiz, Rodolfo

    2011-01-01

    When we look at the difference between two therapies or the association of a risk factor or prognostic indicator with its outcome, we need to evaluate the accuracy of the result. This assessment is based on a judgment that uses information about the study design and statistical management of the information. This paper specifically mentions the relevance of the statistical test selected. Statistical tests are chosen mainly from two characteristics: the objective of the study and type of variables. The objective can be divided into three test groups: a) those in which you want to show differences between groups or inside a group before and after a maneuver, b) those that seek to show the relationship (correlation) between variables, and c) those that aim to predict an outcome. The types of variables are divided in two: quantitative (continuous and discontinuous) and qualitative (ordinal and dichotomous). For example, if we seek to demonstrate differences in age (quantitative variable) among patients with systemic lupus erythematosus (SLE) with and without neurological disease (two groups), the appropriate test is the "Student t test for independent samples." But if the comparison is about the frequency of females (binomial variable), then the appropriate statistical test is the χ(2).

  4. Statistical physics of non-thermal phase transitions from foundations to applications

    CERN Document Server

    Abaimov, Sergey G

    2015-01-01

    Statistical physics can be used to better understand non-thermal complex systems—phenomena such as stock-market crashes, revolutions in society and in science, fractures in engineered materials and in the Earth’s crust, catastrophes, traffic jams, petroleum clusters, polymerization, self-organized criticality and many others exhibit behaviors resembling those of thermodynamic systems. In particular, many of these systems possess phase transitions identical to critical or spinodal phenomena in statistical physics. The application of the well-developed formalism of statistical physics to non-thermal complex systems may help to predict and prevent such catastrophes as earthquakes, snow-avalanches and landslides, failure of engineering structures, or economical crises. This book addresses the issue step-by-step, from phenomenological analogies between complex systems and statistical physics to more complex aspects, such as correlations, fluctuation-dissipation theorem, susceptibility, the concept of free ener...

  5. Statistical searches for microlensing events in large, non-uniformly sampled time-domain surveys: A test using palomar transient factory data

    Energy Technology Data Exchange (ETDEWEB)

    Price-Whelan, Adrian M.; Agüeros, Marcel A. [Department of Astronomy, Columbia University, 550 W 120th Street, New York, NY 10027 (United States); Fournier, Amanda P. [Department of Physics, Broida Hall, University of California, Santa Barbara, CA 93106 (United States); Street, Rachel [Las Cumbres Observatory Global Telescope Network, Inc., 6740 Cortona Drive, Suite 102, Santa Barbara, CA 93117 (United States); Ofek, Eran O. [Benoziyo Center for Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel); Covey, Kevin R. [Lowell Observatory, 1400 West Mars Hill Road, Flagstaff, AZ 86001 (United States); Levitan, David; Sesar, Branimir [Division of Physics, Mathematics, and Astronomy, California Institute of Technology, Pasadena, CA 91125 (United States); Laher, Russ R.; Surace, Jason, E-mail: adrn@astro.columbia.edu [Spitzer Science Center, California Institute of Technology, Mail Stop 314-6, Pasadena, CA 91125 (United States)

    2014-01-20

    Many photometric time-domain surveys are driven by specific goals, such as searches for supernovae or transiting exoplanets, which set the cadence with which fields are re-imaged. In the case of the Palomar Transient Factory (PTF), several sub-surveys are conducted in parallel, leading to non-uniform sampling over its ∼20,000 deg{sup 2} footprint. While the median 7.26 deg{sup 2} PTF field has been imaged ∼40 times in the R band, ∼2300 deg{sup 2} have been observed >100 times. We use PTF data to study the trade off between searching for microlensing events in a survey whose footprint is much larger than that of typical microlensing searches, but with far-from-optimal time sampling. To examine the probability that microlensing events can be recovered in these data, we test statistics used on uniformly sampled data to identify variables and transients. We find that the von Neumann ratio performs best for identifying simulated microlensing events in our data. We develop a selection method using this statistic and apply it to data from fields with >10 R-band observations, 1.1 × 10{sup 9} light curves, uncovering three candidate microlensing events. We lack simultaneous, multi-color photometry to confirm these as microlensing events. However, their number is consistent with predictions for the event rate in the PTF footprint over the survey's three years of operations, as estimated from near-field microlensing models. This work can help constrain all-sky event rate predictions and tests microlensing signal recovery in large data sets, which will be useful to future time-domain surveys, such as that planned with the Large Synoptic Survey Telescope.

  6. Unit Root Testing and Estimation in Nonlinear ESTAR Models with Normal and Non-Normal Errors.

    Directory of Open Access Journals (Sweden)

    Umair Khalil

    Full Text Available Exponential Smooth Transition Autoregressive (ESTAR models can capture non-linear adjustment of the deviations from equilibrium conditions which may explain the economic behavior of many variables that appear non stationary from a linear viewpoint. Many researchers employ the Kapetanios test which has a unit root as the null and a stationary nonlinear model as the alternative. However this test statistics is based on the assumption of normally distributed errors in the DGP. Cook has analyzed the size of the nonlinear unit root of this test in the presence of heavy-tailed innovation process and obtained the critical values for both finite variance and infinite variance cases. However the test statistics of Cook are oversized. It has been found by researchers that using conventional tests is dangerous though the best performance among these is a HCCME. The over sizing for LM tests can be reduced by employing fixed design wild bootstrap remedies which provide a valuable alternative to the conventional tests. In this paper the size of the Kapetanios test statistic employing hetroscedastic consistent covariance matrices has been derived and the results are reported for various sample sizes in which size distortion is reduced. The properties for estimates of ESTAR models have been investigated when errors are assumed non-normal. We compare the results obtained through the fitting of nonlinear least square with that of the quantile regression fitting in the presence of outliers and the error distribution was considered to be from t-distribution for various sample sizes.

  7. Parametric Resonance in Dynamical Systems

    CERN Document Server

    Nijmeijer, Henk

    2012-01-01

    Parametric Resonance in Dynamical Systems discusses the phenomenon of parametric resonance and its occurrence in mechanical systems,vehicles, motorcycles, aircraft and marine craft, and micro-electro-mechanical systems. The contributors provide an introduction to the root causes of this phenomenon and its mathematical equivalent, the Mathieu-Hill equation. Also included is a discussion of how parametric resonance occurs on ships and offshore systems and its frequency in mechanical and electrical systems. This book also: Presents the theory and principles behind parametric resonance Provides a unique collection of the different fields where parametric resonance appears including ships and offshore structures, automotive vehicles and mechanical systems Discusses ways to combat, cope with and prevent parametric resonance including passive design measures and active control methods Parametric Resonance in Dynamical Systems is ideal for researchers and mechanical engineers working in application fields such as MEM...

  8. Quantum tomography enhanced through parametric amplification

    Science.gov (United States)

    Knyazev, E.; Spasibko, K. Yu; Chekhova, M. V.; Khalili, F. Ya

    2018-01-01

    Quantum tomography is the standard method of reconstructing the Wigner function of quantum states of light by means of balanced homodyne detection. The reconstruction quality strongly depends on the photodetectors quantum efficiency and other losses in the measurement setup. In this article we analyze in detail a protocol of enhanced quantum tomography, proposed by Leonhardt and Paul [1] which allows one to reduce the degrading effect of detection losses. It is based on phase-sensitive parametric amplification, with the phase of the amplified quadrature being scanned synchronously with the local oscillator phase. Although with sufficiently strong amplification the protocol enables overcoming any detection inefficiency, it was so far not implemented in the experiment, probably due to the losses in the amplifier. Here we discuss a possible proof-of-principle experiment with a traveling-wave parametric amplifier. We show that with the state-of-the-art optical elements, the protocol enables high fidelity tomographic reconstruction of bright non-classical states of light. We consider two examples: bright squeezed vacuum and squeezed single-photon state, with the latter being a non-Gaussian state and both strongly affected by the losses.

  9. Epicyclic helical channels for parametric resonance ionization cooling

    Energy Technology Data Exchange (ETDEWEB)

    Johson, Rolland Paul [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Derbenev, Yaroslav [Muons, Inc., Batavia, IL (United States)

    2015-08-23

    Proposed next-generation muon colliders will require major technical advances to achieve rapid muon beam cooling requirements. Parametric-resonance Ionization Cooling (PIC) is proposed as the final 6D cooling stage of a high-luminosity muon collider. In PIC, a half-integer parametric resonance causes strong focusing of a muon beam at appropriately placed energy absorbers while ionization cooling limits the beam’s angular spread. Combining muon ionization cooling with parametric resonant dynamics in this way should then allow much smaller final transverse muon beam sizes than conventional ionization cooling alone. One of the PIC challenges is compensation of beam aberrations over a sufficiently wide parameter range while maintaining the dynamical stability with correlated behavior of the horizontal and vertical betatron motion and dispersion. We explore use of a coupling resonance to reduce the dimensionality of the problem and to shift the dynamics away from non-linear resonances. PIC simulations are presented.

  10. Statistical analysis and planning of multihundred-watt impact tests

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waterman, M.S.

    1977-10-01

    Modular multihundred-watt (MHW) radioisotope thermoelectric generators (RTG's) are used as a power source for spacecraft. Due to possible environmental contamination by radioactive materials, numerous tests are required to determine and verify the safety of the RTG. There are results available from 27 fueled MHW impact tests regarding hoop failure, fingerprint failure, and fuel failure. Data from the 27 tests are statistically analyzed for relationships that exist between the test design variables and the failure types. Next, these relationships are used to develop a statistical procedure for planning and conducting either future MHW impact tests or similar tests on other RTG fuel sources. Finally, some conclusions are given

  11. Fuzzy interval Finite Element/Statistical Energy Analysis for mid-frequency analysis of built-up systems with mixed fuzzy and interval parameters

    Science.gov (United States)

    Yin, Hui; Yu, Dejie; Yin, Shengwen; Xia, Baizhan

    2016-10-01

    This paper introduces mixed fuzzy and interval parametric uncertainties into the FE components of the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model for mid-frequency analysis of built-up systems, thus an uncertain ensemble combining non-parametric with mixed fuzzy and interval parametric uncertainties comes into being. A fuzzy interval Finite Element/Statistical Energy Analysis (FIFE/SEA) framework is proposed to obtain the uncertain responses of built-up systems, which are described as intervals with fuzzy bounds, termed as fuzzy-bounded intervals (FBIs) in this paper. Based on the level-cut technique, a first-order fuzzy interval perturbation FE/SEA (FFIPFE/SEA) and a second-order fuzzy interval perturbation FE/SEA method (SFIPFE/SEA) are developed to handle the mixed parametric uncertainties efficiently. FFIPFE/SEA approximates the response functions by the first-order Taylor series, while SFIPFE/SEA improves the accuracy by considering the second-order items of Taylor series, in which all the mixed second-order items are neglected. To further improve the accuracy, a Chebyshev fuzzy interval method (CFIM) is proposed, in which the Chebyshev polynomials is used to approximate the response functions. The FBIs are eventually reconstructed by assembling the extrema solutions at all cut levels. Numerical results on two built-up systems verify the effectiveness of the proposed methods.

  12. Statistical inference for the lifetime performance index based on generalised order statistics from exponential distribution

    Science.gov (United States)

    Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar

    2015-04-01

    In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.

  13. Different uptake of {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO in the same brains: analysis by statistical parametric mapping

    Energy Technology Data Exchange (ETDEWEB)

    Hyun, I.Y. [Dept. of Nuclear Medicine, Inha University College of Medicine, Incheon (Korea); Lee, J.S.; Lee, D.S. [Dept. of Nuclear Medicine, Seoul National University College of Medicine, Seoul (Korea); Rha, J.H.; Lee, I.K.; Ha, C.K. [Dept. of Neurology, Inha University College of Medicine, Incheon (Korea)

    2001-02-01

    The purpose of this study was to investigate the differences between technetium-99m ethyl cysteinate dimer ({sup 99m}Tc-ECD) and technetium-99m hexamethylpropylene amine oxime ({sup 99m}Tc-HMPAO) uptake in the same brains by means of statistical parametric mapping (SPM) analysis. We examined 20 patients (9 male, 11 female, mean age 62{+-}12 years) using {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO single-photon emission tomography (SPET) and magnetic resonance imaging (MRI) of the brain less than 7 days after onset of stroke. MRI showed no cortical infarctions. Infarctions in the pons (6 patients) and medulla (1), ischaemic periventricular white matter lesions (13) and lacunar infarction (7) were found on MRI. Split-dose and sequential SPET techniques were used for {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO brain SPET, without repositioning of the patient. All of the SPET images were spatially transformed to standard space, smoothed and globally normalized. The differences between the {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO SPET images were statistically analysed using statistical parametric mapping (SPM) 96 software. The difference between two groups was considered significant at a threshold of uncorrected P values less than 0.01. Visual analysis showed no hypoperfused areas on either {sup 99m}Tc-ECD or {sup 99m}Tc-HMPAO SPET images. SPM analysis revealed significantly different uptake of {sup 99m}Tc-ECD and {sup 99m}Tc-HMPAO in the same brains. On the {sup 99m}Tc-ECD SPET images, relatively higher uptake was observed in the frontal, parietal and occipital lobes, in the left superior temporal lobe and in the superior region of the cerebellum. On the {sup 99m}Tc-HMPAO SPET images, relatively higher uptake was observed in the medial temporal lobes, thalami, periventricular white matter and brain stem. These differences in uptake of the two tracers in the same brains on SPM analysis suggest that interpretation of cerebral perfusion is possible using SPET with {sup 99m}Tc-ECD and

  14. Multi-level approach for parametric roll analysis

    Science.gov (United States)

    Kim, Taeyoung; Kim, Yonghwan

    2011-03-01

    The present study considers multi-level approach for the analysis of parametric roll phenomena. Three kinds of computation method, GM variation, impulse response function (IRF), and Rankine panel method, are applied for the multi-level approach. IRF and Rankine panel method are based on the weakly nonlinear formulation which includes nonlinear Froude- Krylov and restoring forces. In the computation result of parametric roll occurrence test in regular waves, IRF and Rankine panel method show similar tendency. Although the GM variation approach predicts the occurrence of parametric roll at twice roll natural frequency, its frequency criteria shows a little difference. Nonlinear roll motion in bichromatic wave is also considered in this study. To prove the unstable roll motion in bichromatic waves, theoretical and numerical approaches are applied. The occurrence of parametric roll is theoretically examined by introducing the quasi-periodic Mathieu equation. Instability criteria are well predicted from stability analysis in theoretical approach. From the Fourier analysis, it has been verified that difference-frequency effects create the unstable roll motion. The occurrence of unstable roll motion in bichromatic wave is also observed in the experiment.

  15. A course in mathematical statistics and large sample theory

    CERN Document Server

    Bhattacharya, Rabi; Patrangenaru, Victor

    2016-01-01

    This graduate-level textbook is primarily aimed at graduate students of statistics, mathematics, science, and engineering who have had an undergraduate course in statistics, an upper division course in analysis, and some acquaintance with measure theoretic probability. It provides a rigorous presentation of the core of mathematical statistics. Part I of this book constitutes a one-semester course on basic parametric mathematical statistics. Part II deals with the large sample theory of statisticsparametric and nonparametric, and its contents may be covered in one semester as well. Part III provides brief accounts of a number of topics of current interest for practitioners and other disciplines whose work involves statistical methods. Large Sample theory with many worked examples, numerical calculations, and simulations to illustrate theory Appendices provide ready access to a number of standard results, with many proofs Solutions given to a number of selected exercises from Part I Part II exercises with ...

  16. Analysis of trends of low flow in river stations in eastern Slovakia

    Directory of Open Access Journals (Sweden)

    Martina Zeleňáková

    2012-01-01

    Full Text Available The availability of using hypothesis test techniques to identify the long-term trends of hydrological time series is investigated in this study. The aim is to analyse trends of low flows at streams in eastern Slovakia, namely Poprad, Hornád, Bodva, Bodrog river basins. The article presents a methodology for prediction of hydrological drought based on statistical testing of low stream flows by non-parametric statistical test. The main objective is to identify low flow trends in the selected 63 river stations in eastern Slovakia. The stations with human impacts are also evaluated. The Mann-Kendall non-parametric test has been used to detect trends in hydrological time series. Statistically significant trends have been determined from the trend lines for the whole territory of eastern Slovakia. The results indicate that the observed changes in Slovakian river basins do not have a clearly defined trend.

  17. On selection of optimal stochastic model for accelerated life testing

    International Nuclear Information System (INIS)

    Volf, P.; Timková, J.

    2014-01-01

    This paper deals with the problem of proper lifetime model selection in the context of statistical reliability analysis. Namely, we consider regression models describing the dependence of failure intensities on a covariate, for instance, a stressor. Testing the model fit is standardly based on the so-called martingale residuals. Their analysis has already been studied by many authors. Nevertheless, the Bayes approach to the problem, in spite of its advantages, is just developing. We shall present the Bayes procedure of estimation in several semi-parametric regression models of failure intensity. Then, our main concern is the Bayes construction of residual processes and goodness-of-fit tests based on them. The method is illustrated with both artificial and real-data examples. - Highlights: • Statistical survival and reliability analysis and Bayes approach. • Bayes semi-parametric regression modeling in Cox's and AFT models. • Bayes version of martingale residuals and goodness-of-fit test

  18. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  19. Testing the statistical compatibility of independent data sets

    International Nuclear Information System (INIS)

    Maltoni, M.; Schwetz, T.

    2003-01-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ 2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed

  20. A parametric study of strength reduction factors for elasto-plastic ...

    Indian Academy of Sciences (India)

    A parametric study of strength reduction factors for elasto-plastic oscillators ... motion duration, earthquake magnitude, geological site conditions, and epicentral distance in case of (non-degrading) elasto-plastic oscillators. ... Sadhana | News.

  1. Non-Parametric Kinetic (NPK Analysis of Thermal Oxidation of Carbon Aerogels

    Directory of Open Access Journals (Sweden)

    Azadeh Seifi

    2017-05-01

    Full Text Available In recent years, much attention has been paid to aerogel materials (especially carbon aerogels due to their potential uses in energy-related applications, such as thermal energy storage and thermal protection systems. These open cell carbon-based porous materials (carbon aerogels can strongly react with oxygen at relatively low temperatures (~ 400°C. Therefore, it is necessary to evaluate the thermal performance of carbon aerogels in view of their energy-related applications at high temperatures and under thermal oxidation conditions. The objective of this paper is to study theoretically and experimentally the oxidation reaction kinetics of carbon aerogel using the non-parametric kinetic (NPK as a powerful method. For this purpose, a non-isothermal thermogravimetric analysis, at three different heating rates, was performed on three samples each with its specific pore structure, density and specific surface area. The most significant feature of this method, in comparison with the model-free isoconversional methods, is its ability to separate the functionality of the reaction rate with the degree of conversion and temperature by the direct use of thermogravimetric data. Using this method, it was observed that the Nomen-Sempere model could provide the best fit to the data, while the temperature dependence of the rate constant was best explained by a Vogel-Fulcher relationship, where the reference temperature was the onset temperature of oxidation. Moreover, it was found from the results of this work that the assumption of the Arrhenius relation for the temperature dependence of the rate constant led to over-estimation of the apparent activation energy (up to 160 kJ/mol that was considerably different from the values (up to 3.5 kJ/mol predicted by the Vogel-Fulcher relationship in isoconversional methods

  2. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  3. Prediction intervals for future BMI values of individual children - a non-parametric approach by quantile boosting

    Directory of Open Access Journals (Sweden)

    Mayr Andreas

    2012-01-01

    Full Text Available Abstract Background The construction of prediction intervals (PIs for future body mass index (BMI values of individual children based on a recent German birth cohort study with n = 2007 children is problematic for standard parametric approaches, as the BMI distribution in childhood is typically skewed depending on age. Methods We avoid distributional assumptions by directly modelling the borders of PIs by additive quantile regression, estimated by boosting. We point out the concept of conditional coverage to prove the accuracy of PIs. As conditional coverage can hardly be evaluated in practical applications, we conduct a simulation study before fitting child- and covariate-specific PIs for future BMI values and BMI patterns for the present data. Results The results of our simulation study suggest that PIs fitted by quantile boosting cover future observations with the predefined coverage probability and outperform the benchmark approach. For the prediction of future BMI values, quantile boosting automatically selects informative covariates and adapts to the age-specific skewness of the BMI distribution. The lengths of the estimated PIs are child-specific and increase, as expected, with the age of the child. Conclusions Quantile boosting is a promising approach to construct PIs with correct conditional coverage in a non-parametric way. It is in particular suitable for the prediction of BMI patterns depending on covariates, since it provides an interpretable predictor structure, inherent variable selection properties and can even account for longitudinal data structures.

  4. Parametric instability in GEO 600 interferometer

    International Nuclear Information System (INIS)

    Gurkovsky, A.G.; Vyatchanin, S.P.

    2007-01-01

    We present analysis of undesirable effect of parametric instability in signal recycled GEO 600 interferometer. The basis for this effect is provided by excitation of additional (Stokes) optical mode, having frequency ω 1 , and mirror elastic mode, having frequency ω m , when the optical energy stored in the main FP cavity mode, having frequency ω 0 , exceeds a certain threshold and detuning Δ=ω 0 -ω 1 -ω m is small. We discuss the potential of observing parametric instability and its precursors in GEO 600 interferometer. This approach provides the best option to get familiar with this phenomenon, to develop experimental methods to depress it and to test the effectiveness of these methods in situ

  5. Statistical clustering of parametric maps from dynamic contrast enhanced MRI and an associated decision tree model for non-invasive tumour grading of T1b solid clear cell renal cell carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Xi, Yin; Yuan, Qing; Zhang, Yue; Fulkerson, Michael [UT Southwestern Medical Center, Department of Radiology, Dallas, TX (United States); Madhuranthakam, Ananth J. [UT Southwestern Medical Center, Department of Radiology, Dallas, TX (United States); UT Southwestern Medical Center, Advanced Imaging Research Center, Dallas, TX (United States); Margulis, Vitaly; Cadeddu, Jeffrey A. [UT Southwestern Medical Center, Department of Urology, Dallas, TX (United States); UT Southwestern Medical Center, Kidney Cancer Program, Simmons Comprehensive Cancer Center, Dallas, TX (United States); Brugarolas, James [UT Southwestern Medical Center, Kidney Cancer Program, Simmons Comprehensive Cancer Center, Dallas, TX (United States); UT Southwestern Medical Center, Department of Internal Medicine, Dallas, TX (United States); Kapur, Payal [UT Southwestern Medical Center, Department of Urology, Dallas, TX (United States); UT Southwestern Medical Center, Kidney Cancer Program, Simmons Comprehensive Cancer Center, Dallas, TX (United States); UT Southwestern Medical Center, Department of Pathology, Dallas, Texas (United States); Pedrosa, Ivan [UT Southwestern Medical Center, Department of Radiology, Dallas, TX (United States); UT Southwestern Medical Center, Advanced Imaging Research Center, Dallas, TX (United States); UT Southwestern Medical Center, Kidney Cancer Program, Simmons Comprehensive Cancer Center, Dallas, TX (United States)

    2018-01-15

    To apply a statistical clustering algorithm to combine information from dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) into a single tumour map to distinguish high-grade from low-grade T1b clear cell renal cell carcinoma (ccRCC). This prospective, Institutional Review Board -approved, Health Insurance Portability and Accountability Act -compliant study included 18 patients with solid T1b ccRCC who underwent pre-surgical DCE MRI. After statistical clustering of the parametric maps of the transfer constant between the intravascular and extravascular space (K{sup trans}), rate constant (K{sub ep}) and initial area under the concentration curve (iAUC) with a fuzzy c-means (FCM) algorithm, each tumour was segmented into three regions (low/medium/high active areas). Percentages of each region and tumour size were compared to tumour grade at histopathology. A decision-tree model was constructed to select the best parameter(s) to predict high-grade ccRCC. Seven high-grade and 11 low-grade T1b ccRCCs were included. High-grade histology was associated with higher percent high active areas (p = 0.0154) and this was the only feature selected by the decision tree model, which had a diagnostic performance of 78% accuracy, 86% sensitivity, 73% specificity, 67% positive predictive value and 89% negative predictive value. The FCM integrates multiple DCE-derived parameter maps and identifies tumour regions with unique pharmacokinetic characteristics. Using this approach, a decision tree model using criteria beyond size to predict tumour grade in T1b ccRCCs is proposed. (orig.)

  6. Statistical clustering of parametric maps from dynamic contrast enhanced MRI and an associated decision tree model for non-invasive tumour grading of T1b solid clear cell renal cell carcinoma

    International Nuclear Information System (INIS)

    Xi, Yin; Yuan, Qing; Zhang, Yue; Fulkerson, Michael; Madhuranthakam, Ananth J.; Margulis, Vitaly; Cadeddu, Jeffrey A.; Brugarolas, James; Kapur, Payal; Pedrosa, Ivan

    2018-01-01

    To apply a statistical clustering algorithm to combine information from dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) into a single tumour map to distinguish high-grade from low-grade T1b clear cell renal cell carcinoma (ccRCC). This prospective, Institutional Review Board -approved, Health Insurance Portability and Accountability Act -compliant study included 18 patients with solid T1b ccRCC who underwent pre-surgical DCE MRI. After statistical clustering of the parametric maps of the transfer constant between the intravascular and extravascular space (K trans ), rate constant (K ep ) and initial area under the concentration curve (iAUC) with a fuzzy c-means (FCM) algorithm, each tumour was segmented into three regions (low/medium/high active areas). Percentages of each region and tumour size were compared to tumour grade at histopathology. A decision-tree model was constructed to select the best parameter(s) to predict high-grade ccRCC. Seven high-grade and 11 low-grade T1b ccRCCs were included. High-grade histology was associated with higher percent high active areas (p = 0.0154) and this was the only feature selected by the decision tree model, which had a diagnostic performance of 78% accuracy, 86% sensitivity, 73% specificity, 67% positive predictive value and 89% negative predictive value. The FCM integrates multiple DCE-derived parameter maps and identifies tumour regions with unique pharmacokinetic characteristics. Using this approach, a decision tree model using criteria beyond size to predict tumour grade in T1b ccRCCs is proposed. (orig.)

  7. Statistical State Dynamics Based Study of the Role of Nonlinearity in the Maintenance of Turbulence in Couette Flow

    Science.gov (United States)

    Farrell, Brian; Ioannou, Petros; Nikolaidis, Marios-Andreas

    2017-11-01

    While linear non-normality underlies the mechanism of energy transfer from the externally driven flow to the perturbation field, nonlinearity is also known to play an essential role in sustaining turbulence. We report a study based on the statistical state dynamics of Couette flow turbulence with the goal of better understanding the role of nonlinearity in sustaining turbulence. The statistical state dynamics implementations used are ensemble closures at second order in a cumulant expansion of the Navier-Stokes equations in which the averaging operator is the streamwise mean. Two fundamentally non-normal mechanisms potentially contributing to maintaining the second cumulant are identified. These are essentially parametric perturbation growth arising from interaction of the perturbations with the fluctuating mean flow and transient growth of perturbations arising from nonlinear interaction between components of the perturbation field. By the method of selectively including these mechanisms parametric growth is found to maintain the perturbation field in the turbulent state while the more commonly invoked mechanism associated with transient growth of perturbations arising from scattering by nonlinear interaction is found to suppress perturbation variance. Funded by ERC Coturb Madrid Summer Program and NSF AGS-1246929.

  8. Replication of Non-Trivial Directional Motion in Multi-Scales Observed by the Runs Test

    Science.gov (United States)

    Yura, Yoshihiro; Ohnishi, Takaaki; Yamada, Kenta; Takayasu, Hideki; Takayasu, Misako

    Non-trivial autocorrelation in up-down statistics in financial market price fluctuation is revealed by a multi-scale runs test(Wald-Wolfowitz test). We apply two models, a stochastic price model and dealer model to understand this property. In both approaches we successfully reproduce the non-stationary directional price motions consistent with the runs test by tuning parameters in the models. We find that two types of dealers exist in the markets, a short-time-scale trend-follower and an extended-time-scale contrarian who are active in different time periods.

  9. HOW TO SELECT APPROPRIATE STATISTICAL TEST IN SCIENTIFIC ARTICLES

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-09-01

    Full Text Available Statistics is mathematical science dealing with the collection, analysis, interpretation, and presentation of masses of numerical data in order to draw relevant conclusions. Statistics is a form of mathematical analysis that uses quantified models, representations and synopses for a given set of experimental data or real-life studies. The students and young researchers in biomedical sciences and in special education and rehabilitation often declare that they have chosen to enroll that study program because they have lack of knowledge or interest in mathematics. This is a sad statement, but there is much truth in it. The aim of this editorial is to help young researchers to select statistics or statistical techniques and statistical software appropriate for the purposes and conditions of a particular analysis. The most important statistical tests are reviewed in the article. Knowing how to choose right statistical test is an important asset and decision in the research data processing and in the writing of scientific papers. Young researchers and authors should know how to choose and how to use statistical methods. The competent researcher will need knowledge in statistical procedures. That might include an introductory statistics course, and it most certainly includes using a good statistics textbook. For this purpose, there is need to return of Statistics mandatory subject in the curriculum of the Institute of Special Education and Rehabilitation at Faculty of Philosophy in Skopje. Young researchers have a need of additional courses in statistics. They need to train themselves to use statistical software on appropriate way.

  10. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  11. Parametrization of contrails in a comprehensive climate model

    Energy Technology Data Exchange (ETDEWEB)

    Ponater, M; Brinkop, S; Sausen, R; Schumann, U [Deutsche Forschungs- und Versuchsanstalt fuer Luft- und Raumfahrt e.V., Oberpfaffenhofen (Germany). Inst. fuer Physik der Atmosphaere

    1998-12-31

    A contrail parametrization scheme for a general circulation model (GCM) is presented. Guidelines for its development were that it should be based on the thermodynamic theory of contrail formation and that it should be consistent with the cloud parametrization scheme of the GCM. Results of a six-year test integration indicate reasonable results concerning the spatial and temporal development of both contrail coverage and contrail optical properties. Hence, the scheme forms a promising basis for the quantitative estimation of the contrail climatic impact. (author) 9 refs.

  12. Parametrization of contrails in a comprehensive climate model

    Energy Technology Data Exchange (ETDEWEB)

    Ponater, M.; Brinkop, S.; Sausen, R.; Schumann, U. [Deutsche Forschungs- und Versuchsanstalt fuer Luft- und Raumfahrt e.V., Oberpfaffenhofen (Germany). Inst. fuer Physik der Atmosphaere

    1997-12-31

    A contrail parametrization scheme for a general circulation model (GCM) is presented. Guidelines for its development were that it should be based on the thermodynamic theory of contrail formation and that it should be consistent with the cloud parametrization scheme of the GCM. Results of a six-year test integration indicate reasonable results concerning the spatial and temporal development of both contrail coverage and contrail optical properties. Hence, the scheme forms a promising basis for the quantitative estimation of the contrail climatic impact. (author) 9 refs.

  13. Monte Carlo testing in spatial statistics, with applications to spatial residuals

    DEFF Research Database (Denmark)

    Mrkvička, Tomáš; Soubeyrand, Samuel; Myllymäki, Mari

    2016-01-01

    This paper reviews recent advances made in testing in spatial statistics and discussed at the Spatial Statistics conference in Avignon 2015. The rank and directional quantile envelope tests are discussed and practical rules for their use are provided. These tests are global envelope tests...... with an appropriate type I error probability. Two novel examples are given on their usage. First, in addition to the test based on a classical one-dimensional summary function, the goodness-of-fit of a point process model is evaluated by means of the test based on a higher dimensional functional statistic, namely...

  14. Parametric estimation of P(X > Y) for normal distributions in the context of probabilistic environmental risk assessment.

    Science.gov (United States)

    Jacobs, Rianne; Bekker, Andriëtte A; van der Voet, Hilko; Ter Braak, Cajo J F

    2015-01-01

    Estimating the risk, P(X > Y), in probabilistic environmental risk assessment of nanoparticles is a problem when confronted by potentially small risks and small sample sizes of the exposure concentration X and/or the effect concentration Y. This is illustrated in the motivating case study of aquatic risk assessment of nano-Ag. A non-parametric estimator based on data alone is not sufficient as it is limited by sample size. In this paper, we investigate the maximum gain possible when making strong parametric assumptions as opposed to making no parametric assumptions at all. We compare maximum likelihood and Bayesian estimators with the non-parametric estimator and study the influence of sample size and risk on the (interval) estimators via simulation. We found that the parametric estimators enable us to estimate and bound the risk for smaller sample sizes and small risks. Also, the Bayesian estimator outperforms the maximum likelihood estimators in terms of coverage and interval lengths and is, therefore, preferred in our motivating case study.

  15. Kolmogorov complexity, pseudorandom generators and statistical models testing

    Czech Academy of Sciences Publication Activity Database

    Šindelář, Jan; Boček, Pavel

    2002-01-01

    Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002

  16. statistical tests for frequency distribution of mean gravity anomalies

    African Journals Online (AJOL)

    ES Obe

    1980-03-01

    Mar 1, 1980 ... STATISTICAL TESTS FOR FREQUENCY DISTRIBUTION OF MEAN. GRAVITY ANOMALIES. By ... approach. Kaula [1,2] discussed the method of applying statistical techniques in the ..... mathematical foundation of physical ...

  17. The Gap in Noise test in 11 and 12-year-old children.

    Science.gov (United States)

    Perez, Ana Paula; Pereira, Liliane Desgualdo

    2010-01-01

    gap detection in 11 and 12-year-old children. to investigate temporal resolution through the Gap in Noise test in children of 11 and 12 years in order to establish criteria of normal development. participants were 92 children, with ages of 11 and 12 years, enrolled in elementary school, with no evidences of otologic, and/or neurologic, and/or cognitive disorders, as well as with no history of learning difficulties or school failure. Besides that, participants' hearing thresholds were within normal limits and their verbal recognition in the dichotic test of digits was equal or superior to 95% of hits. All were submitted to the Gap in Noise test. The statistical analysis was performed by non-parametric tests with significance level of 0.05 (5%). the average of the gap thresholds was 5.05 ms, and the average percentage of correct answers was 71.70%. There was no significant statistical difference between the responses by age (eleven and twelve years), by ear (right and left), by gender (male and female). However, when comparing the tests, it was observed that the 1st test showed a higher percentage of identifications of gap, statistically significant than the 2nd test. in 78.27% of the population of this study, the gap thresholds were up to 5 ms, response recommended as normality reference for the age group searched.

  18. A parametric level-set approach for topology optimization of flow domains

    DEFF Research Database (Denmark)

    Pingen, Georg; Waidmann, Matthias; Evgrafov, Anton

    2010-01-01

    of the design variables in the traditional approaches is seen as a possible cause for the slow convergence. Non-smooth material distributions are suspected to trigger premature onset of instationary flows which cannot be treated by steady-state flow models. In the present work, we study whether the convergence...... and the versatility of topology optimization methods for fluidic systems can be improved by employing a parametric level-set description. In general, level-set methods allow controlling the smoothness of boundaries, yield a non-local influence of design variables, and decouple the material description from the flow...... field discretization. The parametric level-set method used in this study utilizes a material distribution approach to represent flow boundaries, resulting in a non-trivial mapping between design variables and local material properties. Using a hydrodynamic lattice Boltzmann method, we study...

  19. A physiology-based parametric imaging method for FDG-PET data

    Science.gov (United States)

    Scussolini, Mara; Garbarino, Sara; Sambuceti, Gianmario; Caviglia, Giacomo; Piana, Michele

    2017-12-01

    Parametric imaging is a compartmental approach that processes nuclear imaging data to estimate the spatial distribution of the kinetic parameters governing tracer flow. The present paper proposes a novel and efficient computational method for parametric imaging which is potentially applicable to several compartmental models of diverse complexity and which is effective in the determination of the parametric maps of all kinetic coefficients. We consider applications to [18 F]-fluorodeoxyglucose positron emission tomography (FDG-PET) data and analyze the two-compartment catenary model describing the standard FDG metabolization by an homogeneous tissue and the three-compartment non-catenary model representing the renal physiology. We show uniqueness theorems for both models. The proposed imaging method starts from the reconstructed FDG-PET images of tracer concentration and preliminarily applies image processing algorithms for noise reduction and image segmentation. The optimization procedure solves pixel-wise the non-linear inverse problem of determining the kinetic parameters from dynamic concentration data through a regularized Gauss-Newton iterative algorithm. The reliability of the method is validated against synthetic data, for the two-compartment system, and experimental real data of murine models, for the renal three-compartment system.

  20. Understanding the Sampling Distribution and Its Use in Testing Statistical Significance.

    Science.gov (United States)

    Breunig, Nancy A.

    Despite the increasing criticism of statistical significance testing by researchers, particularly in the publication of the 1994 American Psychological Association's style manual, statistical significance test results are still popular in journal articles. For this reason, it remains important to understand the logic of inferential statistics. A…