WorldWideScience

Sample records for nonparametric spectral analysis

  1. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  2. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  3. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...... within a nonparametric panel data regression framework. The fourth paper analyses the technical efficiency of dairy farms with environmental output using nonparametric kernel regression in a semiparametric stochastic frontier analysis. The results provided in this PhD thesis show that nonparametric......This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...

  4. Nonparametric analysis of blocked ordered categories data: some examples revisited

    Directory of Open Access Journals (Sweden)

    O. Thas

    2006-08-01

    Full Text Available Nonparametric analysis for general block designs can be given by using the Cochran-Mantel-Haenszel (CMH statistics. We demonstrate this with four examples and note that several well-known nonparametric statistics are special cases of CMH statistics.

  5. Comparative analysis of automotive paints by laser induced breakdown spectroscopy and nonparametric permutation tests

    International Nuclear Information System (INIS)

    McIntee, Erin; Viglino, Emilie; Rinke, Caitlin; Kumor, Stephanie; Ni Liqiang; Sigman, Michael E.

    2010-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been investigated for the discrimination of automobile paint samples. Paint samples from automobiles of different makes, models, and years were collected and separated into sets based on the color, presence or absence of effect pigments and the number of paint layers. Twelve LIBS spectra were obtained for each paint sample, each an average of a five single shot 'drill down' spectra from consecutive laser ablations in the same spot on the sample. Analyses by a nonparametric permutation test and a parametric Wald test were performed to determine the extent of discrimination within each set of paint samples. The discrimination power and Type I error were assessed for each data analysis method. Conversion of the spectral intensity to a log-scale (base 10) resulted in a higher overall discrimination power while observing the same significance level. Working on the log-scale, the nonparametric permutation tests gave an overall 89.83% discrimination power with a size of Type I error being 4.44% at the nominal significance level of 5%. White paint samples, as a group, were the most difficult to differentiate with the power being only 86.56% followed by 95.83% for black paint samples. Parametric analysis of the data set produced lower discrimination (85.17%) with 3.33% Type I errors, which is not recommended for both theoretical and practical considerations. The nonparametric testing method is applicable across many analytical comparisons, with the specific application described here being the pairwise comparison of automotive paint samples.

  6. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  7. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  8. Non-parametric analysis of production efficiency of poultry egg ...

    African Journals Online (AJOL)

    Non-parametric analysis of production efficiency of poultry egg farmers in Delta ... analysis of factors affecting the output of poultry farmers showed that stock ... should be put in place for farmers to learn the best farm practices carried out on the ...

  9. Analysis of small sample size studies using nonparametric bootstrap test with pooled resampling method.

    Science.gov (United States)

    Dwivedi, Alok Kumar; Mallawaarachchi, Indika; Alvarado, Luis A

    2017-06-30

    Experimental studies in biomedical research frequently pose analytical problems related to small sample size. In such studies, there are conflicting findings regarding the choice of parametric and nonparametric analysis, especially with non-normal data. In such instances, some methodologists questioned the validity of parametric tests and suggested nonparametric tests. In contrast, other methodologists found nonparametric tests to be too conservative and less powerful and thus preferred using parametric tests. Some researchers have recommended using a bootstrap test; however, this method also has small sample size limitation. We used a pooled method in nonparametric bootstrap test that may overcome the problem related with small samples in hypothesis testing. The present study compared nonparametric bootstrap test with pooled resampling method corresponding to parametric, nonparametric, and permutation tests through extensive simulations under various conditions and using real data examples. The nonparametric pooled bootstrap t-test provided equal or greater power for comparing two means as compared with unpaired t-test, Welch t-test, Wilcoxon rank sum test, and permutation test while maintaining type I error probability for any conditions except for Cauchy and extreme variable lognormal distributions. In such cases, we suggest using an exact Wilcoxon rank sum test. Nonparametric bootstrap paired t-test also provided better performance than other alternatives. Nonparametric bootstrap test provided benefit over exact Kruskal-Wallis test. We suggest using nonparametric bootstrap test with pooled resampling method for comparing paired or unpaired means and for validating the one way analysis of variance test results for non-normal data in small sample size studies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  11. Theoretical remarks on the statistics of three discriminants in Piety's automated signature analysis of PSD [Power Spectral Density] data

    International Nuclear Information System (INIS)

    Behringer, K.; Spiekerman, G.

    1984-01-01

    Piety (1977) proposed an automated signature analysis of power spectral density data. Eight statistical decision discriminants are introduced. For nearly all the discriminants, improved confidence statements can be made. The statistical characteristics of the last three discriminants, which are applications of non-parametric tests, are considered. (author)

  12. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  13. Weak Disposability in Nonparametric Production Analysis with Undesirable Outputs

    NARCIS (Netherlands)

    Kuosmanen, T.K.

    2005-01-01

    Environmental Economics and Natural Resources Group at Wageningen University in The Netherlands Weak disposability of outputs means that firms can abate harmful emissions by decreasing the activity level. Modeling weak disposability in nonparametric production analysis has caused some confusion.

  14. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural ...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  15. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  16. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods: A Comparison with Clinical Assessment

    Science.gov (United States)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H.; Maurits, Natasha M.

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a better insight into tremor. Typically, routine clinical assessment of accelerometry and electromyography data involves visual inspection by clinicians and occasionally computational analysis to obtain objective characteristics of tremor. However, for some tremor disorders these characteristics may be different during daily activity. This variability in presentation between the clinic and daily life makes a differential diagnosis more difficult. A long-term recording of tremor by accelerometry and/or electromyography in the home environment could help to give a better insight into the tremor disorder. However, an evaluation of such recordings using routine clinical standards would take too much time. We evaluated a range of techniques that automatically detect tremor segments in accelerometer data, as accelerometer data is more easily obtained in the home environment than electromyography data. Time can be saved if clinicians only have to evaluate the tremor characteristics of segments that have been automatically detected in longer daily activity recordings. We tested four non-parametric methods and five parametric methods on clinical accelerometer data from 14 patients with different tremor disorders. The consensus between two clinicians regarding the presence or absence of tremor on 3943 segments of accelerometer data was employed as reference. The nine methods were tested against this reference to identify their optimal parameters. Non-parametric methods generally performed better than parametric methods on our dataset when optimal parameters were used. However, one parametric method, employing the high frequency content of the tremor bandwidth under consideration

  17. Introduction to nonparametric statistics for the biological sciences using R

    CERN Document Server

    MacFarland, Thomas W

    2016-01-01

    This book contains a rich set of tools for nonparametric analyses, and the purpose of this supplemental text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses a...

  18. Nonparametric statistics with applications to science and engineering

    CERN Document Server

    Kvam, Paul H

    2007-01-01

    A thorough and definitive book that fully addresses traditional and modern-day topics of nonparametric statistics This book presents a practical approach to nonparametric statistical analysis and provides comprehensive coverage of both established and newly developed methods. With the use of MATLAB, the authors present information on theorems and rank tests in an applied fashion, with an emphasis on modern methods in regression and curve fitting, bootstrap confidence intervals, splines, wavelets, empirical likelihood, and goodness-of-fit testing. Nonparametric Statistics with Applications to Science and Engineering begins with succinct coverage of basic results for order statistics, methods of categorical data analysis, nonparametric regression, and curve fitting methods. The authors then focus on nonparametric procedures that are becoming more relevant to engineering researchers and practitioners. The important fundamental materials needed to effectively learn and apply the discussed methods are also provide...

  19. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    Science.gov (United States)

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  20. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  1. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  2. Bayesian nonparametric estimation of continuous monotone functions with applications to dose-response analysis.

    Science.gov (United States)

    Bornkamp, Björn; Ickstadt, Katja

    2009-03-01

    In this article, we consider monotone nonparametric regression in a Bayesian framework. The monotone function is modeled as a mixture of shifted and scaled parametric probability distribution functions, and a general random probability measure is assumed as the prior for the mixing distribution. We investigate the choice of the underlying parametric distribution function and find that the two-sided power distribution function is well suited both from a computational and mathematical point of view. The model is motivated by traditional nonlinear models for dose-response analysis, and provides possibilities to elicitate informative prior distributions on different aspects of the curve. The method is compared with other recent approaches to monotone nonparametric regression in a simulation study and is illustrated on a data set from dose-response analysis.

  3. Driving Style Analysis Using Primitive Driving Patterns With Bayesian Nonparametric Approaches

    OpenAIRE

    Wang, Wenshuo; Xi, Junqiang; Zhao, Ding

    2017-01-01

    Analysis and recognition of driving styles are profoundly important to intelligent transportation and vehicle calibration. This paper presents a novel driving style analysis framework using the primitive driving patterns learned from naturalistic driving data. In order to achieve this, first, a Bayesian nonparametric learning method based on a hidden semi-Markov model (HSMM) is introduced to extract primitive driving patterns from time series driving data without prior knowledge of the number...

  4. Decision support using nonparametric statistics

    CERN Document Server

    Beatty, Warren

    2018-01-01

    This concise volume covers nonparametric statistics topics that most are most likely to be seen and used from a practical decision support perspective. While many degree programs require a course in parametric statistics, these methods are often inadequate for real-world decision making in business environments. Much of the data collected today by business executives (for example, customer satisfaction opinions) requires nonparametric statistics for valid analysis, and this book provides the reader with a set of tools that can be used to validly analyze all data, regardless of type. Through numerous examples and exercises, this book explains why nonparametric statistics will lead to better decisions and how they are used to reach a decision, with a wide array of business applications. Online resources include exercise data, spreadsheets, and solutions.

  5. Nonparametric Collective Spectral Density Estimation and Clustering

    KAUST Repository

    Maadooliat, Mehdi; Sun, Ying; Chen, Tianbo

    2017-01-01

    In this paper, we develop a method for the simultaneous estimation of spectral density functions (SDFs) for a collection of stationary time series that share some common features. Due to the similarities among the SDFs, the log-SDF can be represented using a common set of basis functions. The basis shared by the collection of the log-SDFs is estimated as a low-dimensional manifold of a large space spanned by a pre-specified rich basis. A collective estimation approach pools information and borrows strength across the SDFs to achieve better estimation efficiency. Also, each estimated spectral density has a concise representation using the coefficients of the basis expansion, and these coefficients can be used for visualization, clustering, and classification purposes. The Whittle pseudo-maximum likelihood approach is used to fit the model and an alternating blockwise Newton-type algorithm is developed for the computation. A web-based shiny App found at

  6. Nonparametric Collective Spectral Density Estimation and Clustering

    KAUST Repository

    Maadooliat, Mehdi

    2017-04-12

    In this paper, we develop a method for the simultaneous estimation of spectral density functions (SDFs) for a collection of stationary time series that share some common features. Due to the similarities among the SDFs, the log-SDF can be represented using a common set of basis functions. The basis shared by the collection of the log-SDFs is estimated as a low-dimensional manifold of a large space spanned by a pre-specified rich basis. A collective estimation approach pools information and borrows strength across the SDFs to achieve better estimation efficiency. Also, each estimated spectral density has a concise representation using the coefficients of the basis expansion, and these coefficients can be used for visualization, clustering, and classification purposes. The Whittle pseudo-maximum likelihood approach is used to fit the model and an alternating blockwise Newton-type algorithm is developed for the computation. A web-based shiny App found at

  7. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... to the Multi-Directional Efficiency Analysis approach when the proposed models were employed to analyse empirical data of Lithuanian family farm performance, we saw substantial differences in efficiencies associated with different inputs. In particular, assets appeared to be the least efficiently used input...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  8. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    Science.gov (United States)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  9. A Structural Labor Supply Model with Nonparametric Preferences

    NARCIS (Netherlands)

    van Soest, A.H.O.; Das, J.W.M.; Gong, X.

    2000-01-01

    Nonparametric techniques are usually seen as a statistic device for data description and exploration, and not as a tool for estimating models with a richer economic structure, which are often required for policy analysis.This paper presents an example where nonparametric flexibility can be attained

  10. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  11. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  12. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    Science.gov (United States)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user

  13. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  14. Data analysis with small samples and non-normal data nonparametrics and other strategies

    CERN Document Server

    Siebert, Carl F

    2017-01-01

    Written in everyday language for non-statisticians, this book provides all the information needed to successfully conduct nonparametric analyses. This ideal reference book provides step-by-step instructions to lead the reader through each analysis, screenshots of the software and output, and case scenarios to illustrate of all the analytic techniques.

  15. Resolving Nonstationary Spectral Information in Wind Speed Time Series Using the Hilbert-Huang Transform

    DEFF Research Database (Denmark)

    Vincent, Claire Louise; Giebel, Gregor; Pinson, Pierre

    2010-01-01

    a 4-yr time series of 10-min wind speed observations. An adaptive spectral analysis method called the Hilbert–Huang transform is chosen for the analysis, because the nonstationarity of time series of wind speed observations means that they are not well described by a global spectral analysis method...... such as the Fourier transform. The Hilbert–Huang transform is a local method based on a nonparametric and empirical decomposition of the data followed by calculation of instantaneous amplitudes and frequencies using the Hilbert transform. The Hilbert–Huang transformed 4-yr time series is averaged and summarized...

  16. Parametric and Nonparametric EEG Analysis for the Evaluation of EEG Activity in Young Children with Controlled Epilepsy

    Directory of Open Access Journals (Sweden)

    Vangelis Sakkalis

    2008-01-01

    Full Text Available There is an important evidence of differences in the EEG frequency spectrum of control subjects as compared to epileptic subjects. In particular, the study of children presents difficulties due to the early stages of brain development and the various forms of epilepsy indications. In this study, we consider children that developed epileptic crises in the past but without any other clinical, psychological, or visible neurophysiological findings. The aim of the paper is to develop reliable techniques for testing if such controlled epilepsy induces related spectral differences in the EEG. Spectral features extracted by using nonparametric, signal representation techniques (Fourier and wavelet transform and a parametric, signal modeling technique (ARMA are compared and their effect on the classification of the two groups is analyzed. The subjects performed two different tasks: a control (rest task and a relatively difficult math task. The results show that spectral features extracted by modeling the EEG signals recorded from individual channels by an ARMA model give a higher discrimination between the two subject groups for the control task, where classification scores of up to 100% were obtained with a linear discriminant classifier.

  17. Scale-Free Nonparametric Factor Analysis: A User-Friendly Introduction with Concrete Heuristic Examples.

    Science.gov (United States)

    Mittag, Kathleen Cage

    Most researchers using factor analysis extract factors from a matrix of Pearson product-moment correlation coefficients. A method is presented for extracting factors in a non-parametric way, by extracting factors from a matrix of Spearman rho (rank correlation) coefficients. It is possible to factor analyze a matrix of association such that…

  18. A nonparametric mixture model for cure rate estimation.

    Science.gov (United States)

    Peng, Y; Dear, K B

    2000-03-01

    Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

  19. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  20. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  1. Spectral and cross-spectral analysis of uneven time series with the smoothed Lomb-Scargle periodogram and Monte Carlo evaluation of statistical significance

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.

    2012-12-01

    Many spectral analysis techniques have been designed assuming sequences taken with a constant sampling interval. However, there are empirical time series in the geosciences (sediment cores, fossil abundance data, isotope analysis, …) that do not follow regular sampling because of missing data, gapped data, random sampling or incomplete sequences, among other reasons. In general, interpolating an uneven series in order to obtain a succession with a constant sampling interval alters the spectral content of the series. In such cases it is preferable to follow an approach that works with the uneven data directly, avoiding the need for an explicit interpolation step. The Lomb-Scargle periodogram is a popular choice in such circumstances, as there are programs available in the public domain for its computation. One new computer program for spectral analysis improves the standard Lomb-Scargle periodogram approach in two ways: (1) It explicitly adjusts the statistical significance to any bias introduced by variance reduction smoothing, and (2) it uses a permutation test to evaluate confidence levels, which is better suited than parametric methods when neighbouring frequencies are highly correlated. Another novel program for cross-spectral analysis offers the advantage of estimating the Lomb-Scargle cross-periodogram of two uneven time series defined on the same interval, and it evaluates the confidence levels of the estimated cross-spectra by a non-parametric computer intensive permutation test. Thus, the cross-spectrum, the squared coherence spectrum, the phase spectrum, and the Monte Carlo statistical significance of the cross-spectrum and the squared-coherence spectrum can be obtained. Both of the programs are written in ANSI Fortran 77, in view of its simplicity and compatibility. The program code is of public domain, provided on the website of the journal (http://www.iamg.org/index.php/publisher/articleview/frmArticleID/112/). Different examples (with simulated and

  2. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  3. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  4. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  5. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  6. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    Science.gov (United States)

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  7. Hyperspectral image segmentation using a cooperative nonparametric approach

    Science.gov (United States)

    Taher, Akar; Chehdi, Kacem; Cariou, Claude

    2013-10-01

    In this paper a new unsupervised nonparametric cooperative and adaptive hyperspectral image segmentation approach is presented. The hyperspectral images are partitioned band by band in parallel and intermediate classification results are evaluated and fused, to get the final segmentation result. Two unsupervised nonparametric segmentation methods are used in parallel cooperation, namely the Fuzzy C-means (FCM) method, and the Linde-Buzo-Gray (LBG) algorithm, to segment each band of the image. The originality of the approach relies firstly on its local adaptation to the type of regions in an image (textured, non-textured), and secondly on the introduction of several levels of evaluation and validation of intermediate segmentation results before obtaining the final partitioning of the image. For the management of similar or conflicting results issued from the two classification methods, we gradually introduced various assessment steps that exploit the information of each spectral band and its adjacent bands, and finally the information of all the spectral bands. In our approach, the detected textured and non-textured regions are treated separately from feature extraction step, up to the final classification results. This approach was first evaluated on a large number of monocomponent images constructed from the Brodatz album. Then it was evaluated on two real applications using a respectively multispectral image for Cedar trees detection in the region of Baabdat (Lebanon) and a hyperspectral image for identification of invasive and non invasive vegetation in the region of Cieza (Spain). A correct classification rate (CCR) for the first application is over 97% and for the second application the average correct classification rate (ACCR) is over 99%.

  8. NParCov3: A SAS/IML Macro for Nonparametric Randomization-Based Analysis of Covariance

    Directory of Open Access Journals (Sweden)

    Richard C. Zink

    2012-07-01

    Full Text Available Analysis of covariance serves two important purposes in a randomized clinical trial. First, there is a reduction of variance for the treatment effect which provides more powerful statistical tests and more precise confidence intervals. Second, it provides estimates of the treatment effect which are adjusted for random imbalances of covariates between the treatment groups. The nonparametric analysis of covariance method of Koch, Tangen, Jung, and Amara (1998 defines a very general methodology using weighted least-squares to generate covariate-adjusted treatment effects with minimal assumptions. This methodology is general in its applicability to a variety of outcomes, whether continuous, binary, ordinal, incidence density or time-to-event. Further, its use has been illustrated in many clinical trial settings, such as multi-center, dose-response and non-inferiority trials.NParCov3 is a SAS/IML macro written to conduct the nonparametric randomization-based covariance analyses of Koch et al. (1998. The software can analyze a variety of outcomes and can account for stratification. Data from multiple clinical trials will be used for illustration.

  9. Discrete non-parametric kernel estimation for global sensitivity analysis

    International Nuclear Information System (INIS)

    Senga Kiessé, Tristan; Ventura, Anne

    2016-01-01

    This work investigates the discrete kernel approach for evaluating the contribution of the variance of discrete input variables to the variance of model output, via analysis of variance (ANOVA) decomposition. Until recently only the continuous kernel approach has been applied as a metamodeling approach within sensitivity analysis framework, for both discrete and continuous input variables. Now the discrete kernel estimation is known to be suitable for smoothing discrete functions. We present a discrete non-parametric kernel estimator of ANOVA decomposition of a given model. An estimator of sensitivity indices is also presented with its asymtotic convergence rate. Some simulations on a test function analysis and a real case study from agricultural have shown that the discrete kernel approach outperforms the continuous kernel one for evaluating the contribution of moderate or most influential discrete parameters to the model output. - Highlights: • We study a discrete kernel estimation for sensitivity analysis of a model. • A discrete kernel estimator of ANOVA decomposition of the model is presented. • Sensitivity indices are calculated for discrete input parameters. • An estimator of sensitivity indices is also presented with its convergence rate. • An application is realized for improving the reliability of environmental models.

  10. Application of nonparametric statistic method for DNBR limit calculation

    International Nuclear Information System (INIS)

    Dong Bo; Kuang Bo; Zhu Xuenong

    2013-01-01

    Background: Nonparametric statistical method is a kind of statistical inference method not depending on a certain distribution; it calculates the tolerance limits under certain probability level and confidence through sampling methods. The DNBR margin is one important parameter of NPP design, which presents the safety level of NPP. Purpose and Methods: This paper uses nonparametric statistical method basing on Wilks formula and VIPER-01 subchannel analysis code to calculate the DNBR design limits (DL) of 300 MW NPP (Nuclear Power Plant) during the complete loss of flow accident, simultaneously compared with the DL of DNBR through means of ITDP to get certain DNBR margin. Results: The results indicate that this method can gain 2.96% DNBR margin more than that obtained by ITDP methodology. Conclusions: Because of the reduction of the conservation during analysis process, the nonparametric statistical method can provide greater DNBR margin and the increase of DNBR margin is benefited for the upgrading of core refuel scheme. (authors)

  11. Non-Parametric Analysis of Rating Transition and Default Data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...

  12. Examination of Spectral Transformations on Spectral Mixture Analysis

    Science.gov (United States)

    Deng, Y.; Wu, C.

    2018-04-01

    While many spectral transformation techniques have been applied on spectral mixture analysis (SMA), few study examined their necessity and applicability. This paper focused on exploring the difference between spectrally transformed schemes and untransformed scheme to find out which transformed scheme performed better in SMA. In particular, nine spectrally transformed schemes as well as untransformed scheme were examined in two study areas. Each transformed scheme was tested 100 times using different endmember classes' spectra under the endmember model of vegetation- high albedo impervious surface area-low albedo impervious surface area-soil (V-ISAh-ISAl-S). Performance of each scheme was assessed based on mean absolute error (MAE). Statistical analysis technique, Paired-Samples T test, was applied to test the significance of mean MAEs' difference between transformed and untransformed schemes. Results demonstrated that only NSMA could exceed the untransformed scheme in all study areas. Some transformed schemes showed unstable performance since they outperformed the untransformed scheme in one area but weakened the SMA result in another region.

  13. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  14. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  15. An Extended Spectral-Spatial Classification Approach for Hyperspectral Data

    Science.gov (United States)

    Akbari, D.

    2017-11-01

    In this paper an extended classification approach for hyperspectral imagery based on both spectral and spatial information is proposed. The spatial information is obtained by an enhanced marker-based minimum spanning forest (MSF) algorithm. Three different methods of dimension reduction are first used to obtain the subspace of hyperspectral data: (1) unsupervised feature extraction methods including principal component analysis (PCA), independent component analysis (ICA), and minimum noise fraction (MNF); (2) supervised feature extraction including decision boundary feature extraction (DBFE), discriminate analysis feature extraction (DAFE), and nonparametric weighted feature extraction (NWFE); (3) genetic algorithm (GA). The spectral features obtained are then fed into the enhanced marker-based MSF classification algorithm. In the enhanced MSF algorithm, the markers are extracted from the classification maps obtained by both SVM and watershed segmentation algorithm. To evaluate the proposed approach, the Pavia University hyperspectral data is tested. Experimental results show that the proposed approach using GA achieves an approximately 8 % overall accuracy higher than the original MSF-based algorithm.

  16. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  17. Nonparametric Change Point Diagnosis Method of Concrete Dam Crack Behavior Abnormality

    Directory of Open Access Journals (Sweden)

    Zhanchao Li

    2013-01-01

    Full Text Available The study on diagnosis method of concrete crack behavior abnormality has always been a hot spot and difficulty in the safety monitoring field of hydraulic structure. Based on the performance of concrete dam crack behavior abnormality in parametric statistical model and nonparametric statistical model, the internal relation between concrete dam crack behavior abnormality and statistical change point theory is deeply analyzed from the model structure instability of parametric statistical model and change of sequence distribution law of nonparametric statistical model. On this basis, through the reduction of change point problem, the establishment of basic nonparametric change point model, and asymptotic analysis on test method of basic change point problem, the nonparametric change point diagnosis method of concrete dam crack behavior abnormality is created in consideration of the situation that in practice concrete dam crack behavior may have more abnormality points. And the nonparametric change point diagnosis method of concrete dam crack behavior abnormality is used in the actual project, demonstrating the effectiveness and scientific reasonableness of the method established. Meanwhile, the nonparametric change point diagnosis method of concrete dam crack behavior abnormality has a complete theoretical basis and strong practicality with a broad application prospect in actual project.

  18. SPECTRAL ANALYSIS OF EXCHANGE RATES

    Directory of Open Access Journals (Sweden)

    ALEŠA LOTRIČ DOLINAR

    2013-06-01

    Full Text Available Using spectral analysis is very common in technical areas but rather unusual in economics and finance, where ARIMA and GARCH modeling are much more in use. To show that spectral analysis can be useful in determining hidden periodic components for high-frequency finance data as well, we use the example of foreign exchange rates

  19. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  20. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  1. Comparing parametric and non-parametric classifiers for remote sensing of tree species across a land use gradient in a Savanna landscape

    CSIR Research Space (South Africa)

    Cho, Moses A

    2012-11-01

    Full Text Available ) and Random Forest (RF)). The spectral data used consisted of 8 WorldView-2 multispectral bands simulated from 72 VNIR bands image acquired over the study areas using the Carnegie Airborne Observatory (CAO) system. With the exception of SAM, the nonparametric...

  2. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  3. Spectral-spatial classification of hyperspectral data with mutual information based segmented stacked autoencoder approach

    Science.gov (United States)

    Paul, Subir; Nagesh Kumar, D.

    2018-04-01

    Hyperspectral (HS) data comprises of continuous spectral responses of hundreds of narrow spectral bands with very fine spectral resolution or bandwidth, which offer feature identification and classification with high accuracy. In the present study, Mutual Information (MI) based Segmented Stacked Autoencoder (S-SAE) approach for spectral-spatial classification of the HS data is proposed to reduce the complexity and computational time compared to Stacked Autoencoder (SAE) based feature extraction. A non-parametric dependency measure (MI) based spectral segmentation is proposed instead of linear and parametric dependency measure to take care of both linear and nonlinear inter-band dependency for spectral segmentation of the HS bands. Then morphological profiles are created corresponding to segmented spectral features to assimilate the spatial information in the spectral-spatial classification approach. Two non-parametric classifiers, Support Vector Machine (SVM) with Gaussian kernel and Random Forest (RF) are used for classification of the three most popularly used HS datasets. Results of the numerical experiments carried out in this study have shown that SVM with a Gaussian kernel is providing better results for the Pavia University and Botswana datasets whereas RF is performing better for Indian Pines dataset. The experiments performed with the proposed methodology provide encouraging results compared to numerous existing approaches.

  4. Genomic outlier profile analysis: mixture models, null hypotheses, and nonparametric estimation.

    Science.gov (United States)

    Ghosh, Debashis; Chinnaiyan, Arul M

    2009-01-01

    In most analyses of large-scale genomic data sets, differential expression analysis is typically assessed by testing for differences in the mean of the distributions between 2 groups. A recent finding by Tomlins and others (2005) is of a different type of pattern of differential expression in which a fraction of samples in one group have overexpression relative to samples in the other group. In this work, we describe a general mixture model framework for the assessment of this type of expression, called outlier profile analysis. We start by considering the single-gene situation and establishing results on identifiability. We propose 2 nonparametric estimation procedures that have natural links to familiar multiple testing procedures. We then develop multivariate extensions of this methodology to handle genome-wide measurements. The proposed methodologies are compared using simulation studies as well as data from a prostate cancer gene expression study.

  5. SPAM- SPECTRAL ANALYSIS MANAGER (UNIX VERSION)

    Science.gov (United States)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different

  6. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  7. Nonparametric Identification and Estimation of Finite Mixture Models of Dynamic Discrete Choices

    OpenAIRE

    Hiroyuki Kasahara; Katsumi Shimotsu

    2006-01-01

    In dynamic discrete choice analysis, controlling for unobserved heterogeneity is an important issue, and finite mixture models provide flexible ways to account for unobserved heterogeneity. This paper studies nonparametric identifiability of type probabilities and type-specific component distributions in finite mixture models of dynamic discrete choices. We derive sufficient conditions for nonparametric identification for various finite mixture models of dynamic discrete choices used in appli...

  8. Theory of nonparametric tests

    CERN Document Server

    Dickhaus, Thorsten

    2018-01-01

    This textbook provides a self-contained presentation of the main concepts and methods of nonparametric statistical testing, with a particular focus on the theoretical foundations of goodness-of-fit tests, rank tests, resampling tests, and projection tests. The substitution principle is employed as a unified approach to the nonparametric test problems discussed. In addition to mathematical theory, it also includes numerous examples and computer implementations. The book is intended for advanced undergraduate, graduate, and postdoc students as well as young researchers. Readers should be familiar with the basic concepts of mathematical statistics typically covered in introductory statistics courses.

  9. Single versus mixture Weibull distributions for nonparametric satellite reliability

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Long recognized as a critical design attribute for space systems, satellite reliability has not yet received the proper attention as limited on-orbit failure data and statistical analyses can be found in the technical literature. To fill this gap, we recently conducted a nonparametric analysis of satellite reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we provide an advanced parametric fit, based on mixture of Weibull distributions, and compare it with the single Weibull distribution model obtained with the Maximum Likelihood Estimation (MLE) method. We demonstrate that both parametric fits are good approximations of the nonparametric satellite reliability, but that the mixture Weibull distribution provides significant accuracy in capturing all the failure trends in the failure data, as evidenced by the analysis of the residuals and their quasi-normal dispersion.

  10. SpectralNET – an application for spectral graph analysis and visualization

    Directory of Open Access Journals (Sweden)

    Schreiber Stuart L

    2005-10-01

    Full Text Available Abstract Background Graph theory provides a computational framework for modeling a variety of datasets including those emerging from genomics, proteomics, and chemical genetics. Networks of genes, proteins, small molecules, or other objects of study can be represented as graphs of nodes (vertices and interactions (edges that can carry different weights. SpectralNET is a flexible application for analyzing and visualizing these biological and chemical networks. Results Available both as a standalone .NET executable and as an ASP.NET web application, SpectralNET was designed specifically with the analysis of graph-theoretic metrics in mind, a computational task not easily accessible using currently available applications. Users can choose either to upload a network for analysis using a variety of input formats, or to have SpectralNET generate an idealized random network for comparison to a real-world dataset. Whichever graph-generation method is used, SpectralNET displays detailed information about each connected component of the graph, including graphs of degree distribution, clustering coefficient by degree, and average distance by degree. In addition, extensive information about the selected vertex is shown, including degree, clustering coefficient, various distance metrics, and the corresponding components of the adjacency, Laplacian, and normalized Laplacian eigenvectors. SpectralNET also displays several graph visualizations, including a linear dimensionality reduction for uploaded datasets (Principal Components Analysis and a non-linear dimensionality reduction that provides an elegant view of global graph structure (Laplacian eigenvectors. Conclusion SpectralNET provides an easily accessible means of analyzing graph-theoretic metrics for data modeling and dimensionality reduction. SpectralNET is publicly available as both a .NET application and an ASP.NET web application from http://chembank.broad.harvard.edu/resources/. Source code is

  11. Nonparametric Transfer Function Models

    Science.gov (United States)

    Liu, Jun M.; Chen, Rong; Yao, Qiwei

    2009-01-01

    In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584

  12. Spectral analysis by correlation

    International Nuclear Information System (INIS)

    Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G.

    1969-01-01

    The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [fr

  13. Trend Analysis of Pahang River Using Non-Parametric Analysis: Mann Kendalls Trend Test

    International Nuclear Information System (INIS)

    Nur Hishaam Sulaiman; Mohd Khairul Amri Kamarudin; Mohd Khairul Amri Kamarudin; Ahmad Dasuki Mustafa; Muhammad Azizi Amran; Fazureen Azaman; Ismail Zainal Abidin; Norsyuhada Hairoma

    2015-01-01

    Flood is common in Pahang especially during northeast monsoon season from November to February. Three river cross station: Lubuk Paku, Sg. Yap and Temerloh were selected as area of this study. The stream flow and water level data were gathered from DID record. Data set for this study were analysed by using non-parametric analysis, Mann-Kendall Trend Test. The results that obtained from stream flow and water level analysis indicate that there are positively significant trend for Lubuk Paku (0.001) and Sg. Yap (<0.0001) from 1972-2011 with the p-value < 0.05. Temerloh (0.178) data from 1963-2011 recorded no trend for stream flow parameter but negative trend for water level parameter. Hydrological pattern and trend are extremely affected by outside factors such as north east monsoon season that occurred in South China Sea and affected Pahang during November to March. There are other factors such as development and management of the areas which can be considered as factors affected the data and results. Hydrological Pattern is important to indicate the river trend such as stream flow and water level. It can be used as flood mitigation by local authorities. (author)

  14. Nonparametric functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yang, Jie; Wu, Rongling; Casella, George

    2009-03-01

    Functional mapping is a useful tool for mapping quantitative trait loci (QTL) that control dynamic traits. It incorporates mathematical aspects of biological processes into the mixture model-based likelihood setting for QTL mapping, thus increasing the power of QTL detection and the precision of parameter estimation. However, in many situations there is no obvious functional form and, in such cases, this strategy will not be optimal. Here we propose to use nonparametric function estimation, typically implemented with B-splines, to estimate the underlying functional form of phenotypic trajectories, and then construct a nonparametric test to find evidence of existing QTL. Using the representation of a nonparametric regression as a mixed model, the final test statistic is a likelihood ratio test. We consider two types of genetic maps: dense maps and general maps, and the power of nonparametric functional mapping is investigated through simulation studies and demonstrated by examples.

  15. A Bayesian approach to the analysis of quantal bioassay studies using nonparametric mixture models.

    Science.gov (United States)

    Fronczyk, Kassandra; Kottas, Athanasios

    2014-03-01

    We develop a Bayesian nonparametric mixture modeling framework for quantal bioassay settings. The approach is built upon modeling dose-dependent response distributions. We adopt a structured nonparametric prior mixture model, which induces a monotonicity restriction for the dose-response curve. Particular emphasis is placed on the key risk assessment goal of calibration for the dose level that corresponds to a specified response. The proposed methodology yields flexible inference for the dose-response relationship as well as for other inferential objectives, as illustrated with two data sets from the literature. © 2013, The International Biometric Society.

  16. Performances of non-parametric statistics in sensitivity analysis and parameter ranking

    International Nuclear Information System (INIS)

    Saltelli, A.

    1987-01-01

    Twelve parametric and non-parametric sensitivity analysis techniques are compared in the case of non-linear model responses. The test models used are taken from the long-term risk analysis for the disposal of high level radioactive waste in a geological formation. They describe the transport of radionuclides through a set of engineered and natural barriers from the repository to the biosphere and to man. The output data from these models are the dose rates affecting the maximum exposed individual of a critical group at a given point in time. All the techniques are applied to the output from the same Monte Carlo simulations, where a modified version of Latin Hypercube method is used for the sample selection. Hypothesis testing is systematically applied to quantify the degree of confidence in the results given by the various sensitivity estimators. The estimators are ranked according to their robustness and stability, on the basis of two test cases. The conclusions are that no estimator can be considered the best from all points of view and recommend the use of more than just one estimator in sensitivity analysis

  17. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  18. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  19. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  20. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Parametric vs. Nonparametric Regression Modelling within Clinical Decision Support

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Zvárová, Jana

    2017-01-01

    Roč. 5, č. 1 (2017), s. 21-27 ISSN 1805-8698 R&D Projects: GA ČR GA17-01251S Institutional support: RVO:67985807 Keywords : decision support systems * decision rules * statistical analysis * nonparametric regression Subject RIV: IN - Informatics, Computer Science OBOR OECD: Statistics and probability

  2. On Cooper's Nonparametric Test.

    Science.gov (United States)

    Schmeidler, James

    1978-01-01

    The basic assumption of Cooper's nonparametric test for trend (EJ 125 069) is questioned. It is contended that the proper assumption alters the distribution of the statistic and reduces its usefulness. (JKS)

  3. CATDAT - A program for parametric and nonparametric categorical data analysis user's manual, Version 1.0

    International Nuclear Information System (INIS)

    Peterson, James R.; Haas, Timothy C.; Lee, Danny C.

    2000-01-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network

  4. Nonparametric Bayesian density estimation on manifolds with applications to planar shapes.

    Science.gov (United States)

    Bhattacharya, Abhishek; Dunson, David B

    2010-12-01

    Statistical analysis on landmark-based shape spaces has diverse applications in morphometrics, medical diagnostics, machine vision and other areas. These shape spaces are non-Euclidean quotient manifolds. To conduct nonparametric inferences, one may define notions of centre and spread on this manifold and work with their estimates. However, it is useful to consider full likelihood-based methods, which allow nonparametric estimation of the probability density. This article proposes a broad class of mixture models constructed using suitable kernels on a general compact metric space and then on the planar shape space in particular. Following a Bayesian approach with a nonparametric prior on the mixing distribution, conditions are obtained under which the Kullback-Leibler property holds, implying large support and weak posterior consistency. Gibbs sampling methods are developed for posterior computation, and the methods are applied to problems in density estimation and classification with shape-based predictors. Simulation studies show improved estimation performance relative to existing approaches.

  5. Separability Analysis of Sentinel-2A Multi-Spectral Instrument (MSI Data for Burned Area Discrimination

    Directory of Open Access Journals (Sweden)

    Haiyan Huang

    2016-10-01

    Full Text Available Biomass burning is a global phenomenon and systematic burned area mapping is of increasing importance for science and applications. With high spatial resolution and novelty in band design, the recently launched Sentinel-2A satellite provides a new opportunity for moderate spatial resolution burned area mapping. This study examines the performance of the Sentinel-2A Multi Spectral Instrument (MSI bands and derived spectral indices to differentiate between unburned and burned areas. For this purpose, five pairs of pre-fire and post-fire top of atmosphere (TOA reflectance and atmospherically corrected (surface reflectance images were studied. The pixel values of locations that were unburned in the first image and burned in the second image, as well as the values of locations that were unburned in both images which served as a control, were compared and the discrimination of individual bands and spectral indices were evaluated using parametric (transformed divergence and non-parametric (decision tree approaches. Based on the results, the most suitable MSI bands to detect burned areas are the 20 m near-infrared, short wave infrared and red-edge bands, while the performance of the spectral indices varied with location. The atmospheric correction only significantly influenced the separability of the visible wavelength bands. The results provide insights that are useful for developing Sentinel-2 burned area mapping algorithms.

  6. Basic Functional Analysis Puzzles of Spectral Flow

    DEFF Research Database (Denmark)

    Booss-Bavnbek, Bernhelm

    2011-01-01

    We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles.......We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles....

  7. A reconstruction algorithm for three-dimensional object-space data using spatial-spectral multiplexing

    Science.gov (United States)

    Wu, Zhejun; Kudenov, Michael W.

    2017-05-01

    This paper presents a reconstruction algorithm for the Spatial-Spectral Multiplexing (SSM) optical system. The goal of this algorithm is to recover the three-dimensional spatial and spectral information of a scene, given that a one-dimensional spectrometer array is used to sample the pupil of the spatial-spectral modulator. The challenge of the reconstruction is that the non-parametric representation of the three-dimensional spatial and spectral object requires a large number of variables, thus leading to an underdetermined linear system that is hard to uniquely recover. We propose to reparameterize the spectrum using B-spline functions to reduce the number of unknown variables. Our reconstruction algorithm then solves the improved linear system via a least- square optimization of such B-spline coefficients with additional spatial smoothness regularization. The ground truth object and the optical model for the measurement matrix are simulated with both spatial and spectral assumptions according to a realistic field of view. In order to test the robustness of the algorithm, we add Poisson noise to the measurement and test on both two-dimensional and three-dimensional spatial and spectral scenes. Our analysis shows that the root mean square error of the recovered results can be achieved within 5.15%.

  8. Nonparametric e-Mixture Estimation.

    Science.gov (United States)

    Takano, Ken; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2016-12-01

    This study considers the common situation in data analysis when there are few observations of the distribution of interest or the target distribution, while abundant observations are available from auxiliary distributions. In this situation, it is natural to compensate for the lack of data from the target distribution by using data sets from these auxiliary distributions-in other words, approximating the target distribution in a subspace spanned by a set of auxiliary distributions. Mixture modeling is one of the simplest ways to integrate information from the target and auxiliary distributions in order to express the target distribution as accurately as possible. There are two typical mixtures in the context of information geometry: the [Formula: see text]- and [Formula: see text]-mixtures. The [Formula: see text]-mixture is applied in a variety of research fields because of the presence of the well-known expectation-maximazation algorithm for parameter estimation, whereas the [Formula: see text]-mixture is rarely used because of its difficulty of estimation, particularly for nonparametric models. The [Formula: see text]-mixture, however, is a well-tempered distribution that satisfies the principle of maximum entropy. To model a target distribution with scarce observations accurately, this letter proposes a novel framework for a nonparametric modeling of the [Formula: see text]-mixture and a geometrically inspired estimation algorithm. As numerical examples of the proposed framework, a transfer learning setup is considered. The experimental results show that this framework works well for three types of synthetic data sets, as well as an EEG real-world data set.

  9. A nonparametric approach to medical survival data: Uncertainty in the context of risk in mortality analysis

    International Nuclear Information System (INIS)

    Janurová, Kateřina; Briš, Radim

    2014-01-01

    Medical survival right-censored data of about 850 patients are evaluated to analyze the uncertainty related to the risk of mortality on one hand and compare two basic surgery techniques in the context of risk of mortality on the other hand. Colorectal data come from patients who underwent colectomy in the University Hospital of Ostrava. Two basic surgery operating techniques are used for the colectomy: either traditional (open) or minimally invasive (laparoscopic). Basic question arising at the colectomy operation is, which type of operation to choose to guarantee longer overall survival time. Two non-parametric approaches have been used to quantify probability of mortality with uncertainties. In fact, complement of the probability to one, i.e. survival function with corresponding confidence levels is calculated and evaluated. First approach considers standard nonparametric estimators resulting from both the Kaplan–Meier estimator of survival function in connection with Greenwood's formula and the Nelson–Aalen estimator of cumulative hazard function including confidence interval for survival function as well. The second innovative approach, represented by Nonparametric Predictive Inference (NPI), uses lower and upper probabilities for quantifying uncertainty and provides a model of predictive survival function instead of the population survival function. The traditional log-rank test on one hand and the nonparametric predictive comparison of two groups of lifetime data on the other hand have been compared to evaluate risk of mortality in the context of mentioned surgery techniques. The size of the difference between two groups of lifetime data has been considered and analyzed as well. Both nonparametric approaches led to the same conclusion, that the minimally invasive operating technique guarantees the patient significantly longer survival time in comparison with the traditional operating technique

  10. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  11. Scalable Bayesian nonparametric measures for exploring pairwise dependence via Dirichlet Process Mixtures.

    Science.gov (United States)

    Filippi, Sarah; Holmes, Chris C; Nieto-Barajas, Luis E

    2016-11-16

    In this article we propose novel Bayesian nonparametric methods using Dirichlet Process Mixture (DPM) models for detecting pairwise dependence between random variables while accounting for uncertainty in the form of the underlying distributions. A key criteria is that the procedures should scale to large data sets. In this regard we find that the formal calculation of the Bayes factor for a dependent-vs.-independent DPM joint probability measure is not feasible computationally. To address this we present Bayesian diagnostic measures for characterising evidence against a "null model" of pairwise independence. In simulation studies, as well as for a real data analysis, we show that our approach provides a useful tool for the exploratory nonparametric Bayesian analysis of large multivariate data sets.

  12. Substitution dynamical systems spectral analysis

    CERN Document Server

    Queffélec, Martine

    2010-01-01

    This volume mainly deals with the dynamics of finitely valued sequences, and more specifically, of sequences generated by substitutions and automata. Those sequences demonstrate fairly simple combinatorical and arithmetical properties and naturally appear in various domains. As the title suggests, the aim of the initial version of this book was the spectral study of the associated dynamical systems: the first chapters consisted in a detailed introduction to the mathematical notions involved, and the description of the spectral invariants followed in the closing chapters. This approach, combined with new material added to the new edition, results in a nearly self-contained book on the subject. New tools - which have also proven helpful in other contexts - had to be developed for this study. Moreover, its findings can be concretely applied, the method providing an algorithm to exhibit the spectral measures and the spectral multiplicity, as is demonstrated in several examples. Beyond this advanced analysis, many...

  13. SPAM- SPECTRAL ANALYSIS MANAGER (DEC VAX/VMS VERSION)

    Science.gov (United States)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different

  14. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    Observations from sums of random variables with a random number of summands, known as random, compound or stopped sums arise within many areas of engineering and science. Quite often it is desirable to infer properties of the distribution of the terms in the random sum. In the present paper we...... review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  15. Nonparametric Bayesian inference for multidimensional compound Poisson processes

    NARCIS (Netherlands)

    Gugushvili, S.; van der Meulen, F.; Spreij, P.

    2015-01-01

    Given a sample from a discretely observed multidimensional compound Poisson process, we study the problem of nonparametric estimation of its jump size density r0 and intensity λ0. We take a nonparametric Bayesian approach to the problem and determine posterior contraction rates in this context,

  16. A Bayesian nonparametric estimation of distributions and quantiles

    International Nuclear Information System (INIS)

    Poern, K.

    1988-11-01

    The report describes a Bayesian, nonparametric method for the estimation of a distribution function and its quantiles. The method, presupposing random sampling, is nonparametric, so the user has to specify a prior distribution on a space of distributions (and not on a parameter space). In the current application, where the method is used to estimate the uncertainty of a parametric calculational model, the Dirichlet prior distribution is to a large extent determined by the first batch of Monte Carlo-realizations. In this case the results of the estimation technique is very similar to the conventional empirical distribution function. The resulting posterior distribution is also Dirichlet, and thus facilitates the determination of probability (confidence) intervals at any given point in the space of interest. Another advantage is that also the posterior distribution of a specified quantitle can be derived and utilized to determine a probability interval for that quantile. The method was devised for use in the PROPER code package for uncertainty and sensitivity analysis. (orig.)

  17. Particulate characterization by PIXE multivariate spectral analysis

    International Nuclear Information System (INIS)

    Antolak, Arlyn J.; Morse, Daniel H.; Grant, Patrick G.; Kotula, Paul G.; Doyle, Barney L.; Richardson, Charles B.

    2007-01-01

    Obtaining particulate compositional maps from scanned PIXE (proton-induced X-ray emission) measurements is extremely difficult due to the complexity of analyzing spectroscopic data collected with low signal-to-noise at each scan point (pixel). Multivariate spectral analysis has the potential to analyze such data sets by reducing the PIXE data to a limited number of physically realizable and easily interpretable components (that include both spectral and image information). We have adapted the AXSIA (automated expert spectral image analysis) program, originally developed by Sandia National Laboratories to quantify electron-excited X-ray spectroscopy data, for this purpose. Samples consisting of particulates with known compositions and sizes were loaded onto Mylar and paper filter substrates and analyzed by scanned micro-PIXE. The data sets were processed by AXSIA and the associated principal component spectral data were quantified by converting the weighting images into concentration maps. The results indicate automated, nonbiased, multivariate statistical analysis is useful for converting very large amounts of data into a smaller, more manageable number of compositional components needed for locating individual particles-of-interest on large area collection media

  18. Functional analysis, spectral theory, and applications

    CERN Document Server

    Einsiedler, Manfred

    2017-01-01

    This textbook provides a careful treatment of functional analysis and some of its applications in analysis, number theory, and ergodic theory. In addition to discussing core material in functional analysis, the authors cover more recent and advanced topics, including Weyl’s law for eigenfunctions of the Laplace operator, amenability and property (T), the measurable functional calculus, spectral theory for unbounded operators, and an account of Tao’s approach to the prime number theorem using Banach algebras. The book further contains numerous examples and exercises, making it suitable for both lecture courses and self-study. Functional Analysis, Spectral Theory, and Applications is aimed at postgraduate and advanced undergraduate students with some background in analysis and algebra, but will also appeal to everyone with an interest in seeing how functional analysis can be applied to other parts of mathematics.

  19. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  20. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  1. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  2. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  3. NONPARAMETRIC FIXED EFFECT PANEL DATA MODELS: RELATIONSHIP BETWEEN AIR POLLUTION AND INCOME FOR TURKEY

    Directory of Open Access Journals (Sweden)

    Rabia Ece OMAY

    2013-06-01

    Full Text Available In this study, relationship between gross domestic product (GDP per capita and sulfur dioxide (SO2 and particulate matter (PM10 per capita is modeled for Turkey. Nonparametric fixed effect panel data analysis is used for the modeling. The panel data covers 12 territories, in first level of Nomenclature of Territorial Units for Statistics (NUTS, for period of 1990-2001. Modeling of the relationship between GDP and SO2 and PM10 for Turkey, the non-parametric models have given good results.

  4. Spectral theory and nonlinear functional analysis

    CERN Document Server

    Lopez-Gomez, Julian

    2001-01-01

    This Research Note addresses several pivotal problems in spectral theory and nonlinear functional analysis in connection with the analysis of the structure of the set of zeroes of a general class of nonlinear operators. It features the construction of an optimal algebraic/analytic invariant for calculating the Leray-Schauder degree, new methods for solving nonlinear equations in Banach spaces, and general properties of components of solutions sets presented with minimal use of topological tools. The author also gives several applications of the abstract theory to reaction diffusion equations and systems.The results presented cover a thirty-year period and include recent, unpublished findings of the author and his coworkers. Appealing to a broad audience, Spectral Theory and Nonlinear Functional Analysis contains many important contributions to linear algebra, linear and nonlinear functional analysis, and topology and opens the door for further advances.

  5. A NONPARAMETRIC HYPOTHESIS TEST VIA THE BOOTSTRAP RESAMPLING

    OpenAIRE

    Temel, Tugrul T.

    2001-01-01

    This paper adapts an already existing nonparametric hypothesis test to the bootstrap framework. The test utilizes the nonparametric kernel regression method to estimate a measure of distance between the models stated under the null hypothesis. The bootstraped version of the test allows to approximate errors involved in the asymptotic hypothesis test. The paper also develops a Mathematica Code for the test algorithm.

  6. Non-parametric smoothing of experimental data

    International Nuclear Information System (INIS)

    Kuketayev, A.T.; Pen'kov, F.M.

    2007-01-01

    Full text: Rapid processing of experimental data samples in nuclear physics often requires differentiation in order to find extrema. Therefore, even at the preliminary stage of data analysis, a range of noise reduction methods are used to smooth experimental data. There are many non-parametric smoothing techniques: interval averages, moving averages, exponential smoothing, etc. Nevertheless, it is more common to use a priori information about the behavior of the experimental curve in order to construct smoothing schemes based on the least squares techniques. The latter methodology's advantage is that the area under the curve can be preserved, which is equivalent to conservation of total speed of counting. The disadvantages of this approach include the lack of a priori information. For example, very often the sums of undifferentiated (by a detector) peaks are replaced with one peak during the processing of data, introducing uncontrolled errors in the determination of the physical quantities. The problem is solvable only by having experienced personnel, whose skills are much greater than the challenge. We propose a set of non-parametric techniques, which allows the use of any additional information on the nature of experimental dependence. The method is based on a construction of a functional, which includes both experimental data and a priori information. Minimum of this functional is reached on a non-parametric smoothed curve. Euler (Lagrange) differential equations are constructed for these curves; then their solutions are obtained analytically or numerically. The proposed approach allows for automated processing of nuclear physics data, eliminating the need for highly skilled laboratory personnel. Pursuant to the proposed approach is the possibility to obtain smoothing curves in a given confidence interval, e.g. according to the χ 2 distribution. This approach is applicable when constructing smooth solutions of ill-posed problems, in particular when solving

  7. [Analysis of sensitive spectral bands for burning status detection using hyper-spectral images of Tiangong-01].

    Science.gov (United States)

    Qin, Xian-Lin; Zhu, Xi; Yang, Fei; Zhao, Kai-Rui; Pang, Yong; Li, Zeng-Yuan; Li, Xu-Zhi; Zhang, Jiu-Xing

    2013-07-01

    To obtain the sensitive spectral bands for detection of information on 4 kinds of burning status, i. e. flaming, smoldering, smoke, and fire scar, with satellite data, analysis was conducted to identify suitable satellite spectral bands for detection of information on these 4 kinds of burning status by using hyper-spectrum images of Tiangong-01 (TG-01) and employing a method combining statistics and spectral analysis. The results show that: in the hyper-spectral images of TG-01, the spectral bands differ obviously for detection of these 4 kinds of burning status; in all hyper-spectral short-wave infrared channels, the reflectance of flaming is higher than that of all other 3 kinds of burning status, and the reflectance of smoke is the lowest; the reflectance of smoke is higher than that of all other 3 kinds of burning status in the channels corresponding to hyper-spectral visible near-infrared and panchromatic sensors. For spectral band selection, more suitable spectral bands for flaming detection are 1 000.0-1 956.0 and 2 020.0-2 400.0 nm; the suitable spectral bands for identifying smoldering are 930.0-1 000.0 and 1 084.0-2 400.0 nm; the suitable spectral bands for smoke detection is in 400.0-920.0 nm; for fire scar detection, it is suitable to select bands with central wavelengths of 900.0-930.0 and 1 300.0-2 400.0 nm, and then to combine them to construct a detection model.

  8. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo; Genton, Marc G.

    2013-01-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric

  9. Simple nonparametric checks for model data fit in CAT

    NARCIS (Netherlands)

    Meijer, R.R.

    2005-01-01

    In this paper, the usefulness of several nonparametric checks is discussed in a computerized adaptive testing (CAT) context. Although there is no tradition of nonparametric scalability in CAT, it can be argued that scalability checks can be useful to investigate, for example, the quality of item

  10. Surface Estimation, Variable Selection, and the Nonparametric Oracle Property.

    Science.gov (United States)

    Storlie, Curtis B; Bondell, Howard D; Reich, Brian J; Zhang, Hao Helen

    2011-04-01

    Variable selection for multivariate nonparametric regression is an important, yet challenging, problem due, in part, to the infinite dimensionality of the function space. An ideal selection procedure should be automatic, stable, easy to use, and have desirable asymptotic properties. In particular, we define a selection procedure to be nonparametric oracle (np-oracle) if it consistently selects the correct subset of predictors and at the same time estimates the smooth surface at the optimal nonparametric rate, as the sample size goes to infinity. In this paper, we propose a model selection procedure for nonparametric models, and explore the conditions under which the new method enjoys the aforementioned properties. Developed in the framework of smoothing spline ANOVA, our estimator is obtained via solving a regularization problem with a novel adaptive penalty on the sum of functional component norms. Theoretical properties of the new estimator are established. Additionally, numerous simulated and real examples further demonstrate that the new approach substantially outperforms other existing methods in the finite sample setting.

  11. Does Private Tutoring Work? The Effectiveness of Private Tutoring: A Nonparametric Bounds Analysis

    Science.gov (United States)

    Hof, Stefanie

    2014-01-01

    Private tutoring has become popular throughout the world. However, evidence for the effect of private tutoring on students' academic outcome is inconclusive; therefore, this paper presents an alternative framework: a nonparametric bounds method. The present examination uses, for the first time, a large representative data-set in a European setting…

  12. On Holo-Hilbert spectral analysis: a full informational spectral representation for nonlinear and non-stationary data

    OpenAIRE

    Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Hung; Peng, Chung Kang; Meijer, Johanna H.; Wang, Yung-Hung; Long, Steven R.; Wu, Zhauhua

    2016-01-01

    The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert–Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through c...

  13. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  14. A nonparametric spatial scan statistic for continuous data.

    Science.gov (United States)

    Jung, Inkyung; Cho, Ho Jin

    2015-10-20

    Spatial scan statistics are widely used for spatial cluster detection, and several parametric models exist. For continuous data, a normal-based scan statistic can be used. However, the performance of the model has not been fully evaluated for non-normal data. We propose a nonparametric spatial scan statistic based on the Wilcoxon rank-sum test statistic and compared the performance of the method with parametric models via a simulation study under various scenarios. The nonparametric method outperforms the normal-based scan statistic in terms of power and accuracy in almost all cases under consideration in the simulation study. The proposed nonparametric spatial scan statistic is therefore an excellent alternative to the normal model for continuous data and is especially useful for data following skewed or heavy-tailed distributions.

  15. Nonlinear physical systems spectral analysis, stability and bifurcations

    CERN Document Server

    Kirillov, Oleg N

    2013-01-01

    Bringing together 18 chapters written by leading experts in dynamical systems, operator theory, partial differential equations, and solid and fluid mechanics, this book presents state-of-the-art approaches to a wide spectrum of new and challenging stability problems.Nonlinear Physical Systems: Spectral Analysis, Stability and Bifurcations focuses on problems of spectral analysis, stability and bifurcations arising in the nonlinear partial differential equations of modern physics. Bifurcations and stability of solitary waves, geometrical optics stability analysis in hydro- and magnetohydrodynam

  16. Emissivity compensated spectral pyrometry—algorithm and sensitivity analysis

    International Nuclear Information System (INIS)

    Hagqvist, Petter; Sikström, Fredrik; Christiansson, Anna-Karin; Lennartson, Bengt

    2014-01-01

    In order to solve the problem of non-contact temperature measurements on an object with varying emissivity, a new method is herein described and evaluated. The method uses spectral radiance measurements and converts them to temperature readings. It proves to be resilient towards changes in spectral emissivity and tolerates noisy spectral measurements. It is based on an assumption of smooth changes in emissivity and uses historical values of spectral emissivity and temperature for estimating current spectral emissivity. The algorithm, its constituent steps and accompanying parameters are described and discussed. A thorough sensitivity analysis of the method is carried out through simulations. No rigorous instrument calibration is needed for the presented method and it is therefore industrially tractable. (paper)

  17. A ¤nonparametric dynamic additive regression model for longitudinal data

    DEFF Research Database (Denmark)

    Martinussen, T.; Scheike, T. H.

    2000-01-01

    dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models......dynamic linear models, estimating equations, least squares, longitudinal data, nonparametric methods, partly conditional mean models, time-varying-coefficient models...

  18. Spectral Envelopes and Additive + Residual Analysis/Synthesis

    Science.gov (United States)

    Rodet, Xavier; Schwarz, Diemo

    The subject of this chapter is the estimation, representation, modification, and use of spectral envelopes in the context of sinusoidal-additive-plus-residual analysis/synthesis. A spectral envelope is an amplitude-vs-frequency function, which may be obtained from the envelope of a short-time spectrum (Rodet et al., 1987; Schwarz, 1998). [Precise definitions of such an envelope and short-time spectrum (STS) are given in Section 2.] The additive-plus-residual analysis/synthesis method is based on a representation of signals in terms of a sum of time-varying sinusoids and of a non-sinusoidal residual signal [e.g., see Serra (1989), Laroche et al. (1993), McAulay and Quatieri (1995), and Ding and Qian (1997)]. Many musical sound signals may be described as a combination of a nearly periodic waveform and colored noise. The nearly periodic part of the signal can be viewed as a sum of sinusoidal components, called partials, with time-varying frequency and amplitude. Such sinusoidal components are easily observed on a spectral analysis display (Fig. 5.1) as obtained, for instance, from a discrete Fourier transform.

  19. Spatio-spectral analysis of ionization times in high-harmonic generation

    Energy Technology Data Exchange (ETDEWEB)

    Soifer, Hadas, E-mail: hadas.soifer@weizmann.ac.il [Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel); Dagan, Michal; Shafir, Dror; Bruner, Barry D. [Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel); Ivanov, Misha Yu. [Department of Physics, Imperial College London, South Kensington Campus, SW7 2AZ London (United Kingdom); Max-Born Institute for Nonlinear Optics and Short Pulse Spectroscopy, Max-Born-Strasse 2A, D-12489 Berlin (Germany); Serbinenko, Valeria; Barth, Ingo; Smirnova, Olga [Max-Born Institute for Nonlinear Optics and Short Pulse Spectroscopy, Max-Born-Strasse 2A, D-12489 Berlin (Germany); Dudovich, Nirit [Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel)

    2013-03-12

    Graphical abstract: A spatio-spectral analysis of the two-color oscillation phase allows us to accurately separate short and long trajectories and reconstruct their ionization times. Highlights: ► We perform a complete spatio-spectral analysis of the high harmonic generation process. ► We analyze the ionization times across the entire spatio-spectral plane of the harmonics. ► We apply this analysis to reconstruct the ionization times of both short and long trajectories. - Abstract: Recollision experiments have been very successful in resolving attosecond scale dynamics. However, such schemes rely on the single atom response, neglecting the macroscopic properties of the interaction and the effects of using multi-cycle laser fields. In this paper we perform a complete spatio-spectral analysis of the high harmonic generation process and resolve the distribution of the subcycle dynamics of the recolliding electron. Specifically, we focus on the measurement of ionization times. Recently, we have demonstrated that the addition of a weak, crossed polarized second harmonic field allows us to resolve the moment of ionization (Shafir, 2012) [1]. In this paper we extend this measurement and perform a complete spatio-spectral analysis. We apply this analysis to reconstruct the ionization times of both short and long trajectories showing good agreement with the quantum path analysis.

  20. Estimating technical efficiency in the hospital sector with panel data: a comparison of parametric and non-parametric techniques.

    Science.gov (United States)

    Siciliani, Luigi

    2006-01-01

    Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.

  1. Nonparametric Mixture Models for Supervised Image Parcellation.

    Science.gov (United States)

    Sabuncu, Mert R; Yeo, B T Thomas; Van Leemput, Koen; Fischl, Bruce; Golland, Polina

    2009-09-01

    We present a nonparametric, probabilistic mixture model for the supervised parcellation of images. The proposed model yields segmentation algorithms conceptually similar to the recently developed label fusion methods, which register a new image with each training image separately. Segmentation is achieved via the fusion of transferred manual labels. We show that in our framework various settings of a model parameter yield algorithms that use image intensity information differently in determining the weight of a training subject during fusion. One particular setting computes a single, global weight per training subject, whereas another setting uses locally varying weights when fusing the training data. The proposed nonparametric parcellation approach capitalizes on recently developed fast and robust pairwise image alignment tools. The use of multiple registrations allows the algorithm to be robust to occasional registration failures. We report experiments on 39 volumetric brain MRI scans with expert manual labels for the white matter, cerebral cortex, ventricles and subcortical structures. The results demonstrate that the proposed nonparametric segmentation framework yields significantly better segmentation than state-of-the-art algorithms.

  2. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  3. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  4. PIXE-quantified AXSIA: Elemental mapping by multivariate spectral analysis

    International Nuclear Information System (INIS)

    Doyle, B.L.; Provencio, P.P.; Kotula, P.G.; Antolak, A.J.; Ryan, C.G.; Campbell, J.L.; Barrett, K.

    2006-01-01

    Automated, nonbiased, multivariate statistical analysis techniques are useful for converting very large amounts of data into a smaller, more manageable number of chemical components (spectra and images) that are needed to describe the measurement. We report the first use of the multivariate spectral analysis program AXSIA (Automated eXpert Spectral Image Analysis) developed at Sandia National Laboratories to quantitatively analyze micro-PIXE data maps. AXSIA implements a multivariate curve resolution technique that reduces the spectral image data sets into a limited number of physically realizable and easily interpretable components (including both spectra and images). We show that the principal component spectra can be further analyzed using conventional PIXE programs to convert the weighting images into quantitative concentration maps. A common elemental data set has been analyzed using three different PIXE analysis codes and the results compared to the cases when each of these codes is used to separately analyze the associated AXSIA principal component spectral data. We find that these comparisons are in good quantitative agreement with each other

  5. Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification.

    Science.gov (United States)

    Fan, Jianqing; Feng, Yang; Jiang, Jiancheng; Tong, Xin

    We propose a high dimensional classification method that involves nonparametric feature augmentation. Knowing that marginal density ratios are the most powerful univariate classifiers, we use the ratio estimates to transform the original feature measurements. Subsequently, penalized logistic regression is invoked, taking as input the newly transformed or augmented features. This procedure trains models equipped with local complexity and global simplicity, thereby avoiding the curse of dimensionality while creating a flexible nonlinear decision boundary. The resulting method is called Feature Augmentation via Nonparametrics and Selection (FANS). We motivate FANS by generalizing the Naive Bayes model, writing the log ratio of joint densities as a linear combination of those of marginal densities. It is related to generalized additive models, but has better interpretability and computability. Risk bounds are developed for FANS. In numerical analysis, FANS is compared with competing methods, so as to provide a guideline on its best application domain. Real data analysis demonstrates that FANS performs very competitively on benchmark email spam and gene expression data sets. Moreover, FANS is implemented by an extremely fast algorithm through parallel computing.

  6. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  7. Nonparametric Monitoring for Geotechnical Structures Subject to Long-Term Environmental Change

    Directory of Open Access Journals (Sweden)

    Hae-Bum Yun

    2011-01-01

    Full Text Available A nonparametric, data-driven methodology of monitoring for geotechnical structures subject to long-term environmental change is discussed. Avoiding physical assumptions or excessive simplification of the monitored structures, the nonparametric monitoring methodology presented in this paper provides reliable performance-related information particularly when the collection of sensor data is limited. For the validation of the nonparametric methodology, a field case study was performed using a full-scale retaining wall, which had been monitored for three years using three tilt gauges. Using the very limited sensor data, it is demonstrated that important performance-related information, such as drainage performance and sensor damage, could be disentangled from significant daily, seasonal and multiyear environmental variations. Extensive literature review on recent developments of parametric and nonparametric data processing techniques for geotechnical applications is also presented.

  8. Efficiency Analysis of German Electricity Distribution Utilities : Non-Parametric and Parametric Tests

    OpenAIRE

    von Hirschhausen, Christian R.; Cullmann, Astrid

    2005-01-01

    Abstract This paper applies parametric and non-parametric and parametric tests to assess the efficiency of electricity distribution companies in Germany. We address traditional issues in electricity sector benchmarking, such as the role of scale effects and optimal utility size, as well as new evidence specific to the situation in Germany. We use labour, capital, and peak load capacity as inputs, and units sold and the number of customers as output. The data cover 307 (out of 553) ...

  9. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  10. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  11. Robustifying Bayesian nonparametric mixtures for count data.

    Science.gov (United States)

    Canale, Antonio; Prünster, Igor

    2017-03-01

    Our motivating application stems from surveys of natural populations and is characterized by large spatial heterogeneity in the counts, which makes parametric approaches to modeling local animal abundance too restrictive. We adopt a Bayesian nonparametric approach based on mixture models and innovate with respect to popular Dirichlet process mixture of Poisson kernels by increasing the model flexibility at the level both of the kernel and the nonparametric mixing measure. This allows to derive accurate and robust estimates of the distribution of local animal abundance and of the corresponding clusters. The application and a simulation study for different scenarios yield also some general methodological implications. Adding flexibility solely at the level of the mixing measure does not improve inferences, since its impact is severely limited by the rigidity of the Poisson kernel with considerable consequences in terms of bias. However, once a kernel more flexible than the Poisson is chosen, inferences can be robustified by choosing a prior more general than the Dirichlet process. Therefore, to improve the performance of Bayesian nonparametric mixtures for count data one has to enrich the model simultaneously at both levels, the kernel and the mixing measure. © 2016, The International Biometric Society.

  12. Spectral Analysis of Moderately Charged Rare-Gas Atoms

    Directory of Open Access Journals (Sweden)

    Jorge Reyna Almandos

    2017-03-01

    Full Text Available This article presents a review concerning the spectral analysis of several ions of neon, argon, krypton and xenon, with impact on laser studies and astrophysics that were mainly carried out in our collaborative groups between Argentina and Brazil during many years. The spectra were recorded from the vacuum ultraviolet to infrared regions using pulsed discharges. Semi-empirical approaches with relativistic Hartree–Fock and Dirac-Fock calculations were also included in these investigations. The spectral analysis produced new classified lines and energy levels. Lifetimes and oscillator strengths were also calculated.

  13. Evaluation of Fourier integral. Spectral analysis of seismic events

    International Nuclear Information System (INIS)

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  14. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...... considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how...... networks is included. In this paper, neural networks are used for predicting the electricity production of a wind farm. The results are compared with results obtained using an adaptively estimated ARX-model. Finally, two papers on stochastic differential equations are included. In the first paper, among...

  15. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Takamizawa, Hisashi, E-mail: takamizawa.hisashi@jaea.go.jp; Itoh, Hiroto, E-mail: ito.hiroto@jaea.go.jp; Nishiyama, Yutaka, E-mail: nishiyama.yutaka93@jaea.go.jp

    2016-10-15

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  16. Network structure exploration via Bayesian nonparametric models

    International Nuclear Information System (INIS)

    Chen, Y; Wang, X L; Xiang, X; Tang, B Z; Bu, J Z

    2015-01-01

    Complex networks provide a powerful mathematical representation of complex systems in nature and society. To understand complex networks, it is crucial to explore their internal structures, also called structural regularities. The task of network structure exploration is to determine how many groups there are in a complex network and how to group the nodes of the network. Most existing structure exploration methods need to specify either a group number or a certain type of structure when they are applied to a network. In the real world, however, the group number and also the certain type of structure that a network has are usually unknown in advance. To explore structural regularities in complex networks automatically, without any prior knowledge of the group number or the certain type of structure, we extend a probabilistic mixture model that can handle networks with any type of structure but needs to specify a group number using Bayesian nonparametric theory. We also propose a novel Bayesian nonparametric model, called the Bayesian nonparametric mixture (BNPM) model. Experiments conducted on a large number of networks with different structures show that the BNPM model is able to explore structural regularities in networks automatically with a stable, state-of-the-art performance. (paper)

  17. Bioprocess iterative batch-to-batch optimization based on hybrid parametric/nonparametric models.

    Science.gov (United States)

    Teixeira, Ana P; Clemente, João J; Cunha, António E; Carrondo, Manuel J T; Oliveira, Rui

    2006-01-01

    This paper presents a novel method for iterative batch-to-batch dynamic optimization of bioprocesses. The relationship between process performance and control inputs is established by means of hybrid grey-box models combining parametric and nonparametric structures. The bioreactor dynamics are defined by material balance equations, whereas the cell population subsystem is represented by an adjustable mixture of nonparametric and parametric models. Thus optimizations are possible without detailed mechanistic knowledge concerning the biological system. A clustering technique is used to supervise the reliability of the nonparametric subsystem during the optimization. Whenever the nonparametric outputs are unreliable, the objective function is penalized. The technique was evaluated with three simulation case studies. The overall results suggest that the convergence to the optimal process performance may be achieved after a small number of batches. The model unreliability risk constraint along with sampling scheduling are crucial to minimize the experimental effort required to attain a given process performance. In general terms, it may be concluded that the proposed method broadens the application of the hybrid parametric/nonparametric modeling technique to "newer" processes with higher potential for optimization.

  18. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  19. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  20. Nonparametric methods for volatility density estimation

    NARCIS (Netherlands)

    Es, van Bert; Spreij, P.J.C.; Zanten, van J.H.

    2009-01-01

    Stochastic volatility modelling of financial processes has become increasingly popular. The proposed models usually contain a stationary volatility process. We will motivate and review several nonparametric methods for estimation of the density of the volatility process. Both models based on

  1. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    capture the behavior of observed phenomena. Higher-order polynomial and finite-dimensional spline basis models allow for more complicated responses as the...flexibility as these are nonparametric (not constrained to any particular functional form). These should be useful in identifying nonstandard behavior via... deviance ∆ = −2 log(Lreduced/Lfull) is defined in terms of the likelihood function L. For normal error, Lfull = 1, and based on Eq. A-2, we have log

  2. Speaker Linking and Applications using Non-Parametric Hashing Methods

    Science.gov (United States)

    2016-09-08

    nonparametric estimate of a multivariate density function,” The Annals of Math- ematical Statistics , vol. 36, no. 3, pp. 1049–1051, 1965. [9] E. A. Patrick...Speaker Linking and Applications using Non-Parametric Hashing Methods† Douglas Sturim and William M. Campbell MIT Lincoln Laboratory, Lexington, MA...with many approaches [1, 2]. For this paper, we focus on using i-vectors [2], but the methods apply to any embedding. For the task of speaker QBE and

  3. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Science.gov (United States)

    Park, Saerom; Lee, Jaewook; Son, Youngdoo

    2016-01-01

    Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  4. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei; Carroll, Raymond J.; Maity, Arnab

    2011-01-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work

  5. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  6. Robust and transferable quantification of NMR spectral quality using IROC analysis

    Science.gov (United States)

    Zambrello, Matthew A.; Maciejewski, Mark W.; Schuyler, Adam D.; Weatherby, Gerard; Hoch, Jeffrey C.

    2017-12-01

    Non-Fourier methods are increasingly utilized in NMR spectroscopy because of their ability to handle nonuniformly-sampled data. However, non-Fourier methods present unique challenges due to their nonlinearity, which can produce nonrandom noise and render conventional metrics for spectral quality such as signal-to-noise ratio unreliable. The lack of robust and transferable metrics (i.e. applicable to methods exhibiting different nonlinearities) has hampered comparison of non-Fourier methods and nonuniform sampling schemes, preventing the identification of best practices. We describe a novel method, in situ receiver operating characteristic analysis (IROC), for characterizing spectral quality based on the Receiver Operating Characteristic curve. IROC utilizes synthetic signals added to empirical data as "ground truth", and provides several robust scalar-valued metrics for spectral quality. This approach avoids problems posed by nonlinear spectral estimates, and provides a versatile quantitative means of characterizing many aspects of spectral quality. We demonstrate applications to parameter optimization in Fourier and non-Fourier spectral estimation, critical comparison of different methods for spectrum analysis, and optimization of nonuniform sampling schemes. The approach will accelerate the discovery of optimal approaches to nonuniform sampling experiment design and non-Fourier spectrum analysis for multidimensional NMR.

  7. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  8. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    Science.gov (United States)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  9. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are point...

  10. Predicting Market Impact Costs Using Nonparametric Machine Learning Models.

    Directory of Open Access Journals (Sweden)

    Saerom Park

    Full Text Available Market impact cost is the most significant portion of implicit transaction costs that can reduce the overall transaction cost, although it cannot be measured directly. In this paper, we employed the state-of-the-art nonparametric machine learning models: neural networks, Bayesian neural network, Gaussian process, and support vector regression, to predict market impact cost accurately and to provide the predictive model that is versatile in the number of variables. We collected a large amount of real single transaction data of US stock market from Bloomberg Terminal and generated three independent input variables. As a result, most nonparametric machine learning models outperformed a-state-of-the-art benchmark parametric model such as I-star model in four error measures. Although these models encounter certain difficulties in separating the permanent and temporary cost directly, nonparametric machine learning models can be good alternatives in reducing transaction costs by considerably improving in prediction performance.

  11. MEM spectral analysis for predicting influenza epidemics in Japan.

    Science.gov (United States)

    Sumi, Ayako; Kamo, Ken-ichi

    2012-03-01

    The prediction of influenza epidemics has long been the focus of attention in epidemiology and mathematical biology. In this study, we tested whether time series analysis was useful for predicting the incidence of influenza in Japan. The method of time series analysis we used consists of spectral analysis based on the maximum entropy method (MEM) in the frequency domain and the nonlinear least squares method in the time domain. Using this time series analysis, we analyzed the incidence data of influenza in Japan from January 1948 to December 1998; these data are unique in that they covered the periods of pandemics in Japan in 1957, 1968, and 1977. On the basis of the MEM spectral analysis, we identified the periodic modes explaining the underlying variations of the incidence data. The optimum least squares fitting (LSF) curve calculated with the periodic modes reproduced the underlying variation of the incidence data. An extension of the LSF curve could be used to predict the incidence of influenza quantitatively. Our study suggested that MEM spectral analysis would allow us to model temporal variations of influenza epidemics with multiple periodic modes much more effectively than by using the method of conventional time series analysis, which has been used previously to investigate the behavior of temporal variations in influenza data.

  12. Single molecule force spectroscopy at high data acquisition: A Bayesian nonparametric analysis

    Science.gov (United States)

    Sgouralis, Ioannis; Whitmore, Miles; Lapidus, Lisa; Comstock, Matthew J.; Pressé, Steve

    2018-03-01

    Bayesian nonparametrics (BNPs) are poised to have a deep impact in the analysis of single molecule data as they provide posterior probabilities over entire models consistent with the supplied data, not just model parameters of one preferred model. Thus they provide an elegant and rigorous solution to the difficult problem encountered when selecting an appropriate candidate model. Nevertheless, BNPs' flexibility to learn models and their associated parameters from experimental data is a double-edged sword. Most importantly, BNPs are prone to increasing the complexity of the estimated models due to artifactual features present in time traces. Thus, because of experimental challenges unique to single molecule methods, naive application of available BNP tools is not possible. Here we consider traces with time correlations and, as a specific example, we deal with force spectroscopy traces collected at high acquisition rates. While high acquisition rates are required in order to capture dwells in short-lived molecular states, in this setup, a slow response of the optical trap instrumentation (i.e., trapped beads, ambient fluid, and tethering handles) distorts the molecular signals introducing time correlations into the data that may be misinterpreted as true states by naive BNPs. Our adaptation of BNP tools explicitly takes into consideration these response dynamics, in addition to drift and noise, and makes unsupervised time series analysis of correlated single molecule force spectroscopy measurements possible, even at acquisition rates similar to or below the trap's response times.

  13. Spectral map-analysis: a method to analyze gene expression data

    OpenAIRE

    Bijnens, Luc J.M.; Lewi, Paul J.; Göhlmann, Hinrich W.; Molenberghs, Geert; Wouters, Luc

    2004-01-01

    bioinformatics; biplot; correspondence factor analysis; data mining; data visualization; gene expression data; microarray data; multivariate exploratory data analysis; principal component analysis; Spectral map analysis

  14. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis; Mouhot, Clé ment

    2011-01-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  15. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  16. Multi-sample nonparametric treatments comparison in medical ...

    African Journals Online (AJOL)

    Multi-sample nonparametric treatments comparison in medical follow-up study with unequal observation processes through simulation and bladder tumour case study. P. L. Tan, N.A. Ibrahim, M.B. Adam, J. Arasan ...

  17. Nonparametric regression using the concept of minimum energy

    International Nuclear Information System (INIS)

    Williams, Mike

    2011-01-01

    It has recently been shown that an unbinned distance-based statistic, the energy, can be used to construct an extremely powerful nonparametric multivariate two sample goodness-of-fit test. An extension to this method that makes it possible to perform nonparametric regression using multiple multivariate data sets is presented in this paper. The technique, which is based on the concept of minimizing the energy of the system, permits determination of parameters of interest without the need for parametric expressions of the parent distributions of the data sets. The application and performance of this new method is discussed in the context of some simple example analyses.

  18. Nonparametric Bayesian inference for mean residual life functions in survival analysis.

    Science.gov (United States)

    Poynor, Valerie; Kottas, Athanasios

    2018-01-19

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life (MRL) function, which provides the expected remaining lifetime given that a subject has survived (i.e. is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the MRL function characterizes the survival distribution. We develop general Bayesian nonparametric inference for MRL functions built from a Dirichlet process mixture model for the associated survival distribution. The resulting model for the MRL function admits a representation as a mixture of the kernel MRL functions with time-dependent mixture weights. This model structure allows for a wide range of shapes for the MRL function. Particular emphasis is placed on the selection of the mixture kernel, taken to be a gamma distribution, to obtain desirable properties for the MRL function arising from the mixture model. The inference method is illustrated with a data set of two experimental groups and a data set involving right censoring. The supplementary material available at Biostatistics online provides further results on empirical performance of the model, using simulated data examples. © The Author 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Effect on Prediction when Modeling Covariates in Bayesian Nonparametric Models.

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Rosner, Gary L; Müller, Peter; Stewart, Clinton F

    2013-04-01

    In biomedical research, it is often of interest to characterize biologic processes giving rise to observations and to make predictions of future observations. Bayesian nonparametric methods provide a means for carrying out Bayesian inference making as few assumptions about restrictive parametric models as possible. There are several proposals in the literature for extending Bayesian nonparametric models to include dependence on covariates. Limited attention, however, has been directed to the following two aspects. In this article, we examine the effect on fitting and predictive performance of incorporating covariates in a class of Bayesian nonparametric models by one of two primary ways: either in the weights or in the locations of a discrete random probability measure. We show that different strategies for incorporating continuous covariates in Bayesian nonparametric models can result in big differences when used for prediction, even though they lead to otherwise similar posterior inferences. When one needs the predictive density, as in optimal design, and this density is a mixture, it is better to make the weights depend on the covariates. We demonstrate these points via a simulated data example and in an application in which one wants to determine the optimal dose of an anticancer drug used in pediatric oncology.

  20. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...... instantaneous and convolutive mixing, and the inferred temporal patterns. Spatial maps are seen to capture smooth and localized stimuli-related components, and often identifiable noise components. The implementation is freely available as a GUI/SPM plugin, and we recommend using GPICA as an additional tool when...

  1. Essays on nonparametric econometrics of stochastic volatility

    NARCIS (Netherlands)

    Zu, Y.

    2012-01-01

    Volatility is a concept that describes the variation of financial returns. Measuring and modelling volatility dynamics is an important aspect of financial econometrics. This thesis is concerned with nonparametric approaches to volatility measurement and volatility model validation.

  2. The method of separation for evolutionary spectral density estimation of multi-variate and multi-dimensional non-stationary stochastic processes

    KAUST Repository

    Schillinger, Dominik

    2013-07-01

    The method of separation can be used as a non-parametric estimation technique, especially suitable for evolutionary spectral density functions of uniformly modulated and strongly narrow-band stochastic processes. The paper at hand provides a consistent derivation of method of separation based spectrum estimation for the general multi-variate and multi-dimensional case. The validity of the method is demonstrated by benchmark tests with uniformly modulated spectra, for which convergence to the analytical solution is demonstrated. The key advantage of the method of separation is the minimization of spectral dispersion due to optimum time- or space-frequency localization. This is illustrated by the calibration of multi-dimensional and multi-variate geometric imperfection models from strongly narrow-band measurements in I-beams and cylindrical shells. Finally, the application of the method of separation based estimates for the stochastic buckling analysis of the example structures is briefly discussed. © 2013 Elsevier Ltd.

  3. Bayesian Nonparametric Hidden Markov Models with application to the analysis of copy-number-variation in mammalian genomes.

    Science.gov (United States)

    Yau, C; Papaspiliopoulos, O; Roberts, G O; Holmes, C

    2011-01-01

    We consider the development of Bayesian Nonparametric methods for product partition models such as Hidden Markov Models and change point models. Our approach uses a Mixture of Dirichlet Process (MDP) model for the unknown sampling distribution (likelihood) for the observations arising in each state and a computationally efficient data augmentation scheme to aid inference. The method uses novel MCMC methodology which combines recent retrospective sampling methods with the use of slice sampler variables. The methodology is computationally efficient, both in terms of MCMC mixing properties, and robustness to the length of the time series being investigated. Moreover, the method is easy to implement requiring little or no user-interaction. We apply our methodology to the analysis of genomic copy number variation.

  4. Non-parametric Tuning of PID Controllers A Modified Relay-Feedback-Test Approach

    CERN Document Server

    Boiko, Igor

    2013-01-01

    The relay feedback test (RFT) has become a popular and efficient  tool used in process identification and automatic controller tuning. Non-parametric Tuning of PID Controllers couples new modifications of classical RFT with application-specific optimal tuning rules to form a non-parametric method of test-and-tuning. Test and tuning are coordinated through a set of common parameters so that a PID controller can obtain the desired gain or phase margins in a system exactly, even with unknown process dynamics. The concept of process-specific optimal tuning rules in the nonparametric setup, with corresponding tuning rules for flow, level pressure, and temperature control loops is presented in the text.   Common problems of tuning accuracy based on parametric and non-parametric approaches are addressed. In addition, the text treats the parametric approach to tuning based on the modified RFT approach and the exact model of oscillations in the system under test using the locus of a perturbedrelay system (LPRS) meth...

  5. Power spectral analysis of heart rate in hyperthyroidism.

    Science.gov (United States)

    Cacciatori, V; Bellavere, F; Pezzarossa, A; Dellera, A; Gemma, M L; Thomaseth, K; Castello, R; Moghetti, P; Muggeo, M

    1996-08-01

    The aim of the present study was to evaluate the impact of hyperthyroidism on the cardiovascular system by separately analyzing the sympathetic and parasympathetic influences on heart rate. Heart rate variability was evaluated by autoregressive power spectral analysis. This method allows a reliable quantification of the low frequency (LF) and high frequency (HF) components of the heart rate power spectral density; these are considered to be under mainly sympathetic and pure parasympathetic control, respectively. In 10 newly diagnosed untreated hyperthyroid patients with Graves' disease, we analyzed power spectral density of heart rate cyclic variations at rest, while lying, and while standing. In addition, heart rate variations during deep breathing, lying and standing, and Valsalva's maneuver were analyzed. The results were compared to those obtained from 10 age-, sex-, and body mass index-matched control subjects. In 8 hyperthyroid patients, the same evaluation was repeated after the induction of stable euthyroidism by methimazole. Heart rate power spectral analysis showed a sharp reduction of HF components in hyperthyroid subjects compared to controls [lying, 13.3 +/- 4.1 vs. 32.0 +/- 5.6 normalized units (NU; P hyperthyroid subjects while both lying (11.3 +/- 4.5 vs. 3.5 +/- 1.1; P hyperthyroid patients than in controls (1.12 +/- 0.03 vs. 1.31 +/- 0.04; P activity and, thus, a relative hypersympathetic tone.

  6. Spectral analysis of full field digital mammography data

    International Nuclear Information System (INIS)

    Heine, John J.; Velthuizen, Robert P.

    2002-01-01

    The spectral content of mammograms acquired from using a full field digital mammography (FFDM) system are analyzed. Fourier methods are used to show that the FFDM image power spectra obey an inverse power law; in an average sense, the images may be considered as 1/f fields. Two data representations are analyzed and compared (1) the raw data, and (2) the logarithm of the raw data. Two methods are employed to analyze the power spectra (1) a technique based on integrating the Fourier plane with octave ring sectioning developed previously, and (2) an approach based on integrating the Fourier plane using rings of constant width developed for this work. Both methods allow theoretical modeling. Numerical analysis indicates that the effects due to the transformation influence the power spectra measurements in a statistically significant manner in the high frequency range. However, this effect has little influence on the inverse power law estimation for a given image regardless of the data representation or the theoretical analysis approach. The analysis is presented from two points of view (1) each image is treated independently with the results presented as distributions, and (2) for a given representation, the entire image collection is treated as an ensemble with the results presented as expected values. In general, the constant ring width analysis forms the foundation for a spectral comparison method for finding spectral differences, from an image distribution sense, after applying a nonlinear transformation to the data. The work also shows that power law estimation may be influenced due to the presence of noise in the higher frequency range, which is consistent with the known attributes of the detector efficiency. The spectral modeling and inverse power law determinations obtained here are in agreement with that obtained from the analysis of digitized film-screen images presented previously. The form of the power spectrum for a given image is approximately 1/f 2

  7. Nonparametric Bayes Modeling of Multivariate Categorical Data.

    Science.gov (United States)

    Dunson, David B; Xing, Chuanhua

    2012-01-01

    Modeling of multivariate unordered categorical (nominal) data is a challenging problem, particularly in high dimensions and cases in which one wishes to avoid strong assumptions about the dependence structure. Commonly used approaches rely on the incorporation of latent Gaussian random variables or parametric latent class models. The goal of this article is to develop a nonparametric Bayes approach, which defines a prior with full support on the space of distributions for multiple unordered categorical variables. This support condition ensures that we are not restricting the dependence structure a priori. We show this can be accomplished through a Dirichlet process mixture of product multinomial distributions, which is also a convenient form for posterior computation. Methods for nonparametric testing of violations of independence are proposed, and the methods are applied to model positional dependence within transcription factor binding motifs.

  8. Global spectral graph wavelet signature for surface analysis of carpal bones

    Science.gov (United States)

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A.

    2018-02-01

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  9. Multi-spectral Image Analysis for Astaxanthin Coating Classification

    DEFF Research Database (Denmark)

    Ljungqvist, Martin Georg; Ersbøll, Bjarne Kjær; Nielsen, Michael Engelbrecht

    2011-01-01

    Industrial quality inspection using image analysis on astaxanthin coating in aquaculture feed pellets is of great importance for automatic production control. In this study multi-spectral image analysis of pellets was performed using LDA, QDA, SNV and PCA on pixel level and mean value of pixels...

  10. Multitaper spectral analysis of atmospheric radar signals

    Directory of Open Access Journals (Sweden)

    V. K. Anandan

    2004-11-01

    Full Text Available Multitaper spectral analysis using sinusoidal taper has been carried out on the backscattered signals received from the troposphere and lower stratosphere by the Gadanki Mesosphere-Stratosphere-Troposphere (MST radar under various conditions of the signal-to-noise ratio. Comparison of study is made with sinusoidal taper of the order of three and single tapers of Hanning and rectangular tapers, to understand the relative merits of processing under the scheme. Power spectra plots show that echoes are better identified in the case of multitaper estimation, especially in the region of a weak signal-to-noise ratio. Further analysis is carried out to obtain three lower order moments from three estimation techniques. The results show that multitaper analysis gives a better signal-to-noise ratio or higher detectability. The spectral analysis through multitaper and single tapers is subjected to study of consistency in measurements. Results show that the multitaper estimate is better consistent in Doppler measurements compared to single taper estimates. Doppler width measurements with different approaches were studied and the results show that the estimation was better in the multitaper technique in terms of temporal resolution and estimation accuracy.

  11. Geostatistical radar-raingauge combination with nonparametric correlograms: methodological considerations and application in Switzerland

    Science.gov (United States)

    Schiemann, R.; Erdin, R.; Willi, M.; Frei, C.; Berenguer, M.; Sempere-Torres, D.

    2011-05-01

    Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation

  12. Processing of spectral X-ray data with principal components analysis

    CERN Document Server

    Butler, A P H; Cook, N J; Butzer, J; Schleich, N; Tlustos, L; Scott, N; Grasset, R; de Ruiter, N; Anderson, N G

    2011-01-01

    The goal of the work was to develop a general method for processing spectral x-ray image data. Principle component analysis (PCA) is a well understood technique for multivariate data analysis and so was investigated. To assess this method, spectral (multi-energy) computed tomography (CT) data was obtained using a Medipix2 detector in a MARS-CT (Medipix All Resolution System). PCA was able to separate bone (calcium) from two elements with k-edges in the X-ray spectrum used (iodine and barium) within a mouse. This has potential clinical application in dual-energy CT systems and future Medipix3 based spectral imaging where up to eight energies can be recorded simultaneously with excellent energy resolution. (c) 2010 Elsevier B.V. All rights reserved.

  13. Multivariate spectral-analysis of movement-related EEG data

    International Nuclear Information System (INIS)

    Andrew, C. M.

    1997-01-01

    The univariate method of event-related desynchronization (ERD) analysis, which quantifies the temporal evolution of power within specific frequency bands from electroencephalographic (EEG) data recorded during a task or event, is extended to an event related multivariate spectral analysis method. With this method, time courses of cross-spectra, phase spectra, coherence spectra, band-averaged coherence values (event-related coherence, ERCoh), partial power spectra and partial coherence spectra are estimated from an ensemble of multivariate event-related EEG trials. This provides a means of investigating relationships between EEG signals recorded over different scalp areas during the performance of a task or the occurrence of an event. The multivariate spectral analysis method is applied to EEG data recorded during three different movement-related studies involving discrete right index finger movements. The first study investigates the impact of the EEG derivation type on the temporal evolution of interhemispheric coherence between activity recorded at electrodes overlying the left and right sensorimotor hand areas during cued finger movement. The question results whether changes in coherence necessarily reflect changes in functional coupling of the cortical structures underlying the recording electrodes. The method is applied to data recorded during voluntary finger movement and a hypothesis, based on an existing global/local model of neocortical dynamics, is formulated to explain the coherence results. The third study applies partial spectral analysis too, and investigates phase relationships of, movement-related data recorded from a full head montage, thereby providing further results strengthening the global/local hypothesis. (author)

  14. Stochastic semi-nonparametric frontier estimation of electricity distribution networks: Application of the StoNED method in the Finnish regulatory model

    International Nuclear Information System (INIS)

    Kuosmanen, Timo

    2012-01-01

    Electricity distribution network is a prime example of a natural local monopoly. In many countries, electricity distribution is regulated by the government. Many regulators apply frontier estimation techniques such as data envelopment analysis (DEA) or stochastic frontier analysis (SFA) as an integral part of their regulatory framework. While more advanced methods that combine nonparametric frontier with stochastic error term are known in the literature, in practice, regulators continue to apply simplistic methods. This paper reports the main results of the project commissioned by the Finnish regulator for further development of the cost frontier estimation in their regulatory framework. The key objectives of the project were to integrate a stochastic SFA-style noise term to the nonparametric, axiomatic DEA-style cost frontier, and to take the heterogeneity of firms and their operating environments better into account. To achieve these objectives, a new method called stochastic nonparametric envelopment of data (StoNED) was examined. Based on the insights and experiences gained in the empirical analysis using the real data of the regulated networks, the Finnish regulator adopted the StoNED method in use from 2012 onwards.

  15. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    Science.gov (United States)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  16. Terahertz Josephson spectral analysis and its applications

    Science.gov (United States)

    Snezhko, A. V.; Gundareva, I. I.; Lyatti, M. V.; Volkov, O. Y.; Pavlovskiy, V. V.; Poppe, U.; Divin, Y. Y.

    2017-04-01

    Principles of Hilbert-transform spectral analysis (HTSA) are presented and advantages of the technique in the terahertz (THz) frequency range are discussed. THz HTSA requires Josephson junctions with high values of characteristic voltages I c R n and dynamics described by a simple resistively shunted junction (RSJ) model. To meet these requirements, [001]- and [100]-tilt YBa2Cu3O7-x bicrystal junctions with deviations from the RSJ model less than 1% have been developed. Demonstrators of Hilbert-transform spectrum analyzers with various cryogenic environments, including integration into Stirling coolers, are described. Spectrum analyzers have been characterized in the spectral range from 50 GHz to 3 THz. Inside a power dynamic range of five orders, an instrumental function of the analyzers has been found to have a Lorentz form around a single frequency of 1.48 THz with a spectral resolution as low as 0.9 GHz. Spectra of THz radiation from optically pumped gas lasers and semiconductor frequency multipliers have been studied with these spectrum analyzers and the regimes of these radiation sources were optimized for a single-frequency operation. Future applications of HTSA will be related with quick and precise spectral characterization of new radiation sources and identification of substances in the THz frequency range.

  17. Antepartum Fetal Monitoring and Spectral Analysis of Preterm Birth Risk

    Science.gov (United States)

    Păsăricără, Alexandru; Nemescu, Dragoş; Arotăriţei, Dragoş; Rotariu, Cristian

    2017-11-01

    The monitoring and analysis of antepartum fetal and maternal recordings is a research area of notable interest due to the relatively high value of preterm birth. The interest stems from the improvement of devices used for monitoring. The current paper presents the spectral analysis of antepartum heart rate recordings conducted during a study in Romania at the Cuza Voda Obstetrics and Gynecology Clinical Hospital from Iasi between 2010 and 2014. The study focuses on normal and preterm birth risk subjects in order to determine differences between these two types or recordings in terms of spectral analysis.

  18. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  19. CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-12-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.

  20. Nonparametric instrumental regression with non-convex constraints

    International Nuclear Information System (INIS)

    Grasmair, M; Scherzer, O; Vanhems, A

    2013-01-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition. (paper)

  1. Nonparametric instrumental regression with non-convex constraints

    Science.gov (United States)

    Grasmair, M.; Scherzer, O.; Vanhems, A.

    2013-03-01

    This paper considers the nonparametric regression model with an additive error that is dependent on the explanatory variables. As is common in empirical studies in epidemiology and economics, it also supposes that valid instrumental variables are observed. A classical example in microeconomics considers the consumer demand function as a function of the price of goods and the income, both variables often considered as endogenous. In this framework, the economic theory also imposes shape restrictions on the demand function, such as integrability conditions. Motivated by this illustration in microeconomics, we study an estimator of a nonparametric constrained regression function using instrumental variables by means of Tikhonov regularization. We derive rates of convergence for the regularized model both in a deterministic and stochastic setting under the assumption that the true regression function satisfies a projected source condition including, because of the non-convexity of the imposed constraints, an additional smallness condition.

  2. Nonparametric conditional predictive regions for time series

    NARCIS (Netherlands)

    de Gooijer, J.G.; Zerom Godefay, D.

    2000-01-01

    Several nonparametric predictors based on the Nadaraya-Watson kernel regression estimator have been proposed in the literature. They include the conditional mean, the conditional median, and the conditional mode. In this paper, we consider three types of predictive regions for these predictors — the

  3. Berkeley SuperNova Ia Program (BSNIP): Initial Spectral Analysis

    Science.gov (United States)

    Silverman, Jeffrey; Kong, J.; Ganeshalingam, M.; Li, W.; Filippenko, A. V.

    2011-01-01

    The Berkeley SuperNova Ia Program (BSNIP) has been observing nearby (z analysis of this dataset consists of accurately and robustly measuring the strength and position of various spectral features near maximum brightness. We determine the endpoints, pseudo-continuum, expansion velocity, equivalent width, and depth of each major feature observed in our wavelength range. For objects with multiple spectra near maximum brightness we investigate how these values change with time. From these measurements we also calculate velocity gradients and various flux ratios within a given spectrum which will allow us to explore correlations between spectral and photometric observables. Some possible correlations have been studied previously, but our dataset is unique in how self-consistent the data reduction and spectral feature measurements have been, and it is a factor of a few larger than most earlier studies. We will briefly summarize the contents of the full dataset as an introduction to our initial analysis. Some of our measurements of SN Ia spectral features, along with a few initial results from those measurements, will be presented. Finally, we will comment on our current progress and planned future work. We gratefully acknowledge the financial support of NSF grant AST-0908886, the TABASGO Foundation, and the Marc J. Staley Graduate Fellowship in Astronomy.

  4. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  5. A multi-instrument non-parametric reconstruction of the electron pressure profile in the galaxy cluster CLJ1226.9+3332

    Science.gov (United States)

    Romero, C.; McWilliam, M.; Macías-Pérez, J.-F.; Adam, R.; Ade, P.; André, P.; Aussel, H.; Beelen, A.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; de Petris, M.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Lagache, G.; Leclercq, S.; Lestrade, J.-F.; Mauskopf, P.; Mayet, F.; Monfardini, A.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Revéret, V.; Ritacco, A.; Roussel, H.; Ruppin, F.; Schuster, K.; Sievers, A.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2018-04-01

    Context. In the past decade, sensitive, resolved Sunyaev-Zel'dovich (SZ) studies of galaxy clusters have become common. Whereas many previous SZ studies have parameterized the pressure profiles of galaxy clusters, non-parametric reconstructions will provide insights into the thermodynamic state of the intracluster medium. Aim. We seek to recover the non-parametric pressure profiles of the high redshift (z = 0.89) galaxy cluster CLJ 1226.9+3332 as inferred from SZ data from the MUSTANG, NIKA, Bolocam, and Planck instruments, which all probe different angular scales. Methods: Our non-parametric algorithm makes use of logarithmic interpolation, which under the assumption of ellipsoidal symmetry is analytically integrable. For MUSTANG, NIKA, and Bolocam we derive a non-parametric pressure profile independently and find good agreement among the instruments. In particular, we find that the non-parametric profiles are consistent with a fitted generalized Navaro-Frenk-White (gNFW) profile. Given the ability of Planck to constrain the total signal, we include a prior on the integrated Compton Y parameter as determined by Planck. Results: For a given instrument, constraints on the pressure profile diminish rapidly beyond the field of view. The overlap in spatial scales probed by these four datasets is therefore critical in checking for consistency between instruments. By using multiple instruments, our analysis of CLJ 1226.9+3332 covers a large radial range, from the central regions to the cluster outskirts: 0.05 R500 generation of SZ instruments such as NIKA2 and MUSTANG2.

  6. Screen Wars, Star Wars, and Sequels: Nonparametric Reanalysis of Movie Profitability

    OpenAIRE

    W. D. Walls

    2012-01-01

    In this paper we use nonparametric statistical tools to quantify motion-picture profit. We quantify the unconditional distribution of profit, the distribution of profit conditional on stars and sequels, and we also model the conditional expectation of movie profits using a non- parametric data-driven regression model. The flexibility of the non-parametric approach accommodates the full range of possible relationships among the variables without prior specification of a functional form, thereb...

  7. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  8. Nonparametric estimation in models for unobservable heterogeneity

    OpenAIRE

    Hohmann, Daniel

    2014-01-01

    Nonparametric models which allow for data with unobservable heterogeneity are studied. The first publication introduces new estimators and their asymptotic properties for conditional mixture models. The second publication considers estimation of a function from noisy observations of its Radon transform in a Gaussian white noise model.

  9. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.; Lombard, F.

    2012-01-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal

  10. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  11. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  12. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  13. Spectral analysis of an algebraic collapsing acceleration for the characteristics method

    International Nuclear Information System (INIS)

    Le Tellier, R.; Hebert, A.

    2005-01-01

    A spectral analysis of a diffusion synthetic acceleration called Algebraic Collapsing Acceleration (ACA) was carried out in the context of the characteristics method to solve the neutron transport equation. Two analysis were performed in order to assess the ACA performances. Both a standard Fourier analysis in a periodic and infinite slab-geometry and a direct spectral analysis for a finite slab-geometry were investigated. In order to evaluate its performance, ACA was compared with two competing techniques used to accelerate the convergence of the characteristics method, the Self-Collision Re-balancing technique and the Asymptotic Synthetic Acceleration. In the restricted framework of 1-dimensional slab-geometries, we conclude that ACA offers a good compromise between the reduction of the spectral radius of the iterative matrix and the resources to construct, store and solve the corrective system. A comparison on a monoenergetic 2-dimensional benchmark was performed and tends to confirm these conclusions. (authors)

  14. Cloud Masking for Remotely Sensed Data Using Spectral and Principal Components Analysis

    Directory of Open Access Journals (Sweden)

    A. Ahmad

    2012-06-01

    Full Text Available Two methods of cloud masking tuned to tropical conditions have been developed, based on spectral analysis and Principal Components Analysis (PCA of Moderate Resolution Imaging Spectroradiometer (MODIS data. In the spectral approach, thresholds were applied to four reflective bands (1, 2, 3, and 4, three thermal bands (29, 31 and 32, the band 2/band 1 ratio, and the difference between band 29 and 31 in order to detect clouds. The PCA approach applied a threshold to the first principal component derived from the seven quantities used for spectral analysis. Cloud detections were compared with the standard MODIS cloud mask, and their accuracy was assessed using reference images and geographical information on the study area.

  15. Examples of the Application of Nonparametric Information Geometry to Statistical Physics

    Directory of Open Access Journals (Sweden)

    Giovanni Pistone

    2013-09-01

    Full Text Available We review a nonparametric version of Amari’s information geometry in which the set of positive probability densities on a given sample space is endowed with an atlas of charts to form a differentiable manifold modeled on Orlicz Banach spaces. This nonparametric setting is used to discuss the setting of typical problems in machine learning and statistical physics, such as black-box optimization, Kullback-Leibler divergence, Boltzmann-Gibbs entropy and the Boltzmann equation.

  16. [Estimation of Hunan forest carbon density based on spectral mixture analysis of MODIS data].

    Science.gov (United States)

    Yan, En-ping; Lin, Hui; Wang, Guang-xing; Chen, Zhen-xiong

    2015-11-01

    With the fast development of remote sensing technology, combining forest inventory sample plot data and remotely sensed images has become a widely used method to map forest carbon density. However, the existence of mixed pixels often impedes the improvement of forest carbon density mapping, especially when low spatial resolution images such as MODIS are used. In this study, MODIS images and national forest inventory sample plot data were used to conduct the study of estimation for forest carbon density. Linear spectral mixture analysis with and without constraint, and nonlinear spectral mixture analysis were compared to derive the fractions of different land use and land cover (LULC) types. Then sequential Gaussian co-simulation algorithm with and without the fraction images from spectral mixture analyses were employed to estimate forest carbon density of Hunan Province. Results showed that 1) Linear spectral mixture analysis with constraint, leading to a mean RMSE of 0.002, more accurately estimated the fractions of LULC types than linear spectral and nonlinear spectral mixture analyses; 2) Integrating spectral mixture analysis model and sequential Gaussian co-simulation algorithm increased the estimation accuracy of forest carbon density to 81.5% from 74.1%, and decreased the RMSE to 5.18 from 7.26; and 3) The mean value of forest carbon density for the province was 30.06 t · hm(-2), ranging from 0.00 to 67.35 t · hm(-2). This implied that the spectral mixture analysis provided a great potential to increase the estimation accuracy of forest carbon density on regional and global level.

  17. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira

    2013-01-01

    which rely on notions of distances or densities, this approach introduces a novel concept based on local quadratic entropy for evaluating the similarity of a data object with its neighbors. This information theoretic quantity is used to regularize the closeness amongst data instances and subsequently......Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature...... benefits the process of mapping data into a usually lower dimensional space. Outliers are then identified by spectral analysis of the eigenspace spanned by the set of leading eigenvectors derived from the mapping procedure. The proposed technique is purely data-driven and imposes no assumptions regarding...

  18. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  19. Monte-Carlo error analysis in x-ray spectral deconvolution

    International Nuclear Information System (INIS)

    Shirk, D.G.; Hoffman, N.M.

    1985-01-01

    The deconvolution of spectral information from sparse x-ray data is a widely encountered problem in data analysis. An often-neglected aspect of this problem is the propagation of random error in the deconvolution process. We have developed a Monte-Carlo approach that enables us to attach error bars to unfolded x-ray spectra. Our Monte-Carlo error analysis has been incorporated into two specific deconvolution techniques: the first is an iterative convergent weight method; the second is a singular-value-decomposition (SVD) method. These two methods were applied to an x-ray spectral deconvolution problem having m channels of observations with n points in energy space. When m is less than n, this problem has no unique solution. We discuss the systematics of nonunique solutions and energy-dependent error bars for both methods. The Monte-Carlo approach has a particular benefit in relation to the SVD method: It allows us to apply the constraint of spectral nonnegativity after the SVD deconvolution rather than before. Consequently, we can identify inconsistencies between different detector channels

  20. Nonparametric predictive inference in statistical process control

    NARCIS (Netherlands)

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  1. Spectral analysis of the structure of ultradispersed diamonds

    Science.gov (United States)

    Uglov, V. V.; Shimanski, V. I.; Rusalsky, D. P.; Samtsov, M. P.

    2008-07-01

    The structure of ultradispersed diamonds (UDD) is studied by spectral methods. The presence of diamond crystal phase in the UDD is found based on x-ray analysis and Raman spectra. The Raman spectra also show sp2-and sp3-hybridized carbon. Analysis of IR absorption spectra suggests that the composition of functional groups present in the particles changes during the treatment.

  2. Field-scale sensitivity of vegetation discrimination to hyperspectral reflectance and coupled statistics

    DEFF Research Database (Denmark)

    Manevski, Kiril; Jabloun, Mohamed; Gupta, Manika

    2016-01-01

    a more powerful input to a nonparametric analysis for discrimination at the field scale, when compared with unaltered reflectance and parametric analysis. However, the discrimination outputs interact and are very sensitive to the number of observations - an important implication for the design......Remote sensing of land covers utilizes an increasing number of methods for spectral reflectance processing and its accompanying statistics to discriminate between the covers’ spectral signatures at various scales. To this end, the present chapter deals with the field-scale sensitivity...... of the vegetation spectral discrimination to the most common types of reflectance (unaltered and continuum-removed) and statistical tests (parametric and nonparametric analysis of variance). It is divided into two distinct parts. The first part summarizes the current knowledge in relation to vegetation...

  3. Bayesian nonparametric modeling for comparison of single-neuron firing intensities.

    Science.gov (United States)

    Kottas, Athanasios; Behseta, Sam

    2010-03-01

    We propose a fully inferential model-based approach to the problem of comparing the firing patterns of a neuron recorded under two distinct experimental conditions. The methodology is based on nonhomogeneous Poisson process models for the firing times of each condition with flexible nonparametric mixture prior models for the corresponding intensity functions. We demonstrate posterior inferences from a global analysis, which may be used to compare the two conditions over the entire experimental time window, as well as from a pointwise analysis at selected time points to detect local deviations of firing patterns from one condition to another. We apply our method on two neurons recorded from the primary motor cortex area of a monkey's brain while performing a sequence of reaching tasks.

  4. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    Science.gov (United States)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  5. Multivariate nonparametric regression and visualization with R and applications to finance

    CERN Document Server

    Klemelä, Jussi

    2014-01-01

    A modern approach to statistical learning and its applications through visualization methods With a unique and innovative presentation, Multivariate Nonparametric Regression and Visualization provides readers with the core statistical concepts to obtain complete and accurate predictions when given a set of data. Focusing on nonparametric methods to adapt to the multiple types of data generatingmechanisms, the book begins with an overview of classification and regression. The book then introduces and examines various tested and proven visualization techniques for learning samples and functio

  6. Spectral analysis of Floating Car Data

    OpenAIRE

    Gössel, F.; Michler, E.; Wrase, B.

    2003-01-01

    Floating Car Data (FCD) are one important data source in traffic telematic systems. The original variable in these systems is the vehicle velocity. The paper analyses the measured value “vehicle velocity" by methods of information technology. Consequences for processing, transmission and storage of FCD under condition of limited resources are discussed. Starting point of the investigation is the analysis of spectral characteristics of velocity-time-profiles. The spectra are determined by...

  7. Robust variable selection method for nonparametric differential equation models with application to nonlinear dynamic gene regulatory network analysis.

    Science.gov (United States)

    Lu, Tao

    2016-01-01

    The gene regulation network (GRN) evaluates the interactions between genes and look for models to describe the gene expression behavior. These models have many applications; for instance, by characterizing the gene expression mechanisms that cause certain disorders, it would be possible to target those genes to block the progress of the disease. Many biological processes are driven by nonlinear dynamic GRN. In this article, we propose a nonparametric differential equation (ODE) to model the nonlinear dynamic GRN. Specially, we address following questions simultaneously: (i) extract information from noisy time course gene expression data; (ii) model the nonlinear ODE through a nonparametric smoothing function; (iii) identify the important regulatory gene(s) through a group smoothly clipped absolute deviation (SCAD) approach; (iv) test the robustness of the model against possible shortening of experimental duration. We illustrate the usefulness of the model and associated statistical methods through a simulation and a real application examples.

  8. Evaluation of abrasive waterjet produced titan surfaces topography by spectral analysis techniques

    Directory of Open Access Journals (Sweden)

    D. Kozak

    2012-01-01

    Full Text Available Experimental study of a titan grade 2 surface topography prepared by abrasive waterjet cutting is performed using methods of the spectral analysis. Topographic data are acquired by means of the optical profilometr MicroProf®FRT. Estimation of the areal power spectral density of the studied surface is carried out using the periodogram method combined with the Welch´s method. Attention is paid to a structure of the areal power spectral density, which is characterized by means of the angular power spectral density. This structure of the areal spectral density is linked to the fine texture of the surface studied.

  9. [Detection of quadratic phase coupling between EEG signal components by nonparamatric and parametric methods of bispectral analysis].

    Science.gov (United States)

    Schmidt, K; Witte, H

    1999-11-01

    Recently the assumption of the independence of individual frequency components in a signal has been rejected, for example, for the EEG during defined physiological states such as sleep or sedation [9, 10]. Thus, the use of higher-order spectral analysis capable of detecting interrelations between individual signal components has proved useful. The aim of the present study was to investigate the quality of various non-parametric and parametric estimation algorithms using simulated as well as true physiological data. We employed standard algorithms available for the MATLAB. The results clearly show that parametric bispectral estimation is superior to non-parametric estimation in terms of the quality of peak localisation and the discrimination from other peaks.

  10. Bayesian nonparametric system reliability using sets of priors

    NARCIS (Netherlands)

    Walter, G.M.; Aslett, L.J.M.; Coolen, F.P.A.

    2016-01-01

    An imprecise Bayesian nonparametric approach to system reliability with multiple types of components is developed. This allows modelling partial or imperfect prior knowledge on component failure distributions in a flexible way through bounds on the functioning probability. Given component level test

  11. A multitemporal and non-parametric approach for assessing the impacts of drought on vegetation greenness

    DEFF Research Database (Denmark)

    Carrao, Hugo; Sepulcre, Guadalupe; Horion, Stéphanie Marie Anne F

    2013-01-01

    This study evaluates the relationship between the frequency and duration of meteorological droughts and the subsequent temporal changes on the quantity of actively photosynthesizing biomass (greenness) estimated from satellite imagery on rainfed croplands in Latin America. An innovative non-parametric...... and non-supervised approach, based on the Fisher-Jenks optimal classification algorithm, is used to identify multi-scale meteorological droughts on the basis of empirical cumulative distributions of 1, 3, 6, and 12-monthly precipitation totals. As input data for the classifier, we use the gridded GPCC...... for the period between 1998 and 2010. The time-series analysis of vegetation greenness is performed during the growing season with a non-parametric method, namely the seasonal Relative Greenness (RG) of spatially accumulated fAPAR. The Global Land Cover map of 2000 and the GlobCover maps of 2005/2006 and 2009...

  12. Nonparametric test of consistency between cosmological models and multiband CMB measurements

    Energy Technology Data Exchange (ETDEWEB)

    Aghamousa, Amir [Asia Pacific Center for Theoretical Physics, Pohang, Gyeongbuk 790-784 (Korea, Republic of); Shafieloo, Arman, E-mail: amir@apctp.org, E-mail: shafieloo@kasi.re.kr [Korea Astronomy and Space Science Institute, Daejeon 305-348 (Korea, Republic of)

    2015-06-01

    We present a novel approach to test the consistency of the cosmological models with multiband CMB data using a nonparametric approach. In our analysis we calibrate the REACT (Risk Estimation and Adaptation after Coordinate Transformation) confidence levels associated with distances in function space (confidence distances) based on the Monte Carlo simulations in order to test the consistency of an assumed cosmological model with observation. To show the applicability of our algorithm, we confront Planck 2013 temperature data with concordance model of cosmology considering two different Planck spectra combination. In order to have an accurate quantitative statistical measure to compare between the data and the theoretical expectations, we calibrate REACT confidence distances and perform a bias control using many realizations of the data. Our results in this work using Planck 2013 temperature data put the best fit ΛCDM model at 95% (∼ 2σ) confidence distance from the center of the nonparametric confidence set while repeating the analysis excluding the Planck 217 × 217 GHz spectrum data, the best fit ΛCDM model shifts to 70% (∼ 1σ) confidence distance. The most prominent features in the data deviating from the best fit ΛCDM model seems to be at low multipoles  18 < ℓ < 26 at greater than 2σ, ℓ ∼ 750 at ∼1 to 2σ and ℓ ∼ 1800 at greater than 2σ level. Excluding the 217×217 GHz spectrum the feature at ℓ ∼ 1800 becomes substantially less significance at ∼1 to 2σ confidence level. Results of our analysis based on the new approach we propose in this work are in agreement with other analysis done using alternative methods.

  13. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  14. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  15. Spectral analysis of allogeneic hydroxyapatite powders

    Science.gov (United States)

    Timchenko, P. E.; Timchenko, E. V.; Pisareva, E. V.; Vlasov, M. Yu; Red'kin, N. A.; Frolov, O. O.

    2017-01-01

    In this paper we discuss the application of Raman spectroscopy to the in vitro analysis of the hydroxyapatite powder samples produced from different types of animal bone tissue during demineralization process at various acid concentrations and exposure durations. The derivation of the Raman spectrum of hydroxyapatite is attempted by the analysis of the pure powders of its known constituents. Were experimentally found spectral features of hydroxyapatite, based on analysis of the line amplitude at wave numbers 950-965 cm-1 ((PO4)3- (ν1) vibration) and 1065-1075 cm-1 ((CO3)2-(ν1) B-type replacement). Control of physicochemical properties of hydroxyapatite was carried out by Raman spectroscopy. Research results are compared with an infrared Fourier spectroscopy.

  16. Spectral analysis of allogeneic hydroxyapatite powders

    International Nuclear Information System (INIS)

    Timchenko, P E; Timchenko, E V; Pisareva, E V; Vlasov, M Yu; Red’kin, N A; Frolov, O O

    2017-01-01

    In this paper we discuss the application of Raman spectroscopy to the in vitro analysis of the hydroxyapatite powder samples produced from different types of animal bone tissue during demineralization process at various acid concentrations and exposure durations. The derivation of the Raman spectrum of hydroxyapatite is attempted by the analysis of the pure powders of its known constituents. Were experimentally found spectral features of hydroxyapatite, based on analysis of the line amplitude at wave numbers 950-965 cm -1 ((PO 4 ) 3- (ν 1 ) vibration) and 1065-1075 cm -1 ((CO 3 ) 2- (ν 1 ) B-type replacement). Control of physicochemical properties of hydroxyapatite was carried out by Raman spectroscopy. Research results are compared with an infrared Fourier spectroscopy. (paper)

  17. Teaching Nonparametric Statistics Using Student Instrumental Values.

    Science.gov (United States)

    Anderson, Jonathan W.; Diddams, Margaret

    Nonparametric statistics are often difficult to teach in introduction to statistics courses because of the lack of real-world examples. This study demonstrated how teachers can use differences in the rankings and ratings of undergraduate and graduate values to discuss: (1) ipsative and normative scaling; (2) uses of the Mann-Whitney U-test; and…

  18. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei

    2011-07-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work was originally motivated by a unique testing problem in genetic epidemiology (Chatterjee, et al., 2006) that involved a typical generalized linear model but with an additional term reminiscent of the Tukey one-degree-of-freedom formulation, and their interest was in testing for main effects of the genetic variables, while gaining statistical power by allowing for a possible interaction between genes and the environment. Later work (Maity, et al., 2009) involved the possibility of modeling the environmental variable nonparametrically, but they focused on whether there was a parametric main effect for the genetic variables. In this paper, we consider the complementary problem, where the interest is in testing for the main effect of the nonparametrically modeled environmental variable. We derive a generalized likelihood ratio test for this hypothesis, show how to implement it, and provide evidence that our method can improve statistical power when compared to standard partially linear models with main effects only. We use the method for the primary purpose of analyzing data from a case-control study of colorectal adenoma.

  19. The spectral analysis of motion: An "open field" activity test example

    Directory of Open Access Journals (Sweden)

    Obradović Z.

    2013-01-01

    Full Text Available In this work we have described the new mathematical approach, with spectral analysis of the data to evaluate position and motion in the „„open field““ experiments. The aim of this work is to introduce several new parameters mathematically derived from experimental data by means of spectral analysis, and to quantitatively estimate the quality of the motion. Two original software packages (TRACKER and POSTPROC were used for transforming a video data to a log file, suitable for further computational analysis, and to perform analysis from the log file. As an example, results obtained from the experiments with Wistar rats in the „open field“ test are included. The test group of animals was treated with diazepam. Our results demonstrate that all the calculated parameters, such as movement variability, acceleration and deceleration, were significantly lower in the test group compared to the control group. We believe that the application of parameters obtained by spectral analysis could be of great significance in assessing the locomotion impairment in any kind of motion. [Projekat Ministarstva nauke Republike Srbije, br. III41007 i br. ON174028

  20. Smooth semi-nonparametric (SNP) estimation of the cumulative incidence function.

    Science.gov (United States)

    Duc, Anh Nguyen; Wolbers, Marcel

    2017-08-15

    This paper presents a novel approach to estimation of the cumulative incidence function in the presence of competing risks. The underlying statistical model is specified via a mixture factorization of the joint distribution of the event type and the time to the event. The time to event distributions conditional on the event type are modeled using smooth semi-nonparametric densities. One strength of this approach is that it can handle arbitrary censoring and truncation while relying on mild parametric assumptions. A stepwise forward algorithm for model estimation and adaptive selection of smooth semi-nonparametric polynomial degrees is presented, implemented in the statistical software R, evaluated in a sequence of simulation studies, and applied to data from a clinical trial in cryptococcal meningitis. The simulations demonstrate that the proposed method frequently outperforms both parametric and nonparametric alternatives. They also support the use of 'ad hoc' asymptotic inference to derive confidence intervals. An extension to regression modeling is also presented, and its potential and challenges are discussed. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  1. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  2. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  3. A new powerful non-parametric two-stage approach for testing multiple phenotypes in family-based association studies

    NARCIS (Netherlands)

    Lange, C; Lyon, H; DeMeo, D; Raby, B; Silverman, EK; Weiss, ST

    2003-01-01

    We introduce a new powerful nonparametric testing strategy for family-based association studies in which multiple quantitative traits are recorded and the phenotype with the strongest genetic component is not known prior to the analysis. In the first stage, using a population-based test based on the

  4. Parametric image reconstruction using spectral analysis of PET projection data

    International Nuclear Information System (INIS)

    Meikle, Steven R.; Matthews, Julian C.; Cunningham, Vincent J.; Bailey, Dale L.; Livieratos, Lefteris; Jones, Terry; Price, Pat

    1998-01-01

    Spectral analysis is a general modelling approach that enables calculation of parametric images from reconstructed tracer kinetic data independent of an assumed compartmental structure. We investigated the validity of applying spectral analysis directly to projection data motivated by the advantages that: (i) the number of reconstructions is reduced by an order of magnitude and (ii) iterative reconstruction becomes practical which may improve signal-to-noise ratio (SNR). A dynamic software phantom with typical 2-[ 11 C]thymidine kinetics was used to compare projection-based and image-based methods and to assess bias-variance trade-offs using iterative expectation maximization (EM) reconstruction. We found that the two approaches are not exactly equivalent due to properties of the non-negative least-squares algorithm. However, the differences are small ( 1 and, to a lesser extent, VD). The optimal number of EM iterations was 15-30 with up to a two-fold improvement in SNR over filtered back projection. We conclude that projection-based spectral analysis with EM reconstruction yields accurate parametric images with high SNR and has potential application to a wide range of positron emission tomography ligands. (author)

  5. Joint Spectral Analysis for Early Bright X-ray Flares of γ-Ray Bursts ...

    Indian Academy of Sciences (India)

    Abstract. A joint spectral analysis for early bright X-ray flares that were simultaneously observed with Swift BAT and XRT are present. Both BAT and XRT lightcurves of these flares are correlated. Our joint spectral anal- ysis shows that the radiations in the two energy bands are from the same spectral component, which can ...

  6. Spectral theory and nonlinear analysis with applications to spatial ecology

    CERN Document Server

    Cano-Casanova, S; Mora-Corral , C

    2005-01-01

    This volume details some of the latest advances in spectral theory and nonlinear analysis through various cutting-edge theories on algebraic multiplicities, global bifurcation theory, non-linear Schrödinger equations, non-linear boundary value problems, large solutions, metasolutions, dynamical systems, and applications to spatial ecology. The main scope of the book is bringing together a series of topics that have evolved separately during the last decades around the common denominator of spectral theory and nonlinear analysis - from the most abstract developments up to the most concrete applications to population dynamics and socio-biology - in an effort to fill the existing gaps between these fields.

  7. Euler deconvolution and spectral analysis of regional aeromagnetic ...

    African Journals Online (AJOL)

    Existing regional aeromagnetic data from the south-central Zimbabwe craton has been analysed using 3D Euler deconvolution and spectral analysis to obtain quantitative information on the geological units and structures for depth constraints on the geotectonic interpretation of the region. The Euler solution maps confirm ...

  8. The nonparametric bootstrap for the current status model

    NARCIS (Netherlands)

    Groeneboom, P.; Hendrickx, K.

    2017-01-01

    It has been proved that direct bootstrapping of the nonparametric maximum likelihood estimator (MLE) of the distribution function in the current status model leads to inconsistent confidence intervals. We show that bootstrapping of functionals of the MLE can however be used to produce valid

  9. Development of spectral analysis math models and software program and spectral analyzer, digital converter interface equipment design

    Science.gov (United States)

    Hayden, W. L.; Robinson, L. H.

    1972-01-01

    Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.

  10. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    Science.gov (United States)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  11. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    Science.gov (United States)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  12. Nonparametric Regression Estimation for Multivariate Null Recurrent Processes

    Directory of Open Access Journals (Sweden)

    Biqing Cai

    2015-04-01

    Full Text Available This paper discusses nonparametric kernel regression with the regressor being a \\(d\\-dimensional \\(\\beta\\-null recurrent process in presence of conditional heteroscedasticity. We show that the mean function estimator is consistent with convergence rate \\(\\sqrt{n(Th^{d}}\\, where \\(n(T\\ is the number of regenerations for a \\(\\beta\\-null recurrent process and the limiting distribution (with proper normalization is normal. Furthermore, we show that the two-step estimator for the volatility function is consistent. The finite sample performance of the estimate is quite reasonable when the leave-one-out cross validation method is used for bandwidth selection. We apply the proposed method to study the relationship of Federal funds rate with 3-month and 5-year T-bill rates and discover the existence of nonlinearity of the relationship. Furthermore, the in-sample and out-of-sample performance of the nonparametric model is far better than the linear model.

  13. A nonparametric empirical Bayes framework for large-scale multiple testing.

    Science.gov (United States)

    Martin, Ryan; Tokdar, Surya T

    2012-07-01

    We propose a flexible and identifiable version of the 2-groups model, motivated by hierarchical Bayes considerations, that features an empirical null and a semiparametric mixture model for the nonnull cases. We use a computationally efficient predictive recursion (PR) marginal likelihood procedure to estimate the model parameters, even the nonparametric mixing distribution. This leads to a nonparametric empirical Bayes testing procedure, which we call PRtest, based on thresholding the estimated local false discovery rates. Simulations and real data examples demonstrate that, compared to existing approaches, PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.

  14. Spectral Analysis Methods of Social Networks

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2017-01-01

    Full Text Available Online social networks (such as Facebook, Twitter, VKontakte, etc. being an important channel for disseminating information are often used to arrange an impact on the social consciousness for various purposes - from advertising products or services to the full-scale information war thereby making them to be a very relevant object of research. The paper reviewed the analysis methods of social networks (primarily, online, based on the spectral theory of graphs. Such methods use the spectrum of the social graph, i.e. a set of eigenvalues of its adjacency matrix, and also the eigenvectors of the adjacency matrix.Described measures of centrality (in particular, centrality based on the eigenvector and PageRank, which reflect a degree of impact one or another user of the social network has. A very popular PageRank measure uses, as a measure of centrality, the graph vertices, the final probabilities of the Markov chain, whose matrix of transition probabilities is calculated on the basis of the adjacency matrix of the social graph. The vector of final probabilities is an eigenvector of the matrix of transition probabilities.Presented a method of dividing the graph vertices into two groups. It is based on maximizing the network modularity by computing the eigenvector of the modularity matrix.Considered a method for detecting bots based on the non-randomness measure of a graph to be computed using the spectral coordinates of vertices - sets of eigenvector components of the adjacency matrix of a social graph.In general, there are a number of algorithms to analyse social networks based on the spectral theory of graphs. These algorithms show very good results, but their disadvantage is the relatively high (albeit polynomial computational complexity for large graphs.At the same time it is obvious that the practical application capacity of the spectral graph theory methods is still underestimated, and it may be used as a basis to develop new methods.The work

  15. Leak detection in pipelines through spectral analysis of pressure signals

    Directory of Open Access Journals (Sweden)

    Souza A.L.

    2000-01-01

    Full Text Available The development and test of a technique for leak detection in pipelines is presented. The technique is based on the spectral analysis of pressure signals measured in pipeline sections where the formation of stationary waves is favoured, allowing leakage detection during the start/stop of pumps. Experimental tests were performed in a 1250 m long pipeline for various operational conditions of the pipeline (liquid flow rate and leakage configuration. Pressure transients were obtained by four transducers connected to a PC computer. The obtained results show that the spectral analysis of pressure transients, together with the knowledge of reflection points provide a simple and efficient way of identifying leaks during the start/stop of pumps in pipelines.

  16. Nonparametric Estimation of Cumulative Incidence Functions for Competing Risks Data with Missing Cause of Failure

    DEFF Research Database (Denmark)

    Effraimidis, Georgios; Dahl, Christian Møller

    In this paper, we develop a fully nonparametric approach for the estimation of the cumulative incidence function with Missing At Random right-censored competing risks data. We obtain results on the pointwise asymptotic normality as well as the uniform convergence rate of the proposed nonparametric...

  17. Estimation and analysis of spectral solar radiation over Cairo

    International Nuclear Information System (INIS)

    Abdel Wahab, M.M.; Omran, M.

    1994-05-01

    This work presents a methodology to estimate spectral diffuse and global radiation on horizontal surface. This method is validated by comparing with measured direct and global spectral radiation in four bands. The results show a good performance in cloudless conditions. The analysis of the ratio of surface values to extraterrestrial ones revealed an over-all depletion in the summer months. Also there was no evidence for any tendency for conversion of radiational components through different bands. The model presents excellent agreement with the measured values for (UV/G) ratio. (author). 7 refs, 4 figs, 3 tabs

  18. Spectral analysis of bedform dynamics

    DEFF Research Database (Denmark)

    Winter, Christian; Ernstsen, Verner Brandbyge; Noormets, Riko

    Successive multibeam echo sounder surveys in tidal channels off Esbjerg (Denmark) on the North Sea coast reveal the dynamics of subaquatic compound dunes. Mainly driven by tidal currents, dune structures show complex migration patterns in all temporal and spatial scales. Common methods for the an....... The proposed method overcomes the above mentioned problems of common descriptive analysis as it is an objective and straightforward mathematical process. The spectral decomposition of superimposed dunes allows a detailed description and analysis of dune patterns and migration.......Successive multibeam echo sounder surveys in tidal channels off Esbjerg (Denmark) on the North Sea coast reveal the dynamics of subaquatic compound dunes. Mainly driven by tidal currents, dune structures show complex migration patterns in all temporal and spatial scales. Common methods...... allows the application of a procedure, which has been a standard for the analysis of water waves for long times: The bathymetric signal of a cross-section of subaquatic compound dunes is approximated by the sum of a set of harmonic functions, derived by Fourier transformation. If the wavelength...

  19. Bayesian Non-Parametric Mixtures of GARCH(1,1 Models

    Directory of Open Access Journals (Sweden)

    John W. Lau

    2012-01-01

    Full Text Available Traditional GARCH models describe volatility levels that evolve smoothly over time, generated by a single GARCH regime. However, nonstationary time series data may exhibit abrupt changes in volatility, suggesting changes in the underlying GARCH regimes. Further, the number and times of regime changes are not always obvious. This article outlines a nonparametric mixture of GARCH models that is able to estimate the number and time of volatility regime changes by mixing over the Poisson-Kingman process. The process is a generalisation of the Dirichlet process typically used in nonparametric models for time-dependent data provides a richer clustering structure, and its application to time series data is novel. Inference is Bayesian, and a Markov chain Monte Carlo algorithm to explore the posterior distribution is described. The methodology is illustrated on the Standard and Poor's 500 financial index.

  20. Technical Training on High-Order Spectral Analysis and Thermal Anemometry Applications

    Science.gov (United States)

    Maslov, A. A.; Shiplyuk, A. N.; Sidirenko, A. A.; Bountin, D. A.

    2003-01-01

    The topics of thermal anemometry and high-order spectral analyses were the subject of the technical training. Specifically, the objective of the technical training was to study: (i) the recently introduced constant voltage anemometer (CVA) for high-speed boundary layer; and (ii) newly developed high-order spectral analysis techniques (HOSA). Both CVA and HOSA are relevant tools for studies of boundary layer transition and stability.

  1. Non-Parametric Kinetic (NPK Analysis of Thermal Oxidation of Carbon Aerogels

    Directory of Open Access Journals (Sweden)

    Azadeh Seifi

    2017-05-01

    Full Text Available In recent years, much attention has been paid to aerogel materials (especially carbon aerogels due to their potential uses in energy-related applications, such as thermal energy storage and thermal protection systems. These open cell carbon-based porous materials (carbon aerogels can strongly react with oxygen at relatively low temperatures (~ 400°C. Therefore, it is necessary to evaluate the thermal performance of carbon aerogels in view of their energy-related applications at high temperatures and under thermal oxidation conditions. The objective of this paper is to study theoretically and experimentally the oxidation reaction kinetics of carbon aerogel using the non-parametric kinetic (NPK as a powerful method. For this purpose, a non-isothermal thermogravimetric analysis, at three different heating rates, was performed on three samples each with its specific pore structure, density and specific surface area. The most significant feature of this method, in comparison with the model-free isoconversional methods, is its ability to separate the functionality of the reaction rate with the degree of conversion and temperature by the direct use of thermogravimetric data. Using this method, it was observed that the Nomen-Sempere model could provide the best fit to the data, while the temperature dependence of the rate constant was best explained by a Vogel-Fulcher relationship, where the reference temperature was the onset temperature of oxidation. Moreover, it was found from the results of this work that the assumption of the Arrhenius relation for the temperature dependence of the rate constant led to over-estimation of the apparent activation energy (up to 160 kJ/mol that was considerably different from the values (up to 3.5 kJ/mol predicted by the Vogel-Fulcher relationship in isoconversional methods

  2. USING A DEA MANAGEMENT TOOLTHROUGH A NONPARAMETRIC APPROACH: AN EXAMINATION OF URBAN-RURAL EFFECTS ON THAI SCHOOL EFFICIENCY

    Directory of Open Access Journals (Sweden)

    SANGCHAN KANTABUTRA

    2009-04-01

    Full Text Available This paper examines urban-rural effects on public upper-secondary school efficiency in northern Thailand. In the study, efficiency was measured by a nonparametric technique, data envelopment analysis (DEA. Urban-rural effects were examined through a Mann-Whitney nonparametric statistical test. Results indicate that urban schools appear to have access to and practice different production technologies than rural schools, and rural institutions appear to operate less efficiently than their urban counterparts. In addition, a sensitivity analysis, conducted to ascertain the robustness of the analytical framework, revealed the stability of urban-rural effects on school efficiency. Policy to improve school eff iciency should thus take varying geographical area differences into account, viewing rural and urban schools as different from one another. Moreover, policymakers might consider shifting existing resources from urban schools to rural schools, provided that the increase in overall rural efficiency would be greater than the decrease, if any, in the city. Future research directions are discussed.

  3. Modern nonparametric, robust and multivariate methods festschrift in honour of Hannu Oja

    CERN Document Server

    Taskinen, Sara

    2015-01-01

    Written by leading experts in the field, this edited volume brings together the latest findings in the area of nonparametric, robust and multivariate statistical methods. The individual contributions cover a wide variety of topics ranging from univariate nonparametric methods to robust methods for complex data structures. Some examples from statistical signal processing are also given. The volume is dedicated to Hannu Oja on the occasion of his 65th birthday and is intended for researchers as well as PhD students with a good knowledge of statistics.

  4. Tree species mapping in tropical forests using multi-temporal imaging spectroscopy: Wavelength adaptive spectral mixture analysis

    Science.gov (United States)

    Somers, B.; Asner, G. P.

    2014-09-01

    The use of imaging spectroscopy for florisic mapping of forests is complicated by the spectral similarity among co-existing species. Here we evaluated an alternative spectral unmixing strategy combining a time series of EO-1 Hyperion images and an automated feature selection in Multiple Endmember Spectral Mixture Analysis (MESMA). The temporal analysis provided a way to incorporate species phenology while feature selection indicated the best phenological time and best spectral feature set to optimize the separability between tree species. Instead of using the same set of spectral bands throughout the image which is the standard approach in MESMA, our modified Wavelength Adaptive Spectral Mixture Analysis (WASMA) approach allowed the spectral subsets to vary on a per pixel basis. As such we were able to optimize the spectral separability between the tree species present in each pixel. The potential of the new approach for floristic mapping of tree species in Hawaiian rainforests was quantitatively assessed using both simulated and actual hyperspectral image time-series. With a Cohen's Kappa coefficient of 0.65, WASMA provided a more accurate tree species map compared to conventional MESMA (Kappa = 0.54; p-value < 0.05. The flexible or adaptive use of band sets in WASMA provides an interesting avenue to address spectral similarities in complex vegetation canopies.

  5. Comparison of modal spectral and non-linear time history analysis of a piping system

    International Nuclear Information System (INIS)

    Gerard, R.; Aelbrecht, D.; Lafaille, J.P.

    1987-01-01

    A typical piping system of the discharge line of the chemical and volumetric control system, outside the containment, between the penetration and the heat exchanger, an operating power plant was analyzed using four different methods: Modal spectral analysis with 2% constant damping, modal spectral analysis using ASME Code Case N411 (PVRC damping), linear time history analysis, non-linear time history analysis. This paper presents an estimation of the conservatism of the linear methods compared to the non-linear analysis. (orig./HP)

  6. Probit vs. semi-nonparametric estimation: examining the role of disability on institutional entry for older adults.

    Science.gov (United States)

    Sharma, Andy

    2017-06-01

    The purpose of this study was to showcase an advanced methodological approach to model disability and institutional entry. Both of these are important areas to investigate given the on-going aging of the United States population. By 2020, approximately 15% of the population will be 65 years and older. Many of these older adults will experience disability and require formal care. A probit analysis was employed to determine which disabilities were associated with admission into an institution (i.e. long-term care). Since this framework imposes strong distributional assumptions, misspecification leads to inconsistent estimators. To overcome such a short-coming, this analysis extended the probit framework by employing an advanced semi-nonparamertic maximum likelihood estimation utilizing Hermite polynomial expansions. Specification tests show semi-nonparametric estimation is preferred over probit. In terms of the estimates, semi-nonparametric ratios equal 42 for cognitive difficulty, 64 for independent living, and 111 for self-care disability while probit yields much smaller estimates of 19, 30, and 44, respectively. Public health professionals can use these results to better understand why certain interventions have not shown promise. Equally important, healthcare workers can use this research to evaluate which type of treatment plans may delay institutionalization and improve the quality of life for older adults. Implications for rehabilitation With on-going global aging, understanding the association between disability and institutional entry is important in devising successful rehabilitation interventions. Semi-nonparametric is preferred to probit and shows ambulatory and cognitive impairments present high risk for institutional entry (long-term care). Informal caregiving and home-based care require further examination as forms of rehabilitation/therapy for certain types of disabilities.

  7. An introduction to random vibrations, spectral & wavelet analysis

    CERN Document Server

    Newland, D E

    2005-01-01

    One of the first engineering books to cover wavelet analysis, this classic text describes and illustrates basic theory, with a detailed explanation of the workings of discrete wavelet transforms. Computer algorithms are explained and supported by examples and a set of problems, and an appendix lists ten computer programs for calculating and displaying wavelet transforms.Starting with an introduction to probability distributions and averages, the text examines joint probability distributions, ensemble averages, and correlation; Fourier analysis; spectral density and excitation response relation

  8. ANALYSIS OF SPECTRAL CHARACTERISTICS AMONG DIFFERENT SENSORS BY USE OF SIMULATED RS IMAGES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This research, by use of RS image-simulating method, simulated apparent reflectance images at sensor level and ground-reflectance images of SPOT-HRV,CBERS-CCD,Landsat-TM and NOAA14-AVHRR' s corresponding bands. These images were used to analyze sensor's differences caused by spectral sensitivity and atmospheric impacts. The differences were analyzed on Normalized Difference Vegetation Index(NDVI). The results showed that the differences of sensors' spectral characteristics cause changes of their NDVI and reflectance. When multiple sensors' data are applied to digital analysis, the error should be taken into account. Atmospheric effect makes NDVI smaller, and atn~pheric correction has the tendency of increasing NDVI values. The reflectance and their NDVIs of different sensors can be used to analyze the differences among sensor' s features. The spectral analysis method based on RS simulated images can provide a new way to design the spectral characteristics of new sensors.

  9. Adaptive nonparametric Bayesian inference using location-scale mixture priors

    NARCIS (Netherlands)

    Jonge, de R.; Zanten, van J.H.

    2010-01-01

    We study location-scale mixture priors for nonparametric statistical problems, including multivariate regression, density estimation and classification. We show that a rate-adaptive procedure can be obtained if the prior is properly constructed. In particular, we show that adaptation is achieved if

  10. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    International Nuclear Information System (INIS)

    Barat, Eric; Dautremer, Thomas

    2007-01-01

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on R k (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM

  11. International Conference on Robust Rank-Based and Nonparametric Methods

    CERN Document Server

    McKean, Joseph

    2016-01-01

    The contributors to this volume include many of the distinguished researchers in this area. Many of these scholars have collaborated with Joseph McKean to develop underlying theory for these methods, obtain small sample corrections, and develop efficient algorithms for their computation. The papers cover the scope of the area, including robust nonparametric rank-based procedures through Bayesian and big data rank-based analyses. Areas of application include biostatistics and spatial areas. Over the last 30 years, robust rank-based and nonparametric methods have developed considerably. These procedures generalize traditional Wilcoxon-type methods for one- and two-sample location problems. Research into these procedures has culminated in complete analyses for many of the models used in practice including linear, generalized linear, mixed, and nonlinear models. Settings are both multivariate and univariate. With the development of R packages in these areas, computation of these procedures is easily shared with r...

  12. Multivariate statistical analysis for x-ray photoelectron spectroscopy spectral imaging: Effect of image acquisition time

    International Nuclear Information System (INIS)

    Peebles, D.E.; Ohlhausen, J.A.; Kotula, P.G.; Hutton, S.; Blomfield, C.

    2004-01-01

    The acquisition of spectral images for x-ray photoelectron spectroscopy (XPS) is a relatively new approach, although it has been used with other analytical spectroscopy tools for some time. This technique provides full spectral information at every pixel of an image, in order to provide a complete chemical mapping of the imaged surface area. Multivariate statistical analysis techniques applied to the spectral image data allow the determination of chemical component species, and their distribution and concentrations, with minimal data acquisition and processing times. Some of these statistical techniques have proven to be very robust and efficient methods for deriving physically realistic chemical components without input by the user other than the spectral matrix itself. The benefits of multivariate analysis of the spectral image data include significantly improved signal to noise, improved image contrast and intensity uniformity, and improved spatial resolution - which are achieved due to the effective statistical aggregation of the large number of often noisy data points in the image. This work demonstrates the improvements in chemical component determination and contrast, signal-to-noise level, and spatial resolution that can be obtained by the application of multivariate statistical analysis to XPS spectral images

  13. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  14. An experiment with spectral analysis of emotional speech affected by orthodontic appliances

    Science.gov (United States)

    Přibil, Jiří; Přibilová, Anna; Ďuračková, Daniela

    2012-11-01

    The contribution describes the effect of the fixed and removable orthodontic appliances on spectral properties of emotional speech. Spectral changes were analyzed and evaluated by spectrograms and mean Welch’s periodograms. This alternative approach to the standard listening test enables to obtain objective comparison based on statistical analysis by ANOVA and hypothesis tests. Obtained results of analysis performed on short sentences of a female speaker in four emotional states (joyous, sad, angry, and neutral) show that, first of all, the removable orthodontic appliance affects the spectrograms of produced speech.

  15. VIBRATIONS DETECTION IN INDUSTRIAL PUMPS BASED ON SPECTRAL ANALYSIS TO INCREASE THEIR EFFICIENCY

    Directory of Open Access Journals (Sweden)

    Belhadef RACHID

    2016-01-01

    Full Text Available Spectral analysis is the key tool for the study of vibration signals in rotating machinery. In this work, the vibration analy-sis applied for conditional preventive maintenance of such machines is proposed, as part of resolved problems related to vibration detection on the organs of these machines. The vibration signal of a centrifugal pump was treated to mount the benefits of the approach proposed. The obtained results present the signal estimation of a pump vibration using Fourier transform technique compared by the spectral analysis methods based on Prony approach.

  16. Tomato sorting using independent component analysis on spectral images

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.; Young, I.T.

    2003-01-01

    Independent Component Analysis is one of the most widely used methods for blind source separation. In this paper we use this technique to estimate the most important compounds which play a role in the ripening of tomatoes. Spectral images of tomatoes were analyzed. Two main independent components

  17. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  18. Seismic Signal Compression Using Nonparametric Bayesian Dictionary Learning via Clustering

    Directory of Open Access Journals (Sweden)

    Xin Tian

    2017-06-01

    Full Text Available We introduce a seismic signal compression method based on nonparametric Bayesian dictionary learning method via clustering. The seismic data is compressed patch by patch, and the dictionary is learned online. Clustering is introduced for dictionary learning. A set of dictionaries could be generated, and each dictionary is used for one cluster’s sparse coding. In this way, the signals in one cluster could be well represented by their corresponding dictionaries. A nonparametric Bayesian dictionary learning method is used to learn the dictionaries, which naturally infers an appropriate dictionary size for each cluster. A uniform quantizer and an adaptive arithmetic coding algorithm are adopted to code the sparse coefficients. With comparisons to other state-of-the art approaches, the effectiveness of the proposed method could be validated in the experiments.

  19. Correlative Spectral Analysis of Gamma-Ray Bursts using Swift-BAT and GLAST-GBM

    International Nuclear Information System (INIS)

    Stamatikos, Michael; Sakamoto, Taka; Band, David L.

    2008-01-01

    We discuss the preliminary results of spectral analysis simulations involving anticipated correlated multi-wavelength observations of gamma-ray bursts (GRBs) using Swift's Burst Alert Telescope (BAT) and the Gamma-Ray Large Area Space Telescope's (GLAST) Burst Monitor (GLAST-GBM), resulting in joint spectral fits, including characteristic photon energy (E peak ) values, for a conservative annual estimate of ∼30 GRBs. The addition of BAT's spectral response will (i) complement in-orbit calibration efforts of GBM's detector response matrices, (ii) augment GLAST's low energy sensitivity by increasing the ∼20-100 keV effective area, (iii) facilitate ground-based follow-up efforts of GLAST GRBs by increasing GBM's source localization precision, and (iv) help identify a subset of non-triggered GRBs discovered via off-line GBM data analysis. Such multi-wavelength correlative analyses, which have been demonstrated by successful joint-spectral fits of Swift-BAT GRBs with other higher energy detectors such as Konus-WIND and Suzaku-WAM, would enable the study of broad-band spectral and temporal evolution of prompt GRB emission over three energy decades, thus potentially increasing science return without placing additional demands upon mission resources throughout their contemporaneous orbital tenure over the next decade.

  20. Correlative Spectral Analysis of Gamma-Ray Bursts using Swift-BAT and GLAST-GBM

    International Nuclear Information System (INIS)

    Stamatikos, Michael; Sakamoto, Takanori; Band, David L.

    2008-01-01

    We discuss the preliminary results of spectral analysis simulations involving anticipated correlated multi-wavelength observations of gamma-ray bursts (GRBs) using Swift's Burst Alert Telescope (BAT) and the Gamma-Ray Large Area Space Telescope's (GLAST) Burst Monitor (GLAST-GBM), resulting in joint spectral fits, including characteristic photon energy (E peak ) values, for a conservative annual estimate of ∼30 GRBs. The addition of BAT/s spectral response will (i) complement in-orbit calibration efforts of GBM's detector response matrices, (ii) augment GLAST's low energy sensitivity by increasing the ∼20-100 keV effective area, (iii) facilitate ground-based follow-up efforts of GLAST GRBs by increasing GBM's source localization precision, and (iv) help identify a subset of non-triggered GRBs discovered via off-line GBM data analysis. Such multi-wavelength correlative analyses, which have been demonstrated by successful joint-spectral fits of Swift-BAT GRBs with other higher energy detectors such as Konus-WIND and Suzaku-WAM, would enable the study of broad-band spectral and temporal evolution of prompt GRB emission over three energy decades, thus potentially increasing science return without placing additional demands upon mission resources throughout their contemporaneous orbital tenure over the next decade

  1. Nonparametric modeling of dynamic functional connectivity in fmri data

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Madsen, Kristoffer H.; Røge, Rasmus

    2015-01-01

    dynamic changes. The existing approaches modeling dynamic connectivity have primarily been based on time-windowing the data and k-means clustering. We propose a nonparametric generative model for dynamic FC in fMRI that does not rely on specifying window lengths and number of dynamic states. Rooted...

  2. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    Science.gov (United States)

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  3. Uncertainty in decision models analyzing cost-effectiveness : The joint distribution of incremental costs and effectiveness evaluated with a nonparametric bootstrap method

    NARCIS (Netherlands)

    Hunink, Maria; Bult, J.R.; De Vries, J; Weinstein, MC

    1998-01-01

    Purpose. To illustrate the use of a nonparametric bootstrap method in the evaluation of uncertainty in decision models analyzing cost-effectiveness. Methods. The authors reevaluated a previously published cost-effectiveness analysis that used a Markov model comparing initial percutaneous

  4. Fast analysis of spectral data using neural networks

    International Nuclear Information System (INIS)

    Roach, C.M.

    1992-01-01

    Fast analysis techniques are highly desirable in experiments where measurements are recorded at high rates. In fusion experiments the processing required to obtain plasma parameters is usually orders of magnitude slower than the data acquisition. Spectroscopic diagnostics suffer greatly from this problem. The extraction of plasma parameters from a measured spectrum typically corresponds to a nonlinear mapping between distinct multi-dimensional spaces. Where no analytic expression for the mapping exists, conventional analysis methods (e.g. least squares) are usually iterative and therefore slow. With this concern in mind a fast spectral analysis method involving neural networks has been investigated. (author) 6 refs., 3 figs

  5. Semiconductor detectors in current energy dispersive X-ray spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Betin, J; Zhabin, E; Krampit, I; Smirnov, V

    1980-04-01

    A review is presented of the properties of semiconductor detectors and of the possibilities stemming therefrom of using the detectors in X-ray spectral analysis in industries, in logging, in ecology and environmental control, in medicine, etc.

  6. Spectral Analysis of the Background in Ground-based, Long-slit ...

    Indian Academy of Sciences (India)

    1996-12-08

    Dec 8, 1996 ... Spectral Analysis of the Background in Ground-based,. Long-slit .... Figure 1 plots spectra from the 2-D array, after instrumental calibration and before correction for ..... which would merit attention and a better understanding.

  7. Comparative Study of Parametric and Non-parametric Approaches in Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Katebi, S.D.; Blanke, M.; Katebi, M.R.

    This report describes a comparative study between two approaches to fault detection and isolation in dynamic systems. The first approach uses a parametric model of the system. The main components of such techniques are residual and signature generation for processing and analyzing. The second...... approach is non-parametric in the sense that the signature analysis is only dependent on the frequency or time domain information extracted directly from the input-output signals. Based on these approaches, two different fault monitoring schemes are developed where the feature extraction and fault decision...

  8. Spectral Analysis Related to Bare-Metal and Drug-Eluting Coronary Stent Implantation

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Rose Mary Ferreira Lisboa da, E-mail: roselisboa@cardiol.br [Faculdade de Medicina da UFMG, Divinópolis, MG (Brazil); Silva, Carlos Augusto Bueno [Faculdade de Medicina da UFMG, Divinópolis, MG (Brazil); Belo Horizonte, Hospital São João de Deus, Divinópolis, MG (Brazil); Greco, Otaviano José [Belo Horizonte, Hospital São João de Deus, Divinópolis, MG (Brazil); Moreira, Maria da Consolação Vieira [Faculdade de Medicina da UFMG, Divinópolis, MG (Brazil)

    2014-08-15

    The autonomic nervous system plays a central role in cardiovascular regulation; sympathetic activation occurs during myocardial ischemia. To assess the spectral analysis of heart rate variability during stent implantation, comparing the types of stent. This study assessed 61 patients (mean age, 64.0 years; 35 men) with ischemic heart disease and indication for stenting. Stent implantation was performed under Holter monitoring to record the spectral analysis of heart rate variability (Fourier transform), measuring the low-frequency (LF) and high-frequency (HF) components, and the LF/HF ratio before and during the procedure. Bare-metal stent was implanted in 34 patients, while the others received drug-eluting stents. The right coronary artery was approached in 21 patients, the left anterior descending, in 28, and the circumflex, in 9. As compared with the pre-stenting period, all patients showed an increase in LF and HF during stent implantation (658 versus 185 ms2, p = 0.00; 322 versus 121, p = 0.00, respectively), with no change in LF/HF. During stent implantation, LF was 864 ms2 in patients with bare-metal stents, and 398 ms2 in those with drug-eluting stents (p = 0.00). The spectral analysis of heart rate variability showed no association with diabetes mellitus, family history, clinical presentation, beta-blockers, age, and vessel or its segment. Stent implantation resulted in concomitant sympathetic and vagal activations. Diabetes mellitus, use of beta-blockers, and the vessel approached showed no influence on the spectral analysis of heart rate variability. Sympathetic activation was lower during the implantation of drug-eluting stents.

  9. Nonparametric method for failures diagnosis in the actuating subsystem of aircraft control system

    Science.gov (United States)

    Terentev, M. N.; Karpenko, S. S.; Zybin, E. Yu; Kosyanchuk, V. V.

    2018-02-01

    In this paper we design a nonparametric method for failures diagnosis in the aircraft control system that uses the measurements of the control signals and the aircraft states only. It doesn’t require a priori information of the aircraft model parameters, training or statistical calculations, and is based on analytical nonparametric one-step-ahead state prediction approach. This makes it possible to predict the behavior of unidentified and failure dynamic systems, to weaken the requirements to control signals, and to reduce the diagnostic time and problem complexity.

  10. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  11. Curie depth and geothermal gradient from spectral analysis of ...

    African Journals Online (AJOL)

    The resent (2009) aeromagnetic data covering lower part of Benue and upper part of Anambra basins was subjected to one dimensional spectral analysis with the aim of estimating the curie depth and subsequently evaluating both the geothermal gradient and heat flow for the area. Curie point depth estimate obtained were ...

  12. A spectral analysis of rice grains

    International Nuclear Information System (INIS)

    McIlvaine, M.S.; Cua, F.T.; Navarro, E.F.

    1976-06-01

    With the advent of extensive nuclear testing and the development and use of highly potent pesticides and fertilizers, the hazardous threats of radioactive contamination due to fallout and to the absorption of pesticide residues have been given due consideration. Among the many forms of life exposed to these threats are food crops and among these is rice. Several rice grain samples - Japanese rice samples ''A'' and ''B'' submitted by the National Grains Authority (NGA) for analysis, random samples of rice being sold to the public at local markets, and ''black rice'' which were picked from along the shores of a Mindoro town were subjected to spectral analysis. Results revealed the presence of trace elements normally found in plants, such as; K-42, I-124, Cl-38, Na-24, Br-82, and Mn-56. No mercury was detected in the sample specimen analyzed

  13. Spectral decomposition in advection-diffusion analysis by finite element methods

    International Nuclear Information System (INIS)

    Nickell, R.E.; Gartling, D.K.; Strang, G.

    1978-01-01

    In a recent study of the convergence properties of finite element methods in nonlinear fluid mechanics, an indirect approach was taken. A two-dimensional example with a known exact solution was chosen as the vehicle for the study, and various mesh refinements were tested in an attempt to extract information on the effect of the local Reynolds number. However, more direct approaches are usually preferred. In this study one such direct approach is followed, based upon the spectral decomposition of the solution operator. Spectral decomposition is widely employed as a solution technique for linear structural dynamics problems and can be applied readily to linear, transient heat transfer analysis; in this case, the extension to nonlinear problems is of interest. It was shown previously that spectral techniques were applicable to stiff systems of rate equations, while recent studies of geometrically and materially nonlinear structural dynamics have demonstrated the increased information content of the numerical results. The use of spectral decomposition in nonlinear problems of heat and mass transfer would be expected to yield equally increased flow of information to the analyst, and this information could include a quantitative comparison of various solution strategies, meshes, and element hierarchies

  14. High-Selectivity Filter Banks for Spectral Analysis of Music Signals

    Directory of Open Access Journals (Sweden)

    Luiz W. P. Biscainho

    2007-01-01

    Full Text Available This paper approaches, under a unified framework, several algorithms for the spectral analysis of musical signals. Such algorithms include the fast Fourier transform (FFT, the fast filter bank (FFB, the constant-Q transform (CQT, and the bounded-Q transform (BQT, previously known from the associated literature. Two new methods are then introduced, namely, the constant-Q fast filter bank (CQFFB and the bounded-Q fast filter bank (BQFFB, combining the positive characteristics of the previously mentioned algorithms. The provided analyses indicate that the proposed BQFFB achieves an excellent compromise between the reduced computational effort of the FFT, the high selectivity of each output channel of the FFB, and the efficient distribution of frequency channels associated to the CQT and BQT methods. Examples are included to illustrate the performances of these methods in the spectral analysis of music signals.

  15. The application of non-parametric statistical method for an ALARA implementation

    International Nuclear Information System (INIS)

    Cho, Young Ho; Herr, Young Hoi

    2003-01-01

    The cost-effective reduction of Occupational Radiation Dose (ORD) at a nuclear power plant could not be achieved without going through an extensive analysis of accumulated ORD data of existing plants. Through the data analysis, it is required to identify what are the jobs of repetitive high ORD at the nuclear power plant. In this study, Percentile Rank Sum Method (PRSM) is proposed to identify repetitive high ORD jobs, which is based on non-parametric statistical theory. As a case study, the method is applied to ORD data of maintenance and repair jobs at Kori units 3 and 4 that are pressurized water reactors with 950 MWe capacity and have been operated since 1986 and 1987, respectively in Korea. The results was verified and validated, and PRSM has been demonstrated to be an efficient method of analyzing the data

  16. Spectral Analysis within the Virtual Observatory: The GAVO Service TheoSSA

    Science.gov (United States)

    Ringat, E.

    2012-03-01

    In the last decade, numerous Virtual Observatory organizations were established. One of these is the German Astrophysical Virtual Observatory (GAVO) that e.g. provides access to spectral energy distributions via the service TheoSSA. In a pilot phase, these are based on the Tübingen NLTE Model-Atmosphere Package (TMAP) and suitable for hot, compact stars. We demonstrate the power of TheoSSA in an application to the sdOB primary of AA Doradus by comparison with a “classical” spectral analysis.

  17. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  18. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  19. Semiconductor detectors in current energy dispersive X-ray spectral analysis

    International Nuclear Information System (INIS)

    Betin, J.; Zhabin, E.; Krampit, I.; Smirnov, V.

    1980-01-01

    A review is presented of the properties of semiconductor detectors and of the possibilities stemming therefrom of using the detectors in X-ray spectral analysis in industries, in logging, in ecology and environmental control, in medicine, etc. (M.S.)

  20. Nonparametric estimation of benchmark doses in environmental risk assessment

    Science.gov (United States)

    Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen

    2013-01-01

    Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133

  1. Spurious Seasonality Detection: A Non-Parametric Test Proposal

    Directory of Open Access Journals (Sweden)

    Aurelio F. Bariviera

    2018-01-01

    Full Text Available This paper offers a general and comprehensive definition of the day-of-the-week effect. Using symbolic dynamics, we develop a unique test based on ordinal patterns in order to detect it. This test uncovers the fact that the so-called “day-of-the-week” effect is partly an artifact of the hidden correlation structure of the data. We present simulations based on artificial time series as well. While time series generated with long memory are prone to exhibit daily seasonality, pure white noise signals exhibit no pattern preference. Since ours is a non-parametric test, it requires no assumptions about the distribution of returns, so that it could be a practical alternative to conventional econometric tests. We also made an exhaustive application of the here-proposed technique to 83 stock indexes around the world. Finally, the paper highlights the relevance of symbolic analysis in economic time series studies.

  2. Dimensionality Reduction of Hyperspectral Image with Graph-Based Discriminant Analysis Considering Spectral Similarity

    Directory of Open Access Journals (Sweden)

    Fubiao Feng

    2017-03-01

    Full Text Available Recently, graph embedding has drawn great attention for dimensionality reduction in hyperspectral imagery. For example, locality preserving projection (LPP utilizes typical Euclidean distance in a heat kernel to create an affinity matrix and projects the high-dimensional data into a lower-dimensional space. However, the Euclidean distance is not sufficiently correlated with intrinsic spectral variation of a material, which may result in inappropriate graph representation. In this work, a graph-based discriminant analysis with spectral similarity (denoted as GDA-SS measurement is proposed, which fully considers curves changing description among spectral bands. Experimental results based on real hyperspectral images demonstrate that the proposed method is superior to traditional methods, such as supervised LPP, and the state-of-the-art sparse graph-based discriminant analysis (SGDA.

  3. Nonparametric NAR-ARCH Modelling of Stock Prices by the Kernel Methodology

    Directory of Open Access Journals (Sweden)

    Mohamed Chikhi

    2018-02-01

    Full Text Available This paper analyses cyclical behaviour of Orange stock price listed in French stock exchange over 01/03/2000 to 02/02/2017 by testing the nonlinearities through a class of conditional heteroscedastic nonparametric models. The linearity and Gaussianity assumptions are rejected for Orange Stock returns and informational shocks have transitory effects on returns and volatility. The forecasting results show that Orange stock prices are short-term predictable and nonparametric NAR-ARCH model has better performance over parametric MA-APARCH model for short horizons. Plus, the estimates of this model are also better comparing to the predictions of the random walk model. This finding provides evidence for weak form of inefficiency in Paris stock market with limited rationality, thus it emerges arbitrage opportunities.

  4. On the robust nonparametric regression estimation for a functional regressor

    OpenAIRE

    Azzedine , Nadjia; Laksaci , Ali; Ould-Saïd , Elias

    2009-01-01

    On the robust nonparametric regression estimation for a functional regressor correspondance: Corresponding author. (Ould-Said, Elias) (Azzedine, Nadjia) (Laksaci, Ali) (Ould-Said, Elias) Departement de Mathematiques--> , Univ. Djillali Liabes--> , BP 89--> , 22000 Sidi Bel Abbes--> - ALGERIA (Azzedine, Nadjia) Departement de Mathema...

  5. Comparison of Parametric and Nonparametric Methods for Analyzing the Bias of a Numerical Model

    Directory of Open Access Journals (Sweden)

    Isaac Mugume

    2016-01-01

    Full Text Available Numerical models are presently applied in many fields for simulation and prediction, operation, or research. The output from these models normally has both systematic and random errors. The study compared January 2015 temperature data for Uganda as simulated using the Weather Research and Forecast model with actual observed station temperature data to analyze the bias using parametric (the root mean square error (RMSE, the mean absolute error (MAE, mean error (ME, skewness, and the bias easy estimate (BES and nonparametric (the sign test, STM methods. The RMSE normally overestimates the error compared to MAE. The RMSE and MAE are not sensitive to direction of bias. The ME gives both direction and magnitude of bias but can be distorted by extreme values while the BES is insensitive to extreme values. The STM is robust for giving the direction of bias; it is not sensitive to extreme values but it does not give the magnitude of bias. The graphical tools (such as time series and cumulative curves show the performance of the model with time. It is recommended to integrate parametric and nonparametric methods along with graphical methods for a comprehensive analysis of bias of a numerical model.

  6. Bayesian Nonparametric Clustering for Positive Definite Matrices.

    Science.gov (United States)

    Cherian, Anoop; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2016-05-01

    Symmetric Positive Definite (SPD) matrices emerge as data descriptors in several applications of computer vision such as object tracking, texture recognition, and diffusion tensor imaging. Clustering these data matrices forms an integral part of these applications, for which soft-clustering algorithms (K-Means, expectation maximization, etc.) are generally used. As is well-known, these algorithms need the number of clusters to be specified, which is difficult when the dataset scales. To address this issue, we resort to the classical nonparametric Bayesian framework by modeling the data as a mixture model using the Dirichlet process (DP) prior. Since these matrices do not conform to the Euclidean geometry, rather belongs to a curved Riemannian manifold,existing DP models cannot be directly applied. Thus, in this paper, we propose a novel DP mixture model framework for SPD matrices. Using the log-determinant divergence as the underlying dissimilarity measure to compare these matrices, and further using the connection between this measure and the Wishart distribution, we derive a novel DPM model based on the Wishart-Inverse-Wishart conjugate pair. We apply this model to several applications in computer vision. Our experiments demonstrate that our model is scalable to the dataset size and at the same time achieves superior accuracy compared to several state-of-the-art parametric and nonparametric clustering algorithms.

  7. Spectral analysis of major heart tones

    Science.gov (United States)

    Lejkowski, W.; Dobrowolski, A. P.; Majka, K.; Olszewski, R.

    2018-04-01

    The World Health Organization (WHO) figures clearly indicate that cardiovascular disease is the most common cause of death and disability in the world. Early detection of cardiovascular pathologies may contribute to reducing such a high mortality rate. Auscultatory examination is one of the first and most important step in cardiologic diagnostics. Unfortunately, proper diagnosis is closely related to long-term practice and medical experience. The article presents the author's system of recording phonocardiograms and the way of saving data, as well as the outline of the analysis algorithm, which will allow to assign a case to a patient with heart failure or healthy voluntaries' with a certain high probability. The results of a pilot study of phonocardiographic signals were also presented as an introduction to further research aimed at the development of an efficient diagnostic algorithm based on spectral analysis of the heart tone.

  8. Nonparametric combinatorial sequence models.

    Science.gov (United States)

    Wauthier, Fabian L; Jordan, Michael I; Jojic, Nebojsa

    2011-11-01

    This work considers biological sequences that exhibit combinatorial structures in their composition: groups of positions of the aligned sequences are "linked" and covary as one unit across sequences. If multiple such groups exist, complex interactions can emerge between them. Sequences of this kind arise frequently in biology but methodologies for analyzing them are still being developed. This article presents a nonparametric prior on sequences which allows combinatorial structures to emerge and which induces a posterior distribution over factorized sequence representations. We carry out experiments on three biological sequence families which indicate that combinatorial structures are indeed present and that combinatorial sequence models can more succinctly describe them than simpler mixture models. We conclude with an application to MHC binding prediction which highlights the utility of the posterior distribution over sequence representations induced by the prior. By integrating out the posterior, our method compares favorably to leading binding predictors.

  9. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    Science.gov (United States)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  10. Archives of Astronomical Spectral Observations and Atomic/Molecular Databases for their Analysis

    Directory of Open Access Journals (Sweden)

    Ryabchikova T.

    2015-12-01

    Full Text Available We present a review of open-source data for stellar spectroscopy investigations. It includes lists of the main archives of medium-to-high resolution spectroscopic observations, with brief characteristics of the archive data (spectral range, resolving power, flux units. We also review atomic and molecular databases that contain parameters of spectral lines, cross-sections and reaction rates needed for a detailed analysis of high resolution, high signal-to-noise ratio stellar spectra.

  11. The analysis of toxic connections content in water by spectral methods

    Science.gov (United States)

    Plotnikova, I. V.; Chaikovskaya, O. N.; Sokolova, I. V.; Artyushin, V. R.

    2017-08-01

    The current state of ecology means the strict observance of measures for the utilization of household and industrial wastes that is connected with very essential expenses of means and time. Thanks to spectroscopic devices usage the spectral methods allow to carry out the express quantitative and qualitative analysis in a workplace and field conditions. In a work the application of spectral methods by studying the degradation of toxic organic compounds after preliminary radiation of various sources is shown. Experimental data of optical density of water at various influences are given.

  12. Genomic breeding value estimation using nonparametric additive regression models

    Directory of Open Access Journals (Sweden)

    Solberg Trygve

    2009-01-01

    Full Text Available Abstract Genomic selection refers to the use of genomewide dense markers for breeding value estimation and subsequently for selection. The main challenge of genomic breeding value estimation is the estimation of many effects from a limited number of observations. Bayesian methods have been proposed to successfully cope with these challenges. As an alternative class of models, non- and semiparametric models were recently introduced. The present study investigated the ability of nonparametric additive regression models to predict genomic breeding values. The genotypes were modelled for each marker or pair of flanking markers (i.e. the predictors separately. The nonparametric functions for the predictors were estimated simultaneously using additive model theory, applying a binomial kernel. The optimal degree of smoothing was determined by bootstrapping. A mutation-drift-balance simulation was carried out. The breeding values of the last generation (genotyped was predicted using data from the next last generation (genotyped and phenotyped. The results show moderate to high accuracies of the predicted breeding values. A determination of predictor specific degree of smoothing increased the accuracy.

  13. Application of spectral decomposition analysis to in vivo quantification of aluminum by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Comsa, D.C. E-mail: comsadc@mcmaster.ca; Prestwich, W.V.; McNeill, F.E.; Byun, S.H

    2004-12-01

    The toxic effects of aluminum are cumulative and result in painful forms of renal osteodystrophy, most notably adynamic bone disease and osteomalacia, but also other forms of disease. The Trace Element Group at McMaster University has developed an accelerator-based in vivo procedure for detecting aluminum body burden by neutron activation analysis (NAA). Further refining of the method was necessary for increasing its sensitivity. In this context, the present study proposes an improved algorithm for data analysis, based on spectral decomposition. A new minimum detectable limit (MDL) of (0.7{+-}0.1) mg Al was reached for a local dose of (20{+-}1) mSv. The study also addresses the feasibility of a new data acquisition technique, the electronic rejection of the coincident events detected by a NaI(Tl) system. It is expected that the application of this technique, together with spectral decomposition analysis, would provide an acceptable MDL for the method to be valuable in a clinical setting.

  14. Non-parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean-reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  15. Non-Parametric Estimation of Diffusion-Paths Using Wavelet Scaling Methods

    DEFF Research Database (Denmark)

    Høg, Esben

    2003-01-01

    In continuous time, diffusion processes have been used for modelling financial dynamics for a long time. For example the Ornstein-Uhlenbeck process (the simplest mean--reverting process) has been used to model non-speculative price processes. We discuss non--parametric estimation of these processes...

  16. Hierarchical Bayesian nonparametric mixture models for clustering with variable relevance determination.

    Science.gov (United States)

    Yau, Christopher; Holmes, Chris

    2011-07-01

    We propose a hierarchical Bayesian nonparametric mixture model for clustering when some of the covariates are assumed to be of varying relevance to the clustering problem. This can be thought of as an issue in variable selection for unsupervised learning. We demonstrate that by defining a hierarchical population based nonparametric prior on the cluster locations scaled by the inverse covariance matrices of the likelihood we arrive at a 'sparsity prior' representation which admits a conditionally conjugate prior. This allows us to perform full Gibbs sampling to obtain posterior distributions over parameters of interest including an explicit measure of each covariate's relevance and a distribution over the number of potential clusters present in the data. This also allows for individual cluster specific variable selection. We demonstrate improved inference on a number of canonical problems.

  17. High-speed Vibrational Imaging and Spectral Analysis of Lipid Bodies by Compound Raman Microscopy

    OpenAIRE

    Slipchenko, Mikhail N.; Le, Thuc T.; Chen, Hongtao; Cheng, Ji-Xin

    2009-01-01

    Cells store excess energy in the form of cytoplasmic lipid droplets. At present, it is unclear how different types of fatty acids contribute to the formation of lipid-droplets. We describe a compound Raman microscope capable of both high-speed chemical imaging and quantitative spectral analysis on the same platform. We use a picosecond laser source to perform coherent Raman scattering imaging of a biological sample and confocal Raman spectral analysis at points of interest. The potential of t...

  18. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  19. A Nonparametric Test for Seasonal Unit Roots

    OpenAIRE

    Kunst, Robert M.

    2009-01-01

    Abstract: We consider a nonparametric test for the null of seasonal unit roots in quarterly time series that builds on the RUR (records unit root) test by Aparicio, Escribano, and Sipols. We find that the test concept is more promising than a formalization of visual aids such as plots by quarter. In order to cope with the sensitivity of the original RUR test to autocorrelation under its null of a unit root, we suggest an augmentation step by autoregression. We present some evidence on the siz...

  20. Developing an immigration policy for Germany on the basis of a nonparametric labor market classification

    OpenAIRE

    Froelich, Markus; Puhani, Patrick

    2004-01-01

    Based on a nonparametrically estimated model of labor market classifications, this paper makes suggestions for immigration policy using data from western Germany in the 1990s. It is demonstrated that nonparametric regression is feasible in higher dimensions with only a few thousand observations. In sum, labor markets able to absorb immigrants are characterized by above average age and by professional occupations. On the other hand, labor markets for young workers in service occupations are id...

  1. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    Science.gov (United States)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  2. A comparative study of non-parametric models for identification of ...

    African Journals Online (AJOL)

    However, the frequency response method using random binary signals was good for unpredicted white noise characteristics and considered the best method for non-parametric system identifica-tion. The autoregressive external input (ARX) model was very useful for system identification, but on applicati-on, few input ...

  3. A semi-nonparametric mixture model for selecting functionally consistent proteins.

    Science.gov (United States)

    Yu, Lianbo; Doerge, Rw

    2010-09-28

    High-throughput technologies have led to a new era of proteomics. Although protein microarray experiments are becoming more common place there are a variety of experimental and statistical issues that have yet to be addressed, and that will carry over to new high-throughput technologies unless they are investigated. One of the largest of these challenges is the selection of functionally consistent proteins. We present a novel semi-nonparametric mixture model for classifying proteins as consistent or inconsistent while controlling the false discovery rate and the false non-discovery rate. The performance of the proposed approach is compared to current methods via simulation under a variety of experimental conditions. We provide a statistical method for selecting functionally consistent proteins in the context of protein microarray experiments, but the proposed semi-nonparametric mixture model method can certainly be generalized to solve other mixture data problems. The main advantage of this approach is that it provides the posterior probability of consistency for each protein.

  4. 1st Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Lahiri, S; Politis, Dimitris

    2014-01-01

    This volume is composed of peer-reviewed papers that have developed from the First Conference of the International Society for NonParametric Statistics (ISNPS). This inaugural conference took place in Chalkidiki, Greece, June 15-19, 2012. It was organized with the co-sponsorship of the IMS, the ISI, and other organizations. M.G. Akritas, S.N. Lahiri, and D.N. Politis are the first executive committee members of ISNPS, and the editors of this volume. ISNPS has a distinguished Advisory Committee that includes Professors R.Beran, P.Bickel, R. Carroll, D. Cook, P. Hall, R. Johnson, B. Lindsay, E. Parzen, P. Robinson, M. Rosenblatt, G. Roussas, T. SubbaRao, and G. Wahba. The Charting Committee of ISNPS consists of more than 50 prominent researchers from all over the world.   The chapters in this volume bring forth recent advances and trends in several areas of nonparametric statistics. In this way, the volume facilitates the exchange of research ideas, promotes collaboration among researchers from all over the wo...

  5. On Parametric (and Non-Parametric Variation

    Directory of Open Access Journals (Sweden)

    Neil Smith

    2009-11-01

    Full Text Available This article raises the issue of the correct characterization of ‘Parametric Variation’ in syntax and phonology. After specifying their theoretical commitments, the authors outline the relevant parts of the Principles–and–Parameters framework, and draw a three-way distinction among Universal Principles, Parameters, and Accidents. The core of the contribution then consists of an attempt to provide identity criteria for parametric, as opposed to non-parametric, variation. Parametric choices must be antecedently known, and it is suggested that they must also satisfy seven individually necessary and jointly sufficient criteria. These are that they be cognitively represented, systematic, dependent on the input, deterministic, discrete, mutually exclusive, and irreversible.

  6. Spectral Analysis of the sdO Standard Star Feige 34

    Science.gov (United States)

    Latour, M.; Chayer, P.; Green, E. M.; Fontaine, G.

    2017-03-01

    We present our current work on the spectral analysis of the hot sdO star Feige 34. We combine high S/N optical spectra and fully-blanketed non-LTE model atmospheres to derive its fundamental parameters (Teff, log g) and helium abundance. Our best fits indicate Teff = 63 000 K, log g = 6.0 and log N(He)/N(H) = -1.8. We also use available ultraviolet spectra (IUE and FUSE) to measure metal abundances. We find the star to be enriched in iron and nickel by a factor of ten with respect to the solar values, while lighter elements have subsolar abundances. The FUSE spectrum suggests that the spectral lines could be broadened by rotation.

  7. Arbitrary-order Hilbert Spectral Analysis and Intermittency in Solar Wind Density Fluctuations

    Science.gov (United States)

    Carbone, Francesco; Sorriso-Valvo, Luca; Alberti, Tommaso; Lepreti, Fabio; Chen, Christopher H. K.; Němeček, Zdenek; Šafránková, Jana

    2018-05-01

    The properties of inertial- and kinetic-range solar wind turbulence have been investigated with the arbitrary-order Hilbert spectral analysis method, applied to high-resolution density measurements. Due to the small sample size and to the presence of strong nonstationary behavior and large-scale structures, the classical analysis in terms of structure functions may prove to be unsuccessful in detecting the power-law behavior in the inertial range, and may underestimate the scaling exponents. However, the Hilbert spectral method provides an optimal estimation of the scaling exponents, which have been found to be close to those for velocity fluctuations in fully developed hydrodynamic turbulence. At smaller scales, below the proton gyroscale, the system loses its intermittent multiscaling properties and converges to a monofractal process. The resulting scaling exponents, obtained at small scales, are in good agreement with those of classical fractional Brownian motion, indicating a long-term memory in the process, and the absence of correlations around the spectral-break scale. These results provide important constraints on models of kinetic-range turbulence in the solar wind.

  8. The role of the computer in automated spectral analysis

    International Nuclear Information System (INIS)

    Rasmussen, S.E.

    This report describes how a computer can be an extremely valuable tool for routine analysis of spectra, which is a time consuming process. A number of general-purpose algorithms that are available for the various phases of the analysis can be implemented, if these algorithms are designed to cope with all the variations that may occur. Since this is basically impossible, one must find a compromise between obscure error and program complexity. This is usually possible with human interaction at critical points. In spectral analysis this is possible if the user scans the data on an interactive graphics terminal, makes the necessary changes and then returns control to the computer for completion of the analysis

  9. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  10. Non-parametric estimation of the individual's utility map

    OpenAIRE

    Noguchi, Takao; Sanborn, Adam N.; Stewart, Neil

    2013-01-01

    Models of risky choice have attracted much attention in behavioural economics. Previous research has repeatedly demonstrated that individuals' choices are not well explained by expected utility theory, and a number of alternative models have been examined using carefully selected sets of choice alternatives. The model performance however, can depend on which choice alternatives are being tested. Here we develop a non-parametric method for estimating the utility map over the wide range of choi...

  11. Nonparametric Bayesian models through probit stick-breaking processes.

    Science.gov (United States)

    Rodríguez, Abel; Dunson, David B

    2011-03-01

    We describe a novel class of Bayesian nonparametric priors based on stick-breaking constructions where the weights of the process are constructed as probit transformations of normal random variables. We show that these priors are extremely flexible, allowing us to generate a great variety of models while preserving computational simplicity. Particular emphasis is placed on the construction of rich temporal and spatial processes, which are applied to two problems in finance and ecology.

  12. Kernel bandwidth estimation for non-parametric density estimation: a comparative study

    CSIR Research Space (South Africa)

    Van der Walt, CM

    2013-12-01

    Full Text Available We investigate the performance of conventional bandwidth estimators for non-parametric kernel density estimation on a number of representative pattern-recognition tasks, to gain a better understanding of the behaviour of these estimators in high...

  13. A general approach to posterior contraction in nonparametric inverse problems

    NARCIS (Netherlands)

    Knapik, Bartek; Salomond, Jean Bernard

    In this paper, we propose a general method to derive an upper bound for the contraction rate of the posterior distribution for nonparametric inverse problems. We present a general theorem that allows us to derive contraction rates for the parameter of interest from contraction rates of the related

  14. An exploratory data analysis of electroencephalograms using the functional boxplots approach

    KAUST Repository

    Ngo, Duy

    2015-08-19

    Many model-based methods have been developed over the last several decades for analysis of electroencephalograms (EEGs) in order to understand electrical neural data. In this work, we propose to use the functional boxplot (FBP) to analyze log periodograms of EEG time series data in the spectral domain. The functional bloxplot approach produces a median curve—which is not equivalent to connecting medians obtained from frequency-specific boxplots. In addition, this approach identifies a functional median, summarizes variability, and detects potential outliers. By extending FBPs analysis from one-dimensional curves to surfaces, surface boxplots are also used to explore the variation of the spectral power for the alpha (8–12 Hz) and beta (16–32 Hz) frequency bands across the brain cortical surface. By using rank-based nonparametric tests, we also investigate the stationarity of EEG traces across an exam acquired during resting-state by comparing the spectrum during the early vs. late phases of a single resting-state EEG exam.

  15. An exploratory data analysis of electroencephalograms using the functional boxplots approach

    Science.gov (United States)

    Ngo, Duy; Sun, Ying; Genton, Marc G.; Wu, Jennifer; Srinivasan, Ramesh; Cramer, Steven C.; Ombao, Hernando

    2015-01-01

    Many model-based methods have been developed over the last several decades for analysis of electroencephalograms (EEGs) in order to understand electrical neural data. In this work, we propose to use the functional boxplot (FBP) to analyze log periodograms of EEG time series data in the spectral domain. The functional bloxplot approach produces a median curve—which is not equivalent to connecting medians obtained from frequency-specific boxplots. In addition, this approach identifies a functional median, summarizes variability, and detects potential outliers. By extending FBPs analysis from one-dimensional curves to surfaces, surface boxplots are also used to explore the variation of the spectral power for the alpha (8–12 Hz) and beta (16–32 Hz) frequency bands across the brain cortical surface. By using rank-based nonparametric tests, we also investigate the stationarity of EEG traces across an exam acquired during resting-state by comparing the spectrum during the early vs. late phases of a single resting-state EEG exam. PMID:26347598

  16. An exploratory data analysis of electroencephalograms using the functional boxplots approach

    KAUST Repository

    Ngo, Duy; Sun, Ying; Genton, Marc G.; Wu, Jennifer; Srinivasan, Ramesh; Cramer, Steven C.; Ombao, Hernando

    2015-01-01

    Many model-based methods have been developed over the last several decades for analysis of electroencephalograms (EEGs) in order to understand electrical neural data. In this work, we propose to use the functional boxplot (FBP) to analyze log periodograms of EEG time series data in the spectral domain. The functional bloxplot approach produces a median curve—which is not equivalent to connecting medians obtained from frequency-specific boxplots. In addition, this approach identifies a functional median, summarizes variability, and detects potential outliers. By extending FBPs analysis from one-dimensional curves to surfaces, surface boxplots are also used to explore the variation of the spectral power for the alpha (8–12 Hz) and beta (16–32 Hz) frequency bands across the brain cortical surface. By using rank-based nonparametric tests, we also investigate the stationarity of EEG traces across an exam acquired during resting-state by comparing the spectrum during the early vs. late phases of a single resting-state EEG exam.

  17. Real-time spectral analysis of HRV signals: an interactive and user-friendly PC system.

    Science.gov (United States)

    Basano, L; Canepa, F; Ottonello, P

    1998-01-01

    We present a real-time system, built around a PC and a low-cost data acquisition board, for the spectral analysis of the heart rate variability signal. The Windows-like operating environment on which it is based makes the computer program very user-friendly even for non-specialized personnel. The Power Spectral Density is computed through the use of a hybrid method, in which a classical FFT analysis follows an autoregressive finite-extension of data; the stationarity of the sequence is continuously checked. The use of this algorithm gives a high degree of robustness of the spectral estimation. Moreover, always in real time, the FFT of every data block is computed and displayed in order to corroborate the results as well as to allow the user to interactively choose a proper AR model order.

  18. Semiclassical analysis spectral correlations in mesoscopic systems

    International Nuclear Information System (INIS)

    Argaman, N.; Imry, Y.; Smilansky, U.

    1991-07-01

    We consider the recently developed semiclassical analysis of the quantum mechanical spectral form factor, which may be expressed in terms of classically defiable properties. When applied to electrons whose classical behaviour is diffusive, the results of earlier quantum mechanical perturbative derivations, which were developed under a different set of assumptions, are reproduced. The comparison between the two derivations shows that the results depends not on their specific details, but to a large extent on the principle of quantum coherent superposition, and on the generality of the notion of diffusion. The connection with classical properties facilitates application to many physical situations. (author)

  19. Piecewise spectrally band-pass for compressive coded aperture spectral imaging

    International Nuclear Information System (INIS)

    Qian Lu-Lu; Lü Qun-Bo; Huang Min; Xiang Li-Bin

    2015-01-01

    Coded aperture snapshot spectral imaging (CASSI) has been discussed in recent years. It has the remarkable advantages of high optical throughput, snapshot imaging, etc. The entire spatial-spectral data-cube can be reconstructed with just a single two-dimensional (2D) compressive sensing measurement. On the other hand, for less spectrally sparse scenes, the insufficiency of sparse sampling and aliasing in spatial-spectral images reduce the accuracy of reconstructed three-dimensional (3D) spectral cube. To solve this problem, this paper extends the improved CASSI. A band-pass filter array is mounted on the coded mask, and then the first image plane is divided into some continuous spectral sub-band areas. The entire 3D spectral cube could be captured by the relative movement between the object and the instrument. The principle analysis and imaging simulation are presented. Compared with peak signal-to-noise ratio (PSNR) and the information entropy of the reconstructed images at different numbers of spectral sub-band areas, the reconstructed 3D spectral cube reveals an observable improvement in the reconstruction fidelity, with an increase in the number of the sub-bands and a simultaneous decrease in the number of spectral channels of each sub-band. (paper)

  20. Spectral Electroencephalogram Analysis for the Evaluation of Encephalopathy Grade in Children With Acute Liver Failure.

    Science.gov (United States)

    Press, Craig A; Morgan, Lindsey; Mills, Michele; Stack, Cynthia V; Goldstein, Joshua L; Alonso, Estella M; Wainwright, Mark S

    2017-01-01

    Spectral electroencephalogram analysis is a method for automated analysis of electroencephalogram patterns, which can be performed at the bedside. We sought to determine the utility of spectral electroencephalogram for grading hepatic encephalopathy in children with acute liver failure. Retrospective cohort study. Tertiary care pediatric hospital. Patients between 0 and 18 years old who presented with acute liver failure and were admitted to the PICU. None. Electroencephalograms were analyzed by spectral analysis including total power, relative δ, relative θ, relative α, relative β, θ-to-Δ ratio, and α-to-Δ ratio. Normal values and ranges were first derived using normal electroencephalograms from 70 children of 0-18 years old. Age had a significant effect on each variable measured (p liver failure were available for spectral analysis. The median age was 4.3 years, 14 of 33 were male, and the majority had an indeterminate etiology of acute liver failure. Neuroimaging was performed in 26 cases and was normal in 20 cases (77%). The majority (64%) survived, and 82% had a good outcome with a score of 1-3 on the Pediatric Glasgow Outcome Scale-Extended at the time of discharge. Hepatic encephalopathy grade correlated with the qualitative visual electroencephalogram scores assigned by blinded neurophysiologists (rs = 0.493; p encephalopathy was correlated with a total power of less than or equal to 50% of normal for children 0-3 years old, and with a relative θ of less than or equal to 50% normal for children more than 3 years old (p > 0.05). Spectral electroencephalogram classification correlated with outcome (p encephalopathy and correlates with outcome. Spectral electroencephalogram may allow improved quantitative and reproducible assessment of hepatic encephalopathy grade in children with acute liver failure.

  1. A BAYESIAN NONPARAMETRIC MIXTURE MODEL FOR SELECTING GENES AND GENE SUBNETWORKS.

    Science.gov (United States)

    Zhao, Yize; Kang, Jian; Yu, Tianwei

    2014-06-01

    It is very challenging to select informative features from tens of thousands of measured features in high-throughput data analysis. Recently, several parametric/regression models have been developed utilizing the gene network information to select genes or pathways strongly associated with a clinical/biological outcome. Alternatively, in this paper, we propose a nonparametric Bayesian model for gene selection incorporating network information. In addition to identifying genes that have a strong association with a clinical outcome, our model can select genes with particular expressional behavior, in which case the regression models are not directly applicable. We show that our proposed model is equivalent to an infinity mixture model for which we develop a posterior computation algorithm based on Markov chain Monte Carlo (MCMC) methods. We also propose two fast computing algorithms that approximate the posterior simulation with good accuracy but relatively low computational cost. We illustrate our methods on simulation studies and the analysis of Spellman yeast cell cycle microarray data.

  2. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method.

    Science.gov (United States)

    Zhang, Tingting; Kou, S C

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure.

  3. A Bayesian Beta-Mixture Model for Nonparametric IRT (BBM-IRT)

    Science.gov (United States)

    Arenson, Ethan A.; Karabatsos, George

    2017-01-01

    Item response models typically assume that the item characteristic (step) curves follow a logistic or normal cumulative distribution function, which are strictly monotone functions of person test ability. Such assumptions can be overly-restrictive for real item response data. We propose a simple and more flexible Bayesian nonparametric IRT model…

  4. A non-parametric method for correction of global radiation observations

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt

    2013-01-01

    in the observations are corrected. These are errors such as: tilt in the leveling of the sensor, shadowing from surrounding objects, clipping and saturation in the signal processing, and errors from dirt and wear. The method is based on a statistical non-parametric clear-sky model which is applied to both...

  5. EXOPLANETARY DETECTION BY MULTIFRACTAL SPECTRAL ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Sahil; Wettlaufer, John S. [Program in Applied Mathematics, Yale University, New Haven, CT (United States); Sordo, Fabio Del [Department of Astronomy, Yale University, New Haven, CT (United States)

    2017-01-01

    Owing to technological advances, the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies timescales that characterize planetary orbital motion around the host star and those that arise from stellar features such as spots. Without fitting stellar models to spectral data, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the timescales obtained to primary and secondary eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via Doppler shift detection. Finally, we have analyzed synthetic spectra obtained using the SOAP 2.0 tool, which simulates a stellar spectrum and the influence of the presence of a planet or a spot on that spectrum over one orbital period. We have demonstrated that, so long as the signal-to-noise-ratio ≥ 75, our approach reconstructs the planetary orbital period, as well as the rotation period of a spot on the stellar surface.

  6. On asymptotic analysis of spectral problems in elasticity

    Directory of Open Access Journals (Sweden)

    S.A. Nazarov

    Full Text Available The three-dimensional spectral elasticity problem is studied in an anisotropic and inhomogeneous solid with small defects, i.e., inclusions, voids, and microcracks. Asymptotics of eigenfrequencies and the corresponding elastic eigenmodes are constructed and justified. New technicalities of the asymptotic analysis are related to variable coefficients of differential operators, vectorial setting of the problem, and usage of intrinsic integral characteristics of defects. The asymptotic formulae are developed in a form convenient for application in shape optimization and inverse problems.

  7. Application of nonparametric statistics to material strength/reliability assessment

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-01-01

    An advanced material technology requires data base on a wide variety of material behavior which need to be established experimentally. It may often happen that experiments are practically limited in terms of reproducibility or a range of test parameters. Statistical methods can be applied to understanding uncertainties in such a quantitative manner as required from the reliability point of view. Statistical assessment involves determinations of a most probable value and the maximum and/or minimum value as one-sided or two-sided confidence limit. A scatter of test data can be approximated by a theoretical distribution only if the goodness of fit satisfies a test criterion. Alternatively, nonparametric statistics (NPS) or distribution-free statistics can be applied. Mathematical procedures by NPS are well established for dealing with most reliability problems. They handle only order statistics of a sample. Mathematical formulas and some applications to engineering assessments are described. They include confidence limits of median, population coverage of sample, required minimum number of a sample, and confidence limits of fracture probability. These applications demonstrate that a nonparametric statistical estimation is useful in logical decision making in the case a large uncertainty exists. (author)

  8. Spectral Depth Analysis of some Segments of the Bida Basin ...

    African Journals Online (AJOL)

    ADOWIE PERE

    2017-12-16

    Dec 16, 2017 ... ABSTRACT: Spectral depth analysis was carried out on ten (10) of the 2009 total magnetic field intensity data sheets covering some segments of the Bida basin, to determine the depth to magnetic basement within the basin. The data was ... groundwater lie concealed beneath the earth surface and the ...

  9. Exact nonparametric inference for detection of nonlinear determinism

    OpenAIRE

    Luo, Xiaodong; Zhang, Jie; Small, Michael; Moroz, Irene

    2005-01-01

    We propose an exact nonparametric inference scheme for the detection of nonlinear determinism. The essential fact utilized in our scheme is that, for a linear stochastic process with jointly symmetric innovations, its ordinary least square (OLS) linear prediction error is symmetric about zero. Based on this viewpoint, a class of linear signed rank statistics, e.g. the Wilcoxon signed rank statistic, can be derived with the known null distributions from the prediction error. Thus one of the ad...

  10. Thermal infrared spectral analysis of compacted fine-grained mineral mixtures: implications for spectral interpretation of lithified sedimentary materials on Mars

    Science.gov (United States)

    Pan, C.; Rogers, D.

    2012-12-01

    Characterizing the thermal infrared (TIR) spectral mixing behavior of compacted fine-grained mineral assemblages is necessary for facilitating quantitative mineralogy of sedimentary surfaces from spectral measurements. Previous researchers have demonstrated that TIR spectra from igneous and metamorphic rocks as well as coarse-grained (>63 micron) sand mixtures combine in proportion to their volume abundance. However, the spectral mixing behavior of compacted, fine-grained mineral mixtures that would be characteristic of sedimentary depositional environments has received little attention. Here we characterize the spectral properties of pressed pellet samples of pestle and centrifuged to obtain less than 10 micron size. Pure phases and mixtures of two, three and four components were made in varying proportions by volume. All of the samples were pressed into pellets at 15000PSI to minimize volume scattering. Thermal infrared spectra of pellets were measured in the Vibrational Spectroscopy Laboratory at Stony Brook University with a Thermo Fisher Nicolet 6700 Fourier transform infrared Michelson interferometer from ~225 to 2000 cm-1. Our preliminary results indicate that some pelletized samples have contributions from volume scattering, which leads to non-linear spectral combinations. It is not clear if the transparency features (which arise from multiple surface reflections of incident photons) are due to minor clinging fines on an otherwise specular pellet surface or to partially transmitted energy through optically thin grains in the compacted mixture. Inclusion of loose powder (analysis of TES and Mini-TES data of lithified sedimentary deposits.

  11. Use of fast Fourier transform in gamma-ray spectral analysis

    International Nuclear Information System (INIS)

    Tominaga, Shoji; Nayatani, Yoshinobu; Nagata, Shojiro; Sasaki, Takashi; Ueda, Isamu.

    1978-01-01

    In order to simplify the mass data processing in a response matrix method for γ-ray spectral analysis, a method using a Fast Fourier Transform has been devised. The validity of the method has been confirmed by computer simulation for spectra of a NaI detector. First, it is shown that spectral data can be represented by Fourier series with a reduced number of terms. Then the estimation of intensities of γ-ray components is performed by a matrix operation using the compressed data of an observation spectrum and standard spectra in Fourier coefficients. The identification of γ-ray energies is also easy. Several features of the method and a general problem to be solved in relation to a response matrix method are described. (author)

  12. Promotion time cure rate model with nonparametric form of covariate effects.

    Science.gov (United States)

    Chen, Tianlei; Du, Pang

    2018-05-10

    Survival data with a cured portion are commonly seen in clinical trials. Motivated from a biological interpretation of cancer metastasis, promotion time cure model is a popular alternative to the mixture cure rate model for analyzing such data. The existing promotion cure models all assume a restrictive parametric form of covariate effects, which can be incorrectly specified especially at the exploratory stage. In this paper, we propose a nonparametric approach to modeling the covariate effects under the framework of promotion time cure model. The covariate effect function is estimated by smoothing splines via the optimization of a penalized profile likelihood. Point-wise interval estimates are also derived from the Bayesian interpretation of the penalized profile likelihood. Asymptotic convergence rates are established for the proposed estimates. Simulations show excellent performance of the proposed nonparametric method, which is then applied to a melanoma study. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Monitoring urban greenness dynamics using multiple endmember spectral mixture analysis.

    Directory of Open Access Journals (Sweden)

    Muye Gan

    Full Text Available Urban greenness is increasingly recognized as an essential constituent of the urban environment and can provide a range of services and enhance residents' quality of life. Understanding the pattern of urban greenness and exploring its spatiotemporal dynamics would contribute valuable information for urban planning. In this paper, we investigated the pattern of urban greenness in Hangzhou, China, over the past two decades using time series Landsat-5 TM data obtained in 1990, 2002, and 2010. Multiple endmember spectral mixture analysis was used to derive vegetation cover fractions at the subpixel level. An RGB-vegetation fraction model, change intensity analysis and the concentric technique were integrated to reveal the detailed, spatial characteristics and the overall pattern of change in the vegetation cover fraction. Our results demonstrated the ability of multiple endmember spectral mixture analysis to accurately model the vegetation cover fraction in pixels despite the complex spectral confusion of different land cover types. The integration of multiple techniques revealed various changing patterns in urban greenness in this region. The overall vegetation cover has exhibited a drastic decrease over the past two decades, while no significant change occurred in the scenic spots that were studied. Meanwhile, a remarkable recovery of greenness was observed in the existing urban area. The increasing coverage of small green patches has played a vital role in the recovery of urban greenness. These changing patterns were more obvious during the period from 2002 to 2010 than from 1990 to 2002, and they revealed the combined effects of rapid urbanization and greening policies. This work demonstrates the usefulness of time series of vegetation cover fractions for conducting accurate and in-depth studies of the long-term trajectories of urban greenness to obtain meaningful information for sustainable urban development.

  14. Notes on the Implementation of Non-Parametric Statistics within the Westinghouse Realistic Large Break LOCA Evaluation Model (ASTRUM)

    International Nuclear Information System (INIS)

    Frepoli, Cesare; Oriani, Luca

    2006-01-01

    In recent years, non-parametric or order statistics methods have been widely used to assess the impact of the uncertainties within Best-Estimate LOCA evaluation models. The bounding of the uncertainties is achieved with a direct Monte Carlo sampling of the uncertainty attributes, with the minimum trial number selected to 'stabilize' the estimation of the critical output values (peak cladding temperature (PCT), local maximum oxidation (LMO), and core-wide oxidation (CWO A non-parametric order statistics uncertainty analysis was recently implemented within the Westinghouse Realistic Large Break LOCA evaluation model, also referred to as 'Automated Statistical Treatment of Uncertainty Method' (ASTRUM). The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature. The USNRC-approved Westinghouse method follows a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a Large Break LOCA analysis. This is a solid approach which guarantees that a bounding value (at 95% probability) of the 95 th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. The objective of this paper is to provide additional insights on the ASTRUM statistical approach, with a more in-depth analysis of pros and cons of the order statistics and of the Westinghouse approach in the implementation of this statistical methodology. (authors)

  15. Spectral analysis of the He-enriched sdO-star HD 127493

    Science.gov (United States)

    Dorsch, Matti; Latour, Marilyn; Heber, Ulrich

    2018-02-01

    The bright sdO star HD127493 is known to be of mixed H/He composition and excellent archival spectra covering both optical and ultraviolet ranges are available. UV spectra play a key role as they give access to many chemical species that do not show spectral lines in the optical, such as iron and nickel. This encouraged the quantitative spectral analysis of this prototypical mixed H/He composition sdO star. We determined atmospheric parameters for HD127493 in addition to the abundance of C, N, O, Si, S, Fe, and Ni in the atmosphere using non-LTE model atmospheres calculated with TLUSTY/SYNSPEC. A comparison between the parallax distance measured by Hipparcos and the derived spectroscopic distance indicate that the derived atmospheric parameters are realistic. From our metal abundance analysis, we find a strong CNO signature and enrichment in iron and nickel.

  16. Localized Spectral Analysis of Fluctuating Power Generation from Solar Energy Systems

    Directory of Open Access Journals (Sweden)

    Johan Nijs

    2007-01-01

    Full Text Available Fluctuations in solar irradiance are a serious obstacle for the future large-scale application of photovoltaics. Occurring regularly with the passage of clouds, they can cause unexpected power variations and introduce voltage dips to the power distribution system. This paper proposes the treatment of such fluctuating time series as realizations of a stochastic, locally stationary, wavelet process. Its local spectral density can be estimated from empirical data by means of wavelet periodograms. The wavelet approach allows the analysis of the amplitude of fluctuations per characteristic scale, hence, persistence of the fluctuation. Furthermore, conclusions can be drawn on the frequency of occurrence of fluctuations of different scale. This localized spectral analysis was applied to empirical data of two successive years. The approach is especially useful for network planning and load management of power distribution systems containing a high density of photovoltaic generation units.

  17. Spectral analysis and multigrid preconditioners for two-dimensional space-fractional diffusion equations

    Science.gov (United States)

    Moghaderi, Hamid; Dehghan, Mehdi; Donatelli, Marco; Mazza, Mariarosa

    2017-12-01

    Fractional diffusion equations (FDEs) are a mathematical tool used for describing some special diffusion phenomena arising in many different applications like porous media and computational finance. In this paper, we focus on a two-dimensional space-FDE problem discretized by means of a second order finite difference scheme obtained as combination of the Crank-Nicolson scheme and the so-called weighted and shifted Grünwald formula. By fully exploiting the Toeplitz-like structure of the resulting linear system, we provide a detailed spectral analysis of the coefficient matrix at each time step, both in the case of constant and variable diffusion coefficients. Such a spectral analysis has a very crucial role, since it can be used for designing fast and robust iterative solvers. In particular, we employ the obtained spectral information to define a Galerkin multigrid method based on the classical linear interpolation as grid transfer operator and damped-Jacobi as smoother, and to prove the linear convergence rate of the corresponding two-grid method. The theoretical analysis suggests that the proposed grid transfer operator is strong enough for working also with the V-cycle method and the geometric multigrid. On this basis, we introduce two computationally favourable variants of the proposed multigrid method and we use them as preconditioners for Krylov methods. Several numerical results confirm that the resulting preconditioning strategies still keep a linear convergence rate.

  18. Spectral methods for the detection of network community structure: a comparative analysis

    International Nuclear Information System (INIS)

    Shen, Hua-Wei; Cheng, Xue-Qi

    2010-01-01

    Spectral analysis has been successfully applied to the detection of community structure of networks, respectively being based on the adjacency matrix, the standard Laplacian matrix, the normalized Laplacian matrix, the modularity matrix, the correlation matrix and several other variants of these matrices. However, the comparison between these spectral methods is less reported. More importantly, it is still unclear which matrix is more appropriate for the detection of community structure. This paper answers the question by evaluating the effectiveness of these five matrices against benchmark networks with heterogeneous distributions of node degree and community size. Test results demonstrate that the normalized Laplacian matrix and the correlation matrix significantly outperform the other three matrices at identifying the community structure of networks. This indicates that it is crucial to take into account the heterogeneous distribution of node degree when using spectral analysis for the detection of community structure. In addition, to our surprise, the modularity matrix exhibits very similar performance to the adjacency matrix, which indicates that the modularity matrix does not gain benefits from using the configuration model as a reference network with the consideration of the node degree heterogeneity

  19. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    Science.gov (United States)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  20. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  1. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems.

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  2. Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks

    Science.gov (United States)

    Gray-Davies, Tristan; Holmes, Chris C.; Caron, François

    2018-01-01

    We present a novel Bayesian nonparametric regression model for covariates X and continuous response variable Y ∈ ℝ. The model is parametrized in terms of marginal distributions for Y and X and a regression function which tunes the stochastic ordering of the conditional distributions F (y|x). By adopting an approximate composite likelihood approach, we show that the resulting posterior inference can be decoupled for the separate components of the model. This procedure can scale to very large datasets and allows for the use of standard, existing, software from Bayesian nonparametric density estimation and Plackett-Luce ranking estimation to be applied. As an illustration, we show an application of our approach to a US Census dataset, with over 1,300,000 data points and more than 100 covariates. PMID:29623150

  3. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    Science.gov (United States)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  4. Nonparametric statistics a step-by-step approach

    CERN Document Server

    Corder, Gregory W

    2014-01-01

    "…a very useful resource for courses in nonparametric statistics in which the emphasis is on applications rather than on theory.  It also deserves a place in libraries of all institutions where introductory statistics courses are taught."" -CHOICE This Second Edition presents a practical and understandable approach that enhances and expands the statistical toolset for readers. This book includes: New coverage of the sign test and the Kolmogorov-Smirnov two-sample test in an effort to offer a logical and natural progression to statistical powerSPSS® (Version 21) software and updated screen ca

  5. A structural nonparametric reappraisal of the CO2 emissions-income relationship

    NARCIS (Netherlands)

    Azomahou, T.T.; Goedhuys - Degelin, Micheline; Nguyen-Van, P.

    Relying on a structural nonparametric estimation, we show that co2 emissions clearly increase with income at low income levels. For higher income levels, we observe a decreasing relationship, though not significant. We also find thatco2 emissions monotonically increases with energy use at a

  6. Nonparametric estimation of the stationary M/G/1 workload distribution function

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted

    2005-01-01

    In this paper it is demonstrated how a nonparametric estimator of the stationary workload distribution function of the M/G/1-queue can be obtained by systematic sampling the workload process. Weak convergence results and bootstrap methods for empirical distribution functions for stationary associ...

  7. Transformation-invariant and nonparametric monotone smooth estimation of ROC curves.

    Science.gov (United States)

    Du, Pang; Tang, Liansheng

    2009-01-30

    When a new diagnostic test is developed, it is of interest to evaluate its accuracy in distinguishing diseased subjects from non-diseased subjects. The accuracy of the test is often evaluated by receiver operating characteristic (ROC) curves. Smooth ROC estimates are often preferable for continuous test results when the underlying ROC curves are in fact continuous. Nonparametric and parametric methods have been proposed by various authors to obtain smooth ROC curve estimates. However, there are certain drawbacks with the existing methods. Parametric methods need specific model assumptions. Nonparametric methods do not always satisfy the inherent properties of the ROC curves, such as monotonicity and transformation invariance. In this paper we propose a monotone spline approach to obtain smooth monotone ROC curves. Our method ensures important inherent properties of the underlying ROC curves, which include monotonicity, transformation invariance, and boundary constraints. We compare the finite sample performance of the newly proposed ROC method with other ROC smoothing methods in large-scale simulation studies. We illustrate our method through a real life example. Copyright (c) 2008 John Wiley & Sons, Ltd.

  8. Impulse response identification with deterministic inputs using non-parametric methods

    International Nuclear Information System (INIS)

    Bhargava, U.K.; Kashyap, R.L.; Goodman, D.M.

    1985-01-01

    This paper addresses the problem of impulse response identification using non-parametric methods. Although the techniques developed herein apply to the truncated, untruncated, and the circulant models, we focus on the truncated model which is useful in certain applications. Two methods of impulse response identification will be presented. The first is based on the minimization of the C/sub L/ Statistic, which is an estimate of the mean-square prediction error; the second is a Bayesian approach. For both of these methods, we consider the effects of using both the identity matrix and the Laplacian matrix as weights on the energy in the impulse response. In addition, we present a method for estimating the effective length of the impulse response. Estimating the length is particularly important in the truncated case. Finally, we develop a method for estimating the noise variance at the output. Often, prior information on the noise variance is not available, and a good estimate is crucial to the success of estimating the impulse response with a nonparametric technique

  9. Assessment of modern spectral analysis methods to improve wavenumber resolution of F-K spectra

    International Nuclear Information System (INIS)

    Shirley, T.E.; Laster, S.J.; Meek, R.A.

    1987-01-01

    The improvement in wavenumber spectra obtained by using high resolution spectral estimators is examined. Three modern spectral estimators were tested, namely the Autoregressive/Maximum Entropy (AR/ME) method, the Extended Prony method, and an eigenstructure method. They were combined with the conventional Fourier method by first transforming each trace with a Fast Fourier Transform (FFT). A high resolution spectral estimator was applied to the resulting complex spatial sequence for each frequency. The collection of wavenumber spectra thus computed comprises a hybrid f-k spectrum with high wavenumber resolution and less spectral ringing. Synthetic and real data records containing 25 traces were analyzed by using the hybrid f-k method. The results show an FFT-AR/ME f-k spectrum has noticeably better wavenumber resolution and more spectral dynamic range than conventional spectra when the number of channels is small. The observed improvement suggests the hybrid technique is potentially valuable in seismic data analysis

  10. On the spectral analysis of iterative solutions of the discretized one-group transport equation

    International Nuclear Information System (INIS)

    Sanchez, Richard

    2004-01-01

    We analyze the Fourier-mode technique used for the spectral analysis of iterative solutions of the one-group discretized transport equation. We introduce a direct spectral analysis for the iterative solution of finite difference approximations for finite slabs composed of identical layers, providing thus a complementary analysis that is more appropriate for reactor applications. Numerical calculations for the method of characteristics and with the diamond difference approximation show the appearance of antisymmetric modes generated by the iteration on boundary data. We have also utilized the discrete Fourier transform to compute the spectrum for a periodic slab containing N identical layers and shown that at the limit N → ∞ one obtains the familiar Fourier-mode solution

  11. Spectral analysis of bacanora (agave-derived liquor) by using FT-Raman spectroscopy

    Science.gov (United States)

    Ortega Clavero, Valentin; Weber, Andreas; Schröder, Werner; Curticapean, Dan

    2016-04-01

    The industry of the agave-derived bacanora, in the northern Mexican state of Sonora, has been growing substantially in recent years. However, this higher demand still lies under the influences of a variety of social, legal, cultural, ecological and economic elements. The governmental institutions of the state have tried to encourage a sustainable development and certain levels of standardization in the production of bacanora by applying different economical and legal strategies. However, a large portion of this alcoholic beverage is still produced in a traditional and rudimentary fashion. Beyond the quality of the beverage, the lack of proper control, by using adequate instrumental methods, might represent a health risk, as in several cases traditional-distilled beverages can contain elevated levels of harmful materials. The present article describes the qualitative spectral analysis of samples of the traditional-produced distilled beverage bacanora in the range from 0 cm-1 to 3500 cm-1 by using a Fourier Transform Raman spectrometer. This particular technique has not been previously explored for the analysis of bacanora, as in the case of other beverages, including tequila. The proposed instrumental arrangement for the spectral analysis has been built by combining conventional hardware parts (Michelson interferometer, photo-diodes, visible laser, etc.) and a set of self-developed evaluation algorithms. The resulting spectral information has been compared to those of pure samples of ethanol and to the spectra from different samples of the alcoholic beverage tequila. The proposed instrumental arrangement can be used the analysis of bacanora.

  12. Modal spectral analysis of piping: Determination of the significant frequency range

    International Nuclear Information System (INIS)

    Geraets, L.H.

    1981-01-01

    This paper investigates the influence of the number of modes on the response of a piping system in a dynamic modal spectral analysis. It shows how the analysis can be limited to a specific frequency range of the pipe (independent of the frequency range of the response spectrum), allowing cost reduction without loss in accuracy. The 'missing mass' is taken into account through an original technique. (orig./HP)

  13. Nonparametric Estimation of Interval Reliability for Discrete-Time Semi-Markov Systems

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Limnios, Nikolaos

    2016-01-01

    In this article, we consider a repairable discrete-time semi-Markov system with finite state space. The measure of the interval reliability is given as the probability of the system being operational over a given finite-length time interval. A nonparametric estimator is proposed for the interval...

  14. Assessing pupil and school performance by non-parametric and parametric techniques

    NARCIS (Netherlands)

    de Witte, K.; Thanassoulis, E.; Simpson, G.; Battisti, G.; Charlesworth-May, A.

    2010-01-01

    This paper discusses the use of the non-parametric free disposal hull (FDH) and the parametric multi-level model (MLM) as alternative methods for measuring pupil and school attainment where hierarchical structured data are available. Using robust FDH estimates, we show how to decompose the overall

  15. Supremum Norm Posterior Contraction and Credible Sets for Nonparametric Multivariate Regression

    NARCIS (Netherlands)

    Yoo, W.W.; Ghosal, S

    2016-01-01

    In the setting of nonparametric multivariate regression with unknown error variance, we study asymptotic properties of a Bayesian method for estimating a regression function f and its mixed partial derivatives. We use a random series of tensor product of B-splines with normal basis coefficients as a

  16. A non-parametric hierarchical model to discover behavior dynamics from tracks

    NARCIS (Netherlands)

    Kooij, J.F.P.; Englebienne, G.; Gavrila, D.M.

    2012-01-01

    We present a novel non-parametric Bayesian model to jointly discover the dynamics of low-level actions and high-level behaviors of tracked people in open environments. Our model represents behaviors as Markov chains of actions which capture high-level temporal dynamics. Actions may be shared by

  17. On the use of permutation in and the performance of a class of nonparametric methods to detect differential gene expression.

    Science.gov (United States)

    Pan, Wei

    2003-07-22

    Recently a class of nonparametric statistical methods, including the empirical Bayes (EB) method, the significance analysis of microarray (SAM) method and the mixture model method (MMM), have been proposed to detect differential gene expression for replicated microarray experiments conducted under two conditions. All the methods depend on constructing a test statistic Z and a so-called null statistic z. The null statistic z is used to provide some reference distribution for Z such that statistical inference can be accomplished. A common way of constructing z is to apply Z to randomly permuted data. Here we point our that the distribution of z may not approximate the null distribution of Z well, leading to possibly too conservative inference. This observation may apply to other permutation-based nonparametric methods. We propose a new method of constructing a null statistic that aims to estimate the null distribution of a test statistic directly. Using simulated data and real data, we assess and compare the performance of the existing method and our new method when applied in EB, SAM and MMM. Some interesting findings on operating characteristics of EB, SAM and MMM are also reported. Finally, by combining the idea of SAM and MMM, we outline a simple nonparametric method based on the direct use of a test statistic and a null statistic.

  18. On Wasserstein Two-Sample Testing and Related Families of Nonparametric Tests

    Directory of Open Access Journals (Sweden)

    Aaditya Ramdas

    2017-01-01

    Full Text Available Nonparametric two-sample or homogeneity testing is a decision theoretic problem that involves identifying differences between two random variables without making parametric assumptions about their underlying distributions. The literature is old and rich, with a wide variety of statistics having being designed and analyzed, both for the unidimensional and the multivariate setting. Inthisshortsurvey,wefocusonteststatisticsthatinvolvetheWassersteindistance. Usingan entropic smoothing of the Wasserstein distance, we connect these to very different tests including multivariate methods involving energy statistics and kernel based maximum mean discrepancy and univariate methods like the Kolmogorov–Smirnov test, probability or quantile (PP/QQ plots and receiver operating characteristic or ordinal dominance (ROC/ODC curves. Some observations are implicit in the literature, while others seem to have not been noticed thus far. Given nonparametric two-sample testing’s classical and continued importance, we aim to provide useful connections for theorists and practitioners familiar with one subset of methods but not others.

  19. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. [Analysis of software for identifying spectral line of laser-induced breakdown spectroscopy based on LabVIEW].

    Science.gov (United States)

    Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang

    2012-03-01

    Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.

  1. Two Step Procedure Using a 1-D Slab Spectral Geometry in a Pebble Bed Reactor Core Analysis

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Kim, Kang Seog; Noh, Jae Man; Joo, Hyung Kook

    2005-01-01

    A strong spectral interaction between the core and the reflector has been one of the main concerns in the analysis of pebble bed reactor cores. To resolve this problem, VSOP adopted iteration between the spectrum calculation in a spectral zone and the global core calculation. In VSOP, the whole problem domain is divided into many spectral zones in which the fine group spectrum is calculated using bucklings for fast groups and albedos for thermal groups from the global core calculation. The resulting spectrum in each spectral zone is used to generate broad group cross sections of the spectral zone for the global core calculation. In this paper, we demonstrate a two step procedure in a pebble bed reactor core analysis. In the first step, we generate equivalent cross sections from a 1-D slab spectral geometry model with the help of the equivalence theory. The equivalent cross sections generated in this way include the effect of the spectral interaction between the core and the reflector. In the second step, we perform a diffusion calculation using the equivalent cross sections generated in the first step. A simple benchmark problem derived from the PMBR-400 Reactor was introduced to verify this approach. We compared the two step solutions with the Monte Carlo (MC) solutions for the problem

  2. Estimation of Stochastic Volatility Models by Nonparametric Filtering

    DEFF Research Database (Denmark)

    Kanaya, Shin; Kristensen, Dennis

    2016-01-01

    /estimated volatility process replacing the latent process. Our estimation strategy is applicable to both parametric and nonparametric stochastic volatility models, and can handle both jumps and market microstructure noise. The resulting estimators of the stochastic volatility model will carry additional biases...... and variances due to the first-step estimation, but under regularity conditions we show that these vanish asymptotically and our estimators inherit the asymptotic properties of the infeasible estimators based on observations of the volatility process. A simulation study examines the finite-sample properties...

  3. Preliminary Geologic/spectral Analysis of LANDSAT-4 Thematic Mapper Data, Wind River/bighorn Basin Area, Wyoming

    Science.gov (United States)

    Lang, H. R.; Conel, J. E.; Paylor, E. D.

    1984-01-01

    A LIDQA evaluation for geologic applications of a LANDSAT TM scene covering the Wind River/Bighorn Basin area, Wyoming, is examined. This involves a quantitative assessment of data quality including spatial and spectral characteristics. Analysis is concentrated on the 6 visible, near infrared, and short wavelength infrared bands. Preliminary analysis demonstrates that: (1) principal component images derived from the correlation matrix provide the most useful geologic information. To extract surface spectral reflectance, the TM radiance data must be calibrated. Scatterplots demonstrate that TM data can be calibrated and sensor response is essentially linear. Low instrumental offset and gain settings result in spectral data that do not utilize the full dynamic range of the TM system.

  4. Hadron Energy Reconstruction for ATLAS Barrel Combined Calorimeter Using Non-Parametrical Method

    CERN Document Server

    Kulchitskii, Yu A

    2000-01-01

    Hadron energy reconstruction for the ATLAS barrel prototype combined calorimeter in the framework of the non-parametrical method is discussed. The non-parametrical method utilizes only the known e/h ratios and the electron calibration constants and does not require the determination of any parameters by a minimization technique. Thus, this technique lends itself to fast energy reconstruction in a first level trigger. The reconstructed mean values of the hadron energies are within \\pm1% of the true values and the fractional energy resolution is [(58\\pm 3)%{\\sqrt{GeV}}/\\sqrt{E}+(2.5\\pm0.3)%]\\bigoplus(1.7\\pm0.2) GeV/E. The value of the e/h ratio obtained for the electromagnetic compartment of the combined calorimeter is 1.74\\pm0.04. Results of a study of the longitudinal hadronic shower development are also presented.

  5. High-speed vibrational imaging and spectral analysis of lipid bodies by compound Raman microscopy.

    Science.gov (United States)

    Slipchenko, Mikhail N; Le, Thuc T; Chen, Hongtao; Cheng, Ji-Xin

    2009-05-28

    Cells store excess energy in the form of cytoplasmic lipid droplets. At present, it is unclear how different types of fatty acids contribute to the formation of lipid droplets. We describe a compound Raman microscope capable of both high-speed chemical imaging and quantitative spectral analysis on the same platform. We used a picosecond laser source to perform coherent Raman scattering imaging of a biological sample and confocal Raman spectral analysis at points of interest. The potential of the compound Raman microscope was evaluated on lipid bodies of cultured cells and live animals. Our data indicate that the in vivo fat contains much more unsaturated fatty acids (FAs) than the fat formed via de novo synthesis in 3T3-L1 cells. Furthermore, in vivo analysis of subcutaneous adipocytes and glands revealed a dramatic difference not only in the unsaturation level but also in the thermodynamic state of FAs inside their lipid bodies. Additionally, the compound Raman microscope allows tracking of the cellular uptake of a specific fatty acid and its abundance in nascent cytoplasmic lipid droplets. The high-speed vibrational imaging and spectral analysis capability renders compound Raman microscopy an indispensible analytical tool for the study of lipid-droplet biology.

  6. Low default credit scoring using two-class non-parametric kernel density estimation

    CSIR Research Space (South Africa)

    Rademeyer, E

    2016-12-01

    Full Text Available This paper investigates the performance of two-class classification credit scoring data sets with low default ratios. The standard two-class parametric Gaussian and non-parametric Parzen classifiers are extended, using Bayes’ rule, to include either...

  7. Application of spectral analysis for differentiation between metals using signals from eddy-current transducers

    OpenAIRE

    Abramovych, Anton; Poddubny, Volodymyr

    2017-01-01

    The authors theoretically and experimentally substantiated the use of the spectral method for processing a signal of the vortex-current metal detector for dichotomous differentiation between metals. Results of experimental research that prove the possibility of using spectral analysis for differentiation between metals were presented. The vortex-current method for detection of hidden metal objects was analyzed. It was indicated that amplitude of output VCD signal is determined by electric con...

  8. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    International Nuclear Information System (INIS)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David; Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Buchhave, Lars A.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  9. TRANSIT TIMING OBSERVATIONS FROM KEPLER. II. CONFIRMATION OF TWO MULTIPLANET SYSTEMS VIA A NON-PARAMETRIC CORRELATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; Moorhead, Althea V.; Morehead, Robert C. [Astronomy Department, University of Florida, 211 Bryant Space Sciences Center, Gainesville, FL 32611 (United States); Fabrycky, Daniel C. [UCO/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States); Steffen, Jason H. [Fermilab Center for Particle Astrophysics, P.O. Box 500, MS 127, Batavia, IL 60510 (United States); Carter, Joshua A.; Fressin, Francois; Holman, Matthew J.; Ragozzine, Darin; Charbonneau, David [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Lissauer, Jack J.; Rowe, Jason F.; Borucki, William J.; Bryson, Stephen T.; Burke, Christopher J.; Caldwell, Douglas A. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Welsh, William F. [Astronomy Department, San Diego State University, San Diego, CA 92182-1221 (United States); Allen, Christopher [Orbital Sciences Corporation/NASA Ames Research Center, Moffett Field, CA 94035 (United States); Batalha, Natalie M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192 (United States); Buchhave, Lars A., E-mail: eford@astro.ufl.edu [Niels Bohr Institute, Copenhagen University, DK-2100 Copenhagen (Denmark); Collaboration: Kepler Science Team; and others

    2012-05-10

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies is in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the TTVs of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple-planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  10. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  11. Spectral Analysis Of Business Cycles In The Visegrad Group Countries

    Directory of Open Access Journals (Sweden)

    Kijek Arkadiusz

    2017-06-01

    Full Text Available This paper examines the business cycle properties of Visegrad group countries. The main objective is to identify business cycles in these countries and to study the relationships between them. The author applies a modification of the Fourier analysis to estimate cycle amplitudes and frequencies. This allows for a more precise estimation of cycle characteristics than the traditional approach. The cross-spectral analysis of GDP cyclical components for the Czech Republic, Hungary, Poland and Slovakia makes it possible to assess the degree of business cycle synchronization between the countries.

  12. Overlapping communities detection based on spectral analysis of line graphs

    Science.gov (United States)

    Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan

    2018-05-01

    Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.

  13. An Excel‐based implementation of the spectral method of action potential alternans analysis

    Science.gov (United States)

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  14. HYPERSPECTRAL HYPERION IMAGERY ANALYSIS AND ITS APPLICATION USING SPECTRAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    W. Pervez

    2015-03-01

    Full Text Available Rapid advancement in remote sensing open new avenues to explore the hyperspectral Hyperion imagery pre-processing techniques, analysis and application for land use mapping. The hyperspectral data consists of 242 bands out of which 196 calibrated/useful bands are available for hyperspectral applications. Atmospheric correction applied to the hyperspectral calibrated bands make the data more useful for its further processing/ application. Principal component (PC analysis applied to the hyperspectral calibrated bands reduced the dimensionality of the data and it is found that 99% of the data is held in first 10 PCs. Feature extraction is one of the important application by using vegetation delineation and normalized difference vegetation index. The machine learning classifiers uses the technique to identify the pixels having significant difference in the spectral signature which is very useful for classification of an image. Supervised machine learning classifier technique has been used for classification of hyperspectral image which resulted in overall efficiency of 86.6703 and Kappa co-efficient of 0.7998.

  15. A theoretical-experimental methodology for assessing the sensitivity of biomedical spectral imaging platforms, assays, and analysis methods.

    Science.gov (United States)

    Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C

    2018-01-01

    Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Performance evaluation of spectral deconvolution analysis tool (SDAT) software used for nuclear explosion radionuclide measurements

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.; Biegalski, S.R.; Haas, D.A.

    2008-01-01

    The Spectral Deconvolution Analysis Tool (SDAT) software was developed to improve counting statistics and detection limits for nuclear explosion radionuclide measurements. SDAT utilizes spectral deconvolution spectroscopy techniques and can analyze both β-γ coincidence spectra for radioxenon isotopes and high-resolution HPGe spectra from aerosol monitors. Spectral deconvolution spectroscopy is an analysis method that utilizes the entire signal deposited in a gamma-ray detector rather than the small portion of the signal that is present in one gamma-ray peak. This method shows promise to improve detection limits over classical gamma-ray spectroscopy analytical techniques; however, this hypothesis has not been tested. To address this issue, we performed three tests to compare the detection ability and variance of SDAT results to those of commercial off- the-shelf (COTS) software which utilizes a standard peak search algorithm. (author)

  17. Evaluation of Nonparametric Probabilistic Forecasts of Wind Power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg, orlov 31.07.2008

    Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates of the most...... likely outcome for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form of a single or a set...

  18. Estimation from PET data of transient changes in dopamine concentration induced by alcohol: support for a non-parametric signal estimation method

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, C C; Yoder, K K; Normandin, M D; Morris, E D [Department of Radiology, Indiana University School of Medicine, Indianapolis, IN (United States); Kareken, D A [Department of Neurology, Indiana University School of Medicine, Indianapolis, IN (United States); Bouman, C A [Weldon School of Biomedical Engineering, Purdue University, West Lafayette, IN (United States); O' Connor, S J [Department of Psychiatry, Indiana University School of Medicine, Indianapolis, IN (United States)], E-mail: emorris@iupui.edu

    2008-03-07

    We previously developed a model-independent technique (non-parametric ntPET) for extracting the transient changes in neurotransmitter concentration from paired (rest and activation) PET studies with a receptor ligand. To provide support for our method, we introduced three hypotheses of validation based on work by Endres and Carson (1998 J. Cereb. Blood Flow Metab. 18 1196-210) and Yoder et al (2004 J. Nucl. Med. 45 903-11), and tested them on experimental data. All three hypotheses describe relationships between the estimated free (synaptic) dopamine curves (F{sup DA}(t)) and the change in binding potential ({delta}BP). The veracity of the F{sup DA}(t) curves recovered by nonparametric ntPET is supported when the data adhere to the following hypothesized behaviors: (1) {delta}BP should decline with increasing DA peak time, (2) {delta}BP should increase as the strength of the temporal correlation between F{sup DA}(t) and the free raclopride (F{sup RAC}(t)) curve increases, (3) {delta}BP should decline linearly with the effective weighted availability of the receptor sites. We analyzed regional brain data from 8 healthy subjects who received two [{sup 11}C]raclopride scans: one at rest, and one during which unanticipated IV alcohol was administered to stimulate dopamine release. For several striatal regions, nonparametric ntPET was applied to recover F{sup DA}(t), and binding potential values were determined. Kendall rank-correlation analysis confirmed that the F{sup DA}(t) data followed the expected trends for all three validation hypotheses. Our findings lend credence to our model-independent estimates of F{sup DA}(t). Application of nonparametric ntPET may yield important insights into how alterations in timing of dopaminergic neurotransmission are involved in the pathologies of addiction and other psychiatric disorders.

  19. Nonparametric identification of nonlinear dynamic systems using a synchronisation-based method

    Science.gov (United States)

    Kenderi, Gábor; Fidlin, Alexander

    2014-12-01

    The present study proposes an identification method for highly nonlinear mechanical systems that does not require a priori knowledge of the underlying nonlinearities to reconstruct arbitrary restoring force surfaces between degrees of freedom. This approach is based on the master-slave synchronisation between a dynamic model of the system as the slave and the real system as the master using measurements of the latter. As the model synchronises to the measurements, it becomes an observer of the real system. The optimal observer algorithm in a least-squares sense is given by the Kalman filter. Using the well-known state augmentation technique, the Kalman filter can be turned into a dual state and parameter estimator to identify parameters of a priori characterised nonlinearities. The paper proposes an extension of this technique towards nonparametric identification. A general system model is introduced by describing the restoring forces as bilateral spring-dampers with time-variant coefficients, which are estimated as augmented states. The estimation procedure is followed by an a posteriori statistical analysis to reconstruct noise-free restoring force characteristics using the estimated states and their estimated variances. Observability is provided using only one measured mechanical quantity per degree of freedom, which makes this approach less demanding in the number of necessary measurement signals compared with truly nonparametric solutions, which typically require displacement, velocity and acceleration signals. Additionally, due to the statistical rigour of the procedure, it successfully addresses signals corrupted by significant measurement noise. In the present paper, the method is described in detail, which is followed by numerical examples of one degree of freedom (1DoF) and 2DoF mechanical systems with strong nonlinearities of vibro-impact type to demonstrate the effectiveness of the proposed technique.

  20. Spectral analysis of underwater explosions in the Dead Sea

    Science.gov (United States)

    Gitterman, Y.; Ben-Avraham, Z.; Ginzburg, A.

    1998-08-01

    The present study utilizes the Israel Seismic Network (ISN) as a spatially distributed multichannel system for the discrimination of low-magnitude events (ML UWEs) and 16 earthquakes in the magnitude range ML = 1.6-2.8, within distances of 10-150 km, recorded by the ISN, were selected for the analysis. The analysis is based on a smoothed (0.5 Hz window) Fourier spectrum of the whole signal (defined by the signal-to-noise criterion), without picking separate wave phases. It was found that the classical discriminant of the seismic energy ratio between the relatively low-frequency (1-6 Hz) and high-frequency (6-11 Hz) bands, averaged over an ISN subnetwork, showed an overlap between UWEs and earthquakes and cannot itself provide reliable identification. We developed and tested a new multistation discriminant based on the low- frequency spectral modulation (LFSM) method. In our case the LFSM is associated with the bubbling effect in underwater explosions. The method demonstrates a distinct azimuth-invariant coherency of spectral shapes in the low-frequency range (1-12 Hz) of short-period seismometer systems. The coherency of the modulated spectra for different ISN stations was measured by semblance statistics commonly used in seismic prospecting for phase correlation in the time domain. The modified statistics provided an almost complete separation between earthquakes and underwater explosions.

  1. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.

    Science.gov (United States)

    García, Constantino A; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.

  2. Spectral stratigraphy

    Science.gov (United States)

    Lang, Harold R.

    1991-01-01

    A new approach to stratigraphic analysis is described which uses photogeologic and spectral interpretation of multispectral remote sensing data combined with topographic information to determine the attitude, thickness, and lithology of strata exposed at the surface. The new stratigraphic procedure is illustrated by examples in the literature. The published results demonstrate the potential of spectral stratigraphy for mapping strata, determining dip and strike, measuring and correlating stratigraphic sequences, defining lithofacies, mapping biofacies, and interpreting geological structures.

  3. Spectral analysis in thin tubes with axial heterogeneities

    KAUST Repository

    Ferreira, Rita; Mascarenhas, M. Luí sa; Piatnitski, Andrey

    2015-01-01

    In this paper, we present the 3D-1D asymptotic analysis of the Dirichlet spectral problem associated with an elliptic operator with axial periodic heterogeneities. We extend to the 3D-1D case previous 3D-2D results (see [10]) and we analyze the special case where the scale of thickness is much smaller than the scale of the heterogeneities and the planar coefficient has a unique global minimum in the periodic cell. These results are of great relevance in the comprehension of the wave propagation in nanowires showing axial heterogeneities (see [17]).

  4. Comparison of Analysis and Spectral Nudging Techniques for Dynamical Downscaling with the WRF Model over China

    Directory of Open Access Journals (Sweden)

    Yuanyuan Ma

    2016-01-01

    Full Text Available To overcome the problem that the horizontal resolution of global climate models may be too low to resolve features which are important at the regional or local scales, dynamical downscaling has been extensively used. However, dynamical downscaling results generally drift away from large-scale driving fields. The nudging technique can be used to balance the performance of dynamical downscaling at large and small scales, but the performances of the two nudging techniques (analysis nudging and spectral nudging are debated. Moreover, dynamical downscaling is now performed at the convection-permitting scale to reduce the parameterization uncertainty and obtain the finer resolution. To compare the performances of the two nudging techniques in this study, three sensitivity experiments (with no nudging, analysis nudging, and spectral nudging covering a period of two months with a grid spacing of 6 km over continental China are conducted to downscale the 1-degree National Centers for Environmental Prediction (NCEP dataset with the Weather Research and Forecasting (WRF model. Compared with observations, the results show that both of the nudging experiments decrease the bias of conventional meteorological elements near the surface and at different heights during the process of dynamical downscaling. However, spectral nudging outperforms analysis nudging for predicting precipitation, and analysis nudging outperforms spectral nudging for the simulation of air humidity and wind speed.

  5. Multivariat least-squares methods applied to the quantitative spectral analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.; Vopicka, D.A.

    1985-01-01

    In an extension of earlier work, weighted multivariate least-squares methods of quantitative FT-IR analysis have been developed. A linear least-squares approximation to nonlinearities in the Beer-Lambert law is made by allowing the reference spectra to be a set of known mixtures, The incorporation of nonzero intercepts in the relation between absorbance and concentration further improves the approximation of nonlinearities while simultaneously accounting for nonzero spectra baselines. Pathlength variations are also accommodated in the analysis, and under certain conditions, unknown sample pathlengths can be determined. All spectral data are used to improve the precision and accuracy of the estimated concentrations. During the calibration phase of the analysis, pure component spectra are estimated from the standard mixture spectra. These can be compared with the measured pure component spectra to determine which vibrations experience nonlinear behavior. In the predictive phase of the analysis, the calculated spectra are used in our previous least-squares analysis to estimate sample component concentrations. These methods were applied to the analysis of the IR spectra of binary mixtures of esters. Even with severely overlapping spectral bands and nonlinearities in the Beer-Lambert law, the average relative error in the estimated concentration was <1%

  6. Monitoring PSR B1509–58 with RXTE: Spectral analysis 1996–2010

    Directory of Open Access Journals (Sweden)

    E. Litzinger

    2011-01-01

    Full Text Available We present an analysis of the X-ray spectra of the young, Crab-like pulsar PSR B1509–58 (pulse period P ~ 151ms observed by RXTE over 14 years since the beginning of the mission in 1996. The uniform dataset is especially well suited for studying the stability of the spectral parameters over time as well as for determining pulse phase resolved spectral parameters with high significance. The phase averaged spectra as well as the resolved spectra can be well described by an absorbed power law.

  7. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E

    2009-01-01

    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  8. Nonparametric Bayesian models for a spatial covariance.

    Science.gov (United States)

    Reich, Brian J; Fuentes, Montserrat

    2012-01-01

    A crucial step in the analysis of spatial data is to estimate the spatial correlation function that determines the relationship between a spatial process at two locations. The standard approach to selecting the appropriate correlation function is to use prior knowledge or exploratory analysis, such as a variogram analysis, to select the correct parametric correlation function. Rather that selecting a particular parametric correlation function, we treat the covariance function as an unknown function to be estimated from the data. We propose a flexible prior for the correlation function to provide robustness to the choice of correlation function. We specify the prior for the correlation function using spectral methods and the Dirichlet process prior, which is a common prior for an unknown distribution function. Our model does not require Gaussian data or spatial locations on a regular grid. The approach is demonstrated using a simulation study as well as an analysis of California air pollution data.

  9. Analyzing cost efficient production behavior under economies of scope : A nonparametric methodology

    NARCIS (Netherlands)

    Cherchye, L.J.H.; de Rock, B.; Vermeulen, F.M.P.

    2008-01-01

    In designing a production model for firms that generate multiple outputs, we take as a starting point that such multioutput production refers to economies of scope, which in turn originate from joint input use and input externalities. We provide a nonparametric characterization of cost-efficient

  10. A non-parametric Bayesian approach to decompounding from high frequency data

    NARCIS (Netherlands)

    Gugushvili, Shota; van der Meulen, F.H.; Spreij, Peter

    2016-01-01

    Given a sample from a discretely observed compound Poisson process, we consider non-parametric estimation of the density f0 of its jump sizes, as well as of its intensity λ0. We take a Bayesian approach to the problem and specify the prior on f0 as the Dirichlet location mixture of normal densities.

  11. Quantitative analysis of the dual-energy CT virtual spectral curve for focal liver lesions characterization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qi, E-mail: wq20@hotmail.com; Shi, Gaofeng, E-mail: gaofengs62@sina.com; Qi, Xiaohui, E-mail: qixiaohui1984@163.com; Fan, Xueli, E-mail: 407849960@qq.com; Wang, Lijia, E-mail: 893197597@qq.com

    2014-10-15

    Highlights: • We establish a feasible method using the virtual spectral curves (VSC) to differentiate focal liver lesions using DECT. • Our study shows the slope of the VSC can be used to differentiate between hemangioma, HCC, metastasis and cyst. • Importantly, the diagnostic specificities associated with using the slope to diagnose both hemangioma and cysts were 100%. - Abstract: Objective: To assess the usefulness of the spectral curve slope of dual-energy CT (DECT) for differentiating between hepatocellular carcinoma (HCC), hepatic metastasis, hemangioma (HH) and cysts. Methods: In total, 121 patients were imaged in the portal venous phase using dual-energy mode. Of these patients, 23 patients had HH, 28 patients had HCC, 40 patients had metastases and 30 patients had simple cysts. The spectral curves of the hepatic lesions were derived from the 40–190 keV levels of virtual monochromatic spectral imaging. The spectral curve slopes were calculated from 40 to 110 keV. The slopes were compared using the Kruskal–Wallis test. Receiver operating characteristic curves (ROC) were used to determine the optimal cut-off value of the slope of the spectral curve to differentiate between the lesions. Results: The spectral curves of the four lesion types had different baseline levels. The HH baseline level was the highest followed by HCC, metastases and cysts. The slopes of the spectral curves of HH, HCC, metastases and cysts were 3.81 ± 1.19, 1.49 ± 0.57, 1.06 ± 0.76 and 0.13 ± 0.17, respectively. These values were significantly different (P < 0.008). Based on ROC analysis, the respective diagnostic sensitivity and specificity were 87% and 100% for hemangioma (cut-off value ≥ 2.988), 82.1% and 65.9% for HCC (cut-off value 1.167–2.998), 65.9% and 59% for metastasis (cut-off value 0.133–1.167) and 44.4% and 100% for cysts (cut-off value ≤ 0.133). Conclusion: Quantitative analysis of the DECT spectral curve in the portal venous phase can be used to

  12. Hyperspectral imaging of polymer banknotes for building and analysis of spectral library

    Science.gov (United States)

    Lim, Hoong-Ta; Murukeshan, Vadakke Matham

    2017-11-01

    The use of counterfeit banknotes increases crime rates and cripples the economy. New countermeasures are required to stop counterfeiters who use advancing technologies with criminal intent. Many countries started adopting polymer banknotes to replace paper notes, as polymer notes are more durable and have better quality. The research on authenticating such banknotes is of much interest to the forensic investigators. Hyperspectral imaging can be employed to build a spectral library of polymer notes, which can then be used for classification to authenticate these notes. This is however not widely reported and has become a research interest in forensic identification. This paper focuses on the use of hyperspectral imaging on polymer notes to build spectral libraries, using a pushbroom hyperspectral imager which has been previously reported. As an initial study, a spectral library will be built from three arbitrarily chosen regions of interest of five circulated genuine polymer notes. Principal component analysis is used for dimension reduction and to convert the information in the spectral library to principal components. A 99% confidence ellipse is formed around the cluster of principal component scores of each class and then used as classification criteria. The potential of the adopted methodology is demonstrated by the classification of the imaged regions as training samples.

  13. kruX: matrix-based non-parametric eQTL discovery.

    Science.gov (United States)

    Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom

    2014-01-14

    The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.

  14. Spectral and spectral-frequency methods of investigating atmosphereless bodies of the Solar system

    International Nuclear Information System (INIS)

    Busarev, Vladimir V; Prokof'eva-Mikhailovskaya, Valentina V; Bochkov, Valerii V

    2007-01-01

    A method of reflectance spectrophotometry of atmosphereless bodies of the Solar system, its specificity, and the means of eliminating basic spectral noise are considered. As a development, joining the method of reflectance spectrophotometry with the frequency analysis of observational data series is proposed. The combined spectral-frequency method allows identification of formations with distinctive spectral features, and estimations of their sizes and distribution on the surface of atmospherelss celestial bodies. As applied to investigations of asteroids 21 Lutetia and 4 Vesta, the spectral frequency method has given us the possibility of obtaining fundamentally new information about minor planets. (instruments and methods of investigation)

  15. A nonparametric mean-variance smoothing method to assess Arabidopsis cold stress transcriptional regulator CBF2 overexpression microarray data.

    Science.gov (United States)

    Hu, Pingsha; Maiti, Tapabrata

    2011-01-01

    Microarray is a powerful tool for genome-wide gene expression analysis. In microarray expression data, often mean and variance have certain relationships. We present a non-parametric mean-variance smoothing method (NPMVS) to analyze differentially expressed genes. In this method, a nonlinear smoothing curve is fitted to estimate the relationship between mean and variance. Inference is then made upon shrinkage estimation of posterior means assuming variances are known. Different methods have been applied to simulated datasets, in which a variety of mean and variance relationships were imposed. The simulation study showed that NPMVS outperformed the other two popular shrinkage estimation methods in some mean-variance relationships; and NPMVS was competitive with the two methods in other relationships. A real biological dataset, in which a cold stress transcription factor gene, CBF2, was overexpressed, has also been analyzed with the three methods. Gene ontology and cis-element analysis showed that NPMVS identified more cold and stress responsive genes than the other two methods did. The good performance of NPMVS is mainly due to its shrinkage estimation for both means and variances. In addition, NPMVS exploits a non-parametric regression between mean and variance, instead of assuming a specific parametric relationship between mean and variance. The source code written in R is available from the authors on request.

  16. Site Characterization in the Urban Area of Tijuana, B. C., Mexico by Means of: H/V Spectral Ratios, Spectral Analysis of Surface Waves, and Random Decrement Method

    Science.gov (United States)

    Tapia-Herrera, R.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.

    2009-05-01

    Results of site characterization for an experimental site in the metropolitan area of Tijuana, B. C., Mexico are presented as part of the on-going research in which time series of earthquakes, ambient noise, and induced vibrations were processed with three different methods: H/V spectral ratios, Spectral Analysis of Surface Waves (SASW), and the Random Decrement Method, (RDM). Forward modeling using the wave propagation stiffness matrix method (Roësset and Kausel, 1981) was used to compute the theoretical SH/P, SV/P spectral ratios, and the experimental H/V spectral ratios were computed following the conventional concepts of Fourier analysis. The modeling/comparison between the theoretical and experimental H/V spectral ratios was carried out. For the SASW method the theoretical dispersion curves were also computed and compared with the experimental one, and finally the theoretical free vibration decay curve was compared with the experimental one obtained with the RDM. All three methods were tested with ambient noise, induced vibrations, and earthquake signals. Both experimental spectral ratios obtained with ambient noise as well as earthquake signals agree quite well with the theoretical spectral ratios, particularly at the fundamental vibration frequency of the recording site. Differences between the fundamental vibration frequencies are evident for sites located at alluvial fill (~0.6 Hz) and at sites located at conglomerate/sandstones fill (0.75 Hz). Shear wave velocities for the soft soil layers of the 4-layer discrete soil model ranges as low as 100 m/s and up to 280 m/s. The results with the SASW provided information that allows to identify low velocity layers, not seen before with the traditional seismic methods. The damping estimations obtained with the RDM are within the expected values, and the dominant frequency of the system also obtained with the RDM correlates within the range of plus-minus 20 % with the one obtained by means of the H/V spectral

  17. A Non-Parametric Surrogate-based Test of Significance for T-Wave Alternans Detection

    Science.gov (United States)

    Nemati, Shamim; Abdala, Omar; Bazán, Violeta; Yim-Yeh, Susie; Malhotra, Atul; Clifford, Gari

    2010-01-01

    We present a non-parametric adaptive surrogate test that allows for the differentiation of statistically significant T-Wave Alternans (TWA) from alternating patterns that can be solely explained by the statistics of noise. The proposed test is based on estimating the distribution of noise induced alternating patterns in a beat sequence from a set of surrogate data derived from repeated reshuffling of the original beat sequence. Thus, in assessing the significance of the observed alternating patterns in the data no assumptions are made about the underlying noise distribution. In addition, since the distribution of noise-induced alternans magnitudes is calculated separately for each sequence of beats within the analysis window, the method is robust to data non-stationarities in both noise and TWA. The proposed surrogate method for rejecting noise was compared to the standard noise rejection methods used with the Spectral Method (SM) and the Modified Moving Average (MMA) techniques. Using a previously described realistic multi-lead model of TWA, and real physiological noise, we demonstrate the proposed approach reduces false TWA detections, while maintaining a lower missed TWA detection compared with all the other methods tested. A simple averaging-based TWA estimation algorithm was coupled with the surrogate significance testing and was evaluated on three public databases; the Normal Sinus Rhythm Database (NRSDB), the Chronic Heart Failure Database (CHFDB) and the Sudden Cardiac Death Database (SCDDB). Differences in TWA amplitudes between each database were evaluated at matched heart rate (HR) intervals from 40 to 120 beats per minute (BPM). Using the two-sample Kolmogorov-Smirnov test, we found that significant differences in TWA levels exist between each patient group at all decades of heart rates. The most marked difference was generally found at higher heart rates, and the new technique resulted in a larger margin of separability between patient populations than

  18. Ultra-wideband spectral analysis using S2 technology

    International Nuclear Information System (INIS)

    Krishna Mohan, R.; Chang, T.; Tian, M.; Bekker, S.; Olson, A.; Ostrander, C.; Khallaayoun, A.; Dollinger, C.; Babbitt, W.R.; Cole, Z.; Reibel, R.R.; Merkel, K.D.; Sun, Y.; Cone, R.; Schlottau, F.; Wagner, K.H.

    2007-01-01

    This paper outlines the efforts to develop an ultra-wideband spectrum analyzer that takes advantage of the broad spectral response and fine spectral resolution (∼25 kHz) of spatial-spectral (S2) materials. The S2 material can process the full spectrum of broadband microwave transmissions, with adjustable time apertures (down to 100 μs) and fast update rates (up to 1 kHz). A cryogenically cooled Tm:YAG crystal that operates on microwave signals modulated onto a stabilized optical carrier at 793 nm is used as the core for the spectrum analyzer. Efforts to develop novel component technologies that enhance the performance of the system and meet the application requirements are discussed, including an end-to-end device model for parameter optimization. We discuss the characterization of new ultra-wide bandwidth S2 materials. Detection and post-processing module development including the implementation of a novel spectral recovery algorithm using field programmable gate array technology (FPGA) is also discussed

  19. Ultra-wideband spectral analysis using S2 technology

    Energy Technology Data Exchange (ETDEWEB)

    Krishna Mohan, R. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States)]. E-mail: krishna@spectrum.montana.edu; Chang, T. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Tian, M. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Bekker, S. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Olson, A. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Ostrander, C. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Khallaayoun, A. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Dollinger, C. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Babbitt, W.R. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Cole, Z. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); S2 Corporation, Bozeman, MT 59718 (United States); Reibel, R.R. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); S2 Corporation, Bozeman, MT 59718 (United States); Merkel, K.D. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); S2 Corporation, Bozeman, MT 59718 (United States); Sun, Y. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Cone, R. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Schlottau, F. [University of Colorado, Boulder, CO 80309 (United States); Wagner, K.H. [University of Colorado, Boulder, CO 80309 (United States)

    2007-11-15

    This paper outlines the efforts to develop an ultra-wideband spectrum analyzer that takes advantage of the broad spectral response and fine spectral resolution ({approx}25 kHz) of spatial-spectral (S2) materials. The S2 material can process the full spectrum of broadband microwave transmissions, with adjustable time apertures (down to 100 {mu}s) and fast update rates (up to 1 kHz). A cryogenically cooled Tm:YAG crystal that operates on microwave signals modulated onto a stabilized optical carrier at 793 nm is used as the core for the spectrum analyzer. Efforts to develop novel component technologies that enhance the performance of the system and meet the application requirements are discussed, including an end-to-end device model for parameter optimization. We discuss the characterization of new ultra-wide bandwidth S2 materials. Detection and post-processing module development including the implementation of a novel spectral recovery algorithm using field programmable gate array technology (FPGA) is also discussed.

  20. Spectral analysis of K-shell X-ray emission of magnesium plasma

    Indian Academy of Sciences (India)

    2014-02-06

    Feb 6, 2014 ... Spectral analysis of K-shell X-ray emission of magnesium plasma, produced by laser pulses of 45 fs duration, focussed up to an intensity of ∼1018 W cm-2, is carried out. The plasma conditions prevalent during the emission of X-ray spectrum were identified by comparing the experimental spectra with the ...

  1. Exact nonparametric confidence bands for the survivor function.

    Science.gov (United States)

    Matthews, David

    2013-10-12

    A method to produce exact simultaneous confidence bands for the empirical cumulative distribution function that was first described by Owen, and subsequently corrected by Jager and Wellner, is the starting point for deriving exact nonparametric confidence bands for the survivor function of any positive random variable. We invert a nonparametric likelihood test of uniformity, constructed from the Kaplan-Meier estimator of the survivor function, to obtain simultaneous lower and upper bands for the function of interest with specified global confidence level. The method involves calculating a null distribution and associated critical value for each observed sample configuration. However, Noe recursions and the Van Wijngaarden-Decker-Brent root-finding algorithm provide the necessary tools for efficient computation of these exact bounds. Various aspects of the effect of right censoring on these exact bands are investigated, using as illustrations two observational studies of survival experience among non-Hodgkin's lymphoma patients and a much larger group of subjects with advanced lung cancer enrolled in trials within the North Central Cancer Treatment Group. Monte Carlo simulations confirm the merits of the proposed method of deriving simultaneous interval estimates of the survivor function across the entire range of the observed sample. This research was supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada. It was begun while the author was visiting the Department of Statistics, University of Auckland, and completed during a subsequent sojourn at the Medical Research Council Biostatistics Unit in Cambridge. The support of both institutions, in addition to that of NSERC and the University of Waterloo, is greatly appreciated.

  2. Noise analysis role in reactor safety, Spectral analysis (PSD)

    International Nuclear Information System (INIS)

    Jovanovic, S.; Velickovic, Lj.

    1967-11-01

    Spectral power density of a zero power reactor is frequency dependent and related to transfer function of the reactor and to spectral density of the input disturbance. Measurement of spectral power density of a critical system is used to obtain the ratio (β/l), β is the effective yield of delayed neutrons, and l is the effective mean neutron lifetime. When reactor is subcritical, if the effective yie ald of delayed neutrons, the effective mean neutron lifetime are known, the shutdown margin can be determined by relation α = (1 - k (1- β0)/l, k is the effective multiplication factor. Output neutron spectrum at the RB reactor in Vinca was measured for a few reactor core configurations and for a few levels of heavy water at subcritical state. Measured values were satisfactory when the reactor was critical, but the reactor noise of subcritical system was covered by the white noise of the detector and electronic equipment. The Ra-Be source was under the reactor vessel when measurements of subcritical system were done. More efficient detector or external random stimulus for increasing the intensity of neutron fluctuations would be needed to obtain results for subcritical system

  3. Analyzing Cost Efficient Production Behavior Under Economies of Scope : A Nonparametric Methodology

    NARCIS (Netherlands)

    Cherchye, L.J.H.; de Rock, B.; Vermeulen, F.M.P.

    2006-01-01

    In designing a production model for firms that generate multiple outputs, we take as a starting point that such multi-output production refers to economies of scope, which in turn originate from joint input use and input externalities. We provide a nonparametric characterization of cost efficient

  4. An Excel-based implementation of the spectral method of action potential alternans analysis.

    Science.gov (United States)

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  5. On the Choice of Difference Sequence in a Unified Framework for Variance Estimation in Nonparametric Regression

    KAUST Repository

    Dai, Wenlin; Tong, Tiejun; Zhu, Lixing

    2017-01-01

    Difference-based methods do not require estimating the mean function in nonparametric regression and are therefore popular in practice. In this paper, we propose a unified framework for variance estimation that combines the linear regression method with the higher-order difference estimators systematically. The unified framework has greatly enriched the existing literature on variance estimation that includes most existing estimators as special cases. More importantly, the unified framework has also provided a smart way to solve the challenging difference sequence selection problem that remains a long-standing controversial issue in nonparametric regression for several decades. Using both theory and simulations, we recommend to use the ordinary difference sequence in the unified framework, no matter if the sample size is small or if the signal-to-noise ratio is large. Finally, to cater for the demands of the application, we have developed a unified R package, named VarED, that integrates the existing difference-based estimators and the unified estimators in nonparametric regression and have made it freely available in the R statistical program http://cran.r-project.org/web/packages/.

  6. On the Choice of Difference Sequence in a Unified Framework for Variance Estimation in Nonparametric Regression

    KAUST Repository

    Dai, Wenlin

    2017-09-01

    Difference-based methods do not require estimating the mean function in nonparametric regression and are therefore popular in practice. In this paper, we propose a unified framework for variance estimation that combines the linear regression method with the higher-order difference estimators systematically. The unified framework has greatly enriched the existing literature on variance estimation that includes most existing estimators as special cases. More importantly, the unified framework has also provided a smart way to solve the challenging difference sequence selection problem that remains a long-standing controversial issue in nonparametric regression for several decades. Using both theory and simulations, we recommend to use the ordinary difference sequence in the unified framework, no matter if the sample size is small or if the signal-to-noise ratio is large. Finally, to cater for the demands of the application, we have developed a unified R package, named VarED, that integrates the existing difference-based estimators and the unified estimators in nonparametric regression and have made it freely available in the R statistical program http://cran.r-project.org/web/packages/.

  7. A non-parametric Data Envelopment Analysis approach for improving energy efficiency of grape production

    International Nuclear Information System (INIS)

    Khoshroo, Alireza; Mulwa, Richard; Emrouznejad, Ali; Arabi, Behrouz

    2013-01-01

    Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production. In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming. The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. - Highlights: • The focus of this paper is to identify excessive use of energy and optimize energy consumption in grape production. • We measure the efficiency as a function of labor/machinery/chemicals/farmyard manure/diesel-fuel/electricity/water. • Data were obtained from 41 grape

  8. Spectral analysis of IGR J01572-7259 during its 2016 outburst

    Science.gov (United States)

    La Palombara, N.; Esposito, P.; Mereghetti, S.; Pintore, F.; Sidoli, L.; Tiengo, A.

    2018-03-01

    We report on the results of the XMM-Newton observation of IGR J01572-7259 during its most recent outburst in 2016 May, the first since 2008. The source reached a flux f ˜ 10-10 erg cm-2 s-1, which allowed us to perform a detailed analysis of its timing and spectral properties. We obtained a pulse period Pspin = 11.58208(2) s. The pulse profile is double peaked and strongly energy dependent, as the second peak is prominent only at low energies and the pulsed fraction increases with energy. The main spectral component is a power-law model, but at low energies, we also detected a soft thermal component, which can be described with either a blackbody or a hot plasma model. Both the EPIC and RGS spectra show several emission lines, which can be identified with the transition lines of ionized N, O, Ne, and Fe and cannot be described with a thermal emission model. The phase-resolved spectral analysis showed that the flux of both the soft excess and the emission lines vary with the pulse phase: the soft excess disappears in the first pulse and becomes significant only in the second, where also the Fe line is stronger. This variability is difficult to explain with emission from a hot plasma, while the reprocessing of the primary X-ray emission at the inner edge of the accretion disc provides a reliable scenario. On the other hand, the narrow emission lines can be due to the presence of photoionized matter around the accreting source.

  9. The quantum spectral analysis of the two-dimensional annular billiard system

    International Nuclear Information System (INIS)

    Yan-Hui, Zhang; Ji-Quan, Zhang; Xue-You, Xu; Sheng-Lu, Lin

    2009-01-01

    Based on the extended closed-orbit theory together with spectral analysis, this paper studies the correspondence between quantum mechanics and the classical counterpart in a two-dimensional annular billiard. The results demonstrate that the Fourier-transformed quantum spectra are in very good accordance with the lengths of the classical ballistic trajectories, whereas spectral strength is intimately associated with the shapes of possible open orbits connecting arbitrary two points in the annular cavity. This approach facilitates an intuitive understanding of basic quantum features such as quantum interference, locations of the wavefunctions, and allows quantitative calculations in the range of high energies, where full quantum calculations may become impractical in general. This treatment provides a thread to explore the properties of microjunction transport and even quantum chaos under the much more general system. (general)

  10. Nonparametric Efficiency Testing of Asian Stock Markets Using Weekly Data

    OpenAIRE

    CORNELIS A. LOS

    2004-01-01

    The efficiency of speculative markets, as represented by Fama's 1970 fair game model, is tested on weekly price index data of six Asian stock markets - Hong Kong, Indonesia, Malaysia, Singapore, Taiwan and Thailand - using Sherry's (1992) non-parametric methods. These scientific testing methods were originally developed to analyze the information processing efficiency of nervous systems. In particular, the stationarity and independence of the price innovations are tested over ten years, from ...

  11. Research on the strong optical feedback effects based on spectral analysis method

    Science.gov (United States)

    Zeng, Zhaoli; Qu, XueMin; Li, Weina; Zhang, Min; Wang, Hao; Li, Tuo

    2018-01-01

    The strong optical feedback has the advantage of generating high resolution fringes. However, these feedback fringes usually seem like the noise signal when the feedback level is high. This defect severely limits its practical application. In this paper, the generation mechanism of noise fringes with strong optical feedback is studied by using spectral analysis method. The spectral analysis results show that, in most cases, the noise-like fringes are observed owing to the strong multiple high-order feedback. However, at certain feedback cavity condition, there may be only one high-order feedback beam goes back to the laser cavity, the noise-like fringes can change to the cosine-like fringes. And the resolution of this fringe is dozens times than that of the weak optical feedback. This research provides a method to obtain high resolution cosine-like fringes rather than noise signal in the strong optical feedback, which makes it possible to be used in nanoscale displacement measurements.

  12. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    OpenAIRE

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in can...

  13. Bayesian Bandwidth Selection for a Nonparametric Regression Model with Mixed Types of Regressors

    Directory of Open Access Journals (Sweden)

    Xibin Zhang

    2016-04-01

    Full Text Available This paper develops a sampling algorithm for bandwidth estimation in a nonparametric regression model with continuous and discrete regressors under an unknown error density. The error density is approximated by the kernel density estimator of the unobserved errors, while the regression function is estimated using the Nadaraya-Watson estimator admitting continuous and discrete regressors. We derive an approximate likelihood and posterior for bandwidth parameters, followed by a sampling algorithm. Simulation results show that the proposed approach typically leads to better accuracy of the resulting estimates than cross-validation, particularly for smaller sample sizes. This bandwidth estimation approach is applied to nonparametric regression model of the Australian All Ordinaries returns and the kernel density estimation of gross domestic product (GDP growth rates among the organisation for economic co-operation and development (OECD and non-OECD countries.

  14. Spacetime Discontinuous Galerkin FEM: Spectral Response

    International Nuclear Information System (INIS)

    Abedi, R; Omidi, O; Clarke, P L

    2014-01-01

    Materials in nature demonstrate certain spectral shapes in terms of their material properties. Since successful experimental demonstrations in 2000, metamaterials have provided a means to engineer materials with desired spectral shapes for their material properties. Computational tools are employed in two different aspects for metamaterial modeling: 1. Mircoscale unit cell analysis to derive and possibly optimize material's spectral response; 2. macroscale to analyze their interaction with conventional material. We compare two different approaches of Time-Domain (TD) and Frequency Domain (FD) methods for metamaterial applications. Finally, we discuss advantages of the TD method of Spacetime Discontinuous Galerkin finite element method (FEM) for spectral analysis of metamaterials

  15. Estimation of sub-pixel water area on Tibet plateau using multiple endmembers spectral mixture spectral analysis from MODIS data

    Science.gov (United States)

    Cui, Qian; Shi, Jiancheng; Xu, Yuanliu

    2011-12-01

    Water is the basic needs for human society, and the determining factor of stability of ecosystem as well. There are lots of lakes on Tibet Plateau, which will lead to flood and mudslide when the water expands sharply. At present, water area is extracted from TM or SPOT data for their high spatial resolution; however, their temporal resolution is insufficient. MODIS data have high temporal resolution and broad coverage. So it is valuable resource for detecting the change of water area. Because of its low spatial resolution, mixed-pixels are common. In this paper, four spectral libraries are built using MOD09A1 product, based on that, water body is extracted in sub-pixels utilizing Multiple Endmembers Spectral Mixture Analysis (MESMA) using MODIS daily reflectance data MOD09GA. The unmixed result is comparing with contemporaneous TM data and it is proved that this method has high accuracy.

  16. CRISS power spectral density

    International Nuclear Information System (INIS)

    Vaeth, W.

    1979-04-01

    The correlation of signal components at different frequencies like higher harmonics cannot be detected by a normal power spectral density measurement, since this technique correlates only components at the same frequency. This paper describes a special method for measuring the correlation of two signal components at different frequencies: the CRISS power spectral density. From this new function in frequency analysis, the correlation of two components can be determined quantitatively either they stem from one signal or from two diverse signals. The principle of the method, suitable for the higher harmonics of a signal as well as for any other frequency combinations is shown for the digital frequency analysis technique. Two examples of CRISS power spectral densities demonstrates the operation of the new method. (orig.) [de

  17. Estimation of the limit of detection with a bootstrap-derived standard error by a partly non-parametric approach. Application to HPLC drug assays

    DEFF Research Database (Denmark)

    Linnet, Kristian

    2005-01-01

    Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors......Bootstrap, HPLC, limit of blank, limit of detection, non-parametric statistics, type I and II errors...

  18. Chebyshev super spectral viscosity method for water hammer analysis

    Directory of Open Access Journals (Sweden)

    Hongyu Chen

    2013-09-01

    Full Text Available In this paper, a new fast and efficient algorithm, Chebyshev super spectral viscosity (SSV method, is introduced to solve the water hammer equations. Compared with standard spectral method, the method's advantage essentially consists in adding a super spectral viscosity to the equations for the high wave numbers of the numerical solution. It can stabilize the numerical oscillation (Gibbs phenomenon and improve the computational efficiency while discontinuities appear in the solution. Results obtained from the Chebyshev super spectral viscosity method exhibit greater consistency with conventional water hammer calculations. It shows that this new numerical method offers an alternative way to investigate the behavior of the water hammer in propellant pipelines.

  19. Principle component analysis and linear discriminant analysis of multi-spectral autofluorescence imaging data for differentiating basal cell carcinoma and healthy skin

    Science.gov (United States)

    Chernomyrdin, Nikita V.; Zaytsev, Kirill I.; Lesnichaya, Anastasiya D.; Kudrin, Konstantin G.; Cherkasova, Olga P.; Kurlov, Vladimir N.; Shikunova, Irina A.; Perchik, Alexei V.; Yurchenko, Stanislav O.; Reshetov, Igor V.

    2016-09-01

    In present paper, an ability to differentiate basal cell carcinoma (BCC) and healthy skin by combining multi-spectral autofluorescence imaging, principle component analysis (PCA), and linear discriminant analysis (LDA) has been demonstrated. For this purpose, the experimental setup, which includes excitation and detection branches, has been assembled. The excitation branch utilizes a mercury arc lamp equipped with a 365-nm narrow-linewidth excitation filter, a beam homogenizer, and a mechanical chopper. The detection branch employs a set of bandpass filters with the central wavelength of spectral transparency of λ = 400, 450, 500, and 550 nm, and a digital camera. The setup has been used to study three samples of freshly excised BCC. PCA and LDA have been implemented to analyze the data of multi-spectral fluorescence imaging. Observed results of this pilot study highlight the advantages of proposed imaging technique for skin cancer diagnosis.

  20. Spectral Analysis of a Quantum System with a Double Line Singular Interaction

    Czech Academy of Sciences Publication Activity Database

    Kondej, S.; Krejčiřík, David

    2013-01-01

    Roč. 49, č. 4 (2013), s. 831-859 ISSN 0034-5318 R&D Projects: GA ČR GAP203/11/0701 Institutional support: RVO:61389005 Keywords : Schrödinger operator * singular perturbation * spectral analysis * Hardy inequality * resonance Subject RIV: BE - Theoretical Physics Impact factor: 0.614, year: 2013

  1. GBTIDL: Reduction and Analysis of GBT Spectral Line Data

    Science.gov (United States)

    Marganian, P.; Garwood, R. W.; Braatz, J. A.; Radziwill, N. M.; Maddalena, R. J.

    2013-03-01

    GBTIDL is an interactive package for reduction and analysis of spectral line data taken with the Robert C. Byrd Green Bank Telescope (GBT). The package, written entirely in IDL, consists of straightforward yet flexible calibration, averaging, and analysis procedures (the "GUIDE layer") modeled after the UniPOPS and CLASS data reduction philosophies, a customized plotter with many built-in visualization features, and Data I/O and toolbox functionality that can be used for more advanced tasks. GBTIDL makes use of data structures which can also be used to store intermediate results. The package consumes and produces data in GBT SDFITS format. GBTIDL can be run online and have access to the most recent data coming off the telescope, or can be run offline on preprocessed SDFITS files.

  2. Assessing Goodness of Fit in Item Response Theory with Nonparametric Models: A Comparison of Posterior Probabilities and Kernel-Smoothing Approaches

    Science.gov (United States)

    Sueiro, Manuel J.; Abad, Francisco J.

    2011-01-01

    The distance between nonparametric and parametric item characteristic curves has been proposed as an index of goodness of fit in item response theory in the form of a root integrated squared error index. This article proposes to use the posterior distribution of the latent trait as the nonparametric model and compares the performance of an index…

  3. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  4. On the detection of corrosion pit interactions using two-dimensional spectral analysis

    International Nuclear Information System (INIS)

    Jarrah, Adil; Nianga, Jean-Marie; Iost, Alain; Guillemot, Gildas; Najjar, Denis

    2010-01-01

    A statistical methodology for detecting pits interactions based on a two-dimensional spectral analysis is presented. This method can be used as a tool for the exploratory analysis of spatial point patterns and can be advanced as an alternative of classical methods based on distance. One of the major advantages of the spectral analysis approach over the use of classical methods is its ability to reveal more details about the spatial structure like the scale for which pits corrosion can be considered as independent. Furthermore, directional components of pattern can be investigated. The method is validated in a first time using numerical simulations on random, regular and aggregated structures. The density of pits, used in the numerical simulations, corresponds to that assessed from a corroded aluminium sheet. In a second time, this method is applied to verify the independence of the corrosion pits observed on the aforementioned aluminium sheet before applying the Gumbel theory to determine the maximum pit depth. Indeed, the property of independence is a prerequisite of the Gumbel theory which is one of the most frequently used in the field of safety and reliability.

  5. Non-parametric trend analysis of the aridity index for three large arid and semi-arid basins in Iran

    Science.gov (United States)

    Ahani, Hossien; Kherad, Mehrzad; Kousari, Mohammad Reza; van Roosmalen, Lieke; Aryanfar, Ramin; Hosseini, Seyyed Mashaallah

    2013-05-01

    Currently, an important scientific challenge that researchers are facing is to gain a better understanding of climate change at the regional scale, which can be especially challenging in an area with low and highly variable precipitation amounts such as Iran. Trend analysis of the medium-term change using ground station observations of meteorological variables can enhance our knowledge of the dominant processes in an area and contribute to the analysis of future climate projections. Generally, studies focus on the long-term variability of temperature and precipitation and to a lesser extent on other important parameters such as moisture indices. In this study the recent 50-year trends (1955-2005) of precipitation (P), potential evapotranspiration (PET), and aridity index (AI) in monthly time scale were studied over 14 synoptic stations in three large Iran basins using the Mann-Kendall non-parametric test. Additionally, an analysis of the monthly, seasonal and annual trend of each parameter was performed. Results showed no significant trends in the monthly time series. However, PET showed significant, mostly decreasing trends, for the seasonal values, which resulted in a significant negative trend in annual PET at five stations. Significant negative trends in seasonal P values were only found at a number of stations in spring and summer and no station showed significant negative trends in annual P. Due to the varied positive and negative trends in annual P and to a lesser extent PET, almost as many stations with negative as positive trends in annual AI were found, indicating that both drying and wetting trends occurred in Iran. Overall, the northern part of the study area showed an increasing trend in annual AI which meant that the region became wetter, while the south showed decreasing trends in AI.

  6. Assessing and monitoring of urban vegetation using multiple endmember spectral mixture analysis

    Science.gov (United States)

    Zoran, M. A.; Savastru, R. S.; Savastru, D. M.

    2013-08-01

    During last years urban vegetation with significant health, biological and economical values had experienced dramatic changes due to urbanization and human activities in the metropolitan area of Bucharest in Romania. We investigated the utility of remote sensing approaches of multiple endmember spectral mixture analysis (MESMA) applied to IKONOS and Landsat TM/ETM satellite data for estimating fractional cover of urban/periurban forest, parks, agricultural vegetation areas. Because of the spectral heterogeneity of same physical features of urban vegetation increases with the increase of image resolution, the traditional spectral information-based statistical method may not be useful to classify land cover dynamics from high resolution imageries like IKONOS. So we used hierarchy tree classification method in classification and MESMA for vegetation land cover dynamics assessment based on available IKONOS high-resolution imagery of Bucharest town. This study employs thirty two endmembers and six hundred and sixty spectral models to identify all Earth's features (vegetation, water, soil, impervious) and shade in the Bucharest area. The mean RMS error for the selected vegetation land cover classes range from 0.0027 to 0.018. The Pearson correlation between the fraction outputs from MESMA and reference data from all IKONOS images 1m panchromatic resolution data for urban/periurban vegetation were ranging in the domain 0.7048 - 0.8287. The framework in this study can be applied to other urban vegetation areas in Romania.

  7. Spectral analysis software improves confidence in plant and soil water stable isotope analyses performed by isotope ratio infrared spectroscopy (IRIS).

    Science.gov (United States)

    West, A G; Goldsmith, G R; Matimati, I; Dawson, T E

    2011-08-30

    Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be

  8. Analyzing availability using transfer function models and cross spectral analysis

    International Nuclear Information System (INIS)

    Singpurwalla, N.D.

    1980-01-01

    The paper shows how the methods of multivariate time series analysis can be used in a novel way to investigate the interrelationships between a series of operating (running) times and a series of maintenance (down) times of a complex system. Specifically, the techniques of cross spectral analysis are used to help obtain a Box-Jenkins type transfer function model for the running times and the down times of a nuclear reactor. A knowledge of the interrelationships between the running times and the down times is useful for an evaluation of maintenance policies, for replacement policy decisions, and for evaluating the availability and the readiness of complex systems

  9. Analysis of In-Situ Spectral Reflectance of Sago and Other Palms: Implications for Their Detection in Optical Satellite Images

    Science.gov (United States)

    Rendon Santillan, Jojene; Makinano-Santillan, Meriam

    2018-04-01

    We present a characterization, comparison and analysis of in-situ spectral reflectance of Sago and other palms (coconut, oil palm and nipa) to ascertain on which part of the electromagnetic spectrum these palms are distinguishable from each other. The analysis also aims to reveal information that will assist in selecting which band to use when mapping Sago palms using the images acquired by these sensors. The datasets used in the analysis consisted of averaged spectral reflectance curves of each palm species measured within the 345-1045 nm wavelength range using an Ocean Optics USB4000-VIS-NIR Miniature Fiber Optic Spectrometer. This in-situ reflectance data was also resampled to match the spectral response of the 4 bands of ALOS AVNIR-2, 3 bands of ASTER VNIR, 4 bands of Landsat 7 ETM+, 5 bands of Landsat 8, and 8 bands of Worldview-2 (WV2). Examination of the spectral reflectance curves showed that the near infra-red region, specifically at 770, 800 and 875 nm, provides the best wavelengths where Sago palms can be distinguished from other palms. The resampling of the in-situ reflectance spectra to match the spectral response of optical sensors made possible the analysis of the differences in reflectance values of Sago and other palms in different bands of the sensors. Overall, the knowledge learned from the analysis can be useful in the actual analysis of optical satellite images, specifically in determining which band to include or to exclude, or whether to use all bands of a sensor in discriminating and mapping Sago palms.

  10. Nonparametric estimation of location and scale parameters

    KAUST Repository

    Potgieter, C.J.

    2012-12-01

    Two random variables X and Y belong to the same location-scale family if there are constants μ and σ such that Y and μ+σX have the same distribution. In this paper we consider non-parametric estimation of the parameters μ and σ under minimal assumptions regarding the form of the distribution functions of X and Y. We discuss an approach to the estimation problem that is based on asymptotic likelihood considerations. Our results enable us to provide a methodology that can be implemented easily and which yields estimators that are often near optimal when compared to fully parametric methods. We evaluate the performance of the estimators in a series of Monte Carlo simulations. © 2012 Elsevier B.V. All rights reserved.

  11. Non-parametric PSF estimation from celestial transit solar images using blind deconvolution

    Directory of Open Access Journals (Sweden)

    González Adriana

    2016-01-01

    Full Text Available Context: Characterization of instrumental effects in astronomical imaging is important in order to extract accurate physical information from the observations. The measured image in a real optical instrument is usually represented by the convolution of an ideal image with a Point Spread Function (PSF. Additionally, the image acquisition process is also contaminated by other sources of noise (read-out, photon-counting. The problem of estimating both the PSF and a denoised image is called blind deconvolution and is ill-posed. Aims: We propose a blind deconvolution scheme that relies on image regularization. Contrarily to most methods presented in the literature, our method does not assume a parametric model of the PSF and can thus be applied to any telescope. Methods: Our scheme uses a wavelet analysis prior model on the image and weak assumptions on the PSF. We use observations from a celestial transit, where the occulting body can be assumed to be a black disk. These constraints allow us to retain meaningful solutions for the filter and the image, eliminating trivial, translated, and interchanged solutions. Under an additive Gaussian noise assumption, they also enforce noise canceling and avoid reconstruction artifacts by promoting the whiteness of the residual between the blurred observations and the cleaned data. Results: Our method is applied to synthetic and experimental data. The PSF is estimated for the SECCHI/EUVI instrument using the 2007 Lunar transit, and for SDO/AIA using the 2012 Venus transit. Results show that the proposed non-parametric blind deconvolution method is able to estimate the core of the PSF with a similar quality to parametric methods proposed in the literature. We also show that, if these parametric estimations are incorporated in the acquisition model, the resulting PSF outperforms both the parametric and non-parametric methods.

  12. Spectral analysis of mammographic images using a multitaper method

    International Nuclear Information System (INIS)

    Wu Gang; Mainprize, James G.; Yaffe, Martin J.

    2012-01-01

    Purpose: Power spectral analysis in radiographic images is conventionally performed using a windowed overlapping averaging periodogram. This study describes an alternative approach using a multitaper technique and compares its performance with that of the standard method. This tool will be valuable in power spectrum estimation of images, whose content deviates significantly from uniform white noise. The performance of the multitaper approach will be evaluated in terms of spectral stability, variance reduction, bias, and frequency precision. The ultimate goal is the development of a useful tool for image quality assurance. Methods: A multitaper approach uses successive data windows of increasing order. This mitigates spectral leakage allowing one to calculate a reduced-variance power spectrum. The multitaper approach will be compared with the conventional power spectrum method in several typical situations, including the noise power spectra (NPS) measurements of simulated projection images of a uniform phantom, NPS measurement of real detector images of a uniform phantom for two clinical digital mammography systems, and the estimation of the anatomic noise in mammographic images (simulated images and clinical mammograms). Results: Examination of spectrum variance versus frequency resolution and bias indicates that the multitaper approach is superior to the conventional single taper methods in the prevention of spectrum leakage and variance reduction. More than four times finer frequency precision can be achieved with equivalent or less variance and bias. Conclusions: Without any shortening of the image data length, the bias is smaller and the frequency resolution is higher with the multitaper method, and the need to compromise in the choice of regions of interest size to balance between the reduction of variance and the loss of frequency resolution is largely eliminated.

  13. Transition redshift: new constraints from parametric and nonparametric methods

    Energy Technology Data Exchange (ETDEWEB)

    Rani, Nisha; Mahajan, Shobhit; Mukherjee, Amitabha [Department of Physics and Astrophysics, University of Delhi, New Delhi 110007 (India); Jain, Deepak [Deen Dayal Upadhyaya College, University of Delhi, New Delhi 110015 (India); Pires, Nilza, E-mail: nrani@physics.du.ac.in, E-mail: djain@ddu.du.ac.in, E-mail: shobhit.mahajan@gmail.com, E-mail: amimukh@gmail.com, E-mail: npires@dfte.ufrn.br [Departamento de Física Teórica e Experimental, UFRN, Campus Universitário, Natal, RN 59072-970 (Brazil)

    2015-12-01

    In this paper, we use the cosmokinematics approach to study the accelerated expansion of the Universe. This is a model independent approach and depends only on the assumption that the Universe is homogeneous and isotropic and is described by the FRW metric. We parametrize the deceleration parameter, q(z), to constrain the transition redshift (z{sub t}) at which the expansion of the Universe goes from a decelerating to an accelerating phase. We use three different parametrizations of q(z) namely, q{sub I}(z)=q{sub 1}+q{sub 2}z, q{sub II} (z) = q{sub 3} + q{sub 4} ln (1 + z) and q{sub III} (z)=½+q{sub 5}/(1+z){sup 2}. A joint analysis of the age of galaxies, strong lensing and supernovae Ia data indicates that the transition redshift is less than unity i.e. z{sub t} < 1. We also use a nonparametric approach (LOESS+SIMEX) to constrain z{sub t}. This too gives z{sub t} < 1 which is consistent with the value obtained by the parametric approach.

  14. Estimation of spectral kurtosis

    Science.gov (United States)

    Sutawanir

    2017-03-01

    Rolling bearings are the most important elements in rotating machinery. Bearing frequently fall out of service for various reasons: heavy loads, unsuitable lubrications, ineffective sealing. Bearing faults may cause a decrease in performance. Analysis of bearing vibration signals has attracted attention in the field of monitoring and fault diagnosis. Bearing vibration signals give rich information for early detection of bearing failures. Spectral kurtosis, SK, is a parameter in frequency domain indicating how the impulsiveness of a signal varies with frequency. Faults in rolling bearings give rise to a series of short impulse responses as the rolling elements strike faults, SK potentially useful for determining frequency bands dominated by bearing fault signals. SK can provide a measure of the distance of the analyzed bearings from a healthy one. SK provides additional information given by the power spectral density (psd). This paper aims to explore the estimation of spectral kurtosis using short time Fourier transform known as spectrogram. The estimation of SK is similar to the estimation of psd. The estimation falls in model-free estimation and plug-in estimator. Some numerical studies using simulations are discussed to support the methodology. Spectral kurtosis of some stationary signals are analytically obtained and used in simulation study. Kurtosis of time domain has been a popular tool for detecting non-normality. Spectral kurtosis is an extension of kurtosis in frequency domain. The relationship between time domain and frequency domain analysis is establish through power spectrum-autocovariance Fourier transform. Fourier transform is the main tool for estimation in frequency domain. The power spectral density is estimated through periodogram. In this paper, the short time Fourier transform of the spectral kurtosis is reviewed, a bearing fault (inner ring and outer ring) is simulated. The bearing response, power spectrum, and spectral kurtosis are plotted to

  15. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  16. WINDOWS: a program for the analysis of spectral data foil activation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Stallmann, F.W.; Eastham, J.F.; Kam, F.B.K.

    1978-12-01

    The computer program WINDOWS together with its subroutines is described for the analysis of neutron spectral data foil activation measurements. In particular, the unfolding of the neutron differential spectrum, estimated windows and detector contributions, upper and lower bounds for an integral response, and group fluxes obtained from neutron transport calculations. 116 references. (JFP)

  17. WINDOWS: a program for the analysis of spectral data foil activation measurements

    International Nuclear Information System (INIS)

    Stallmann, F.W.; Eastham, J.F.; Kam, F.B.K.

    1978-12-01

    The computer program WINDOWS together with its subroutines is described for the analysis of neutron spectral data foil activation measurements. In particular, the unfolding of the neutron differential spectrum, estimated windows and detector contributions, upper and lower bounds for an integral response, and group fluxes obtained from neutron transport calculations. 116 references

  18. Planck 2013 results. IX. HFI spectral response

    CERN Document Server

    Ade, P A R; Armitage-Caplan, C; Arnaud, M; Ashdown, M; Atrio-Barandela, F; Aumont, J; Baccigalupi, C; Banday, A J; Barreiro, R B; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bobin, J; Bock, J J; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Bridges, M; Bucher, M; Burigana, C; Cardoso, J -F; Catalano, A; Challinor, A; Chamballu, A; Chary, R -R; Chen, X; Chiang, L -Y; Chiang, H C; Christensen, P R; Church, S; Clements, D L; Colombi, S; Colombo, L P L; Combet, C; Comis, B; Couchot, F; Coulais, A; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davies, R D; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Delouis, J -M; Désert, F -X; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dupac, X; Efstathiou, G; Enßlin, T A; Eriksen, H K; Falgarone, E; Finelli, F; Forni, O; Frailis, M; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Giraud-Héraud, Y; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Hansen, F K; Hanson, D; Harrison, D; Henrot-Versillé, S; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hivon, E; Hobson, M; Holmes, W A; Hornstrup, A; Hovest, W; Huffenberger, K M; Hurier, G; Jaffe, T R; Jaffe, A H; Jones, W C; Juvela, M; Keihänen, E; Keskitalo, R; Kisner, T S; Kneissl, R; Knoche, J; Knox, L; Kunz, M; Kurki-Suonio, H; Lagache, G; Lamarre, J -M; Lasenby, A; Laureijs, R J; Lawrence, C R; Leahy, J P; Leonardi, R; Leroy, C; Lesgourgues, J; Liguori, M; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Macías-Pérez, J F; Maffei, B; Mandolesi, N; Maris, M; Marshall, D J; Martin, P G; Martínez-González, E; Masi, S; Matarrese, S; Matthai, F; Mazzotta, P; McGehee, P; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschênes, M -A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Netterfield, C B; Nørgaard-Nielsen, H U; North, C; Noviello, F; Novikov, D; Novikov, I; Osborne, S; Oxborrow, C A; Paci, F; Pagano, L; Pajot, F; Paoletti, D; Pasian, F; Patanchon, G; Perdereau, O; Perotto, L; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Poutanen, T; Pratt, G W; Prézeau, G; Prunet, S; Puget, J -L; Rachen, J P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S; Riller, T; Ristorcelli, I; Rocha, G; Rosset, C; Roudier, G; Rusholme, B; Santos, D; Savini, G; Shellard, E P S; Spencer, L D; Starck, J -L; Stolyarov, V; Stompor, R; Sudiwala, R; Sureau, F; Sutton, D; Suur-Uski, A -S; Sygnet, J -F; Tauber, J A; Tavagnacco, D; Terenzi, L; Tomasi, M; Tristram, M; Tucci, M; Umana, G; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Vittorio, N; Wade, L A; Wandelt, B D; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    The Planck High Frequency Instrument (HFI) spectral response was determined through a series of ground based tests conducted with the HFI focal plane in a cryogenic environment prior to launch. The main goal of the spectral transmission tests was to measure the relative spectral response (including out-of-band signal rejection) of all HFI detectors. This was determined by measuring the output of a continuously scanned Fourier transform spectrometer coupled with all HFI detectors. As there is no on-board spectrometer within HFI, the ground-based spectral response experiments provide the definitive data set for the relative spectral calibration of the HFI. The spectral response of the HFI is used in Planck data analysis and component separation, this includes extraction of CO emission observed within Planck bands, dust emission, Sunyaev-Zeldovich sources, and intensity to polarization leakage. The HFI spectral response data have also been used to provide unit conversion and colour correction analysis tools. Ver...

  19. Non-parametric system identification from non-linear stochastic response

    DEFF Research Database (Denmark)

    Rüdinger, Finn; Krenk, Steen

    2001-01-01

    An estimation method is proposed for identification of non-linear stiffness and damping of single-degree-of-freedom systems under stationary white noise excitation. Non-parametric estimates of the stiffness and damping along with an estimate of the white noise intensity are obtained by suitable...... of the energy at mean-level crossings, which yields the damping relative to white noise intensity. Finally, an estimate of the noise intensity is extracted by estimating the absolute damping from the autocovariance functions of a set of modified phase plane variables at different energy levels. The method...

  20. Bayesian nonparametric clustering in phylogenetics: modeling antigenic evolution in influenza.

    Science.gov (United States)

    Cybis, Gabriela B; Sinsheimer, Janet S; Bedford, Trevor; Rambaut, Andrew; Lemey, Philippe; Suchard, Marc A

    2018-01-30

    Influenza is responsible for up to 500,000 deaths every year, and antigenic variability represents much of its epidemiological burden. To visualize antigenic differences across many viral strains, antigenic cartography methods use multidimensional scaling on binding assay data to map influenza antigenicity onto a low-dimensional space. Analysis of such assay data ideally leads to natural clustering of influenza strains of similar antigenicity that correlate with sequence evolution. To understand the dynamics of these antigenic groups, we present a framework that jointly models genetic and antigenic evolution by combining multidimensional scaling of binding assay data, Bayesian phylogenetic machinery and nonparametric clustering methods. We propose a phylogenetic Chinese restaurant process that extends the current process to incorporate the phylogenetic dependency structure between strains in the modeling of antigenic clusters. With this method, we are able to use the genetic information to better understand the evolution of antigenicity throughout epidemics, as shown in applications of this model to H1N1 influenza. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. An experimental applications of impedance measurements by spectral analysis to electrochemistry and corrosion

    International Nuclear Information System (INIS)

    Castro, E.B.; Vilche, J.R.; Milocco, R.H.

    1984-01-01

    An impedance measurement system based on the spectral analysis of excitation and response signals was implemented using a pseudo-random binary sequence in the generation of the electrical perturbation signal. The spectral density functions were estimated through finite Fourier transforms of the original time history records by fast computation of Fourier series. Experimental results obtained using the FFT algorithm in the developed impedance measurement system which covers a wide frequency range, 10 KHz >= f >= 1 mHz, are given both for dummy cells representing conventional electric circuits in electrochemistry and corrosion systems and for the Fe/acidic chloride solution interfaces under different polarization conditions. (C.L.B.) [pt

  2. Testing a parametric function against a nonparametric alternative in IV and GMM settings

    DEFF Research Database (Denmark)

    Gørgens, Tue; Wurtz, Allan

    This paper develops a specification test for functional form for models identified by moment restrictions, including IV and GMM settings. The general framework is one where the moment restrictions are specified as functions of data, a finite-dimensional parameter vector, and a nonparametric real ...

  3. Proposing a framework for airline service quality evaluation using Type-2 Fuzzy TOPSIS and non-parametric analysis

    Directory of Open Access Journals (Sweden)

    Navid Haghighat

    2017-12-01

    Full Text Available This paper focuses on evaluating airline service quality from the perspective of passengers' view. Until now a lot of researches has been performed in airline service quality evaluation in the world but a little research has been conducted in Iran, yet. In this study, a framework for measuring airline service quality in Iran is proposed. After reviewing airline service quality criteria, SSQAI model was selected because of its comprehensiveness in covering airline service quality dimensions. SSQAI questionnaire items were redesigned to adopt with Iranian airlines requirements and environmental circumstances in the Iran's economic and cultural context. This study includes fuzzy decision-making theory, considering the possible fuzzy subjective judgment of the evaluators during airline service quality evaluation. Fuzzy TOPSIS have been applied for ranking airlines service quality performances. Three major Iranian airlines which have the most passenger transfer volumes in domestic and foreign flights were chosen for evaluation in this research. Results demonstrated Mahan airline has got the best service quality performance rank in gaining passengers' satisfaction with delivery of high-quality services to its passengers, among the three major Iranian airlines. IranAir and Aseman airlines placed in the second and third rank, respectively, according to passenger's evaluation. Statistical analysis has been used in analyzing passenger responses. Due to the abnormality of data, Non-parametric tests were applied. To demonstrate airline ranks in every criterion separately, Friedman test was performed. Variance analysis and Tukey test were applied to study the influence of increasing in age and educational level of passengers on degree of their satisfaction from airline's service quality. Results showed that age has no significant relation to passenger satisfaction of airlines, however, increasing in educational level demonstrated a negative impact on

  4. Spectral analysis methods for vehicle interior vibro-acoustics identification

    Science.gov (United States)

    Hosseini Fouladi, Mohammad; Nor, Mohd. Jailani Mohd.; Ariffin, Ahmad Kamal

    2009-02-01

    Noise has various effects on comfort, performance and health of human. Sound are analysed by human brain based on the frequencies and amplitudes. In a dynamic system, transmission of sound and vibrations depend on frequency and direction of the input motion and characteristics of the output. It is imperative that automotive manufacturers invest a lot of effort and money to improve and enhance the vibro-acoustics performance of their products. The enhancement effort may be very difficult and time-consuming if one relies only on 'trial and error' method without prior knowledge about the sources itself. Complex noise inside a vehicle cabin originated from various sources and travel through many pathways. First stage of sound quality refinement is to find the source. It is vital for automotive engineers to identify the dominant noise sources such as engine noise, exhaust noise and noise due to vibration transmission inside of vehicle. The purpose of this paper is to find the vibro-acoustical sources of noise in a passenger vehicle compartment. The implementation of spectral analysis method is much faster than the 'trial and error' methods in which, parts should be separated to measure the transfer functions. Also by using spectral analysis method, signals can be recorded in real operational conditions which conduce to more consistent results. A multi-channel analyser is utilised to measure and record the vibro-acoustical signals. Computational algorithms are also employed to identify contribution of various sources towards the measured interior signal. These achievements can be utilised to detect, control and optimise interior noise performance of road transport vehicles.

  5. A simple non-parametric goodness-of-fit test for elliptical copulas

    Directory of Open Access Journals (Sweden)

    Jaser Miriam

    2017-12-01

    Full Text Available In this paper, we propose a simple non-parametric goodness-of-fit test for elliptical copulas of any dimension. It is based on the equality of Kendall’s tau and Blomqvist’s beta for all bivariate margins. Nominal level and power of the proposed test are investigated in a Monte Carlo study. An empirical application illustrates our goodness-of-fit test at work.

  6. A distributed microcomputer-controlled system for data acquisition and power spectral analysis of EEG.

    Science.gov (United States)

    Vo, T D; Dwyer, G; Szeto, H H

    1986-04-01

    A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.

  7. Spectral analysis of point-vortex dynamics: first application to vortex polygons in a circular domain

    International Nuclear Information System (INIS)

    Speetjens, M F M; Meleshko, V V; Van Heijst, G J F

    2014-01-01

    The present study addresses the classical problem of the dynamics and stability of a cluster of N-point vortices of equal strength arranged in a polygonal configuration (‘N-vortex polygons’). In unbounded domains, such N-vortex polygons are unconditionally stable for N⩽7. Confinement in a circular domain tightens the stability conditions to N⩽6 and a maximum polygon size relative to the domain radius. This work expands on existing studies on stability and integrability by a first giving an exploratory spectral analysis of the dynamics of N vortex polygons in circular domains. Key to this is that the spectral signature of the time evolution of vortex positions reflects their qualitative behaviour. Expressing vortex motion by a generic evolution operator (the so-called Koopman operator) provides a rigorous framework for such spectral analyses. This paves the way to further differentiation and classification of point-vortex behaviour beyond stability and integrability. The concept of Koopman-based spectral analysis is demonstrated for N-vortex polygons. This reveals that conditional stability can be seen as a local form of integrability and confirms an important generic link between spectrum and dynamics: discrete spectra imply regular (quasi-periodic) motion; continuous (sub-)spectra imply chaotic motion. Moreover, this exposes rich nonlinear dynamics as intermittency between regular and chaotic motion and quasi-coherent structures formed by chaotic vortices. (ss 1)

  8. Analysis of neutron reflectivity data: maximum entropy, Bayesian spectral analysis and speckle holography

    International Nuclear Information System (INIS)

    Sivia, D.S.; Hamilton, W.A.; Smith, G.S.

    1991-01-01

    The analysis of neutron reflectivity data to obtain nuclear scattering length density profiles is akin to the notorious phaseless Fourier problem, well known in many fields such as crystallography. Current methods of analysis culminate in the refinement of a few parameters of a functional model, and are often preceded by a long and laborious process of trial and error. We start by discussing the use of maximum entropy for obtained 'free-form' solutions of the density profile, as an alternative to the trial and error phase when a functional model is not available. Next we consider a Bayesian spectral analysis approach, which is appropriate for optimising the parameters of a simple (but adequate) type of model when the number of parameters is not known. Finally, we suggest a novel experimental procedure, the analogue of astronomical speckle holography, designed to alleviate the ambiguity problems inherent in traditional reflectivity measurements. (orig.)

  9. Spectral properties of generalized eigenparameter dependent ...

    African Journals Online (AJOL)

    Jost function, spectrum, the spectral singularities, and the properties of the principal vectors corresponding to the spectral singularities of L, if. ∞Σn=1 n(∣1 - an∣ + ∣bnl) < ∞. Mathematics Subject Classication (2010): 34L05, 34L40, 39A70, 47A10, 47A75. Key words: Discrete equations, eigenparameter, spectral analysis, ...

  10. Bayesian Nonparametric Mixture Estimation for Time-Indexed Functional Data in R

    Directory of Open Access Journals (Sweden)

    Terrance D. Savitsky

    2016-08-01

    Full Text Available We present growfunctions for R that offers Bayesian nonparametric estimation models for analysis of dependent, noisy time series data indexed by a collection of domains. This data structure arises from combining periodically published government survey statistics, such as are reported in the Current Population Study (CPS. The CPS publishes monthly, by-state estimates of employment levels, where each state expresses a noisy time series. Published state-level estimates from the CPS are composed from household survey responses in a model-free manner and express high levels of volatility due to insufficient sample sizes. Existing software solutions borrow information over a modeled time-based dependence to extract a de-noised time series for each domain. These solutions, however, ignore the dependence among the domains that may be additionally leveraged to improve estimation efficiency. The growfunctions package offers two fully nonparametric mixture models that simultaneously estimate both a time and domain-indexed dependence structure for a collection of time series: (1 A Gaussian process (GP construction, which is parameterized through the covariance matrix, estimates a latent function for each domain. The covariance parameters of the latent functions are indexed by domain under a Dirichlet process prior that permits estimation of the dependence among functions across the domains: (2 An intrinsic Gaussian Markov random field prior construction provides an alternative to the GP that expresses different computation and estimation properties. In addition to performing denoised estimation of latent functions from published domain estimates, growfunctions allows estimation of collections of functions for observation units (e.g., households, rather than aggregated domains, by accounting for an informative sampling design under which the probabilities for inclusion of observation units are related to the response variable. growfunctions includes plot

  11. ANALYSIS OF IN-SITU SPECTRAL REFLECTANCE OF SAGO AND OTHER PALMS: IMPLICATIONS FOR THEIR DETECTION IN OPTICAL SATELLITE IMAGES

    Directory of Open Access Journals (Sweden)

    J. R. Santillan

    2018-04-01

    Full Text Available We present a characterization, comparison and analysis of in-situ spectral reflectance of Sago and other palms (coconut, oil palm and nipa to ascertain on which part of the electromagnetic spectrum these palms are distinguishable from each other. The analysis also aims to reveal information that will assist in selecting which band to use when mapping Sago palms using the images acquired by these sensors. The datasets used in the analysis consisted of averaged spectral reflectance curves of each palm species measured within the 345–1045 nm wavelength range using an Ocean Optics USB4000-VIS-NIR Miniature Fiber Optic Spectrometer. This in-situ reflectance data was also resampled to match the spectral response of the 4 bands of ALOS AVNIR-2, 3 bands of ASTER VNIR, 4 bands of Landsat 7 ETM+, 5 bands of Landsat 8, and 8 bands of Worldview-2 (WV2. Examination of the spectral reflectance curves showed that the near infra-red region, specifically at 770, 800 and 875 nm, provides the best wavelengths where Sago palms can be distinguished from other palms. The resampling of the in-situ reflectance spectra to match the spectral response of optical sensors made possible the analysis of the differences in reflectance values of Sago and other palms in different bands of the sensors. Overall, the knowledge learned from the analysis can be useful in the actual analysis of optical satellite images, specifically in determining which band to include or to exclude, or whether to use all bands of a sensor in discriminating and mapping Sago palms.

  12. Nonparametric inference of network structure and dynamics

    Science.gov (United States)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  13. Numerical Solution of Nonlinear Fredholm Integro-Differential Equations Using Spectral Homotopy Analysis Method

    Directory of Open Access Journals (Sweden)

    Z. Pashazadeh Atabakan

    2013-01-01

    Full Text Available Spectral homotopy analysis method (SHAM as a modification of homotopy analysis method (HAM is applied to obtain solution of high-order nonlinear Fredholm integro-differential problems. The existence and uniqueness of the solution and convergence of the proposed method are proved. Some examples are given to approve the efficiency and the accuracy of the proposed method. The SHAM results show that the proposed approach is quite reasonable when compared to homotopy analysis method, Lagrange interpolation solutions, and exact solutions.

  14. Spectral analysis of turbulence propagation mechanisms in solar wind and tokamaks plasmas

    International Nuclear Information System (INIS)

    Dong, Yue

    2014-01-01

    This thesis takes part in the study of spectral transfers in the turbulence of magnetized plasmas. We will be interested in turbulence in solar wind and tokamaks. Spacecraft measures, first principle simulations and simple dynamical systems will be used to understand the mechanisms behind spectral anisotropy and spectral transfers in these plasmas. The first part of this manuscript will introduce the common context of solar wind and tokamaks, what is specific to each of them and present some notions needed to understand the work presented here. The second part deals with turbulence in the solar wind. We will present first an observational study on the spectral variability of solar wind turbulence. Starting from the study of Grappin et al. (1990, 1991) on Helios mission data, we bring a new analysis taking into account a correct evaluation of large scale spectral break, provided by the higher frequency data of the Wind mission. This considerably modifies the result on the spectral index distribution of the magnetic and kinetic energy. A second observational study is presented on solar wind turbulence anisotropy using autocorrelation functions. Following the work of Matthaeus et al. (1990); Dasso et al. (2005), we bring a new insight on this statistical, in particular the question of normalisation choices used to build the autocorrelation function, and its consequence on the measured anisotropy. This allows us to bring a new element in the debate on the measured anisotropy depending on the choice of the referential either based on local or global mean magnetic field. Finally, we study for the first time in 3D the effects of the transverse expansion of solar wind on its turbulence. This work is based on a theoretical and numerical scheme developed by Grappin et al. (1993); Grappin and Velli (1996), but never used in 3D. Our main results deal with the evolution of spectral and polarization anisotropy due to the competition between non-linear and linear (Alfven coupling

  15. The browning value changes and spectral analysis on the Maillard reaction product from glucose and methionine model system

    Science.gov (United States)

    Al-Baarri, A. N.; Legowo, A. M.; Widayat

    2018-01-01

    D-glucose has been understood to provide the various effect on the reactivity in Maillard reaction resulting in the changes in physical performance of food product. Therefore this research was done to analyse physical appearance of Maillard reaction product made of D-glucose and methionine as a model system. The changes in browning value and spectral analysis model system were determined. The glucose-methionine model system was produced through the heating treatment at 50°C and RH 70% for 24 hours. The data were collected for every three hour using spectrophotometer. As result, browning value was elevated with the increase of heating time and remarkably high if compare to the D-glucose only. Furthermore, the spectral analysis showed that methionine turned the pattern of peak appearance. As conclusion, methionine raised the browning value and changed the pattern of spectral analysis in Maillard reaction model system.

  16. Estimation of compound distribution in spectral images of tomatoes using independent component analysis

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.

    2003-01-01

    Independent Component Analysis (ICA) is one of the most widely used methods for blind source separation. In this paper we use this technique to estimate the important compounds which play a role in the ripening of tomatoes. Spectral images of tomatoes were analyzed. Two main independent components

  17. Spectral analysis of noisy nonlinear maps

    International Nuclear Information System (INIS)

    Hirshman, S.P.; Whitson, J.C.

    1982-01-01

    A path integral equation formalism is developed to obtain the frequency spectrum of nonlinear mappings exhibiting chaotic behavior. The one-dimensional map, x/sub n+1/ = f(x/sub n/), where f is nonlinear and n is a discrete time variable, is analyzed in detail. This map is introduced as a paradigm of systems whose exact behavior is exceedingly complex, and therefore irretrievable, but which nevertheless possess smooth, well-behaved solutions in the presence of small sources of external noise. A Boltzmann integral equation is derived for the probability distribution function p(x,n). This equation is linear and is therefore amenable to spectral analysis. The nonlinear dynamics in f(x) appear as transition probability matrix elements, and the presence of noise appears simply as an overall multiplicative scattering amplitude. This formalism is used to investigate the band structure of the logistic equation and to analyze the effects of external noise on both the invariant measure and the frequency spectrum of x/sub n/ for several values of lambda epsilon [0,1

  18. Enzyme activity measurement via spectral evolution profiling and PARAFAC

    DEFF Research Database (Denmark)

    Baum, Andreas; Meyer, Anne S.; Garcia, Javier Lopez

    2013-01-01

    The recent advances in multi-way analysis provide new solutions to traditional enzyme activity assessment. In the present study enzyme activity has been determined by monitoring spectral changes of substrates and products in real time. The method relies on measurement of distinct spectral...... fingerprints of the reaction mixture at specific time points during the course of the whole enzyme catalyzed reaction and employs multi-way analysis to detect the spectral changes. The methodology is demonstrated by spectral evolution profiling of Fourier Transform Infrared (FTIR) spectral fingerprints using...

  19. Spectral Efficiency Analysis for Multicarrier Based 4G Systems

    DEFF Research Database (Denmark)

    Silva, Nuno; Rahman, Muhammad Imadur; Frederiksen, Flemming Bjerge

    2006-01-01

    In this paper, a spectral efficiency definition is proposed. Spectral efficiency for multicarrier based multiaccess techniques, such as OFDMA, MC-CDMA and OFDMA-CDM, is analyzed. Simulations for different indoor and outdoor scenarios are carried out. Based on the simulations, we have discussed ho...

  20. The Support Reduction Algorithm for Computing Non-Parametric Function Estimates in Mixture Models

    OpenAIRE

    GROENEBOOM, PIET; JONGBLOED, GEURT; WELLNER, JON A.

    2008-01-01

    In this paper, we study an algorithm (which we call the support reduction algorithm) that can be used to compute non-parametric M-estimators in mixture models. The algorithm is compared with natural competitors in the context of convex regression and the ‘Aspect problem’ in quantum physics.

  1. Nonparametric Change Point Diagnosis Method of Concrete Dam Crack Behavior Abnormality

    OpenAIRE

    Li, Zhanchao; Gu, Chongshi; Wu, Zhongru

    2013-01-01

    The study on diagnosis method of concrete crack behavior abnormality has always been a hot spot and difficulty in the safety monitoring field of hydraulic structure. Based on the performance of concrete dam crack behavior abnormality in parametric statistical model and nonparametric statistical model, the internal relation between concrete dam crack behavior abnormality and statistical change point theory is deeply analyzed from the model structure instability of parametric statistical model ...

  2. Hyperthyroidism is characterized by both increased sympathetic and decreased vagal modulation of heart rate: evidence from spectral analysis of heart rate variability.

    Science.gov (United States)

    Chen, Jin-Long; Chiu, Hung-Wen; Tseng, Yin-Jiun; Chu, Woei-Chyn

    2006-06-01

    The clinical manifestations of hyperthyroidism resemble those of the hyperadrenergic state. This study was designed to evaluate the impact of hyperthyroidism on the autonomic nervous system (ANS) and to investigate the relationship between serum thyroid hormone concentrations and parameters of spectral heart rate variability (HRV) analysis in hyperthyroidism. Thirty-two hyperthyroid Graves' disease patients (mean age 31 years) and 32 sex-, age-, and body mass index (BMI)-matched normal control subjects were recruited to receive one-channel electrocardiogram (ECG) recording. The cardiac autonomic nervous function was evaluated by the spectral analysis of HRV, which indicates the autonomic modulation of the sinus node. The correlation coefficients between serum thyroid hormone concentrations and parameters of the spectral HRV analysis were also computed. The hyperthyroid patients revealed significant differences (P hyperthyroidism in 28 patients, all of the above parameters were restored to levels comparable to those of the controls. In addition, serum thyroid hormone concentrations showed significant correlations with spectral HRV parameters. Hyperthyroidism is in a sympathovagal imbalanced state, characterized by both increased sympathetic and decreased vagal modulation of the heart rate. These autonomic dysfunctions can be detected simultaneously by spectral analysis of HRV, and the spectral HRV parameters could reflect the disease severity in hyperthyroid patients.

  3. Spectral analysis of the geomagnetic activity index Ap during different IMF conditions (1947-1978)

    International Nuclear Information System (INIS)

    Francia, P.; Villante, U.

    1986-01-01

    The spectral analysis of the geomagnetic activity index Ap (1947-1978) has been conducted for intervals associated respectively with two and four sectors of the interplanetary magnetic fields per solar rotation. A recurrent 2-sector structure is typically associated with an emerging spectral peak close to T s (T s being the period of solar rotation as seen from Earth), while the T 2 /2 modulation becomes more important during intervals corresponding to four sectors per solar rotation. The recurrence tendency of two high-velocity streams per solar rotation seems to reinforce the relative importance of the T 2 /2 modulation

  4. A Simple Spectral Observer

    Directory of Open Access Journals (Sweden)

    Lizeth Torres

    2018-05-01

    Full Text Available The principal aim of a spectral observer is twofold: the reconstruction of a signal of time via state estimation and the decomposition of such a signal into the frequencies that make it up. A spectral observer can be catalogued as an online algorithm for time-frequency analysis because is a method that can compute on the fly the Fourier transform (FT of a signal, without having the entire signal available from the start. In this regard, this paper presents a novel spectral observer with an adjustable constant gain for reconstructing a given signal by means of the recursive identification of the coefficients of a Fourier series. The reconstruction or estimation of a signal in the context of this work means to find the coefficients of a linear combination of sines a cosines that fits a signal such that it can be reproduced. The design procedure of the spectral observer is presented along with the following applications: (1 the reconstruction of a simple periodical signal, (2 the approximation of both a square and a triangular signal, (3 the edge detection in signals by using the Fourier coefficients, (4 the fitting of the historical Bitcoin market data from 1 December 2014 to 8 January 2018 and (5 the estimation of a input force acting upon a Duffing oscillator. To round out this paper, we present a detailed discussion about the results of the applications as well as a comparative analysis of the proposed spectral observer vis-à-vis the Short Time Fourier Transform (STFT, which is a well-known method for time-frequency analysis.

  5. A non-parametric framework for estimating threshold limit values

    Directory of Open Access Journals (Sweden)

    Ulm Kurt

    2005-11-01

    Full Text Available Abstract Background To estimate a threshold limit value for a compound known to have harmful health effects, an 'elbow' threshold model is usually applied. We are interested on non-parametric flexible alternatives. Methods We describe how a step function model fitted by isotonic regression can be used to estimate threshold limit values. This method returns a set of candidate locations, and we discuss two algorithms to select the threshold among them: the reduced isotonic regression and an algorithm considering the closed family of hypotheses. We assess the performance of these two alternative approaches under different scenarios in a simulation study. We illustrate the framework by analysing the data from a study conducted by the German Research Foundation aiming to set a threshold limit value in the exposure to total dust at workplace, as a causal agent for developing chronic bronchitis. Results In the paper we demonstrate the use and the properties of the proposed methodology along with the results from an application. The method appears to detect the threshold with satisfactory success. However, its performance can be compromised by the low power to reject the constant risk assumption when the true dose-response relationship is weak. Conclusion The estimation of thresholds based on isotonic framework is conceptually simple and sufficiently powerful. Given that in threshold value estimation context there is not a gold standard method, the proposed model provides a useful non-parametric alternative to the standard approaches and can corroborate or challenge their findings.

  6. Spectral analysis of coolant activity from a commercial nuclear generating station

    International Nuclear Information System (INIS)

    Swann, J.D.; Lewis, B.J.; Ip, M.

    2008-01-01

    In support of the development of a real-time on-line fuel failure monitoring system for the CANDU reactor, actual gamma spectroscopy data files from the gaseous fission product (GFP) monitoring system were acquired from almost four years of operation at a commercial Nuclear Generating Station (NGS). Several spectral analysis techniques were used to process the data files. Radioisotopic activity from the plant information (PI) system was compared to an in-house C++ code that was used to determine the photopeak area and to a separate analysis with commercial software from Canberra-Aptec. These various techniques provided for a calculation of the coolant activity concentration of the noble gas and iodine species in the primary heat transport system. These data were then used to benchmark the Visual DETECT code, a user friendly software tool which can be used to characterize the defective fuel state based on a coolant activity analysis. Acceptable agreement was found with the spectral techniques when compared to the known defective bundle history at the commercial reactor. A more generalized method of assessing the fission product release data was also considered with the development of a pre-processor to evaluate the radioisotopic release rate from mass balance considerations. The release rate provided a more efficient means to characterize the occurrence of a defect and was consistent with the actual defect situation at the power plant as determined from in-bay examination of discharged fuel bundles. (author)

  7. COMBINED ANALYSIS OF IMAGES AND SPECTRAL ENERGY DISTRIBUTIONS OF TAURUS PROTOSTARS

    International Nuclear Information System (INIS)

    Gramajo, Luciana V.; Gomez, Mercedes; Whitney, Barbara A.; Robitaille, Thomas P.

    2010-01-01

    We present an analysis of spectral energy distributions (SEDs), near- and mid-infrared images, and Spitzer spectra of eight embedded Class I/II objects in the Taurus-Auriga molecular cloud. The initial model for each source was chosen using the grid of young stellar objects (YSOs) and SED fitting tool of Robitaille et al. Then the models were refined using the radiative transfer code of Whitney et al. to fit both the spectra and the infrared images of these objects. In general, our models agree with previous published analyses. However, our combined models should provide more reliable determinations of the physical and geometrical parameters since they are derived from SEDs, including the Spitzer spectra, covering the complete spectral range; and high-resolution near-infrared and Spitzer IRAC images. The combination of SED and image modeling better constrains the different components (central source, disk, envelope) of the YSOs. Our derived luminosities are higher, on average, than previous estimates because we account for the viewing angles (usually nearly edge-on) of most of the sources. Our analysis suggests that the standard rotating collapsing protostar model with disks and bipolar cavities works well for the analyzed sample of objects in the Taurus molecular cloud.

  8. ANALYSIS OF CAMOUFLAGE COVER SPECTRAL CHARACTERISTICS BY IMAGING SPECTROMETER

    Directory of Open Access Journals (Sweden)

    A. Y. Kouznetsov

    2016-03-01

    Full Text Available Subject of Research.The paper deals with the problems of detection and identification of objects in hyperspectral imagery. The possibility of object type determination by statistical methods is demonstrated. The possibility of spectral image application for its data type identification is considered. Method. Researching was done by means of videospectral equipment for objects detection at "Fregat" substrate. The postprocessing of hyperspectral information was done with the use of math model of pattern recognition system. The vegetation indexes TCHVI (Three-Channel Vegetation Index and NDVI (Normalized Difference Vegetation Index were applied for quality control of object recognition. Neumann-Pearson criterion was offered as a tool for determination of objects differences. Main Results. We have carried out analysis of the spectral characteristics of summer-typecamouflage cover (Germany. We have calculated the density distribution of vegetation indexes. We have obtained statistical characteristics needed for creation of mathematical model for pattern recognition system. We have shown the applicability of vegetation indices for detection of summer camouflage cover on averdure background. We have presented mathematical model of object recognition based on Neumann-Pearson criterion. Practical Relevance. The results may be useful for specialists in the field of hyperspectral data processing for surface state monitoring.

  9. Spectral theory of linear operators and spectral systems in Banach algebras

    CERN Document Server

    Müller, Vladimir

    2003-01-01

    This book is dedicated to the spectral theory of linear operators on Banach spaces and of elements in Banach algebras. It presents a survey of results concerning various types of spectra, both of single and n-tuples of elements. Typical examples are the one-sided spectra, the approximate point, essential, local and Taylor spectrum, and their variants. The theory is presented in a unified, axiomatic and elementary way. Many results appear here for the first time in a monograph. The material is self-contained. Only a basic knowledge of functional analysis, topology, and complex analysis is assumed. The monograph should appeal both to students who would like to learn about spectral theory and to experts in the field. It can also serve as a reference book. The present second edition contains a number of new results, in particular, concerning orbits and their relations to the invariant subspace problem. This book is dedicated to the spectral theory of linear operators on Banach spaces and of elements in Banach alg...

  10. [Experimental Methods and Result Analysis of a Variety of Spectral Reflectance Properties of the Thin Oil Film].

    Science.gov (United States)

    Ye, Zhou; Liu, Li; Wei, Chuan-xin; Gu, Qun; An, Ping-ao; Zhao, Yue-jiao; Yin, Da-yi

    2015-06-01

    In order to analysis the oil spill situation based on the obtained data in airborne aerial work, it's needed to get the spectral reflectance characteristics of the oil film of different oils and thickness as support and to select the appropriate operating band. An experiment is set up to measure the reflectance spectroscopy from ultraviolet to near-infrared for the film of five target samples, which means petrol, diesel, lubricating oil, kerosene and fossil, using spectral measurement device. The result is compared with the reflectance spectra of water in the same experimental environment, which shows that the spectral reflection characteristics of the oil film are related to the thickness and the type of the oil film. In case of the same thickness, the spectral reflectance curve of different types of film is far different, and for the same type of film, the spectral reflectance curve changes accordingly with the change of film thickness, therefore in terms of the single film, different film thickness can be distinguished by reflectance curves. It also shows that in terms of the same film thickness, the reflectance of diesel, kerosene, lubricants reaches peak around 380 nm wavelength, obviously different from the reflectance of water, and that the reflectance of crude oil is far less than that of water in more than 340 nm wavelength, and the obtained reflection spectrum can be used to distinguish between different types of oil film to some extent. The experiment covers main types of spilled oil, with data comprehensively covering commonly used detect spectral bands, and quantitative description of the spectral reflectance properties of film. It provides comprehensive theoretical and data support for the selection of airborne oil spill detection working band and the detection and analysis of water-surface oil spill.

  11. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  12. Solid state linear dichroic infrared spectral analysis of benzimidazoles and their N 1-protonated salts

    Science.gov (United States)

    Ivanova, B. B.

    2005-11-01

    A stereo structural characterization of 2,5,6-thrimethylbenzimidazole (MBIZ) and 2-amino-benzimidaziole (2-NH 2-BI) and their N 1 protonation salts was carried out using a polarized solid state linear dichroic infrared spectral (IR-LD) analysis in nematic liquid crystal suspension. All experimental predicted structures were compared with the theoretical ones, obtained by ab initio calculations. The Cs to C2v* symmetry transformation as a result of protonation processes, with a view of its reflection on the infrared spectral characteristics was described.

  13. Comparative Analysis of Mass Spectral Similarity Measures on Peak Alignment for Comprehensive Two-Dimensional Gas Chromatography Mass Spectrometry

    Science.gov (United States)

    2013-01-01

    Peak alignment is a critical procedure in mass spectrometry-based biomarker discovery in metabolomics. One of peak alignment approaches to comprehensive two-dimensional gas chromatography mass spectrometry (GC×GC-MS) data is peak matching-based alignment. A key to the peak matching-based alignment is the calculation of mass spectral similarity scores. Various mass spectral similarity measures have been developed mainly for compound identification, but the effect of these spectral similarity measures on the performance of peak matching-based alignment still remains unknown. Therefore, we selected five mass spectral similarity measures, cosine correlation, Pearson's correlation, Spearman's correlation, partial correlation, and part correlation, and examined their effects on peak alignment using two sets of experimental GC×GC-MS data. The results show that the spectral similarity measure does not affect the alignment accuracy significantly in analysis of data from less complex samples, while the partial correlation performs much better than other spectral similarity measures when analyzing experimental data acquired from complex biological samples. PMID:24151524

  14. Spectral analysis of the stick-slip phenomenon in "oral" tribological texture evaluation.

    Science.gov (United States)

    Sanahuja, Solange; Upadhyay, Rutuja; Briesen, Heiko; Chen, Jianshe

    2017-08-01

    "Oral" tribology has become a new paradigm in food texture studies to understand complex texture attributes, such as creaminess, oiliness, and astringency, which could not be successfully characterized by traditional texture analysis nor by rheology. Stick-slip effects resulting from intermittent sliding motion during kinetic friction of oral mucosa could constitute an additional determining factor of sensory perception where traditional friction coefficient values and their Stribeck regimes fail in predicting different lubricant (food bolus and saliva) behaviors. It was hypothesized that the observed jagged behavior of most sliding force curves are due to stick-slip effects and depend on test velocity, normal load, surface roughness as well as lubricant type. Therefore, different measurement set-ups were investigated: sliding velocities from 0.01 to 40 mm/s, loads of 0.5 and 2.5 N as well as a smooth and a textured silicone contact surface. Moreover, dry contact measurements were compared to model food systems, such as water, oil, and oil-in-water emulsions. Spectral analysis permitted to extract the distribution of stick-slip magnitudes for specific wave numbers, characterizing the occurrence of jagged force peaks per unit sliding distance, similar to frequencies per unit time. The spectral features were affected by all the above mentioned tested factors. Stick-slip created vibration frequencies in the range of those detected by oral mechanoreceptors (0.3-400 Hz). The study thus provides a new insight into the use of tribology in food psychophysics. Dynamic spectral analysis has been applied for the first time to the force-displacement curves in "oral" tribology. Analyzing the stick-slip phenomenon in the dynamic friction provides new information that is generally overlooked or confused with machine noise and which may help to understand friction-related sensory attributes. This approach allows us to differentiate samples that have similar friction coefficient

  15. Estimating the Spectral Width of a Narrowband Optical Signal

    DEFF Research Database (Denmark)

    Lading, Lars; Skov Jensen, A.

    1980-01-01

    Methods for estimating the spectral width of a narrowband optical signal are investigated. Spectral analysis and Fourier spectroscopy are compared. Optimum and close-to-optimum estimators are developed under the constraint of having only one photodetector.......Methods for estimating the spectral width of a narrowband optical signal are investigated. Spectral analysis and Fourier spectroscopy are compared. Optimum and close-to-optimum estimators are developed under the constraint of having only one photodetector....

  16. Measuring energy performance with sectoral heterogeneity: A non-parametric frontier approach

    International Nuclear Information System (INIS)

    Wang, H.; Ang, B.W.; Wang, Q.W.; Zhou, P.

    2017-01-01

    Evaluating economy-wide energy performance is an integral part of assessing the effectiveness of a country's energy efficiency policy. Non-parametric frontier approach has been widely used by researchers for such a purpose. This paper proposes an extended non-parametric frontier approach to studying economy-wide energy efficiency and productivity performances by accounting for sectoral heterogeneity. Relevant techniques in index number theory are incorporated to quantify the driving forces behind changes in the economy-wide energy productivity index. The proposed approach facilitates flexible modelling of different sectors' production processes, and helps to examine sectors' impact on the aggregate energy performance. A case study of China's economy-wide energy efficiency and productivity performances in its 11th five-year plan period (2006–2010) is presented. It is found that sectoral heterogeneities in terms of energy performance are significant in China. Meanwhile, China's economy-wide energy productivity increased slightly during the study period, mainly driven by the technical efficiency improvement. A number of other findings have also been reported. - Highlights: • We model economy-wide energy performance by considering sectoral heterogeneity. • The proposed approach can identify sectors' impact on the aggregate energy performance. • Obvious sectoral heterogeneities are identified in evaluating China's energy performance.

  17. Spectral analysis of linear relations and degenerate operator semigroups

    International Nuclear Information System (INIS)

    Baskakov, A G; Chernyshov, K I

    2002-01-01

    Several problems of the spectral theory of linear relations in Banach spaces are considered. Linear differential inclusions in a Banach space are studied. The construction of the phase space and solutions is carried out with the help of the spectral theory of linear relations, ergodic theorems, and degenerate operator semigroups

  18. [Review of digital ground object spectral library].

    Science.gov (United States)

    Zhou, Xiao-Hu; Zhou, Ding-Wu

    2009-06-01

    A higher spectral resolution is the main direction of developing remote sensing technology, and it is quite important to set up the digital ground object reflectance spectral database library, one of fundamental research fields in remote sensing application. Remote sensing application has been increasingly relying on ground object spectral characteristics, and quantitative analysis has been developed to a new stage. The present article summarized and systematically introduced the research status quo and development trend of digital ground object reflectance spectral libraries at home and in the world in recent years. Introducing the spectral libraries has been established, including desertification spectral database library, plants spectral database library, geological spectral database library, soil spectral database library, minerals spectral database library, cloud spectral database library, snow spectral database library, the atmosphere spectral database library, rocks spectral database library, water spectral database library, meteorites spectral database library, moon rock spectral database library, and man-made materials spectral database library, mixture spectral database library, volatile compounds spectral database library, and liquids spectral database library. In the process of establishing spectral database libraries, there have been some problems, such as the lack of uniform national spectral database standard and uniform standards for the ground object features as well as the comparability between different databases. In addition, data sharing mechanism can not be carried out, etc. This article also put forward some suggestions on those problems.

  19. Improved classification accuracy of powdery mildew infection levels of wine grapes by spatial-spectral analysis of hyperspectral images.

    Science.gov (United States)

    Knauer, Uwe; Matros, Andrea; Petrovic, Tijana; Zanker, Timothy; Scott, Eileen S; Seiffert, Udo

    2017-01-01

    Hyperspectral imaging is an emerging means of assessing plant vitality, stress parameters, nutrition status, and diseases. Extraction of target values from the high-dimensional datasets either relies on pixel-wise processing of the full spectral information, appropriate selection of individual bands, or calculation of spectral indices. Limitations of such approaches are reduced classification accuracy, reduced robustness due to spatial variation of the spectral information across the surface of the objects measured as well as a loss of information intrinsic to band selection and use of spectral indices. In this paper we present an improved spatial-spectral segmentation approach for the analysis of hyperspectral imaging data and its application for the prediction of powdery mildew infection levels (disease severity) of intact Chardonnay grape bunches shortly before veraison. Instead of calculating texture features (spatial features) for the huge number of spectral bands independently, dimensionality reduction by means of Linear Discriminant Analysis (LDA) was applied first to derive a few descriptive image bands. Subsequent classification was based on modified Random Forest classifiers and selective extraction of texture parameters from the integral image representation of the image bands generated. Dimensionality reduction, integral images, and the selective feature extraction led to improved classification accuracies of up to [Formula: see text] for detached berries used as a reference sample (training dataset). Our approach was validated by predicting infection levels for a sample of 30 intact bunches. Classification accuracy improved with the number of decision trees of the Random Forest classifier. These results corresponded with qPCR results. An accuracy of 0.87 was achieved in classification of healthy, infected, and severely diseased bunches. However, discrimination between visually healthy and infected bunches proved to be challenging for a few samples

  20. Nonparametric trend estimation in the presence of fractal noise: application to fMRI time-series analysis.

    Science.gov (United States)

    Afshinpour, Babak; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2008-06-30

    Unknown low frequency fluctuations called "trend" are observed in noisy time-series measured for different applications. In some disciplines, they carry primary information while in other fields such as functional magnetic resonance imaging (fMRI) they carry nuisance effects. In all cases, however, it is necessary to estimate them accurately. In this paper, a method for estimating trend in the presence of fractal noise is proposed and applied to fMRI time-series. To this end, a partly linear model (PLM) is fitted to each time-series. The parametric and nonparametric parts of PLM are considered as contributions of hemodynamic response and trend, respectively. Using the whitening property of wavelet transform, the unknown components of the model are estimated in the wavelet domain. The results of the proposed method are compared to those of other parametric trend-removal approaches such as spline and polynomial models. It is shown that the proposed method improves activation detection and decreases variance of the estimated parameters relative to the other methods.

  1. Adaptive nonparametric estimation for L\\'evy processes observed at low frequency

    OpenAIRE

    Kappus, Johanna

    2013-01-01

    This article deals with adaptive nonparametric estimation for L\\'evy processes observed at low frequency. For general linear functionals of the L\\'evy measure, we construct kernel estimators, provide upper risk bounds and derive rates of convergence under regularity assumptions. Our focus lies on the adaptive choice of the bandwidth, using model selection techniques. We face here a non-standard problem of model selection with unknown variance. A new approach towards this problem is proposed, ...

  2. Non-parametric reconstruction of an inflaton potential from Einstein–Cartan–Sciama–Kibble gravity with particle production

    Directory of Open Access Journals (Sweden)

    Shantanu Desai

    2016-04-01

    Full Text Available The coupling between spin and torsion in the Einstein–Cartan–Sciama–Kibble theory of gravity generates gravitational repulsion at very high densities, which prevents a singularity in a black hole and may create there a new universe. We show that quantum particle production in such a universe near the last bounce, which represents the Big Bang, gives the dynamics that solves the horizon, flatness, and homogeneity problems in cosmology. For a particular range of the particle production coefficient, we obtain a nearly constant Hubble parameter that gives an exponential expansion of the universe with more than 60 e-folds, which lasts about ∼10−42 s. This scenario can thus explain cosmic inflation without requiring a fundamental scalar field and reheating. From the obtained time dependence of the scale factor, we follow the prescription of Ellis and Madsen to reconstruct in a non-parametric way a scalar field potential which gives the same dynamics of the early universe. This potential gives the slow-roll parameters of cosmic inflation, from which we calculate the tensor-to-scalar ratio, the scalar spectral index of density perturbations, and its running as functions of the production coefficient. We find that these quantities do not significantly depend on the scale factor at the Big Bounce. Our predictions for these quantities are consistent with the Planck 2015 observations.

  3. Spectral Analysis of Vector Magnetic Field Profiles

    Science.gov (United States)

    Parker, Robert L.; OBrien, Michael S.

    1997-01-01

    We investigate the power spectra and cross spectra derived from the three components of the vector magnetic field measured on a straight horizontal path above a statistically stationary source. All of these spectra, which can be estimated from the recorded time series, are related to a single two-dimensional power spectral density via integrals that run in the across-track direction in the wavenumber domain. Thus the measured spectra must obey a number of strong constraints: for example, the sum of the two power spectral densities of the two horizontal field components equals the power spectral density of the vertical component at every wavenumber and the phase spectrum between the vertical and along-track components is always pi/2. These constraints provide powerful checks on the quality of the measured data; if they are violated, measurement or environmental noise should be suspected. The noise due to errors of orientation has a clear characteristic; both the power and phase spectra of the components differ from those of crustal signals, which makes orientation noise easy to detect and to quantify. The spectra of the crustal signals can be inverted to obtain information about the cross-track structure of the field. We illustrate these ideas using a high-altitude Project Magnet profile flown in the southeastern Pacific Ocean.

  4. A nonparametric statistical method for determination of a confidence interval for the mean of a set of results obtained in a laboratory intercomparison

    International Nuclear Information System (INIS)

    Veglia, A.

    1981-08-01

    In cases where sets of data are obviously not normally distributed, the application of a nonparametric method for the estimation of a confidence interval for the mean seems to be more suitable than some other methods because such a method requires few assumptions about the population of data. A two-step statistical method is proposed which can be applied to any set of analytical results: elimination of outliers by a nonparametric method based on Tchebycheff's inequality, and determination of a confidence interval for the mean by a non-parametric method based on binominal distribution. The method is appropriate only for samples of size n>=10

  5. Higher order structure analysis of nano-materials by spectral reflectance of laser-plasma soft x-ray

    International Nuclear Information System (INIS)

    Azuma, Hirozumi; Takeichi, Akihiro; Noda, Shoji

    1995-01-01

    We have proposed a new experimental arrangement to measure spectral reflectance of nano-materials for analyzing higher order structure with laser-plasma soft x-rays. Structure modification of annealed Mo/Si multilayers and a nylon-6/clay hybrid with poor periodicity was investigated. The measurement of the spectral reflectance of soft x-rays from laser-produced plasma was found to be a useful method for the structure analysis of nano-materials, especially those of rather poor periodicity

  6. Spectral Analysis of Large Particle Systems

    DEFF Research Database (Denmark)

    Dahlbæk, Jonas

    2017-01-01

    that Schur complements, Feshbach maps and Grushin problems are three sides of the same coin, it seems to be a new observation that the smooth Feshbach method can also be formulated as a Grushin problem. Based on this, an abstract account of the spectral renormalization group is given....

  7. Nonparametric Integrated Agrometeorological Drought Monitoring: Model Development and Application

    Science.gov (United States)

    Zhang, Qiang; Li, Qin; Singh, Vijay P.; Shi, Peijun; Huang, Qingzhong; Sun, Peng

    2018-01-01

    Drought is a major natural hazard that has massive impacts on the society. How to monitor drought is critical for its mitigation and early warning. This study proposed a modified version of the multivariate standardized drought index (MSDI) based on precipitation, evapotranspiration, and soil moisture, i.e., modified multivariate standardized drought index (MMSDI). This study also used nonparametric joint probability distribution analysis. Comparisons were done between standardized precipitation evapotranspiration index (SPEI), standardized soil moisture index (SSMI), MSDI, and MMSDI, and real-world observed drought regimes. Results indicated that MMSDI detected droughts that SPEI and/or SSMI failed to do. Also, MMSDI detected almost all droughts that were identified by SPEI and SSMI. Further, droughts detected by MMSDI were similar to real-world observed droughts in terms of drought intensity and drought-affected area. When compared to MMSDI, MSDI has the potential to overestimate drought intensity and drought-affected area across China, which should be attributed to exclusion of the evapotranspiration components from estimation of drought intensity. Therefore, MMSDI is proposed for drought monitoring that can detect agrometeorological droughts. Results of this study provide a framework for integrated drought monitoring in other regions of the world and can help to develop drought mitigation.

  8. Spectral analysis to detection of short circuit fault of solar photovoltaic modules in strings

    International Nuclear Information System (INIS)

    Sevilla-Camacho, P.Y.; Robles-Ocampo, J.B.; Zuñiga-Reyes, Marco A.

    2017-01-01

    This research work presents a method to detect the number of short circuit faulted solar photovoltaic modules in strings of a photovoltaic system by taking into account speed, safety, and non-use of sensors and specialized and expensive equipment. The method consists on apply the spectral analysis and statistical techniques to the alternating current output voltage of a string and detect the number of failed modules through the changes in the amplitude of the component frequency of 12 kHz. For that, the analyzed string is disconnected of the array; and a small pulsed voltage signal of frequency of 12 kHz introduces him under dark condition and controlled temperature. Previous to the analysis, the signal is analogic filtered in order to reduce the direct current signal component. The spectral analysis technique used is the Fast Fourier Transform. The obtained experimental results were validated through simulation of the alternating current equivalent circuit of a solar cell. In all experimental and simulated test, the method allowed to identify correctly the number of photovoltaic modules with short circuit in the analyzed string. (author)

  9. Using spectral imaging for the analysis of abnormalities for colorectal cancer: When is it helpful?

    Science.gov (United States)

    Awan, Ruqayya; Al-Maadeed, Somaya; Al-Saady, Rafif

    2018-01-01

    The spectral imaging technique has been shown to provide more discriminative information than the RGB images and has been proposed for a range of problems. There are many studies demonstrating its potential for the analysis of histopathology images for abnormality detection but there have been discrepancies among previous studies as well. Many multispectral based methods have been proposed for histopathology images but the significance of the use of whole multispectral cube versus a subset of bands or a single band is still arguable. We performed comprehensive analysis using individual bands and different subsets of bands to determine the effectiveness of spectral information for determining the anomaly in colorectal images. Our multispectral colorectal dataset consists of four classes, each represented by infra-red spectrum bands in addition to the visual spectrum bands. We performed our analysis of spectral imaging by stratifying the abnormalities using both spatial and spectral information. For our experiments, we used a combination of texture descriptors with an ensemble classification approach that performed best on our dataset. We applied our method to another dataset and got comparable results with those obtained using the state-of-the-art method and convolutional neural network based method. Moreover, we explored the relationship of the number of bands with the problem complexity and found that higher number of bands is required for a complex task to achieve improved performance. Our results demonstrate a synergy between infra-red and visual spectrum by improving the classification accuracy (by 6%) on incorporating the infra-red representation. We also highlight the importance of how the dataset should be divided into training and testing set for evaluating the histopathology image-based approaches, which has not been considered in previous studies on multispectral histopathology images.

  10. A Non-Parametric Delphi Approach to Foster Innovation Policy Debate in Spain

    Directory of Open Access Journals (Sweden)

    Juan Carlos Salazar-Elena

    2016-05-01

    Full Text Available The aim of this paper is to identify some changes needed in Spain’s innovation policy to fill the gap between its innovation results and those of other European countries in lieu of sustainable leadership. To do this we apply the Delphi methodology to experts from academia, business, and government. To overcome the shortcomings of traditional descriptive methods, we develop an inferential analysis by following a non-parametric bootstrap method which enables us to identify important changes that should be implemented. Particularly interesting is the support found for improving the interconnections among the relevant agents of the innovation system (instead of focusing exclusively in the provision of knowledge and technological inputs through R and D activities, or the support found for “soft” policy instruments aimed at providing a homogeneous framework to assess the innovation capabilities of firms (e.g., for funding purposes. Attention to potential innovators among small and medium enterprises (SMEs and traditional industries is particularly encouraged by experts.

  11. Nonparametric method for genomics-based prediction of performance of quantitative traits involving epistasis in plant breeding.

    Directory of Open Access Journals (Sweden)

    Xiaochun Sun

    Full Text Available Genomic selection (GS procedures have proven useful in estimating breeding value and predicting phenotype with genome-wide molecular marker information. However, issues of high dimensionality, multicollinearity, and the inability to deal effectively with epistasis can jeopardize accuracy and predictive ability. We, therefore, propose a new nonparametric method, pRKHS, which combines the features of supervised principal component analysis (SPCA and reproducing kernel Hilbert spaces (RKHS regression, with versions for traits with no/low epistasis, pRKHS-NE, to high epistasis, pRKHS-E. Instead of assigning a specific relationship to represent the underlying epistasis, the method maps genotype to phenotype in a nonparametric way, thus requiring fewer genetic assumptions. SPCA decreases the number of markers needed for prediction by filtering out low-signal markers with the optimal marker set determined by cross-validation. Principal components are computed from reduced marker matrix (called supervised principal components, SPC and included in the smoothing spline ANOVA model as independent variables to fit the data. The new method was evaluated in comparison with current popular methods for practicing GS, specifically RR-BLUP, BayesA, BayesB, as well as a newer method by Crossa et al., RKHS-M, using both simulated and real data. Results demonstrate that pRKHS generally delivers greater predictive ability, particularly when epistasis impacts trait expression. Beyond prediction, the new method also facilitates inferences about the extent to which epistasis influences trait expression.

  12. Nonparametric method for genomics-based prediction of performance of quantitative traits involving epistasis in plant breeding.

    Science.gov (United States)

    Sun, Xiaochun; Ma, Ping; Mumm, Rita H

    2012-01-01

    Genomic selection (GS) procedures have proven useful in estimating breeding value and predicting phenotype with genome-wide molecular marker information. However, issues of high dimensionality, multicollinearity, and the inability to deal effectively with epistasis can jeopardize accuracy and predictive ability. We, therefore, propose a new nonparametric method, pRKHS, which combines the features of supervised principal component analysis (SPCA) and reproducing kernel Hilbert spaces (RKHS) regression, with versions for traits with no/low epistasis, pRKHS-NE, to high epistasis, pRKHS-E. Instead of assigning a specific relationship to represent the underlying epistasis, the method maps genotype to phenotype in a nonparametric way, thus requiring fewer genetic assumptions. SPCA decreases the number of markers needed for prediction by filtering out low-signal markers with the optimal marker set determined by cross-validation. Principal components are computed from reduced marker matrix (called supervised principal components, SPC) and included in the smoothing spline ANOVA model as independent variables to fit the data. The new method was evaluated in comparison with current popular methods for practicing GS, specifically RR-BLUP, BayesA, BayesB, as well as a newer method by Crossa et al., RKHS-M, using both simulated and real data. Results demonstrate that pRKHS generally delivers greater predictive ability, particularly when epistasis impacts trait expression. Beyond prediction, the new method also facilitates inferences about the extent to which epistasis influences trait expression.

  13. New parameter of the right gastroepiploic arterial graft using the power spectral analysis device named MemCalc soft.

    Science.gov (United States)

    Uehara, Mayuko; Takagi, Nobuyuki; Muraki, Satoshi; Yanase, Yosuke; Tabuchi, Masaki; Tachibana, Kazutoshi; Miyaki, Yasuko; Ito, Toshiro; Higami, Tetsuya

    2015-12-01

    Transit-time flow measurement (TTFM) parameters such as mean graft flow (MGF, ml/min), pulsatility index (PI) and diastolic filling (DF, %) have been extensively researched for internal mammary arterial or saphenous vein grafts. In our experience of using the right gastroepiploic arterial (GEA) graft for right coronary artery (RCA) grafting, we observed unique GEA graft flow waveforms. We analysed the GEA graft flow waveforms for their effectiveness in determining GEA graft patency by power spectral analysis. Forty-five patients underwent off-pump coronary artery bypass using the GEA graft for RCA grafting individually. The means of intraoperative MGF, PI and DF were compared between patent and non-patent grafts, postoperatively. Furthermore, the GEA flow data were output and analysed using power spectral analysis. Forty grafts were 'patent' and five were 'non-patent'. There were no significant differences in the mean TTFM parameters between the patent and non-patent grafts (MGF: 22 vs 8 ml/min, respectively, P = 0.068; PI: 3.5 vs 6.5, respectively, P = 0.155; DF: 63 vs 53%, respectively, P = 0.237). Results of the power spectral analysis presented clear differences; the power spectral density (PSD) of patent grafts presented high peaks at frequency levels of 1, 2 and 3 Hz, and the non-patent graft PSD presented high peaks that were not limited to these frequencies. The PSD had a sensitivity and specificity of 80 and 87.5%, respectively. Power spectral analysis of the GEA graft flow is useful to distinguish between non-patent and patent grafts intraoperatively. This should be used as a fourth parameter along with MGF, PI and DF. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  14. Characterizing CDOM Spectral Variability Across Diverse Regions and Spectral Ranges

    Science.gov (United States)

    Grunert, Brice K.; Mouw, Colleen B.; Ciochetto, Audrey B.

    2018-01-01

    Satellite remote sensing of colored dissolved organic matter (CDOM) has focused on CDOM absorption (aCDOM) at a reference wavelength, as its magnitude provides insight into the underwater light field and large-scale biogeochemical processes. CDOM spectral slope, SCDOM, has been treated as a constant or semiconstant parameter in satellite retrievals of aCDOM despite significant regional and temporal variabilities. SCDOM and other optical metrics provide insights into CDOM composition, processing, food web dynamics, and carbon cycling. To date, much of this work relies on fluorescence techniques or aCDOM in spectral ranges unavailable to current and planned satellite sensors (e.g., global variability in SCDOM and fit deviations in the aCDOM spectra using the recently proposed Gaussian decomposition method. From this, we investigate if global variability in retrieved SCDOM and Gaussian components is significant and regionally distinct. We iteratively decreased the spectral range considered and analyzed the number, location, and magnitude of fitted Gaussian components to understand if a reduced spectral range impacts information obtained within a common spectral window. We compared the fitted slope from the Gaussian decomposition method to absorption-based indices that indicate CDOM composition to determine the ability of satellite-derived slope to inform the analysis and modeling of large-scale biogeochemical processes. Finally, we present implications of the observed variability for remote sensing of CDOM characteristics via SCDOM.

  15. Spectral analysis of Jupiter kilometric radio emissions during the Ulysses flyby

    Science.gov (United States)

    Echer, M. P. D. S.; Echer, E.; Gonzalez, W.; Magalães, F. P.

    2016-12-01

    In this work we analyze Ulysses URAP kilometric radio data during Ulysses Jupiter flyby. The interval selected for analysis was from October 1991 to February 1992. URAP 10-min averages of auroral (bkom) and torus (nkom) radio data are used. The wavelet and iterative regression spectral analyses techniques are employed on both data set. The results obtained will enable us to determine the major frequencies present in the auroral and torus data and study their similar and different periodicities.

  16. Separating environmental efficiency into production and abatement efficiency. A nonparametric model with application to U.S. power plants

    Energy Technology Data Exchange (ETDEWEB)

    Hampf, Benjamin

    2011-08-15

    In this paper we present a new approach to evaluate the environmental efficiency of decision making units. We propose a model that describes a two-stage process consisting of a production and an end-of-pipe abatement stage with the environmental efficiency being determined by the efficiency of both stages. Taking the dependencies between the two stages into account, we show how nonparametric methods can be used to measure environmental efficiency and to decompose it into production and abatement efficiency. For an empirical illustration we apply our model to an analysis of U.S. power plants.

  17. Identification of neuronal network properties from the spectral analysis of calcium imaging signals in neuronal cultures.

    Science.gov (United States)

    Tibau, Elisenda; Valencia, Miguel; Soriano, Jordi

    2013-01-01

    Neuronal networks in vitro are prominent systems to study the development of connections in living neuronal networks and the interplay between connectivity, activity and function. These cultured networks show a rich spontaneous activity that evolves concurrently with the connectivity of the underlying network. In this work we monitor the development of neuronal cultures, and record their activity using calcium fluorescence imaging. We use spectral analysis to characterize global dynamical and structural traits of the neuronal cultures. We first observe that the power spectrum can be used as a signature of the state of the network, for instance when inhibition is active or silent, as well as a measure of the network's connectivity strength. Second, the power spectrum identifies prominent developmental changes in the network such as GABAA switch. And third, the analysis of the spatial distribution of the spectral density, in experiments with a controlled disintegration of the network through CNQX, an AMPA-glutamate receptor antagonist in excitatory neurons, reveals the existence of communities of strongly connected, highly active neurons that display synchronous oscillations. Our work illustrates the interest of spectral analysis for the study of in vitro networks, and its potential use as a network-state indicator, for instance to compare healthy and diseased neuronal networks.

  18. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  19. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  20. Spectral gamuts and spectral gamut mapping

    Science.gov (United States)

    Rosen, Mitchell R.; Derhak, Maxim W.

    2006-01-01

    All imaging devices have two gamuts: the stimulus gamut and the response gamut. The response gamut of a print engine is typically described in CIE colorimetry units, a system derived to quantify human color response. More fundamental than colorimetric gamuts are spectral gamuts, based on radiance, reflectance or transmittance units. Spectral gamuts depend on the physics of light or on how materials interact with light and do not involve the human's photoreceptor integration or brain processing. Methods for visualizing a spectral gamut raise challenges as do considerations of how to utilize such a data-set for producing superior color reproductions. Recent work has described a transformation of spectra reduced to 6-dimensions called LabPQR. LabPQR was designed as a hybrid space with three explicit colorimetric axes and three additional spectral reconstruction axes. In this paper spectral gamuts are discussed making use of LabPQR. Also, spectral gamut mapping is considered in light of the colorimetric-spectral duality of the LabPQR space.