WorldWideScience

Sample records for nonparametric spectral analysis

  1. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  2. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  3. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  4. Nonparametric spectral-based estimation of latent structures

    OpenAIRE

    Bonhomme, Stéphane; Jochmans, Koen; Robin, Jean-Marc

    2014-01-01

    We present a constructive identification proof of p-linear decompositions of q-way arrays. The analysis is based on the joint spectral decomposition of a set of matrices. It has applications in the analysis of a variety of latent-structure models, such as q-variate mixtures of p distributions. As such, our results provide a constructive alternative to Allman, Matias and Rhodes [2009]. The identification argument suggests a joint approximate-diagonalization estimator that is easy to implement ...

  5. Nonparametric Bayes analysis of social science data

    Science.gov (United States)

    Kunihama, Tsuyoshi

    Social science data often contain complex characteristics that standard statistical methods fail to capture. Social surveys assign many questions to respondents, which often consist of mixed-scale variables. Each of the variables can follow a complex distribution outside parametric families and associations among variables may have more complicated structures than standard linear dependence. Therefore, it is not straightforward to develop a statistical model which can approximate structures well in the social science data. In addition, many social surveys have collected data over time and therefore we need to incorporate dynamic dependence into the models. Also, it is standard to observe massive number of missing values in the social science data. To address these challenging problems, this thesis develops flexible nonparametric Bayesian methods for the analysis of social science data. Chapter 1 briefly explains backgrounds and motivations of the projects in the following chapters. Chapter 2 develops a nonparametric Bayesian modeling of temporal dependence in large sparse contingency tables, relying on a probabilistic factorization of the joint pmf. Chapter 3 proposes nonparametric Bayes inference on conditional independence with conditional mutual information used as a measure of the strength of conditional dependence. Chapter 4 proposes a novel Bayesian density estimation method in social surveys with complex designs where there is a gap between sample and population. We correct for the bias by adjusting mixture weights in Bayesian mixture models. Chapter 5 develops a nonparametric model for mixed-scale longitudinal surveys, in which various types of variables can be induced through latent continuous variables and dynamic latent factors lead to flexibly time-varying associations among variables.

  6. Spectral decompositions of multiple time series: a Bayesian non-parametric approach.

    Science.gov (United States)

    Macaro, Christian; Prado, Raquel

    2014-01-01

    We consider spectral decompositions of multiple time series that arise in studies where the interest lies in assessing the influence of two or more factors. We write the spectral density of each time series as a sum of the spectral densities associated to the different levels of the factors. We then use Whittle's approximation to the likelihood function and follow a Bayesian non-parametric approach to obtain posterior inference on the spectral densities based on Bernstein-Dirichlet prior distributions. The prior is strategically important as it carries identifiability conditions for the models and allows us to quantify our degree of confidence in such conditions. A Markov chain Monte Carlo (MCMC) algorithm for posterior inference within this class of frequency-domain models is presented.We illustrate the approach by analyzing simulated and real data via spectral one-way and two-way models. In particular, we present an analysis of functional magnetic resonance imaging (fMRI) brain responses measured in individuals who participated in a designed experiment to study pain perception in humans.

  7. Local Component Analysis for Nonparametric Bayes Classifier

    CERN Document Server

    Khademi, Mahmoud; safayani, Meharn

    2010-01-01

    The decision boundaries of Bayes classifier are optimal because they lead to maximum probability of correct decision. It means if we knew the prior probabilities and the class-conditional densities, we could design a classifier which gives the lowest probability of error. However, in classification based on nonparametric density estimation methods such as Parzen windows, the decision regions depend on the choice of parameters such as window width. Moreover, these methods suffer from curse of dimensionality of the feature space and small sample size problem which severely restricts their practical applications. In this paper, we address these problems by introducing a novel dimension reduction and classification method based on local component analysis. In this method, by adopting an iterative cross-validation algorithm, we simultaneously estimate the optimal transformation matrices (for dimension reduction) and classifier parameters based on local information. The proposed method can classify the data with co...

  8. Spectral Analysis

    CERN Document Server

    Cecconi, Jaures

    2011-01-01

    G. Bottaro: Quelques resultats d'analyse spectrale pour des operateurs differentiels a coefficients constants sur des domaines non bornes.- L. Garding: Eigenfuction expansions.- C. Goulaouic: Valeurs propres de problemes aux limites irreguliers: applications.- G. Grubb: Essential spectra of elliptic systems on compact manifolds.- J.Cl. Guillot: Quelques resultats recents en Scattering.- N. Schechter: Theory of perturbations of partial differential operators.- C.H. Wilcox: Spectral analysis of the Laplacian with a discontinuous coefficient.

  9. A Nonparametric Analogy of Analysis of Covariance

    Science.gov (United States)

    Burnett, Thomas D.; Barr, Donald R.

    1977-01-01

    A nonparametric test of the hypothesis of no treatment effect is suggested for a situation where measures of the severity of the condition treated can be obtained and ranked both pre- and post-treatment. The test allows the pre-treatment rank to be used as a concomitant variable. (Author/JKS)

  10. Lottery spending: a non-parametric analysis.

    Science.gov (United States)

    Garibaldi, Skip; Frisoli, Kayla; Ke, Li; Lim, Melody

    2015-01-01

    We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  11. Lottery spending: a non-parametric analysis.

    Directory of Open Access Journals (Sweden)

    Skip Garibaldi

    Full Text Available We analyze the spending of individuals in the United States on lottery tickets in an average month, as reported in surveys. We view these surveys as sampling from an unknown distribution, and we use non-parametric methods to compare properties of this distribution for various demographic groups, as well as claims that some properties of this distribution are constant across surveys. We find that the observed higher spending by Hispanic lottery players can be attributed to differences in education levels, and we dispute previous claims that the top 10% of lottery players consistently account for 50% of lottery sales.

  12. Investigating the cultural patterns of corruption: A nonparametric analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2011-01-01

    By using a sample of 77 countries our analysis applies several nonparametric techniques in order to reveal the link between national culture and corruption. Based on Hofstede’s cultural dimensions and the corruption perception index, the results reveal that countries with higher levels of corruption tend to have higher power distance and collectivism values in their society.

  13. A Bayesian Nonparametric Meta-Analysis Model

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G.

    2015-01-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall…

  14. Spatial and Spectral Nonparametric Linear Feature Extraction Method for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Jinn-Min Yang

    2016-11-01

    Full Text Available Feature extraction (FE or dimensionality reduction (DR plays quite an important role in the field of pattern recognition. Feature extraction aims to reduce the dimensionality of the high-dimensional dataset to enhance the classification accuracy and foster the classification speed, particularly when the training sample size is small, namely the small sample size (SSS problem. Remotely sensed hyperspectral images (HSIs are often with hundreds of measured features (bands which potentially provides more accurate and detailed information for classification, but it generally needs more samples to estimate parameters to achieve a satisfactory result. The cost of collecting ground-truth of remotely sensed hyperspectral scene can be considerably difficult and expensive. Therefore, FE techniques have been an important part for hyperspectral image classification. Unlike lots of feature extraction methods are based only on the spectral (band information of the training samples, some feature extraction methods integrating both spatial and spectral information of training samples show more effective results in recent years. Spatial contexture information has been proven to be useful to improve the HSI data representation and to increase classification accuracy. In this paper, we propose a spatial and spectral nonparametric linear feature extraction method for hyperspectral image classification. The spatial and spectral information is extracted for each training sample and used to design the within-class and between-class scatter matrices for constructing the feature extraction model. The experimental results on one benchmark hyperspectral image demonstrate that the proposed method obtains stable and satisfactory results than some existing spectral-based feature extraction.

  15. Nonparametric inference procedures for multistate life table analysis.

    Science.gov (United States)

    Dow, M M

    1985-01-01

    Recent generalizations of the classical single state life table procedures to the multistate case provide the means to analyze simultaneously the mobility and mortality experience of 1 or more cohorts. This paper examines fairly general nonparametric combinatorial matrix procedures, known as quadratic assignment, as an analysis technic of various transitional patterns commonly generated by cohorts over the life cycle course. To some degree, the output from a multistate life table analysis suggests inference procedures. In his discussion of multstate life table construction features, the author focuses on the matrix formulation of the problem. He then presents several examples of the proposed nonparametric procedures. Data for the mobility and life expectancies at birth matrices come from the 458 member Cayo Santiago rhesus monkey colony. The author's matrix combinatorial approach to hypotheses testing may prove to be a useful inferential strategy in several multidimensional demographic areas.

  16. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  17. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    -Douglas function nor the Translog function are consistent with the “true” relationship between the inputs and the output in our data set. We solve this problem by using non-parametric regression. This approach delivers reasonable results, which are on average not too different from the results of the parametric......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  18. Reducing Poisson noise and baseline drift in X-ray spectral images with bootstrap Poisson regression and robust nonparametric regression

    CERN Document Server

    Zhu, Feng; Feng, Weiyue; Wang, Huajian; Huang, Shaosen; Lv, Yisong; Chen, Yong

    2013-01-01

    X-ray spectral imaging provides quantitative imaging of trace elements in biological sample with high sensitivity. We propose a novel algorithm to promote the signal-to-noise ratio (SNR) of X-ray spectral images that have low photon counts. Firstly, we estimate the image data area that belongs to the homogeneous parts through confidence interval testing. Then, we apply the Poisson regression through its maximum likelihood estimation on this area to estimate the true photon counts from the Poisson noise corrupted data. Unlike other denoising methods based on regression analysis, we use the bootstrap resampling methods to ensure the accuracy of regression estimation. Finally, we use a robust local nonparametric regression method to estimate the baseline and subsequently subtract it from the X-ray spectral data to further improve the SNR of the data. Experiments on several real samples show that the proposed method performs better than some state-of-the-art approaches to ensure accuracy and precision for quantit...

  19. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas a......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... to estimate production functions without the specification of a functional form. Therefore, they avoid possible misspecification errors due to the use of an unsuitable functional form. In this paper, we use parametric and non-parametric methods to identify the optimal size of Polish crop farms...

  20. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  1. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb......-Douglas or the Translog production function is used. However, the specification of a functional form for the production function involves the risk of specifying a functional form that is not similar to the “true” relationship between the inputs and the output. This misspecification might result in biased estimation...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  2. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    This thesis seeks to develop methodologies for assessment of agricultural efficiency and employ them to Lithuanian family farms. In particular, we focus on three particular objectives throughout the research: (i) to perform a fully non-parametric analysis of efficiency effects, (ii) to extend...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...... of stochasticity associated with Lithuanian family farm performance. The former technique showed that the farms differed in terms of the mean values and variance of the efficiency scores over time with some clear patterns prevailing throughout the whole research period. The fuzzy Free Disposal Hull showed...

  3. Bayesian nonparametric meta-analysis using Polya tree mixture models.

    Science.gov (United States)

    Branscum, Adam J; Hanson, Timothy E

    2008-09-01

    Summary. A common goal in meta-analysis is estimation of a single effect measure using data from several studies that are each designed to address the same scientific inquiry. Because studies are typically conducted in geographically disperse locations, recent developments in the statistical analysis of meta-analytic data involve the use of random effects models that account for study-to-study variability attributable to differences in environments, demographics, genetics, and other sources that lead to heterogeneity in populations. Stemming from asymptotic theory, study-specific summary statistics are modeled according to normal distributions with means representing latent true effect measures. A parametric approach subsequently models these latent measures using a normal distribution, which is strictly a convenient modeling assumption absent of theoretical justification. To eliminate the influence of overly restrictive parametric models on inferences, we consider a broader class of random effects distributions. We develop a novel hierarchical Bayesian nonparametric Polya tree mixture (PTM) model. We present methodology for testing the PTM versus a normal random effects model. These methods provide researchers a straightforward approach for conducting a sensitivity analysis of the normality assumption for random effects. An application involving meta-analysis of epidemiologic studies designed to characterize the association between alcohol consumption and breast cancer is presented, which together with results from simulated data highlight the performance of PTMs in the presence of nonnormality of effect measures in the source population.

  4. Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders

    DEFF Research Database (Denmark)

    Nielsen, Morten Ørregaard

    2009-01-01

    In this paper a nonparametric variance ratio testing approach is proposed for determining the number of cointegrating relations in fractionally integrated systems. The test statistic is easily calculated without prior knowledge of the integration order of the data, the strength of the cointegrating...

  5. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move b...... but that this dependence vanishes after 2-3 years....

  6. Non-parametric analysis of rating transition and default data

    DEFF Research Database (Denmark)

    Fledelius, Peter; Lando, David; Perch Nielsen, Jens

    2004-01-01

    We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move...

  7. Poverty and life cycle effects: A nonparametric analysis for Germany

    OpenAIRE

    Stich, Andreas

    1996-01-01

    Most empirical studies on poverty consider the extent of poverty either for the entire society or for separate groups like elderly people.However, these papers do not show what the situation looks like for persons of a certain age. In this paper poverty measures depending on age are derived using the joint density of income and age. The density is nonparametrically estimated by weighted Gaussian kernel density estimation. Applying the conditional density of income to several poverty measures ...

  8. ANALYSIS OF TIED DATA: AN ALTERNATIVE NON-PARAMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    I. C. A. OYEKA

    2012-02-01

    Full Text Available This paper presents a non-parametric statistical method of analyzing two-sample data that makes provision for the possibility of ties in the data. A test statistic is developed and shown to be free of the effect of any possible ties in the data. An illustrative example is provided and the method is shown to compare favourably with its competitor; the Mann-Whitney test and is more powerful than the latter when there are ties.

  9. A Bayesian nonparametric method for prediction in EST analysis

    Directory of Open Access Journals (Sweden)

    Prünster Igor

    2007-09-01

    Full Text Available Abstract Background Expressed sequence tags (ESTs analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b the number of new unique genes to be observed in a future sample; c the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample.

  10. Central limit theorem of nonparametric estimate of spectral density functions of sample covariance matrices

    CERN Document Server

    Pan, Guangming; Zhou, Wang

    2010-01-01

    A consistent kernel estimator of the limiting spectral distribution of general sample covariance matrices was introduced in Jing, Pan, Shao and Zhou (2010). The central limit theorem of the kernel estimator is proved in this paper.

  11. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  12. Nonparametric estimate of spectral density functions of sample covariance matrices: A first step

    OpenAIRE

    2012-01-01

    The density function of the limiting spectral distribution of general sample covariance matrices is usually unknown. We propose to use kernel estimators which are proved to be consistent. A simulation study is also conducted to show the performance of the estimators.

  13. Local kernel nonparametric discriminant analysis for adaptive extraction of complex structures

    Science.gov (United States)

    Li, Quanbao; Wei, Fajie; Zhou, Shenghan

    2017-05-01

    The linear discriminant analysis (LDA) is one of popular means for linear feature extraction. It usually performs well when the global data structure is consistent with the local data structure. Other frequently-used approaches of feature extraction usually require linear, independence, or large sample condition. However, in real world applications, these assumptions are not always satisfied or cannot be tested. In this paper, we introduce an adaptive method, local kernel nonparametric discriminant analysis (LKNDA), which integrates conventional discriminant analysis with nonparametric statistics. LKNDA is adept in identifying both complex nonlinear structures and the ad hoc rule. Six simulation cases demonstrate that LKNDA have both parametric and nonparametric algorithm advantages and higher classification accuracy. Quartic unilateral kernel function may provide better robustness of prediction than other functions. LKNDA gives an alternative solution for discriminant cases of complex nonlinear feature extraction or unknown feature extraction. At last, the application of LKNDA in the complex feature extraction of financial market activities is proposed.

  14. Non-parametric production analysis of pesticides use in the Netherlands

    NARCIS (Netherlands)

    Oude Lansink, A.G.J.M.; Silva, E.

    2004-01-01

    Many previous empirical studies on the productivity of pesticides suggest that pesticides are under-utilized in agriculture despite the general held believe that these inputs are substantially over-utilized. This paper uses data envelopment analysis (DEA) to calculate non-parametric measures of the

  15. Tremor Detection Using Parametric and Non-Parametric Spectral Estimation Methods : A Comparison with Clinical Assessment

    NARCIS (Netherlands)

    Martinez Manzanera, Octavio; Elting, Jan Willem; van der Hoeven, Johannes H; Maurits, Natasha M

    2016-01-01

    In the clinic, tremor is diagnosed during a time-limited process in which patients are observed and the characteristics of tremor are visually assessed. For some tremor disorders, a more detailed analysis of these characteristics is needed. Accelerometry and electromyography can be used to obtain a

  16. Comparison of Rank Analysis of Covariance and Nonparametric Randomized Blocks Analysis.

    Science.gov (United States)

    Porter, Andrew C.; McSweeney, Maryellen

    The relative power of three possible experimental designs under the condition that data is to be analyzed by nonparametric techniques; the comparison of the power of each nonparametric technique to its parametric analogue; and the comparison of relative powers using nonparametric and parametric techniques are discussed. The three nonparametric…

  17. Spline Nonparametric Regression Analysis of Stress-Strain Curve of Confined Concrete

    Directory of Open Access Journals (Sweden)

    Tavio Tavio

    2008-01-01

    Full Text Available Due to enormous uncertainties in confinement models associated with the maximum compressive strength and ductility of concrete confined by rectilinear ties, the implementation of spline nonparametric regression analysis is proposed herein as an alternative approach. The statistical evaluation is carried out based on 128 large-scale column specimens of either normal-or high-strength concrete tested under uniaxial compression. The main advantage of this kind of analysis is that it can be applied when the trend of relation between predictor and response variables are not obvious. The error in the analysis can, therefore, be minimized so that it does not depend on the assumption of a particular shape of the curve. This provides higher flexibility in the application. The results of the statistical analysis indicates that the stress-strain curves of confined concrete obtained from the spline nonparametric regression analysis proves to be in good agreement with the experimental curves available in literatures

  18. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...... to avoid this problem. The main objective is to investigate the applicability of the nonparametric kernel regression method in applied production analysis. The focus of the empirical analyses included in this thesis is the agricultural sector in Poland. Data on Polish farms are used to investigate...... practically and politically relevant problems and to illustrate how nonparametric regression methods can be used in applied microeconomic production analysis both in panel data and cross-section data settings. The thesis consists of four papers. The first paper addresses problems of parametric...

  19. Rapid spectral analysis for spectral imaging.

    Science.gov (United States)

    Jacques, Steven L; Samatham, Ravikant; Choudhury, Niloy

    2010-07-15

    Spectral imaging requires rapid analysis of spectra associated with each pixel. A rapid algorithm has been developed that uses iterative matrix inversions to solve for the absorption spectra of a tissue using a lookup table for photon pathlength based on numerical simulations. The algorithm uses tissue water content as an internal standard to specify the strength of optical scattering. An experimental example is presented on the spectroscopy of portwine stain lesions. When implemented in MATLAB, the method is ~100-fold faster than using fminsearch().

  20. SPECTRAL ANALYSIS OF RADIOXENON

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Matthew W.; Bowyer, Ted W.; Hayes, James C.; Heimbigner, Tom R.; Hubbard, Charles W.; McIntyre, Justin I.; Schrom, Brian T.

    2008-09-23

    Monitoring changes in atmospheric radioxenon concentrations is a major tool in the detection of an underground nuclear explosion. Ground based systems like the Automated Radioxenon Sampler /Analyzer (ARSA), the Swedish Unattended Noble gas Analyzer (SAUNA) and the Automatic portable radiometer of isotopes Xe (ARIX), can collect and detect several radioxenon isotopes by processing and transferring samples into a high efficiency beta-gamma coincidence detector. The high efficiency beta-gamma coincidence detector makes these systems highly sensitive to the radioxenon isotopes 133Xe, 131mXe, 133mXe and 135Xe. The standard analysis uses regions of interest (ROI) to determine the amount of a particular radioxenon isotope present. The ROI method relies on the peaks of interest falling within energy limits of the ROI. Some potential problems inherent in this method are the reliance on stable detector gains and a fixed resolution for each energy peak. In addition, when a high activity sample is measured there will be more interference among the ROI, in particular within the 133Xe, 133mXe, and 131mXe regions. A solution to some of these problems can be obtained through spectral fitting of the data. Spectral fitting is simply the fitting of the peaks using known functions to determine the number and relative peak positions and widths. By knowing this information it is possible to determine which isotopes are present. Area under each peak can then be used to determine an overall concentration for each isotope. Using the areas of the peaks several key detector characteristics can be determined: efficiency, energy calibration, energy resolution and ratios between interfering isotopes (Radon daughters).

  1. Multilevel Latent Class Analysis: Parametric and Nonparametric Models

    Science.gov (United States)

    Finch, W. Holmes; French, Brian F.

    2014-01-01

    Latent class analysis is an analytic technique often used in educational and psychological research to identify meaningful groups of individuals within a larger heterogeneous population based on a set of variables. This technique is flexible, encompassing not only a static set of variables but also longitudinal data in the form of growth mixture…

  2. Applications of non-parametric statistics and analysis of variance on sample variances

    Science.gov (United States)

    Myers, R. H.

    1981-01-01

    Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.

  3. The Use of Nonparametric Kernel Regression Methods in Econometric Production Analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard

    This PhD thesis addresses one of the fundamental problems in applied econometric analysis, namely the econometric estimation of regression functions. The conventional approach to regression analysis is the parametric approach, which requires the researcher to specify the form of the regression...... function. However, the a priori specification of a functional form involves the risk of choosing one that is not similar to the “true” but unknown relationship between the regressors and the dependent variable. This problem, known as parametric misspecification, can result in biased parameter estimates...... and nonparametric estimations of production functions in order to evaluate the optimal firm size. The second paper discusses the use of parametric and nonparametric regression methods to estimate panel data regression models. The third paper analyses production risk, price uncertainty, and farmers' risk preferences...

  4. Multi-Directional Non-Parametric Analysis of Agricultural Efficiency

    DEFF Research Database (Denmark)

    Balezentis, Tomas

    the Multi-Directional Efficiency Analysis approach, (iii) to account for uncertainties via the use of probabilistic and fuzzy measures. Therefore, the thesis encompass six papers dedicated to (the combinations of) these objectives. One of the main contributions of this thesis is a number of extensions...... relative to labour, intermediate consumption and land (in some cases land was not treated as a discretionary input). These findings call for further research on relationships among financial structure, investment decisions, and efficiency in Lithuanian family farms. Application of different techniques...

  5. SPECTRAL ANALYSIS OF EXCHANGE RATES

    Directory of Open Access Journals (Sweden)

    ALEŠA LOTRIČ DOLINAR

    2013-06-01

    Full Text Available Using spectral analysis is very common in technical areas but rather unusual in economics and finance, where ARIMA and GARCH modeling are much more in use. To show that spectral analysis can be useful in determining hidden periodic components for high-frequency finance data as well, we use the example of foreign exchange rates

  6. Timescale Analysis of Spectral Lags

    Institute of Scientific and Technical Information of China (English)

    Ti-Pei Li; Jin-Lu Qu; Hua Feng; Li-Ming Song; Guo-Qiang Ding; Li Chen

    2004-01-01

    A technique for timescale analysis of spectral lags performed directly in the time domain is developed. Simulation studies are made to compare the time domain technique with the Fourier frequency analysis for spectral time lags. The time domain technique is applied to studying rapid variabilities of X-ray binaries and γ-ray bursts. The results indicate that in comparison with the Fourier analysis the timescale analysis technique is more powerful for the study of spectral lags in rapid variabilities on short time scales and short duration flaring phenomena.

  7. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  8. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  9. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering...... the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...

  10. Spectral Analysis of Markov Chains

    OpenAIRE

    2007-01-01

    The paper deals with the problem of a statistical analysis of Markov chains connected with the spectral density. We present the expressions for the function of spectral density. These expressions may be used to estimate the parameter of the Markov chain.

  11. Non-parametric analysis of infrared spectra for recognition of glass and glass ceramic fragments in recycling plants.

    Science.gov (United States)

    Farcomeni, Alessio; Serranti, Silvia; Bonifazi, Giuseppe

    2008-01-01

    Glass ceramic detection in glass recycling plants represents a still unsolved problem, as glass ceramic material looks like normal glass and is usually detected only by specialized personnel. The presence of glass-like contaminants inside waste glass products, resulting from both industrial and differentiated urban waste collection, increases process production costs and reduces final product quality. In this paper an innovative approach for glass ceramic recognition, based on the non-parametric analysis of infrared spectra, is proposed and investigated. The work was specifically addressed to the spectral classification of glass and glass ceramic fragments collected in an actual recycling plant from three different production lines: flat glass, colored container-glass and white container-glass. The analyses, carried out in the near and mid-infrared (NIR-MIR) spectral field (1280-4480 nm), show that glass ceramic and glass fragments can be recognized by applying a wavelet transform, with a small classification error. Moreover, a method for selecting only a small subset of relevant wavelength ratios is suggested, allowing the conduct of a fast recognition of the two classes of materials. The results show how the proposed approach can be utilized to develop a classification engine to be integrated inside a hardware and software sorting architecture for fast "on-line" ceramic glass recognition and separation.

  12. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    Science.gov (United States)

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  13. Nonparametric Bayesian Dictionary Learning for Analysis of Noisy and Incomplete Images

    Science.gov (United States)

    2010-04-01

    OF EACH CELL ARE RESULTS OF KSVD AND BPFA, RESPECTIVELY. σ C.man House Peppers Lena Barbara Boats F.print Couple Hill 5 37.87 39.37 37.78 38.60 38.08...INTERPOLATION PSNR RESULTS, USING PATCH SIZE 8× 8. BOTTOM: BPFA RGB IMAGE INTERPOLATION PSNR RESULTS, USING PATCH SIZE 7× 7. data ratio C.man House Peppers Lena...of subspaces. IEEE Trans. Inform. Theory, 2009. [16] T. Ferguson . A Bayesian analysis of some nonparametric problems. Annals of Statistics, 1:209–230

  14. An adaptive nonparametric method in benchmark analysis for bioassay and environmental studies.

    Science.gov (United States)

    Bhattacharya, Rabi; Lin, Lizhen

    2010-12-01

    We present a novel nonparametric method for bioassay and benchmark analysis in risk assessment, which averages isotonic MLEs based on disjoint subgroups of dosages. The asymptotic theory for the methodology is derived, showing that the MISEs (mean integrated squared error) of the estimates of both the dose-response curve F and its inverse F(-1) achieve the optimal rate O(N(-4/5)). Also, we compute the asymptotic distribution of the estimate ζ~p of the effective dosage ζ(p) = F(-1) (p) which is shown to have an optimally small asymptotic variance.

  15. Evolution of the CMB Power Spectrum Across WMAP Data Releases: A Nonparametric Analysis

    CERN Document Server

    Aghamousa, Amir; Souradeep, Tarun

    2011-01-01

    We present a comparative analysis of the WMAP 1-, 3-, 5-, and 7-year data releases for the CMB angular power spectrum, with respect to the following three key questions: (a) How well is the angular power spectrum determined by the data alone? (b) How well is the Lambda-CDM model supported by a model-independent, data-driven analysis? (c) What are the realistic uncertainties on peak/dip locations and heights? Our analysis is based on a nonparametric function estimation methodology [1,2]. Our results show that the height of the power spectrum is well determined by data alone for multipole index l approximately less than 600 (1-year), 800 (3-year), and 900 (5- and 7-year data realizations). We also show that parametric fits based on the Lambda-CDM model are remarkably close to our nonparametric fit in l-regions where the data are sufficiently precise. A contrasting example is provided by an H-Lambda-CDM model: As the data become precise with successive data realizations, the H-Lambda-CDM angular power spectrum g...

  16. Bayesian Nonparametric Regression Analysis of Data with Random Effects Covariates from Longitudinal Measurements

    KAUST Repository

    Ryu, Duchwan

    2010-09-28

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves. © 2010, The International Biometric Society.

  17. Bayesian nonparametric regression analysis of data with random effects covariates from longitudinal measurements.

    Science.gov (United States)

    Ryu, Duchwan; Li, Erning; Mallick, Bani K

    2011-06-01

    We consider nonparametric regression analysis in a generalized linear model (GLM) framework for data with covariates that are the subject-specific random effects of longitudinal measurements. The usual assumption that the effects of the longitudinal covariate processes are linear in the GLM may be unrealistic and if this happens it can cast doubt on the inference of observed covariate effects. Allowing the regression functions to be unknown, we propose to apply Bayesian nonparametric methods including cubic smoothing splines or P-splines for the possible nonlinearity and use an additive model in this complex setting. To improve computational efficiency, we propose the use of data-augmentation schemes. The approach allows flexible covariance structures for the random effects and within-subject measurement errors of the longitudinal processes. The posterior model space is explored through a Markov chain Monte Carlo (MCMC) sampler. The proposed methods are illustrated and compared to other approaches, the "naive" approach and the regression calibration, via simulations and by an application that investigates the relationship between obesity in adulthood and childhood growth curves.

  18. Towards Nonstationary, Nonparametric Independent Process Analysis with Unknown Source Component Dimensions

    CERN Document Server

    Szabo, Zoltan

    2010-01-01

    The goal of this paper is to extend independent subspace analysis (ISA) to the case of (i) nonparametric, not strictly stationary source dynamics and (ii) unknown source component dimensions. We make use of functional autoregressive (fAR) processes to model the temporal evolution of the hidden sources. An extension of the ISA separation principle--which states that the ISA problem can be solved by traditional independent component analysis (ICA) and clustering of the ICA elements--is derived for the solution of the defined fAR independent process analysis task (fAR-IPA): applying fAR identification we reduce the problem to ISA. A local averaging approach, the Nadaraya-Watson kernel regression technique is adapted to obtain strongly consistent fAR estimation. We extend the Amari-index to different dimensional components and illustrate the efficiency of the fAR-IPA approach by numerical examples.

  19. Substitution dynamical systems spectral analysis

    CERN Document Server

    Queffélec, Martine

    2010-01-01

    This volume mainly deals with the dynamics of finitely valued sequences, and more specifically, of sequences generated by substitutions and automata. Those sequences demonstrate fairly simple combinatorical and arithmetical properties and naturally appear in various domains. As the title suggests, the aim of the initial version of this book was the spectral study of the associated dynamical systems: the first chapters consisted in a detailed introduction to the mathematical notions involved, and the description of the spectral invariants followed in the closing chapters. This approach, combined with new material added to the new edition, results in a nearly self-contained book on the subject. New tools - which have also proven helpful in other contexts - had to be developed for this study. Moreover, its findings can be concretely applied, the method providing an algorithm to exhibit the spectral measures and the spectral multiplicity, as is demonstrated in several examples. Beyond this advanced analysis, many...

  20. A Level Set Analysis and A Nonparametric Regression on S&P 500 Daily Return

    Directory of Open Access Journals (Sweden)

    Yipeng Yang

    2016-02-01

    Full Text Available In this paper, a level set analysis is proposed which aims to analyze the S&P 500 return with a certain magnitude. It is found that the process of large jumps/drops of return tend to have negative serial correlation, and volatility clustering phenomenon can be easily seen. Then, a nonparametric analysis is performed and new patterns are discovered. An ARCH model is constructed based on the patterns we discovered and it is capable of manifesting the volatility skew in option pricing. A comparison of our model with the GARCH(1,1 model is carried out. The explanation of the validity on our model through prospect theory is provided, and, as a novelty, we linked the volatility skew phenomenon to the prospect theory in behavioral finance.

  1. Non-parametric frequency analysis of extreme values for integrated disaster management considering probable maximum events

    Science.gov (United States)

    Takara, K. T.

    2015-12-01

    This paper describes a non-parametric frequency analysis method for hydrological extreme-value samples with a size larger than 100, verifying the estimation accuracy with a computer intensive statistics (CIS) resampling such as the bootstrap. Probable maximum values are also incorporated into the analysis for extreme events larger than a design level of flood control. Traditional parametric frequency analysis methods of extreme values include the following steps: Step 1: Collecting and checking extreme-value data; Step 2: Enumerating probability distributions that would be fitted well to the data; Step 3: Parameter estimation; Step 4: Testing goodness of fit; Step 5: Checking the variability of quantile (T-year event) estimates by the jackknife resampling method; and Step_6: Selection of the best distribution (final model). The non-parametric method (NPM) proposed here can skip Steps 2, 3, 4 and 6. Comparing traditional parameter methods (PM) with the NPM, this paper shows that PM often underestimates 100-year quantiles for annual maximum rainfall samples with records of more than 100 years. Overestimation examples are also demonstrated. The bootstrap resampling can do bias correction for the NPM and can also give the estimation accuracy as the bootstrap standard error. This NPM has advantages to avoid various difficulties in above-mentioned steps in the traditional PM. Probable maximum events are also incorporated into the NPM as an upper bound of the hydrological variable. Probable maximum precipitation (PMP) and probable maximum flood (PMF) can be a new parameter value combined with the NPM. An idea how to incorporate these values into frequency analysis is proposed for better management of disasters that exceed the design level. The idea stimulates more integrated approach by geoscientists and statisticians as well as encourages practitioners to consider the worst cases of disasters in their disaster management planning and practices.

  2. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    Science.gov (United States)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased.

  3. Enveloping Spectral Surfaces: Covariate Dependent Spectral Analysis of Categorical Time Series.

    Science.gov (United States)

    Krafty, Robert T; Xiong, Shuangyan; Stoffer, David S; Buysse, Daniel J; Hall, Martica

    2012-09-01

    Motivated by problems in Sleep Medicine and Circadian Biology, we present a method for the analysis of cross-sectional categorical time series collected from multiple subjects where the effect of static continuous-valued covariates is of interest. Toward this goal, we extend the spectral envelope methodology for the frequency domain analysis of a single categorical process to cross-sectional categorical processes that are possibly covariate dependent. The analysis introduces an enveloping spectral surface for describing the association between the frequency domain properties of qualitative time series and covariates. The resulting surface offers an intuitively interpretable measure of association between covariates and a qualitative time series by finding the maximum possible conditional power at a given frequency from scalings of the qualitative time series conditional on the covariates. The optimal scalings that maximize the power provide scientific insight by identifying the aspects of the qualitative series which have the most pronounced periodic features at a given frequency conditional on the value of the covariates. To facilitate the assessment of the dependence of the enveloping spectral surface on the covariates, we include a theory for analyzing the partial derivatives of the surface. Our approach is entirely nonparametric, and we present estimation and asymptotics in the setting of local polynomial smoothing.

  4. Trend Analysis of Golestan's Rivers Discharges Using Parametric and Non-parametric Methods

    Science.gov (United States)

    Mosaedi, Abolfazl; Kouhestani, Nasrin

    2010-05-01

    One of the major problems in human life is climate changes and its problems. Climate changes will cause changes in rivers discharges. The aim of this research is to investigate the trend analysis of seasonal and yearly rivers discharges of Golestan province (Iran). In this research four trend analysis method including, conjunction point, linear regression, Wald-Wolfowitz and Mann-Kendall, for analyzing of river discharges in seasonal and annual periods in significant level of 95% and 99% were applied. First, daily discharge data of 12 hydrometrics stations with a length of 42 years (1965-2007) were selected, after some common statistical tests such as, homogeneity test (by applying G-B and M-W tests), the four mentioned trends analysis tests were applied. Results show that in all stations, for summer data time series, there are decreasing trends with a significant level of 99% according to Mann-Kendall (M-K) test. For autumn time series data, all four methods have similar results. For other periods, the results of these four tests were more or less similar together. While, for some stations the results of tests were different. Keywords: Trend Analysis, Discharge, Non-parametric methods, Wald-Wolfowitz, The Mann-Kendall test, Golestan Province.

  5. Spectral analysis of bedform dynamics

    DEFF Research Database (Denmark)

    Winter, Christian; Ernstsen, Verner Brandbyge; Noormets, Riko

    . An assessment of bedform migration was achieved, as the growth and displacement of every single constituent can be distinguished. It can be shown that the changes in amplitude remain small for all harmonic constituents, whereas the phase shifts differ significantly. Thus the harmonics can be classified....... The proposed method overcomes the above mentioned problems of common descriptive analysis as it is an objective and straightforward mathematical process. The spectral decomposition of superimposed dunes allows a detailed description and analysis of dune patterns and migration....

  6. APPLICATION OF PARAMETRIC AND NON-PARAMETRIC BENCHMARKING METHODS IN COST EFFICIENCY ANALYSIS OF THE ELECTRICITY DISTRIBUTION SECTOR

    Directory of Open Access Journals (Sweden)

    Andrea Furková

    2007-06-01

    Full Text Available This paper explores the aplication of parametric and non-parametric benchmarking methods in measuring cost efficiency of Slovak and Czech electricity distribution companies. We compare the relative cost efficiency of Slovak and Czech distribution companies using two benchmarking methods: the non-parametric Data Envelopment Analysis (DEA and the Stochastic Frontier Analysis (SFA as the parametric approach. The first part of analysis was based on DEA models. Traditional cross-section CCR and BCC model were modified to cost efficiency estimation. In further analysis we focus on two versions of stochastic frontier cost functioin using panel data: MLE model and GLS model. These models have been applied to an unbalanced panel of 11 (Slovakia 3 and Czech Republic 8 regional electricity distribution utilities over a period from 2000 to 2004. The differences in estimated scores, parameters and ranking of utilities were analyzed. We observed significant differences between parametric methods and DEA approach.

  7. European regional efficiency and geographical externalities: a spatial nonparametric frontier analysis

    Science.gov (United States)

    Ramajo, Julián; Cordero, José Manuel; Márquez, Miguel Ángel

    2017-10-01

    This paper analyses region-level technical efficiency in nine European countries over the 1995-2007 period. We propose the application of a nonparametric conditional frontier approach to account for the presence of heterogeneous conditions in the form of geographical externalities. Such environmental factors are beyond the control of regional authorities, but may affect the production function. Therefore, they need to be considered in the frontier estimation. Specifically, a spatial autoregressive term is included as an external conditioning factor in a robust order- m model. Thus we can test the hypothesis of non-separability (the external factor impacts both the input-output space and the distribution of efficiencies), demonstrating the existence of significant global interregional spillovers into the production process. Our findings show that geographical externalities affect both the frontier level and the probability of being more or less efficient. Specifically, the results support the fact that the spatial lag variable has an inverted U-shaped non-linear impact on the performance of regions. This finding can be interpreted as a differential effect of interregional spillovers depending on the size of the neighboring economies: positive externalities for small values, possibly related to agglomeration economies, and negative externalities for high values, indicating the possibility of production congestion. Additionally, evidence of the existence of a strong geographic pattern of European regional efficiency is reported and the levels of technical efficiency are acknowledged to have converged during the period under analysis.

  8. SPECIES-SPECIFIC FOREST VARIABLE ESTIMATION USING NON-PARAMETRIC MODELING OF MULTI-SPECTRAL PHOTOGRAMMETRIC POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    J. Bohlin

    2012-07-01

    Full Text Available The recent development in software for automatic photogrammetric processing of multispectral aerial imagery, and the growing nation-wide availability of Digital Elevation Model (DEM data, are about to revolutionize data capture for forest management planning in Scandinavia. Using only already available aerial imagery and ALS-assessed DEM data, raster estimates of the forest variables mean tree height, basal area, total stem volume, and species-specific stem volumes were produced and evaluated. The study was conducted at a coniferous hemi-boreal test site in southern Sweden (lat. 58° N, long. 13° E. Digital aerial images from the Zeiss/Intergraph Digital Mapping Camera system were used to produce 3D point-cloud data with spectral information. Metrics were calculated for 696 field plots (10 m radius from point-cloud data and used in k-MSN to estimate forest variables. For these stands, the tree height ranged from 1.4 to 33.0 m (18.1 m mean, stem volume from 0 to 829 m3 ha-1 (249 m3 ha-1 mean and basal area from 0 to 62.2 m2 ha-1 (26.1 m2 ha-1 mean, with mean stand size of 2.8 ha. Estimates made using digital aerial images corresponding to the standard acquisition of the Swedish National Land Survey (Lantmäteriet showed RMSEs (in percent of the surveyed stand mean of 7.5% for tree height, 11.4% for basal area, 13.2% for total stem volume, 90.6% for pine stem volume, 26.4 for spruce stem volume, and 72.6% for deciduous stem volume. The results imply that photogrammetric matching of digital aerial images has significant potential for operational use in forestry.

  9. Non-parametric seismic hazard analysis in the presence of incomplete data

    Science.gov (United States)

    Yazdani, Azad; Mirzaei, Sajjad; Dadkhah, Koroush

    2017-01-01

    The distribution of earthquake magnitudes plays a crucial role in the estimation of seismic hazard parameters. Due to the complexity of earthquake magnitude distribution, non-parametric approaches are recommended over classical parametric methods. The main deficiency of the non-parametric approach is the lack of complete magnitude data in almost all cases. This study aims to introduce an imputation procedure for completing earthquake catalog data that will allow the catalog to be used for non-parametric density estimation. Using a Monte Carlo simulation, the efficiency of introduced approach is investigated. This study indicates that when a magnitude catalog is incomplete, the imputation procedure can provide an appropriate tool for seismic hazard assessment. As an illustration, the imputation procedure was applied to estimate earthquake magnitude distribution in Tehran, the capital city of Iran.

  10. Analysis of intravenous glucose tolerance test data using parametric and nonparametric modeling: application to a population at risk for diabetes.

    Science.gov (United States)

    Marmarelis, Vasilis Z; Shin, Dae C; Zhang, Yaping; Kautzky-Willer, Alexandra; Pacini, Giovanni; D'Argenio, David Z

    2013-07-01

    Modeling studies of the insulin-glucose relationship have mainly utilized parametric models, most notably the minimal model (MM) of glucose disappearance. This article presents results from the comparative analysis of the parametric MM and a nonparametric Laguerre based Volterra Model (LVM) applied to the analysis of insulin modified (IM) intravenous glucose tolerance test (IVGTT) data from a clinical study of gestational diabetes mellitus (GDM). An IM IVGTT study was performed 8 to 10 weeks postpartum in 125 women who were diagnosed with GDM during their pregnancy [population at risk of developing diabetes (PRD)] and in 39 control women with normal pregnancies (control subjects). The measured plasma glucose and insulin from the IM IVGTT in each group were analyzed via a population analysis approach to estimate the insulin sensitivity parameter of the parametric MM. In the nonparametric LVM analysis, the glucose and insulin data were used to calculate the first-order kernel, from which a diagnostic scalar index representing the integrated effect of insulin on glucose was derived. Both the parametric MM and nonparametric LVM describe the glucose concentration data in each group with good fidelity, with an improved measured versus predicted r² value for the LVM of 0.99 versus 0.97 for the MM analysis in the PRD. However, application of the respective diagnostic indices of the two methods does result in a different classification of 20% of the individuals in the PRD. It was found that the data based nonparametric LVM revealed additional insights about the manner in which infused insulin affects blood glucose concentration. © 2013 Diabetes Technology Society.

  11. An exact predictive recursion for Bayesian nonparametric analysis of incomplete data

    OpenAIRE

    Garibaldi, Ubaldo; Viarengo, Paolo

    2010-01-01

    This paper presents a new derivation of nonparametric distribution estimation with right-censored data. It is based on an extension of the predictive inferences to compound evidence. The estimate is recursive and exact, and no stochastic approximation is needed: it simply requires that the censored data are processed in decreasing order. Only in this case the recursion provides exact posterior predictive distributions for subsequent samples under a Dirichlet process prior. The resulting estim...

  12. Parametric and Nonparametric EEG Analysis for the Evaluation of EEG Activity in Young Children with Controlled Epilepsy

    Directory of Open Access Journals (Sweden)

    Vangelis Sakkalis

    2008-01-01

    Full Text Available There is an important evidence of differences in the EEG frequency spectrum of control subjects as compared to epileptic subjects. In particular, the study of children presents difficulties due to the early stages of brain development and the various forms of epilepsy indications. In this study, we consider children that developed epileptic crises in the past but without any other clinical, psychological, or visible neurophysiological findings. The aim of the paper is to develop reliable techniques for testing if such controlled epilepsy induces related spectral differences in the EEG. Spectral features extracted by using nonparametric, signal representation techniques (Fourier and wavelet transform and a parametric, signal modeling technique (ARMA are compared and their effect on the classification of the two groups is analyzed. The subjects performed two different tasks: a control (rest task and a relatively difficult math task. The results show that spectral features extracted by modeling the EEG signals recorded from individual channels by an ARMA model give a higher discrimination between the two subject groups for the control task, where classification scores of up to 100% were obtained with a linear discriminant classifier.

  13. Spectral analysis by correlation; Analyse spectrale par correlation

    Energy Technology Data Exchange (ETDEWEB)

    Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G. [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1969-07-01

    The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [French] La densite spectrale d'un signal qui represente la repartition de sa puissance sur l'axe des frequences est une fonction de premiere importance, constamment utilisee dans tout ce qui touche le traitement du signal (identification de processus, analyse de vibrations, etc...). Parmi toutes les methodes possibles de calcul de cette fonction, la methode par correlation (calcul de la fonction de correlation + transformation de Fourier) est tres seduisante par sa simplicite et ses performances. L'etude qui est faite ici va deboucher sur la realisation d'un appareil qui, couple a un correlateur, constituera un ensemble d'analyse spectrale en temps reel couvrant la gamme de frequence 0 a 5 MHz. (auteur)

  14. CURRENT STATUS OF NONPARAMETRIC STATISTICS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-02-01

    Full Text Available Nonparametric statistics is one of the five points of growth of applied mathematical statistics. Despite the large number of publications on specific issues of nonparametric statistics, the internal structure of this research direction has remained undeveloped. The purpose of this article is to consider its division into regions based on the existing practice of scientific activity determination of nonparametric statistics and classify investigations on nonparametric statistical methods. Nonparametric statistics allows to make statistical inference, in particular, to estimate the characteristics of the distribution and testing statistical hypotheses without, as a rule, weakly proven assumptions about the distribution function of samples included in a particular parametric family. For example, the widespread belief that the statistical data are often have the normal distribution. Meanwhile, analysis of results of observations, in particular, measurement errors, always leads to the same conclusion - in most cases the actual distribution significantly different from normal. Uncritical use of the hypothesis of normality often leads to significant errors, in areas such as rejection of outlying observation results (emissions, the statistical quality control, and in other cases. Therefore, it is advisable to use nonparametric methods, in which the distribution functions of the results of observations are imposed only weak requirements. It is usually assumed only their continuity. On the basis of generalization of numerous studies it can be stated that to date, using nonparametric methods can solve almost the same number of tasks that previously used parametric methods. Certain statements in the literature are incorrect that nonparametric methods have less power, or require larger sample sizes than parametric methods. Note that in the nonparametric statistics, as in mathematical statistics in general, there remain a number of unresolved problems

  15. Basic Functional Analysis Puzzles of Spectral Flow

    DEFF Research Database (Denmark)

    Booss-Bavnbek, Bernhelm

    2011-01-01

    We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles.......We explain an array of basic functional analysis puzzles on the way to general spectral flow formulae and indicate a direction of future topological research for dealing with these puzzles....

  16. Nanocatalytic resonance scattering spectral analysis

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The resonance scattering spectral technique has been established using the synchronous scanning technique on spectrofluorometry.Because of its advantages of simplicity,rapidity and sensitivity,it has been widely applied to analyses of proteins,nucleic acids and inorganic ions.This paper summarizes the application of immunonanogold and aptamer modified nanogold(AptAu) catalytic resonance scattering spectral technique in combination with the work of our group,citing 53 references.

  17. Nonparametric variance estimation in the analysis of microarray data: a measurement error approach.

    Science.gov (United States)

    Carroll, Raymond J; Wang, Yuedong

    2008-01-01

    This article investigates the effects of measurement error on the estimation of nonparametric variance functions. We show that either ignoring measurement error or direct application of the simulation extrapolation, SIMEX, method leads to inconsistent estimators. Nevertheless, the direct SIMEX method can reduce bias relative to a naive estimator. We further propose a permutation SIMEX method which leads to consistent estimators in theory. The performance of both SIMEX methods depends on approximations to the exact extrapolants. Simulations show that both SIMEX methods perform better than ignoring measurement error. The methodology is illustrated using microarray data from colon cancer patients.

  18. Non-parametric trend analysis of water quality data of rivers in Kansas

    Science.gov (United States)

    Yu, Y.-S.; Zou, S.; Whittemore, D.

    1993-01-01

    Surface water quality data for 15 sampling stations in the Arkansas, Verdigris, Neosho, and Walnut river basins inside the state of Kansas were analyzed to detect trends (or lack of trends) in 17 major constituents by using four different non-parametric methods. The results show that concentrations of specific conductance, total dissolved solids, calcium, total hardness, sodium, potassium, alkalinity, sulfate, chloride, total phosphorus, ammonia plus organic nitrogen, and suspended sediment generally have downward trends. Some of the downward trends are related to increases in discharge, while others could be caused by decreases in pollution sources. Homogeneity tests show that both station-wide trends and basinwide trends are non-homogeneous. ?? 1993.

  19. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population.

  20. NParCov3: A SAS/IML Macro for Nonparametric Randomization-Based Analysis of Covariance

    Directory of Open Access Journals (Sweden)

    Richard C. Zink

    2012-07-01

    Full Text Available Analysis of covariance serves two important purposes in a randomized clinical trial. First, there is a reduction of variance for the treatment effect which provides more powerful statistical tests and more precise confidence intervals. Second, it provides estimates of the treatment effect which are adjusted for random imbalances of covariates between the treatment groups. The nonparametric analysis of covariance method of Koch, Tangen, Jung, and Amara (1998 defines a very general methodology using weighted least-squares to generate covariate-adjusted treatment effects with minimal assumptions. This methodology is general in its applicability to a variety of outcomes, whether continuous, binary, ordinal, incidence density or time-to-event. Further, its use has been illustrated in many clinical trial settings, such as multi-center, dose-response and non-inferiority trials.NParCov3 is a SAS/IML macro written to conduct the nonparametric randomization-based covariance analyses of Koch et al. (1998. The software can analyze a variety of outcomes and can account for stratification. Data from multiple clinical trials will be used for illustration.

  1. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2014-01-01

    Thoroughly revised and reorganized, the fourth edition presents in-depth coverage of the theory and methods of the most widely used nonparametric procedures in statistical analysis and offers example applications appropriate for all areas of the social, behavioral, and life sciences. The book presents new material on the quantiles, the calculation of exact and simulated power, multiple comparisons, additional goodness-of-fit tests, methods of analysis of count data, and modern computer applications using MINITAB, SAS, and STATXACT. It includes tabular guides for simplified applications of tests and finding P values and confidence interval estimates.

  2. Nonparametric analysis of competing risks data with event category missing at random.

    Science.gov (United States)

    Gouskova, Natalia A; Lin, Feng-Chang; Fine, Jason P

    2017-03-01

    In competing risks setup, the data for each subject consist of the event time, censoring indicator, and event category. However, sometimes the information about the event category can be missing, as, for example, in a case when the date of death is known but the cause of death is not available. In such situations, treating subjects with missing event category as censored leads to the underestimation of the hazard functions. We suggest nonparametric estimators for the cumulative cause-specific hazards and the cumulative incidence functions which use the Nadaraya-Watson estimator to obtain the contribution of an event with missing category to each of the cause-specific hazards. We derive the propertied of the proposed estimators. Optimal bandwidth is determined, which minimizes the mean integrated squared errors of the proposed estimators over time. The methodology is illustrated using data on lung infections in patients from the United States Cystic Fibrosis Foundation Patient Registry. © 2016, The International Biometric Society.

  3. Nonparametric Signal Extraction and Measurement Error in the Analysis of Electroencephalographic Activity During Sleep.

    Science.gov (United States)

    Crainiceanu, Ciprian M; Caffo, Brian S; Di, Chong-Zhi; Punjabi, Naresh M

    2009-06-01

    We introduce methods for signal and associated variability estimation based on hierarchical nonparametric smoothing with application to the Sleep Heart Health Study (SHHS). SHHS is the largest electroencephalographic (EEG) collection of sleep-related data, which contains, at each visit, two quasi-continuous EEG signals for each subject. The signal features extracted from EEG data are then used in second level analyses to investigate the relation between health, behavioral, or biometric outcomes and sleep. Using subject specific signals estimated with known variability in a second level regression becomes a nonstandard measurement error problem. We propose and implement methods that take into account cross-sectional and longitudinal measurement error. The research presented here forms the basis for EEG signal processing for the SHHS.

  4. Nonparametric analysis of the time structure of seismicity in a geographic region

    Directory of Open Access Journals (Sweden)

    A. Quintela-del-Río

    2002-06-01

    Full Text Available As an alternative to traditional parametric approaches, we suggest nonparametric methods for analyzing temporal data on earthquake occurrences. In particular, the kernel method for estimating the hazard function and the intensity function are presented. One novelty of our approaches is that we take into account the possible dependence of the data to estimate the distribution of time intervals between earthquakes, which has not been considered in most statistics studies on seismicity. Kernel estimation of hazard function has been used to study the occurrence process of cluster centers (main shocks. Kernel intensity estimation, on the other hand, has helped to describe the occurrence process of cluster members (aftershocks. Similar studies in two geographic areas of Spain (Granada and Galicia have been carried out to illustrate the estimation methods suggested.

  5. Climatic, parametric and non-parametric analysis of energy performance of double-glazed windows in different climates

    Directory of Open Access Journals (Sweden)

    Saeed Banihashemi

    2015-12-01

    Full Text Available In line with the growing global trend toward energy efficiency in buildings, this paper aims to first; investigate the energy performance of double-glazed windows in different climates and second; analyze the most dominant used parametric and non-parametric tests in dimension reduction for simulating this component. A four-story building representing the conventional type of residential apartments for four climates of cold, temperate, hot-arid and hot-humid was selected for simulation. 10 variables of U-factor, SHGC, emissivity, visible transmittance, monthly average dry bulb temperature, monthly average percent humidity, monthly average wind speed, monthly average direct solar radiation, monthly average diffuse solar radiation and orientation constituted the parameters considered in the calculation of cooling and heating loads of the case. Design of Experiment and Principal Component Analysis methods were applied to find the most significant factors and reduction dimension of initial variables. It was observed that in two climates of temperate and hot-arid, using double glazed windows was beneficial in both cold and hot months whereas in cold and hot-humid climates where heating and cooling loads are dominant respectively, they were advantageous in only those dominant months. Furthermore, an inconsistency was revealed between parametric and non-parametric tests in terms of identifying the most significant variables.

  6. 'nparACT' package for R: A free software tool for the non-parametric analysis of actigraphy data.

    Science.gov (United States)

    Blume, Christine; Santhi, Nayantara; Schabus, Manuel

    2016-01-01

    For many studies, participants' sleep-wake patterns are monitored and recorded prior to, during and following an experimental or clinical intervention using actigraphy, i.e. the recording of data generated by movements. Often, these data are merely inspected visually without computation of descriptive parameters, in part due to the lack of user-friendly software. To address this deficit, we developed a package for R Core Team [6], that allows computing several non-parametric measures from actigraphy data. Specifically, it computes the interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) of activity and gives the start times and average activity values of M10 (i.e. the ten hours with maximal activity) and L5 (i.e. the five hours with least activity). Two functions compute these 'classical' parameters and handle either single or multiple files. Two other functions additionally allow computing an L-value (i.e. the least activity value) for a user-defined time span termed 'Lflex' value. A plotting option is included in all functions. The package can be downloaded from the Comprehensive R Archives Network (CRAN). •The package 'nparACT' for R serves the non-parametric analysis of actigraphy data.•Computed parameters include interdaily stability (IS), intradaily variability (IV) and relative amplitude (RA) as well as start times and average activity during the 10 h with maximal and the 5 h with minimal activity (i.e. M10 and L5).

  7. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  8. Nonparametric randomization-based covariate adjustment for stratified analysis of time-to-event or dichotomous outcomes.

    Science.gov (United States)

    Hussey, Michael A; Koch, Gary G; Preisser, John S; Saville, Benjamin R

    2016-01-01

    Time-to-event or dichotomous outcomes in randomized clinical trials often have analyses using the Cox proportional hazards model or conditional logistic regression, respectively, to obtain covariate-adjusted log hazard (or odds) ratios. Nonparametric Randomization-Based Analysis of Covariance (NPANCOVA) can be applied to unadjusted log hazard (or odds) ratios estimated from a model containing treatment as the only explanatory variable. These adjusted estimates are stratified population-averaged treatment effects and only require a valid randomization to the two treatment groups and avoid key modeling assumptions (e.g., proportional hazards in the case of a Cox model) for the adjustment variables. The methodology has application in the regulatory environment where such assumptions cannot be verified a priori. Application of the methodology is illustrated through three examples on real data from two randomized trials.

  9. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    CERN Document Server

    Ford, Eric B; Steffen, Jason H; Carter, Joshua A; Fressin, Francois; Holman, Matthew J; Lissauer, Jack J; Moorhead, Althea V; Morehead, Robert C; Ragozzine, Darin; Rowe, Jason F; Welsh, William F; Allen, Christopher; Batalha, Natalie M; Borucki, William J; Bryson, Stephen T; Buchhave, Lars A; Burke, Christopher J; Caldwell, Douglas A; Charbonneau, David; Clarke, Bruce D; Cochran, William D; Désert, Jean-Michel; Endl, Michael; Everett, Mark E; Fischer, Debra A; Gautier, Thomas N; Gilliland, Ron L; Jenkins, Jon M; Haas, Michael R; Horch, Elliott; Howell, Steve B; Ibrahim, Khadeejah A; Isaacson, Howard; Koch, David G; Latham, David W; Li, Jie; Lucas, Philip; MacQueen, Phillip J; Marcy, Geoffrey W; McCauliff, Sean; Mullally, Fergal R; Quinn, Samuel N; Quintana, Elisa; Shporer, Avi; Still, Martin; Tenenbaum, Peter; Thompson, Susan E; Torres, Guillermo; Twicken, Joseph D; Wohler, Bill

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timingn variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:...

  10. Augmented Classical Least Squares Multivariate Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, David M. (Albuquerque, NM); Melgaard, David K. (Albuquerque, NM)

    2005-01-11

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  11. Augmented Classical Least Squares Multivariate Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, David M. (Albuquerque, NM); Melgaard, David K. (Albuquerque, NM)

    2005-07-26

    A method of multivariate spectral analysis, termed augmented classical least squares (ACLS), provides an improved CLS calibration model when unmodeled sources of spectral variation are contained in a calibration sample set. The ACLS methods use information derived from component or spectral residuals during the CLS calibration to provide an improved calibration-augmented CLS model. The ACLS methods are based on CLS so that they retain the qualitative benefits of CLS, yet they have the flexibility of PLS and other hybrid techniques in that they can define a prediction model even with unmodeled sources of spectral variation that are not explicitly included in the calibration model. The unmodeled sources of spectral variation may be unknown constituents, constituents with unknown concentrations, nonlinear responses, non-uniform and correlated errors, or other sources of spectral variation that are present in the calibration sample spectra. Also, since the various ACLS methods are based on CLS, they can incorporate the new prediction-augmented CLS (PACLS) method of updating the prediction model for new sources of spectral variation contained in the prediction sample set without having to return to the calibration process. The ACLS methods can also be applied to alternating least squares models. The ACLS methods can be applied to all types of multivariate data.

  12. Stationary Time Series Analysis Using Information and Spectral Analysis

    Science.gov (United States)

    1992-09-01

    spectral density function of the time series. The spectral density function f(w), 0 < w < 1, is defined as the Fourier transform of...series with spectral density function f(w). 4 An important result of Pinsker [(1964), p. 196] can be interpreted as providing a for- mula for asymptotic...Analysis Papers, Holden-Day, San Francisco, California. Parzen, E. (1958) "On asymptotically efficient consistent estimates of the spectral density function

  13. SPAM- SPECTRAL ANALYSIS MANAGER (UNIX VERSION)

    Science.gov (United States)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different

  14. Digital Forensics Analysis of Spectral Estimation Methods

    CERN Document Server

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  15. A Non-Parametric and Entropy Based Analysis of the Relationship between the VIX and S&P 500

    Directory of Open Access Journals (Sweden)

    Abhay K. Singh

    2013-10-01

    Full Text Available This paper features an analysis of the relationship between the S&P 500 Index and the VIX using daily data obtained from the CBOE website and SIRCA (The Securities Industry Research Centre of the Asia Pacific. We explore the relationship between the S&P 500 daily return series and a similar series for the VIX in terms of a long sample drawn from the CBOE from 1990 to mid 2011 and a set of returns from SIRCA’s TRTH datasets from March 2005 to-date. This shorter sample, which captures the behavior of the new VIX, introduced in 2003, is divided into four sub-samples which permit the exploration of the impact of the Global Financial Crisis. We apply a series of non-parametric based tests utilizing entropy based metrics. These suggest that the PDFs and CDFs of these two return distributions change shape in various subsample periods. The entropy and MI statistics suggest that the degree of uncertainty attached to these distributions changes through time and using the S&P 500 return as the dependent variable, that the amount of information obtained from the VIX changes with time and reaches a relative maximum in the most recent period from 2011 to 2012. The entropy based non-parametric tests of the equivalence of the two distributions and their symmetry all strongly reject their respective nulls. The results suggest that parametric techniques do not adequately capture the complexities displayed in the behavior of these series. This has practical implications for hedging utilizing derivatives written on the VIX.

  16. Bedform characterization through 2D spectral analysis

    DEFF Research Database (Denmark)

    Lefebvre, Alice; Ernstsen, Verner Brandbyge; Winter, Christian

    2011-01-01

    characteristics using twodimensional (2D) spectral analysis is presented and tested on seabed elevation data from the Knudedyb tidal inlet in the Danish Wadden Sea, where large compound bedforms are found. The bathymetric data were divided into 20x20 m areas on which a 2D spectral analysis was applied. The most...... energetic peak of the 2D spectrum was found and its energy, frequency and direction were calculated. A power-law was fitted to the average of slices taken through the 2D spectrum; its slope and y-intercept were calculated. Using these results the test area was morphologically classified into 4 distinct...

  17. Nonparametric statistical inference

    CERN Document Server

    Gibbons, Jean Dickinson

    2010-01-01

    Overall, this remains a very fine book suitable for a graduate-level course in nonparametric statistics. I recommend it for all people interested in learning the basic ideas of nonparametric statistical inference.-Eugenia Stoimenova, Journal of Applied Statistics, June 2012… one of the best books available for a graduate (or advanced undergraduate) text for a theory course on nonparametric statistics. … a very well-written and organized book on nonparametric statistics, especially useful and recommended for teachers and graduate students.-Biometrics, 67, September 2011This excellently presente

  18. [Comparison of two spectral mixture analysis models].

    Science.gov (United States)

    Wang, Qin-Jun; Lin, Qi-Zhong; Li, Ming-Xiao; Wang, Li-Ming

    2009-10-01

    A spectral mixture analysis experiment was designed to compare the spectral unmixing effects of linear spectral mixture analysis (LSMA) and constraint linear spectral mixture analysis (CLSMA). In the experiment, red, green, blue and yellow colors were printed on a coarse album as four end members. Thirty nine mixed samples were made according to each end member's different percent in one pixel. Then, field spectrometer was located on the top of the mixed samples' center to measure spectrum one by one. Inversion percent of each end member in the pixel was extracted using LSMA and CLSMA models. Finally, normalized mean squared error was calculated between inversion and real percent to compare the two models' effects on spectral unmixing. Results from experiment showed that the total error of LSMA was 0.30087 and that of CLSMA was 0.37552 when using all bands in the spectrum. Therefore, LSMA was 0.075 less than that of CLSMA when the whole bands of four end members' spectra were used. On the other hand, the total error of LSMA was 0.28095 and that of CLSMA was 0.29805 after band selection. So, LSMA was 0.017 less than that of CLSMA when bands selection was performed. Therefore, whether all or selected bands were used, the accuracy of LSMA was better than that of CLSMA because during the process of spectrum measurement, errors caused by instrument or human were introduced into the model, leading to that the measured data could not mean the strict requirement of CLSMA and therefore reduced its accuracy: Furthermore, the total error of LSMA using selected bands was 0.02 less than that using the whole bands. The total error of CLSMA using selected bands was 0.077 less than that using the whole bands. So, in the same model, spectral unmixing using selected bands to reduce the correlation of end members' spectra was superior to that using the whole bands.

  19. GPU-accelerated nonparametric kinetic analysis of DCE-MRI data from glioblastoma patients treated with bevacizumab.

    Science.gov (United States)

    Hsu, Yu-Han H; Ferl, Gregory Z; Ng, Chee M

    2013-05-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is often used to examine vascular function in malignant tumors and noninvasively monitor drug efficacy of antivascular therapies in clinical studies. However, complex numerical methods used to derive tumor physiological properties from DCE-MRI images can be time-consuming and computationally challenging. Recent advancement of computing technology in graphics processing unit (GPU) makes it possible to build an energy-efficient and high-power parallel computing platform for solving complex numerical problems. This study develops the first reported fast GPU-based method for nonparametric kinetic analysis of DCE-MRI data using clinical scans of glioblastoma patients treated with bevacizumab (Avastin®). In the method, contrast agent concentration-time profiles in arterial blood and tumor tissue are smoothed using a robust kernel-based regression algorithm in order to remove artifacts due to patient motion and then deconvolved to produce the impulse response function (IRF). The area under the curve (AUC) and mean residence time (MRT) of the IRF are calculated using statistical moment analysis, and two tumor physiological properties that relate to vascular permeability, volume transfer constant between blood plasma and extravascular extracellular space (K(trans)) and fractional interstitial volume (ve) are estimated using the approximations AUC/MRT and AUC. The most significant feature in this method is the use of GPU-computing to analyze data from more than 60,000 voxels in each DCE-MRI image in parallel fashion. All analysis steps have been automated in a single program script that requires only blood and tumor data as the sole input. The GPU-accelerated method produces K(trans) and ve estimates that are comparable to results from previous studies but reduces computational time by more than 80-fold compared to a previously reported central processing unit-based nonparametric method. Furthermore, it is at

  20. SpecViz: Interactive Spectral Data Analysis

    Science.gov (United States)

    Earl, Nicholas Michael; STScI

    2016-06-01

    The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight

  1. LASER SPECTRAL ANALYSIS OF STRAIN MEASUREM ENT

    Institute of Scientific and Technical Information of China (English)

    姜耀东; 陈至达

    1994-01-01

    Modern optical theory has shown that the far field or Fraunhofer diffraction equipment is identical to the Fourier spectral analyzer. In the Fourier spectral analyzer the Fourier spectra or the Fraunhofer diffraction pattern of a graph is formed on the back foeal plane when a laser beam is directed on the graph lying on the front focal plane; the Fourier spectra of the graph is also subjected to change during the deformation of the graph. Through analyzing the change of Fourier spectra the deformation of the graph can be obtained. A few years ago, based on the above principles the authors proposed a new technique of strain measurement by laser spectral analysis. Demonstration and discussion will be made in detail in this paper.

  2. Spectral theory and nonlinear functional analysis

    CERN Document Server

    Lopez-Gomez, Julian

    2001-01-01

    This Research Note addresses several pivotal problems in spectral theory and nonlinear functional analysis in connection with the analysis of the structure of the set of zeroes of a general class of nonlinear operators. It features the construction of an optimal algebraic/analytic invariant for calculating the Leray-Schauder degree, new methods for solving nonlinear equations in Banach spaces, and general properties of components of solutions sets presented with minimal use of topological tools. The author also gives several applications of the abstract theory to reaction diffusion equations and systems.The results presented cover a thirty-year period and include recent, unpublished findings of the author and his coworkers. Appealing to a broad audience, Spectral Theory and Nonlinear Functional Analysis contains many important contributions to linear algebra, linear and nonlinear functional analysis, and topology and opens the door for further advances.

  3. SVD analysis of Aura TES spectral residuals

    Science.gov (United States)

    Beer, Reinhard; Kulawik, Susan S.; Rodgers, Clive D.; Bowman, Kevin W.

    2005-01-01

    Singular Value Decomposition (SVD) analysis is both a powerful diagnostic tool and an effective method of noise filtering. We present the results of an SVD analysis of an ensemble of spectral residuals acquired in September 2004 from a 16-orbit Aura Tropospheric Emission Spectrometer (TES) Global Survey and compare them to alternative methods such as zonal averages. In particular, the technique highlights issues such as the orbital variation of instrument response and incompletely modeled effects of surface emissivity and atmospheric composition.

  4. nparLD: An R Software Package for the Nonparametric Analysis of Longitudinal Data in Factorial Experiments

    Directory of Open Access Journals (Sweden)

    Kimihiro Noguchi

    2012-09-01

    Full Text Available Longitudinal data from factorial experiments frequently arise in various fields of study, ranging from medicine and biology to public policy and sociology. In most practical situations, the distribution of observed data is unknown and there may exist a number of atypical measurements and outliers. Hence, use of parametric and semi-parametric procedures that impose restrictive distributional assumptions on observed longitudinal samples becomes questionable. This, in turn, has led to a substantial demand for statistical procedures that enable us to accurately and reliably analyze longitudinal measurements in factorial experiments with minimal conditions on available data, and robust nonparametric methodology offering such a possibility becomes of particular practical importance. In this article, we introduce a new R package nparLD which provides statisticians and researchers from other disciplines an easy and user-friendly access to the most up-to-date robust rank-based methods for the analysis of longitudinal data in factorial settings. We illustrate the implemented procedures by case studies from dentistry, biology, and medicine.

  5. Zero- vs. one-dimensional, parametric vs. non-parametric, and confidence interval vs. hypothesis testing procedures in one-dimensional biomechanical trajectory analysis.

    Science.gov (United States)

    Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A

    2015-05-01

    Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories.

  6. The Efficiency Change of Italian Public Universities in the New Millennium: A Non-Parametric Analysis

    Science.gov (United States)

    Guccio, Calogero; Martorana, Marco Ferdinando; Mazza, Isidoro

    2017-01-01

    The paper assesses the evolution of efficiency of Italian public universities for the period 2000-2010. It aims at investigating whether their levels of efficiency showed signs of convergence, and if the well-known disparity between northern and southern regions decreased. For this purpose, we use a refinement of data envelopment analysis, namely…

  7. Non-parametric group-level statistics for source-resolved ERP analysis.

    Science.gov (United States)

    Lee, Clement; Miyakoshi, Makoto; Delorme, Arnaud; Cauwenberghs, Gert; Makeig, Scott

    2015-01-01

    We have developed a new statistical framework for group-level event-related potential (ERP) analysis in EEGLAB. The framework calculates the variance of scalp channel signals accounted for by the activity of homogeneous clusters of sources found by independent component analysis (ICA). When ICA data decomposition is performed on each subject's data separately, functionally equivalent ICs can be grouped into EEGLAB clusters. Here, we report a new addition (statPvaf) to the EEGLAB plug-in std_envtopo to enable inferential statistics on main effects and interactions in event related potentials (ERPs) of independent component (IC) processes at the group level. We demonstrate the use of the updated plug-in on simulated and actual EEG data.

  8. Nonparametric Bayesian Inference for Mean Residual Life Functions in Survival Analysis

    OpenAIRE

    Poynor, Valerie; Kottas, Athanasios

    2014-01-01

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life function which provides the expected remaining lifetime given that a subject has survived (i.e., is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the mean residual life function characterizes the sur...

  9. Transit Timing Observations from Kepler: II. Confirmation of Two Multiplanet Systems via a Non-parametric Correlation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric B.; /Florida U.; Fabrycky, Daniel C.; /Lick Observ.; Steffen, Jason H.; /Fermilab; Carter, Joshua A.; /Harvard-Smithsonian Ctr. Astrophys.; Fressin, Francois; /Harvard-Smithsonian Ctr. Astrophys.; Holman, Matthew J.; /Harvard-Smithsonian Ctr. Astrophys.; Lissauer, Jack J.; /NASA, Ames; Moorhead, Althea V.; /Florida U.; Morehead, Robert C.; /Florida U.; Ragozzine, Darin; /Harvard-Smithsonian Ctr. Astrophys.; Rowe, Jason F.; /NASA, Ames /SETI Inst., Mtn. View /San Diego State U., Astron. Dept.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timing variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data sets. We apply this method to an analysis of the transit timing variations of two stars with multiple transiting planet candidates identified by Kepler. We confirm four transiting planets in two multiple planet systems based on their TTVs and the constraints imposed by dynamical stability. An additional three candidates in these same systems are not confirmed as planets, but are likely to be validated as real planets once further observations and analyses are possible. If all were confirmed, these systems would be near 4:6:9 and 2:4:6:9 period commensurabilities. Our results demonstrate that TTVs provide a powerful tool for confirming transiting planets, including low-mass planets and planets around faint stars for which Doppler follow-up is not practical with existing facilities. Continued Kepler observations will dramatically improve the constraints on the planet masses and orbits and provide sensitivity for detecting additional non-transiting planets. If Kepler observations were extended to eight years, then a similar analysis could likely confirm systems with multiple closely spaced, small transiting planets in or near the habitable zone of solar-type stars.

  10. Exoplanetary Detection By Multifractal Spectral Analysis

    CERN Document Server

    Agarwal, Sahil; Wettlaufer, John S

    2016-01-01

    Owing to technological advances the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies time scales that characterize planetary orbital motion around the host star. Without fitting spectral data to stellar models, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the time scales obtained to primary transit and secondary exoplanet eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via dop...

  11. Quantal Response: Nonparametric Modeling

    Science.gov (United States)

    2017-01-01

    spline N−spline Fig. 3 Logistic regression 7 Approved for public release; distribution is unlimited. 5. Nonparametric QR Models Nonparametric linear ...stimulus and probability of response. The Generalized Linear Model approach does not make use of the limit distribution but allows arbitrary functional...7. Conclusions and Recommendations 18 8. References 19 Appendix A. The Linear Model 21 Appendix B. The Generalized Linear Model 33 Appendix C. B

  12. A Parcellation Based Nonparametric Algorithm for Independent Component Analysis with Application to fMRI Data

    Directory of Open Access Journals (Sweden)

    Shanshan eLi

    2016-01-01

    Full Text Available Independent Component analysis (ICA is a widely used technique for separating signals that have been mixed together. In this manuscript, we propose a novel ICA algorithm using density estimation and maximum likelihood, where the densities of the signals are estimated via p-spline based histogram smoothing and the mixing matrix is simultaneously estimated using an optimization algorithm. The algorithm is exceedingly simple, easy to implement and blind to the underlying distributions of the source signals. To relax the identically distributed assumption in the density function, a modified algorithm is proposed to allow for different density functions on different regions. The performance of the proposed algorithm is evaluated in different simulation settings. For illustration, the algorithm is applied to a research investigation with a large collection of resting state fMRI datasets. The results show that the algorithm successfully recovers the established brain networks.

  13. Adaptive Kernel Canonical Correlation Analysis Algorithms for Nonparametric Identification of Wiener and Hammerstein Systems

    Directory of Open Access Journals (Sweden)

    Ignacio Santamaría

    2008-04-01

    Full Text Available This paper treats the identification of nonlinear systems that consist of a cascade of a linear channel and a nonlinearity, such as the well-known Wiener and Hammerstein systems. In particular, we follow a supervised identification approach that simultaneously identifies both parts of the nonlinear system. Given the correct restrictions on the identification problem, we show how kernel canonical correlation analysis (KCCA emerges as the logical solution to this problem. We then extend the proposed identification algorithm to an adaptive version allowing to deal with time-varying systems. In order to avoid overfitting problems, we discuss and compare three possible regularization techniques for both the batch and the adaptive versions of the proposed algorithm. Simulations are included to demonstrate the effectiveness of the presented algorithm.

  14. Introduction to nonparametric statistics for the biological sciences using R

    CERN Document Server

    MacFarland, Thomas W

    2016-01-01

    This book contains a rich set of tools for nonparametric analyses, and the purpose of this supplemental text is to provide guidance to students and professional researchers on how R is used for nonparametric data analysis in the biological sciences: To introduce when nonparametric approaches to data analysis are appropriate To introduce the leading nonparametric tests commonly used in biostatistics and how R is used to generate appropriate statistics for each test To introduce common figures typically associated with nonparametric data analysis and how R is used to generate appropriate figures in support of each data set The book focuses on how R is used to distinguish between data that could be classified as nonparametric as opposed to data that could be classified as parametric, with both approaches to data classification covered extensively. Following an introductory lesson on nonparametric statistics for the biological sciences, the book is organized into eight self-contained lessons on various analyses a...

  15. Power of non-parametric linkage analysis in mapping genes contributing to human longevity in long-lived sib-pairs

    DEFF Research Database (Denmark)

    Tan, Qihua; Zhao, J H; Iachine, I

    2004-01-01

    This report investigates the power issue in applying the non-parametric linkage analysis of affected sib-pairs (ASP) [Kruglyak and Lander, 1995: Am J Hum Genet 57:439-454] to localize genes that contribute to human longevity using long-lived sib-pairs. Data were simulated by introducing a recently...... developed statistical model for measuring marker-longevity associations [Yashin et al., 1999: Am J Hum Genet 65:1178-1193], enabling direct power comparison between linkage and association approaches. The non-parametric linkage (NPL) scores estimated in the region harboring the causal allele are evaluated...... in case of a dominant effect. Although the power issue may depend heavily on the true genetic nature in maintaining survival, our study suggests that results from small-scale sib-pair investigations should be referred with caution, given the complexity of human longevity....

  16. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  17. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy.

    Science.gov (United States)

    Kong, Xiangrong; Mas, Valeria; Archer, Kellie J

    2008-02-26

    With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN) to those with normal functioning allograft. The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been reported to be relevant to renal diseases. Further study on the

  18. Spectral phasor analysis allows rapid and reliable unmixing of fluorescence microscopy spectral images

    NARCIS (Netherlands)

    Fereidouni, F.; Bader, A.N.; Gerritsen, H.C.

    2012-01-01

    A new global analysis algorithm to analyse (hyper-) spectral images is presented. It is based on the phasor representation that has been demonstrated to be very powerful for the analysis of lifetime imaging data. In spectral phasor analysis the fluorescence spectrum of each pixel in the image is Fou

  19. Spectral Analysis of Nonstationary Spacecraft Vibration Data

    Science.gov (United States)

    1965-11-01

    the instantaneous power spectral density function for the process (y(t)). This spectral function can take on negative values for certain cases...power spectral density function is not directly measurable in the frequency domain. An experimental estimate for the function can be obtained only by...called the generalized power spectral density function for the process (y(t)) . This spectral description for nonstationary data is of great value for

  20. Spectral analysis of allogeneic hydroxyapatite powders

    Science.gov (United States)

    Timchenko, P. E.; Timchenko, E. V.; Pisareva, E. V.; Vlasov, M. Yu; Red’kin, N. A.; Frolov, O. O.

    2017-01-01

    In this paper we discuss the application of Raman spectroscopy to the in vitro analysis of the hydroxyapatite powder samples produced from different types of animal bone tissue during demineralization process at various acid concentrations and exposure durations. The derivation of the Raman spectrum of hydroxyapatite is attempted by the analysis of the pure powders of its known constituents. Were experimentally found spectral features of hydroxyapatite, based on analysis of the line amplitude at wave numbers 950-965 cm-1 ((PO4)3- (ν1) vibration) and 1065-1075 cm-1 ((CO3)2-(ν1) B-type replacement). Control of physicochemical properties of hydroxyapatite was carried out by Raman spectroscopy. Research results are compared with an infrared Fourier spectroscopy.

  1. Multivariate Analysis of Solar Spectral Irradiance Measurements

    Science.gov (United States)

    Pilewskie, P.; Rabbette, M.

    2001-01-01

    Principal component analysis is used to characterize approximately 7000 downwelling solar irradiance spectra retrieved at the Southern Great Plains site during an Atmospheric Radiation Measurement (ARM) shortwave intensive operating period. This analysis technique has proven to be very effective in reducing a large set of variables into a much smaller set of independent variables while retaining the information content. It is used to determine the minimum number of parameters necessary to characterize atmospheric spectral irradiance or the dimensionality of atmospheric variability. It was found that well over 99% of the spectral information was contained in the first six mutually orthogonal linear combinations of the observed variables (flux at various wavelengths). Rotation of the principal components was effective in separating various components by their independent physical influences. The majority of the variability in the downwelling solar irradiance (380-1000 nm) was explained by the following fundamental atmospheric parameters (in order of their importance): cloud scattering, water vapor absorption, molecular scattering, and ozone absorption. In contrast to what has been proposed as a resolution to a clear-sky absorption anomaly, no unexpected gaseous absorption signature was found in any of the significant components.

  2. Spectral analysis of signals the missing data case

    CERN Document Server

    Wang, Yanwei

    2006-01-01

    Spectral estimation is important in many fields including astronomy, meteorology, seismology, communications, economics, speech analysis, medical imaging, radar, sonar, and underwater acoustics. Most existing spectral estimation algorithms are devised for uniformly sampled complete-data sequences. However, the spectral estimation for data sequences with missing samples is also important in many applications ranging from astronomical time series analysis to synthetic aperture radar imaging with angular diversity. For spectral estimation in the missing-data case, the challenge is how to extend t

  3. Least Squares Moving-Window Spectral Analysis.

    Science.gov (United States)

    Lee, Young Jong

    2017-01-01

    Least squares regression is proposed as a moving-windows method for analysis of a series of spectra acquired as a function of external perturbation. The least squares moving-window (LSMW) method can be considered an extended form of the Savitzky-Golay differentiation for nonuniform perturbation spacing. LSMW is characterized in terms of moving-window size, perturbation spacing type, and intensity noise. Simulation results from LSMW are compared with results from other numerical differentiation methods, such as single-interval differentiation, autocorrelation moving-window, and perturbation correlation moving-window methods. It is demonstrated that this simple LSMW method can be useful for quantitative analysis of nonuniformly spaced spectral data with high frequency noise.

  4. Structural Vibration Monitoring Using Cumulative Spectral Analysis

    Directory of Open Access Journals (Sweden)

    Satoru Goto

    2013-01-01

    Full Text Available This paper describes a resonance decay estimation for structural health monitoring in the presence of nonstationary vibrations. In structural health monitoring, the structure's frequency response and resonant decay characteristics are very important for understanding how the structure changes. Cumulative spectral analysis (CSA estimates the frequency decay by using the impulse response. However, measuring the impulse response of buildings is impractical due to the need to shake the building itself. In a previous study, we reported on system damping monitoring using cumulative harmonic analysis (CHA, which is based on CSA. The current study describes scale model experiments on estimating the hidden resonance decay under non-stationary noise conditions by using CSA for structural condition monitoring.

  5. Transit Timing Observations From Kepler: Ii. Confirmation of Two Multiplanet Systems via a Non-Parametric Correlation Analysis

    OpenAIRE

    Ford, Eric B.; Fabrycky, Daniel C.; Steffen, Jason H.; Carter, Joshua A.; Fressin, Francois; Holman, Matthew Jon; Lissauer, Jack J.; Moorhead, Althea V.; Morehead, Robert C.; Ragozzine, Darin; Rowe, Jason F.; Welsh, William F.; Allen, Christopher; Batalha, Natalie M.; Borucki, William J.

    2012-01-01

    We present a new method for confirming transiting planets based on the combination of transit timingn variations (TTVs) and dynamical stability. Correlated TTVs provide evidence that the pair of bodies are in the same physical system. Orbital stability provides upper limits for the masses of the transiting companions that are in the planetary regime. This paper describes a non-parametric technique for quantifying the statistical significance of TTVs based on the correlation of two TTV data se...

  6. Spectral luminescence analysis of amniotic fluid

    Science.gov (United States)

    Slobozhanina, Ekaterina I.; Kozlova, Nataly M.; Kasko, Leonid P.; Mamontova, Marina V.; Chernitsky, Eugene A.

    1997-12-01

    It is shown that the amniotic fluid has intensive ultra-violet luminescence caused by proteins. Along with it amniotic fluid radiated in the field of 380 - 650 nm with maxima at 430 - 450 nm and 520 - 560 nm. The first peak of luminescence ((lambda) exc equals 350 nm; (lambda) em equals 430 - 440 nm) is caused (most probably) by the presence in amniotic fluid of some hormones, NADH2 and NADPH2. A more long-wave component ((lambda) exc equals 460 nm; (lambda) em equals 520 - 560 nm) is most likely connected with the presence in amniotic fluid pigments (bilirubin connected with protein and other). It is shown that intensity and maximum of ultra-violet luminescence spectra of amniotic fluid in normality and at pathology are identical. However both emission spectra and excitation spectra of long-wave ((lambda) greater than 450 nm) luminescence of amniotic fluid from pregnant women with such prenatal abnormal developments of a fetus as anencephaly and spina bifida are too long-wave region in comparison with the norm. Results of research testify that spectral luminescent analysis of amniotic fluid can be used for screening of malformations of the neural tube. It is very difficult for a practical obstetrician to reveal pregnant women with a high risk of congenital malformations of the fetus. Apart from ultrasonic examination, cytogenetic examination of amniotic fluid and defumination of concentrations of alpha-fetoprotein and acetylcholin-esterases in the amniotic fluid and blood plasma are the most widely used diagnostic approaches. However, biochemical and cytogenetic diagnostic methods are time-consuming. In the present work spectral luminescence properties of the amniotic fluid are investigated to determine spectral parameters that can be used to reveal pregnant women with a high risk of congenital malformations of their offsprings.

  7. Spectral efficiency analysis of OCDMA systems

    Institute of Scientific and Technical Information of China (English)

    Hui Yan; Kun Qiu; Yun Ling

    2009-01-01

    We discuss several kinds of code schemes and analyze their spectral efficiency, code utilizing efficiency, and the maximal spectral efficiency. Error correction coding is used to increase the spectral efficiency, and it can avoid the spectral decrease with the increase of the length. The extended primer code (EPC) has the highest spectral efficiency in the unipolar code system. The bipolar code system has larger spectral efficiency than unipolar code system, but has lower code utilizing efficiency and the maximal spectral efficiency. From the numerical results, we can see that the spectral efficiency increases by 0.025 (b/s)/Hz when the bit error rate (BER) increases from 10-9 to 10-7.

  8. Nonparametric statistical methods

    CERN Document Server

    Hollander, Myles; Chicken, Eric

    2013-01-01

    Praise for the Second Edition"This book should be an essential part of the personal library of every practicing statistician."-Technometrics  Thoroughly revised and updated, the new edition of Nonparametric Statistical Methods includes additional modern topics and procedures, more practical data sets, and new problems from real-life situations. The book continues to emphasize the importance of nonparametric methods as a significant branch of modern statistics and equips readers with the conceptual and technical skills necessary to select and apply the appropriate procedures for any given sit

  9. Exoplanetary Detection by Multifractal Spectral Analysis

    Science.gov (United States)

    Agarwal, Sahil; Del Sordo, Fabio; Wettlaufer, John S.

    2017-01-01

    Owing to technological advances, the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies timescales that characterize planetary orbital motion around the host star and those that arise from stellar features such as spots. Without fitting stellar models to spectral data, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the timescales obtained to primary and secondary eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via Doppler shift detection. Finally, we have analyzed synthetic spectra obtained using the SOAP 2.0 tool, which simulates a stellar spectrum and the influence of the presence of a planet or a spot on that spectrum over one orbital period. We have demonstrated that, so long as the signal-to-noise-ratio ≥ 75, our approach reconstructs the planetary orbital period, as well as the rotation period of a spot on the stellar surface.

  10. Bayesian semiparametric power spectral density estimation in gravitational wave data analysis

    CERN Document Server

    Edwards, Matthew C; Christensen, Nelson

    2015-01-01

    The standard noise model in gravitational wave (GW) data analysis assumes detector noise is stationary and Gaussian distributed, with a known power spectral density (PSD) that is usually estimated using clean off-source data. Real GW data often depart from these assumptions, and misspecified parametric models of the PSD could result in misleading inferences. We propose a Bayesian semiparametric approach to improve this. We use a nonparametric Bernstein polynomial prior on the PSD, with weights attained via a Dirichlet process distribution, and update this using the Whittle likelihood. Posterior samples are obtained using a Metropolis-within-Gibbs sampler. We simultaneously estimate the reconstruction parameters of a rotating core collapse supernova GW burst that has been embedded in simulated Advanced LIGO noise. We also discuss an approach to deal with non-stationary data by breaking longer data streams into smaller and locally stationary components.

  11. Spectral analysis on graph-like spaces

    CERN Document Server

    Post, Olaf

    2012-01-01

    Small-radius tubular structures have attracted considerable attention in the last few years, and are frequently used in different areas such as Mathematical Physics, Spectral Geometry and Global Analysis.   In this monograph, we analyse Laplace-like operators on thin tubular structures ("graph-like spaces''), and their natural limits on metric graphs. In particular, we explore norm resolvent convergence, convergence of the spectra and resonances.   Since the underlying spaces in the thin radius limit change, and become singular in the limit, we develop new tools such as   -norm convergence of operators acting in different Hilbert  spaces,   - an extension of the concept of boundary triples to partial  differential operators, and   -an abstract definition of resonances via boundary triples.   These tools are formulated in an abstract framework, independent of the original problem of graph-like spaces, so that they can be applied in many other situations where the spaces are perturbed.

  12. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  13. CATDAT : A Program for Parametric and Nonparametric Categorical Data Analysis : User's Manual Version 1.0, 1998-1999 Progress Report.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-12-01

    Natural resource professionals are increasingly required to develop rigorous statistical models that relate environmental data to categorical responses data. Recent advances in the statistical and computing sciences have led to the development of sophisticated methods for parametric and nonparametric analysis of data with categorical responses. The statistical software package CATDAT was designed to make some of these relatively new and powerful techniques available to scientists. The CATDAT statistical package includes 4 analytical techniques: generalized logit modeling; binary classification tree; extended K-nearest neighbor classification; and modular neural network.

  14. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    Science.gov (United States)

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  15. Phase Spectral Analysis of EEG Signals

    Institute of Scientific and Technical Information of China (English)

    YOURong-yi; CHENZhong

    2004-01-01

    A new method of phase spectral analysis of EEG is proposed for the comparative analysis of phase spectra between normal EEG and epileptic EEG signals based on the wavelet decomposition technique. By using multiscale wavelet decomposition, the original EEGs are mapped to an orthogonal wavelet space, such that the variations of phase can be observed at multiscale. It is found that the phase (and phase difference) spectra of normal EEGs are distinct from that of epileptic EEGs. That is the variations of phase (and phase difference) of normal EEGs have a distinct periodic pattern with the electrical activity proceeds in the brain, but do not the epileptic EEGs. For epileptic EEGs, only at those transient points, the phase variations are obvious. In order to verify these results with the observational data, the phase variations of EEGs in principal component space are observed and found that, the features of phase spectra is in correspondence with that the wavelet space. These results make it possible to view the behavior of EEG rhythms as a dynamic spectrum.

  16. Spectral analysis of individual realization LDA data

    NARCIS (Netherlands)

    Tummers, M.J.; Passchier, D.M.

    1998-01-01

    The estimation of the autocorrelation function (act) or the spectral density function (sdt) from LDA data poses unique data-processing problems. The random sampling times in LDA preclude the use of the spectral methods for equi-spaced samples. As a consequence, special data-processing algorithms are

  17. Nonparametric Predictive Regression

    OpenAIRE

    Ioannis Kasparis; Elena Andreou; Phillips, Peter C.B.

    2012-01-01

    A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit...

  18. Partial spectral analysis of hydrological time series

    Science.gov (United States)

    Jukić, D.; Denić-Jukić, V.

    2011-03-01

    SummaryHydrological time series comprise the influences of numerous processes involved in the transfer of water in hydrological cycle. It implies that an ambiguity with respect to the processes encoded in spectral and cross-spectral density functions exists. Previous studies have not paid attention adequately to this issue. Spectral and cross-spectral density functions represent the Fourier transforms of auto-covariance and cross-covariance functions. Using this basic property, the ambiguity is resolved by applying a novel approach based on the spectral representation of partial correlation. Mathematical background for partial spectral density, partial amplitude and partial phase functions is presented. The proposed functions yield the estimates of spectral density, amplitude and phase that are not affected by a controlling process. If an input-output relation is the subject of interest, antecedent and subsequent influences of the controlling process can be distinguished considering the input event as a referent point. The method is used for analyses of the relations between the rainfall, air temperature and relative humidity, as well as the influences of air temperature and relative humidity on the discharge from karst spring. Time series are collected in the catchment of the Jadro Spring located in the Dinaric karst area of Croatia.

  19. Photometric Redshift Estimation Using Spectral Connectivity Analysis

    CERN Document Server

    Freeman, P E; Lee, A B; Richards, J W; Schafer, C M

    2009-01-01

    The development of fast and accurate methods of photometric redshift estimation is a vital step towards being able to fully utilize the data of next-generation surveys within precision cosmology. In this paper we apply a specific approach to spectral connectivity analysis (SCA; Lee & Wasserman 2009) called diffusion map. SCA is a class of non-linear techniques for transforming observed data (e.g., photometric colours for each galaxy, where the data lie on a complex subset of p-dimensional space) to a simpler, more natural coordinate system wherein we apply regression to make redshift predictions. As SCA relies upon eigen-decomposition, our training set size is limited to ~ 10,000 galaxies; we use the Nystrom extension to quickly estimate diffusion coordinates for objects not in the training set. We apply our method to 350,738 SDSS main sample galaxies, 29,816 SDSS luminous red galaxies, and 5,223 galaxies from DEEP2 with CFHTLS ugriz photometry. For all three datasets, we achieve prediction accuracies on ...

  20. Factor Analysis for Spectral Reconnaissance and Situational Understanding

    Science.gov (United States)

    2016-07-11

    reviewed journals: Final Report: Factor Analysis for Spectral Reconnaissance and Situational Understanding Report Title The Army has a critical need for...based NP-hard design problems, by associating them with corresponding estimation problems. 1 Factor Analysis for Spectral Reconnaissance and Situational ...SECURITY CLASSIFICATION OF: The Army has a critical need for enhancing situational understanding for dismounted soldiers and rapidly deployed tactical

  1. Nonparametric factorial analysis of daily weigh-in-motion traffic: implications for the ozone "weekend effect" in Southern California

    Science.gov (United States)

    Gao, Oliver H.; Holmén, Britt A.; Niemeier, Debbie A.

    The Ozone Weekend Effect (OWE) has become increasingly more frequent and widespread in southern California since the mid-1970s. Although a number of hypotheses have been suggested to explain the effect, there remains uncertainty associated with the root factors contributing to elevated weekend ozone concentrations. Targeting the time window of the 1997 Southern California Ozone Study (SCOS97), this paper examines traffic activity data for 14 vehicle classes at 27 weigh-in-motion (WIM) stations in southern California. Nonparametric factorial analyses of light-duty vehicle (LDV) and heavy-duty truck (HDT) traffic volumes indicate significant differences in daily volumes by day of week and between the weekly patterns of daily LDV and HDT volumes. Across WIM stations, the daily LDV volume was highest on Friday and decreased by 10% on weekends compared to that on midweek days. In contrast, daily HDT volumes showed dramatic weekend drops of 53% on Saturday and 64% on Sunday. As a result, LDV to HDT ratios increased by 145% on weekends. Nonparametric tests also suggest that weekly traffic patterns varied significantly between WIM stations located close to (central) and far from (peripheral) the Los Angeles Metro area. Weekend increases in LDV/HDT ratios were more pronounced at central WIM sites due to greater weekend declines of HDT relative to LDV traffic. The implications of these weekly traffic patterns for the OWE in southern California were investigated by estimating daily WIM traffic on-road running exhaust emissions of total organic gas (TOG) and oxides of nitrogen (NO x) using EMFAC2002 emission factors. The results support the California Air Resource Board's (CARB's) NO x reduction hypothesis that greater weekend NO x reductions relative to volatile organic compound (VOC) emissions, in combinations with the VOC-limited ozone system, contribute to the OWE observed in the region. The results from this study can be used to develop weekend on-road mobile emission

  2. Spectral Efficiency Analysis for Multicarrier Based 4G Systems

    DEFF Research Database (Denmark)

    Silva, Nuno; Rahman, Muhammad Imadur; Frederiksen, Flemming Bjerge;

    2006-01-01

    In this paper, a spectral efficiency definition is proposed. Spectral efficiency for multicarrier based multiaccess techniques, such as OFDMA, MC-CDMA and OFDMA-CDM, is analyzed. Simulations for different indoor and outdoor scenarios are carried out. Based on the simulations, we have discussed ho...... different wireless channel’s condition affects the performance of a system in terms of spectral efficiency. Based on our analysis, we have also recommended different access techniques for different scenarios....

  3. Spectral signature verification using statistical analysis and text mining

    Science.gov (United States)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  4. Temporal shape analysis via the spectral signature.

    Science.gov (United States)

    Bernardis, Elena; Konukoglu, Ender; Ou, Yangming; Metaxas, Dimitris N; Desjardins, Benoit; Pohl, Kilian M

    2012-01-01

    In this paper, we adapt spectral signatures for capturing morphological changes over time. Advanced techniques for capturing temporal shape changes frequently rely on first registering the sequence of shapes and then analyzing the corresponding set of high dimensional deformation maps. Instead, we propose a simple encoding motivated by the observation that small shape deformations lead to minor refinements in the spectral signature composed of the eigenvalues of the Laplace operator. The proposed encoding does not require registration, since spectral signatures are invariant to pose changes. We apply our representation to the shapes of the ventricles extracted from 22 cine MR scans of healthy controls and Tetralogy of Fallot patients. We then measure the accuracy score of our encoding by training a linear classifier, which outperforms the same classifier based on volumetric measurements.

  5. Multiatlas segmentation as nonparametric regression.

    Science.gov (United States)

    Awate, Suyash P; Whitaker, Ross T

    2014-09-01

    This paper proposes a novel theoretical framework to model and analyze the statistical characteristics of a wide range of segmentation methods that incorporate a database of label maps or atlases; such methods are termed as label fusion or multiatlas segmentation. We model these multiatlas segmentation problems as nonparametric regression problems in the high-dimensional space of image patches. We analyze the nonparametric estimator's convergence behavior that characterizes expected segmentation error as a function of the size of the multiatlas database. We show that this error has an analytic form involving several parameters that are fundamental to the specific segmentation problem (determined by the chosen anatomical structure, imaging modality, registration algorithm, and label-fusion algorithm). We describe how to estimate these parameters and show that several human anatomical structures exhibit the trends modeled analytically. We use these parameter estimates to optimize the regression estimator. We show that the expected error for large database sizes is well predicted by models learned on small databases. Thus, a few expert segmentations can help predict the database sizes required to keep the expected error below a specified tolerance level. Such cost-benefit analysis is crucial for deploying clinical multiatlas segmentation systems.

  6. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major......This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... currencies. Results show the nonparametric model generally dominates the others when evaluating in-sample. However, the semiparametric model is best for out-of-sample analysis....

  7. A Censored Nonparametric Software Reliability Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper analyses the effct of censoring on the estimation of failure rate, and presents a framework of a censored nonparametric software reliability model. The model is based on nonparametric testing of failure rate monotonically decreasing and weighted kernel failure rate estimation under the constraint of failure rate monotonically decreasing. Not only does the model have the advantages of little assumptions and weak constraints, but also the residual defects number of the software system can be estimated. The numerical experiment and real data analysis show that the model performs well with censored data.

  8. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  9. Visual category recognition using Spectral Regression and Kernel Discriminant Analysis

    NARCIS (Netherlands)

    Tahir, M.A.; Kittler, J.; Mikolajczyk, K.; Yan, F.; van de Sande, K.E.A.; Gevers, T.

    2009-01-01

    Visual category recognition (VCR) is one of the most important tasks in image and video indexing. Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. Recently, Spectral Regression combined with Kernel Discriminant Analysis (SR-KDA) has been s

  10. Spectral Analysis of Rich Network Topology in Social Networks

    Science.gov (United States)

    Wu, Leting

    2013-01-01

    Social networks have received much attention these days. Researchers have developed different methods to study the structure and characteristics of the network topology. Our focus is on spectral analysis of the adjacency matrix of the underlying network. Recent work showed good properties in the adjacency spectral space but there are few…

  11. 二维PCA非参数子空间分析的人脸识别算法%Face Recognition Algorithm of 2DPCA Nonparametric Subspace Analysis

    Institute of Scientific and Technical Information of China (English)

    王美; 梁久祯

    2011-01-01

    This paper proposes a novel face recognition algorithm of 2D Nonparametric Subspace Analysis(2DNSA) based on 2D Principal Componet Analysis(2DPCA) subspace. The original face matrices are performed to have feature dimension reduction, and the reduced feature matrices are used as a new training set, which can be conducted by 2D non-parametric subspace analysis. This method not only can reduce feature dimensions by 2DPCA, but also consider the impact of boundary samples for classification by taking full advantage of classification capacity of 2DNSA, which avoids the irrationality of using class centers to measure the distances of different classes. Experimental results on the two face databases(namely Yale and LARGE) show the improvements of the developed new algorithm over the traditional subspace methods such as (2D)2PCA, 2DPCA, (2D)2LDA, 2DLDA, 2DPCA+2DLDA, 2DNSA, etc.%提出一种结合二维PCA(2DPCA)的二维非参数子空间分析(2DNSA)人脸识别算法.利用2DPCA对原始图像矩阵进行特征降维,以降维后的特征为训练样本,进行二维非参数判别分析,并综合考虑类边界样本对分类的影响,采用2DNSA实现更合理的特征提取.基于Yale、LARGE人脸数据库的实验结果表明,与(2D)2pCA、2DPCA、(2D)2LDA、2DLDA、2DPCA+2DLDA、2DNSA算法相比,该算法性能更优.

  12. Nonparametric trend estimation in the presence of fractal noise: application to fMRI time-series analysis.

    Science.gov (United States)

    Afshinpour, Babak; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2008-06-30

    Unknown low frequency fluctuations called "trend" are observed in noisy time-series measured for different applications. In some disciplines, they carry primary information while in other fields such as functional magnetic resonance imaging (fMRI) they carry nuisance effects. In all cases, however, it is necessary to estimate them accurately. In this paper, a method for estimating trend in the presence of fractal noise is proposed and applied to fMRI time-series. To this end, a partly linear model (PLM) is fitted to each time-series. The parametric and nonparametric parts of PLM are considered as contributions of hemodynamic response and trend, respectively. Using the whitening property of wavelet transform, the unknown components of the model are estimated in the wavelet domain. The results of the proposed method are compared to those of other parametric trend-removal approaches such as spline and polynomial models. It is shown that the proposed method improves activation detection and decreases variance of the estimated parameters relative to the other methods.

  13. A Unified Discussion on the Concept of Score Functions Used in the Context of Nonparametric Linkage Analysis

    Directory of Open Access Journals (Sweden)

    Lars Ängquist

    2008-01-01

    Full Text Available In this article we try to discuss nonparametric linkage (NPL score functions within a broad and quite general framework. The main focus of the paper is the structure, derivation principles and interpretations of the score function entity itself. We define and discuss several families of one-locus score function definitions, i.e. the implicit, explicit and optimal ones. Some generalizations and comments to the two-locus, unconditional and conditional, cases are included as well. Although this article mainly aims at serving as an overview, where the concept of score functions are put into a covering context, we generalize the noncentrality parameter (NCP optimal score functions in Ängquist et al. (2007 to facilitate—through weighting—for incorporation of several plausible distinct genetic models. Since the genetic model itself most oftenly is to some extent unknown this facilitates weaker prior assumptions with respect to plausible true disease models without loosing the property of NCP-optimality. Moreover, we discuss general assumptions and properties of score functions in the above sense. For instance, the concept of identical by descent (IBD sharing structures and score function equivalence are discussed in some detail.

  14. Nonlinear physical systems spectral analysis, stability and bifurcations

    CERN Document Server

    Kirillov, Oleg N

    2013-01-01

    Bringing together 18 chapters written by leading experts in dynamical systems, operator theory, partial differential equations, and solid and fluid mechanics, this book presents state-of-the-art approaches to a wide spectrum of new and challenging stability problems.Nonlinear Physical Systems: Spectral Analysis, Stability and Bifurcations focuses on problems of spectral analysis, stability and bifurcations arising in the nonlinear partial differential equations of modern physics. Bifurcations and stability of solitary waves, geometrical optics stability analysis in hydro- and magnetohydrodynam

  15. Spectral Synthesis via Mean Field approach Independent Component Analysis

    CERN Document Server

    Hu, Ning; Kong, Xu

    2015-01-01

    In this paper, we apply a new statistical analysis technique, Mean Field approach to Bayesian Independent Component Analysis (MF-ICA), on galaxy spectral analysis. This algorithm can compress the stellar spectral library into a few Independent Components (ICs), and galaxy spectrum can be reconstructed by these ICs. Comparing to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, MF-ICA approach offers a large improvement in the efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter-recover for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters from the Sloan Digital Sky Survey galaxies. We find that our MF-ICA method not only can fit the observed galaxy spectra efficiently, but also can recover the physical parameters of galaxies accurately. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find...

  16. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…

  17. Decompounding random sums: A nonparametric approach

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Pitts, Susan M.

    review a number of applications and consider the nonlinear inverse problem of inferring the cumulative distribution function of the components in the random sum. We review the existing literature on non-parametric approaches to the problem. The models amenable to the analysis are generalized considerably...

  18. How Are Teachers Teaching? A Nonparametric Approach

    Science.gov (United States)

    De Witte, Kristof; Van Klaveren, Chris

    2014-01-01

    This paper examines which configuration of teaching activities maximizes student performance. For this purpose a nonparametric efficiency model is formulated that accounts for (1) self-selection of students and teachers in better schools and (2) complementary teaching activities. The analysis distinguishes both individual teaching (i.e., a…

  19. Accuracy analysis of a spectral Poisson solver

    Energy Technology Data Exchange (ETDEWEB)

    Rambaldi, S. [Dipartimento di Fisica Universita di Bologna and INFN, Bologna, Via Irnerio 46, 40126 (Italy)]. E-mail: rambaldi@bo.infn.it; Turchetti, G. [Dipartimento di Fisica Universita di Bologna and INFN, Bologna, Via Irnerio 46, 40126 (Italy); Benedetti, C. [Dipartimento di Fisica Universita di Bologna and INFN, Bologna, Via Irnerio 46, 40126 (Italy); Mattioli, F. [Dipartimento di Fisica Universita di Bologna, Bologna, Via Irnerio 46, 40126 (Italy); Franchi, A. [GSI, Darmstadt, Planckstr. 1, 64291 (Germany)

    2006-06-01

    We solve Poisson's equation in d=2,3 space dimensions by using a spectral method based on Fourier decomposition. The choice of the basis implies that Dirichlet boundary conditions on a box are satisfied. A Green's function-based procedure allows us to impose Dirichlet conditions on any smooth closed boundary, by doubling the computational complexity. The error introduced by the spectral truncation and the discretization of the charge distribution is evaluated by comparison with the exact solution, known in the case of elliptical symmetry. To this end boundary conditions on an equipotential ellipse (ellipsoid) are imposed on the numerical solution. Scaling laws for the error dependence on the number K of Fourier components for each space dimension and the number N of point charges used to simulate the charge distribution are presented and tested. A procedure to increase the accuracy of the method in the beam core region is briefly outlined.

  20. Methods of Spectral Analysis in C++ (MOSAIC)

    Science.gov (United States)

    Engesser, Michael

    2016-06-01

    Stellar spectroscopic classification is most often still done by hand. MOSAIC is a project focused on the collection and classification of astronomical spectra using a computerized algorithm. The code itself attempts to accurately classify stellar spectra according to the broad spectral classes within the Morgan-Keenan system of spectral classification, based on estimated temperature and the relative abundances of certain notable elements (Hydrogen, Helium, etc.) in the stellar atmosphere. The methodology includes calibrating the wavelength for pixels across the image by using the wavelength dispersion of pixels inherent with the spectrograph used. It then calculates the location of the peak in the star's Planck spectrum in order to roughly classify the star. Fitting the graph to a blackbody curve is the final step for a correct classification. Future work will involve taking a closer look at emission lines and luminosity classes.

  1. Artifacts Of Spectral Analysis Of Instrument Readings

    Science.gov (United States)

    Wise, James H.

    1995-01-01

    Report presents experimental and theoretical study of some of artifacts introduced by processing outputs of two nominally identical low-frequency-reading instruments; high-sensitivity servo-accelerometers mounted together and operating, in conjunction with signal-conditioning circuits, as seismometers. Processing involved analog-to-digital conversion with anti-aliasing filtering, followed by digital processing including frequency weighting and computation of different measures of power spectral density (PSD).

  2. The 12-item World Health Organization Disability Assessment Schedule II (WHO-DAS II: a nonparametric item response analysis

    Directory of Open Access Journals (Sweden)

    Fernandez Ana

    2010-05-01

    Full Text Available Abstract Background Previous studies have analyzed the psychometric properties of the World Health Organization Disability Assessment Schedule II (WHO-DAS II using classical omnibus measures of scale quality. These analyses are sample dependent and do not model item responses as a function of the underlying trait level. The main objective of this study was to examine the effectiveness of the WHO-DAS II items and their options in discriminating between changes in the underlying disability level by means of item response analyses. We also explored differential item functioning (DIF in men and women. Methods The participants were 3615 adult general practice patients from 17 regions of Spain, with a first diagnosed major depressive episode. The 12-item WHO-DAS II was administered by the general practitioners during the consultation. We used a non-parametric item response method (Kernel-Smoothing implemented with the TestGraf software to examine the effectiveness of each item (item characteristic curves and their options (option characteristic curves in discriminating between changes in the underliying disability level. We examined composite DIF to know whether women had a higher probability than men of endorsing each item. Results Item response analyses indicated that the twelve items forming the WHO-DAS II perform very well. All items were determined to provide good discrimination across varying standardized levels of the trait. The items also had option characteristic curves that showed good discrimination, given that each increasing option became more likely than the previous as a function of increasing trait level. No gender-related DIF was found on any of the items. Conclusions All WHO-DAS II items were very good at assessing overall disability. Our results supported the appropriateness of the weights assigned to response option categories and showed an absence of gender differences in item functioning.

  3. spectral analysis of ground magnetic data in magadi area, southern ...

    African Journals Online (AJOL)

    Mgina

    issue from fractures distributed along the shores of the lake. Presence of ... Spectral analysis involving determining power spectrum was applied to magnetic data along selected profiles ... of Lake Magadi issuing from the base of fault scarps.

  4. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  5. Nonparametric tests for censored data

    CERN Document Server

    Bagdonavicus, Vilijandas; Nikulin, Mikhail

    2013-01-01

    This book concerns testing hypotheses in non-parametric models. Generalizations of many non-parametric tests to the case of censored and truncated data are considered. Most of the test results are proved and real applications are illustrated using examples. Theories and exercises are provided. The incorrect use of many tests applying most statistical software is highlighted and discussed.

  6. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  7. Spectral Analysis of Diffusions with Jump Boundary

    CERN Document Server

    Kolb, Martin

    2011-01-01

    In this paper we consider one-dimensional diffusions with constant coefficients in a finite interval with jump boundary and a certain deterministic jump distribution. We use coupling methods in order to identify the spectral gap in the case of a large drift and prove that that there is a threshold drift above which the bottom of the spectrum no longer depends on the drift. As a Corollary to our result we are able to answer two questions concerning elliptic eigenvalue problems with non-local boundary conditions formulated previously by Iddo Ben-Ari and Ross Pinsky.

  8. Multitemporal spectral analysis for cheatgrass (Bromus tectorum) classification.

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Nagendra [ORNL; Glenn, Nancy F [Idaho State University

    2009-07-01

    Operational satellite remote sensing data can provide the temporal repeatability necessary to capture phenological differences among species. This study develops a multitemporal stacking method coupled with spectral analysis for extracting information from Landsat imagery to provide species-level information. Temporal stacking can, in an approximate mathematical sense, effectively increase the 'spectral' resolution of the system by adding spectral bands of several multitemporal images. As a demonstration, multitemporal linear spectral unmixing is used to successfully delineate cheatgrass (Bromus tectorum) from soil and surrounding vegetation (77% overall accuracy). This invasive plant is an ideal target for exploring multitemporal methods because of its phenological differences with other vegetation in early spring and, to a lesser degree, in late summer. The techniques developed in this work are directly applicable for other targets with temporally unique spectral differences.

  9. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  10. Hyper-spectral scanner design and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.; Moses, J.; Smith, R.

    1996-06-01

    This is the final report of a two-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). An earlier project produced rough designs for key components of a compact hyper-spectral sensor for environmental and ecological measurements. Such sensors could be deployed on unmanned vehicles, aircraft, or satellites for measurements important to agriculture, the environment, and ecologies. This represents an important advance in remote sensing. Motorola invited us to propose an add-on, proof-of-principle sensor for their Comet satellite, whose primary mission is to demonstrate a channel of the IRIDIUM satellite communications system. Our project converted the preliminary designs from the previous effort into final designs for the telescope, camera, computer and interfaces that constitute the hyper-spectral scanning sensor. The work concentrated on design, fabrication, preliminary integration, and testing of the electronic circuit boards for the computer, data compression board, and interface board for the camera-computer and computer-modulator (transmitter) interfaces.

  11. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    施沛德; 王海燕; 张利华

    2000-01-01

    For regression analysis, some useful Information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literat黵e, but the optimal rates of global convergence have not been obtained yet. Because of the possible Information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression f unction based on right-censored response data, and proves, under some regularity condi-tions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtai

  12. 2nd Conference of the International Society for Nonparametric Statistics

    CERN Document Server

    Manteiga, Wenceslao; Romo, Juan

    2016-01-01

    This volume collects selected, peer-reviewed contributions from the 2nd Conference of the International Society for Nonparametric Statistics (ISNPS), held in Cádiz (Spain) between June 11–16 2014, and sponsored by the American Statistical Association, the Institute of Mathematical Statistics, the Bernoulli Society for Mathematical Statistics and Probability, the Journal of Nonparametric Statistics and Universidad Carlos III de Madrid. The 15 articles are a representative sample of the 336 contributed papers presented at the conference. They cover topics such as high-dimensional data modelling, inference for stochastic processes and for dependent data, nonparametric and goodness-of-fit testing, nonparametric curve estimation, object-oriented data analysis, and semiparametric inference. The aim of the ISNPS 2014 conference was to bring together recent advances and trends in several areas of nonparametric statistics in order to facilitate the exchange of research ideas, promote collaboration among researchers...

  13. Nonparametric statistical methods using R

    CERN Document Server

    Kloke, John

    2014-01-01

    A Practical Guide to Implementing Nonparametric and Rank-Based ProceduresNonparametric Statistical Methods Using R covers traditional nonparametric methods and rank-based analyses, including estimation and inference for models ranging from simple location models to general linear and nonlinear models for uncorrelated and correlated responses. The authors emphasize applications and statistical computation. They illustrate the methods with many real and simulated data examples using R, including the packages Rfit and npsm.The book first gives an overview of the R language and basic statistical c

  14. Broadband Spectral Analysis of Aql X-1

    CERN Document Server

    Raichur, H; Dewangan, G

    2011-01-01

    We present the results of a broadband spectral study of the transient Low Mass X-ray Binary Aql X-1 observed by Suzaku and Rossi X-ray Timing Explorer satellites. The source was observed during its 2007 outburst in the High/Soft (Banana) state and in the Low/Hard (Extreme Island) state. Both the Banana state and the Extreme Island state spectra are best described by a two component model consisting of a soft multi-colour blackbody emission likely originating from the accretion disk and a harder Comptonized emission from the boundary layer. Evidence for a hard tail (extending to ~50 keV) is found during the Banana state; this further (transient) component, accounting for atleast ~1.5% of the source luminosity, is modeled by a power-law. Aql X-1 is the second Atoll source after GX 13+1 to show a high energy tail. The presence of a weak but broad Fe line provides further support for a standard accretion disk extending nearly to the neutron star surface. The input photons for the Comptonizing boundary layer could...

  15. Spectral Analysis and Atmospheric Models of Microflares

    Institute of Scientific and Technical Information of China (English)

    Cheng Fang; Yu-Hua Tang; Zhi Xu

    2006-01-01

    By use of the high-resolution spectral data obtained with THEMIS on 2002 September 5, the spectra and characteristics of five well-observed microflares have been analyzed. Our results indicate that some of them are located near the longitudinal magnetic polarity inversion lines. All the microflares are accompanied by mass motions. The most obvious characteristic of the Hα microflare spectra is the emission at the center of both Hα and CaII 8542(A) lines. For the first time both thermal and non-thermal semi-empirical atmospheric models for the conspicuous and faint microflares are computed. In computing the non-thermal models, we assume that the electron beam resulting from magnetic reconnection is produced in the chromosphere, because it requires lower energies for the injected particles.It is found there is obvious heating in the low chromosphere. The temperature enhancement is about 1000-2200 K in the thermal models. If the non-thermal effects are included, then the required temperature increase can be reduced by 100-150 K. These imply that the Hα microflares can probably be produced by magnetic reconnection in the solar Iower atmosphere.The radiative and kinetic energies of the Hα microflares are estimated and the total energy is found to be 1027 - 4× 1028 erg.

  16. Simulated spectra for QA/QC of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Jackman, K. R. (Kevin R.); Biegalski, S. R.

    2004-01-01

    Monte Carlo simulated spectra have been developed to test the peak analysis algorithms of several spectral analysis software packages. Using MCNP 5, generic sample spectra were generated in order to perform ANSI N42.14 standard spectral tests on Canberra Genie-2000, Ortec GammaVision, and UniSampo. The reference spectra were generated in MCNP 5 using an F8, pulse height, tally with a detector model of an actual Germanium detector used in counting. The detector model matches the detector resolution, energy calibration, and efficiency. The simulated spectra have been found to be useful in testing the reliability and performance of spectral analysis programs. The detector model used was found to be useful in testing the performance of modern spectral analysis software tools. The software packages were analyzed and found to be in compliance with the ANSI 42.14 tests of the peak-search and peak-fitting algorithms. This method of using simulated spectra can be used to perform the ANSI 42.14 tests on the reliability and performance of spectral analysis programs in the absence of standard radioactive materials.

  17. SPAM- SPECTRAL ANALYSIS MANAGER (DEC VAX/VMS VERSION)

    Science.gov (United States)

    Solomon, J. E.

    1994-01-01

    The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different

  18. Induction Motor Speed Estimation by Using Spectral Current Analysis

    OpenAIRE

    2009-01-01

    An interesting application for the FFT analysis is related to the induction motor speed estimation based on spectral current analysis. The paper presents the possibility of induction motor speed estimation by using the current harmonics generated because of the rotor slots and of the eccentricity.

  19. Multi-spectral Image Analysis for Astaxanthin Coating Classification

    DEFF Research Database (Denmark)

    Ljungqvist, Martin Georg; Ersbøll, Bjarne Kjær; Nielsen, Michael Engelbrecht

    2011-01-01

    Industrial quality inspection using image analysis on astaxanthin coating in aquaculture feed pellets is of great importance for automatic production control. In this study multi-spectral image analysis of pellets was performed using LDA, QDA, SNV and PCA on pixel level and mean value of pixels...

  20. International comparisons of the technical efficiency of the hospital sector: panel data analysis of OECD countries using parametric and non-parametric approaches.

    Science.gov (United States)

    Varabyova, Yauheniya; Schreyögg, Jonas

    2013-09-01

    There is a growing interest in the cross-country comparisons of the performance of national health care systems. The present work provides a comparison of the technical efficiency of the hospital sector using unbalanced panel data from OECD countries over the period 2000-2009. The estimation of the technical efficiency of the hospital sector is performed using nonparametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Internal and external validity of findings is assessed by estimating the Spearman rank correlations between the results obtained in different model specifications. The panel-data analyses using two-step DEA and one-stage SFA show that countries, which have higher health care expenditure per capita, tend to have a more technically efficient hospital sector. Whether the expenditure is financed through private or public sources is not related to the technical efficiency of the hospital sector. On the other hand, the hospital sector in countries with higher income inequality and longer average hospital length of stay is less technically efficient. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. A Critical Look at the Mass-Metallicity-SFR Relation in the Local Universe: Non-parametric Analysis Framework and Confounding Systematics

    CERN Document Server

    Salim, Samir; Ly, Chun; Brinchmann, Jarle; Davé, Romeel; Dickinson, Mark; Salzer, John J; Charlot, Stéphane

    2014-01-01

    It has been proposed that the mass-metallicity relation of galaxies exhibits a secondary dependence on star formation rate (SFR), and that the resulting M-Z-SFR relation may be redshift-invariant, i.e., "fundamental." However, conflicting results on the character of the SFR dependence, and whether it exists, have been reported. To gain insight into the origins of the conflicting results, we (a) devise a non-parametric, astrophysically-motivated analysis framework based on the offset from the star-forming ("main") sequence at a given stellar mass (relative specific SFR), (b) apply this methodology and perform a comprehensive re-analysis of the local M-Z-SFR relation, based on SDSS, GALEX, and WISE data, and (c) study the impact of sample selection, and of using different metallicity and SFR indicators. We show that metallicity is anti-correlated with specific SFR regardless of the indicators used. We do not find that the relation is spurious due to correlations arising from biased metallicity measurements, or ...

  2. Spectral mixture analysis of EELS spectrum-images

    Energy Technology Data Exchange (ETDEWEB)

    Dobigeon, Nicolas [University of Toulouse, IRIT/INP-ENSEEIHT, 2 rue Camichel, 31071 Toulouse Cedex 7 (France); Brun, Nathalie, E-mail: nathalie.brun@u-psud.fr [University of Paris Sud, Laboratoire de Physique des Solides, CNRS, UMR 8502, 91405 Orsay Cedex (France)

    2012-09-15

    Recent advances in detectors and computer science have enabled the acquisition and the processing of multidimensional datasets, in particular in the field of spectral imaging. Benefiting from these new developments, Earth scientists try to recover the reflectance spectra of macroscopic materials (e.g., water, grass, mineral types Horizontal-Ellipsis ) present in an observed scene and to estimate their respective proportions in each mixed pixel of the acquired image. This task is usually referred to as spectral mixture analysis or spectral unmixing (SU). SU aims at decomposing the measured pixel spectrum into a collection of constituent spectra, called endmembers, and a set of corresponding fractions (abundances) that indicate the proportion of each endmember present in the pixel. Similarly, when processing spectrum-images, microscopists usually try to map elemental, physical and chemical state information of a given material. This paper reports how a SU algorithm dedicated to remote sensing hyperspectral images can be successfully applied to analyze spectrum-image resulting from electron energy-loss spectroscopy (EELS). SU generally overcomes standard limitations inherent to other multivariate statistical analysis methods, such as principal component analysis (PCA) or independent component analysis (ICA), that have been previously used to analyze EELS maps. Indeed, ICA and PCA may perform poorly for linear spectral mixture analysis due to the strong dependence between the abundances of the different materials. One example is presented here to demonstrate the potential of this technique for EELS analysis. -- Highlights: Black-Right-Pointing-Pointer EELS spectrum images are identical to hyperspectral images for Earth science. Black-Right-Pointing-Pointer Spectral unmixing algorithms have proliferated in the remote sensing field. Black-Right-Pointing-Pointer These powerful techniques can be successfully applied to EELS mapping. Black-Right-Pointing-Pointer Potential

  3. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of

  4. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  5. Mass Defect from Nuclear Physics to Mass Spectral Analysis

    Science.gov (United States)

    Pourshahian, Soheil

    2017-09-01

    Mass defect is associated with the binding energy of the nucleus. It is a fundamental property of the nucleus and the principle behind nuclear energy. Mass defect has also entered into the mass spectrometry terminology with the availability of high resolution mass spectrometry and has found application in mass spectral analysis. In this application, isobaric masses are differentiated and identified by their mass defect. What is the relationship between nuclear mass defect and mass defect used in mass spectral analysis, and are they the same? [Figure not available: see fulltext.

  6. Spectral theory and nonlinear analysis with applications to spatial ecology

    CERN Document Server

    Cano-Casanova, S; Mora-Corral , C

    2005-01-01

    This volume details some of the latest advances in spectral theory and nonlinear analysis through various cutting-edge theories on algebraic multiplicities, global bifurcation theory, non-linear Schrödinger equations, non-linear boundary value problems, large solutions, metasolutions, dynamical systems, and applications to spatial ecology. The main scope of the book is bringing together a series of topics that have evolved separately during the last decades around the common denominator of spectral theory and nonlinear analysis - from the most abstract developments up to the most concrete applications to population dynamics and socio-biology - in an effort to fill the existing gaps between these fields.

  7. Online Fault Diagnosis Method Based on Nonlinear Spectral Analysis

    Institute of Scientific and Technical Information of China (English)

    WEI Rui-xuan; WU Li-xun; WANG Yong-chang; HAN Chong-zhao

    2005-01-01

    The fault diagnosis based on nonlinear spectral analysis is a new technique for the nonlinear fault diagnosis, but its online application could be limited because of the enormous compution requirements for the estimation of general frequency response functions. Based on the fully decoupled Volterra identification algorithm, a new online fault diagnosis method based on nonlinear spectral analysis is presented, which can availably reduce the online compution requirements of general frequency response functions. The composition and working principle of the method are described, the test experiments have been done for damping spring of a vehicle suspension system by utilizing the new method, and the results indicate that the method is efficient.

  8. Spectral Analysis of Large Particle Systems

    DEFF Research Database (Denmark)

    Dahlbæk, Jonas

    2017-01-01

    The Fröhlich polaron model is defined as a quadratic form, and its discrete spectrum is studied for each fixed total momentum ξ ∈ R d in the weak coupling regime. Criteria are determined by means of which the number of discrete eigenvalues may be deduced. The analysis is based on relating...

  9. Spectral analysis of the Chandra comet survey

    NARCIS (Netherlands)

    Bodewits, D.; Christian, D. J.; Torney, M.; Dryer, M.; Lisse, C. M.; Dennerl, K.; Zurbuchen, T. H.; Wolk, S. J.; Tielens, A. G. G. M.; Hoekstra, R.

    2007-01-01

    Aims. We present results of the analysis of cometary X-ray spectra with an extended version of our charge exchange emission model (Bodewits et al. 2006). We have applied this model to the sample of 8 comets thus far observed with the Chandra X-ray observatory and acis spectrometer in the 300 - 1000

  10. Spectral mixture analysis of EELS spectrum-images.

    Science.gov (United States)

    Dobigeon, Nicolas; Brun, Nathalie

    2012-09-01

    Recent advances in detectors and computer science have enabled the acquisition and the processing of multidimensional datasets, in particular in the field of spectral imaging. Benefiting from these new developments, Earth scientists try to recover the reflectance spectra of macroscopic materials (e.g., water, grass, mineral types…) present in an observed scene and to estimate their respective proportions in each mixed pixel of the acquired image. This task is usually referred to as spectral mixture analysis or spectral unmixing (SU). SU aims at decomposing the measured pixel spectrum into a collection of constituent spectra, called endmembers, and a set of corresponding fractions (abundances) that indicate the proportion of each endmember present in the pixel. Similarly, when processing spectrum-images, microscopists usually try to map elemental, physical and chemical state information of a given material. This paper reports how a SU algorithm dedicated to remote sensing hyperspectral images can be successfully applied to analyze spectrum-image resulting from electron energy-loss spectroscopy (EELS). SU generally overcomes standard limitations inherent to other multivariate statistical analysis methods, such as principal component analysis (PCA) or independent component analysis (ICA), that have been previously used to analyze EELS maps. Indeed, ICA and PCA may perform poorly for linear spectral mixture analysis due to the strong dependence between the abundances of the different materials. One example is presented here to demonstrate the potential of this technique for EELS analysis. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. HYPERSPECTRAL HYPERION IMAGERY ANALYSIS AND ITS APPLICATION USING SPECTRAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    W. Pervez

    2015-03-01

    Full Text Available Rapid advancement in remote sensing open new avenues to explore the hyperspectral Hyperion imagery pre-processing techniques, analysis and application for land use mapping. The hyperspectral data consists of 242 bands out of which 196 calibrated/useful bands are available for hyperspectral applications. Atmospheric correction applied to the hyperspectral calibrated bands make the data more useful for its further processing/ application. Principal component (PC analysis applied to the hyperspectral calibrated bands reduced the dimensionality of the data and it is found that 99% of the data is held in first 10 PCs. Feature extraction is one of the important application by using vegetation delineation and normalized difference vegetation index. The machine learning classifiers uses the technique to identify the pixels having significant difference in the spectral signature which is very useful for classification of an image. Supervised machine learning classifier technique has been used for classification of hyperspectral image which resulted in overall efficiency of 86.6703 and Kappa co-efficient of 0.7998.

  12. Spectral Synthesis via Mean Field approach to Independent Component Analysis

    Science.gov (United States)

    Hu, Ning; Su, Shan-Shan; Kong, Xu

    2016-03-01

    We apply a new statistical analysis technique, the Mean Field approach to Independent Component Analysis (MF-ICA) in a Bayseian framework, to galaxy spectral analysis. This algorithm can compress a stellar spectral library into a few Independent Components (ICs), and the galaxy spectrum can be reconstructed by these ICs. Compared to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, the MF-ICA approach offers a large improvement in efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter recovery for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters derived with galaxies from the Sloan Digital Sky Survey. We find that our MF-ICA method can not only fit the observed galaxy spectra efficiently, but can also accurately recover the physical parameters of galaxies. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find it can provide excellent fitting results for low signal-to-noise spectra.

  13. A Review of Unsupervised Spectral Target Analysis for Hyperspectral Imagery

    Directory of Open Access Journals (Sweden)

    Yingzi Du

    2010-01-01

    Full Text Available One of great challenges in unsupervised hyperspectral target analysis is how to obtain desired knowledge in an unsupervised means directly from the data for image analysis. This paper provides a review of unsupervised target analysis by first addressing two fundamental issues, “what are material substances of interest, referred to as targets?” and “how can these targets be extracted from the data?” and then further developing least squares (LS-based unsupervised algorithms for finding spectral targets for analysis. In order to validate and substantiate the proposed unsupervised hyperspectral target analysis, three applications in endmember extraction, target detection and linear spectral unmixing are considered where custom-designed synthetic images and real image scenes are used to conduct experiments.

  14. A Review of Unsupervised Spectral Target Analysis for Hyperspectral Imagery

    Directory of Open Access Journals (Sweden)

    Chang Mann-Li

    2010-01-01

    Full Text Available Abstract One of great challenges in unsupervised hyperspectral target analysis is how to obtain desired knowledge in an unsupervised means directly from the data for image analysis. This paper provides a review of unsupervised target analysis by first addressing two fundamental issues, "what are material substances of interest, referred to as targets?" and "how can these targets be extracted from the data?" and then further developing least squares (LS-based unsupervised algorithms for finding spectral targets for analysis. In order to validate and substantiate the proposed unsupervised hyperspectral target analysis, three applications in endmember extraction, target detection and linear spectral unmixing are considered where custom-designed synthetic images and real image scenes are used to conduct experiments.

  15. A note on the use of the non-parametric Wilcoxon-Mann-Whitney test in the analysis of medical studies

    Directory of Open Access Journals (Sweden)

    Kühnast, Corinna

    2008-04-01

    Full Text Available Background: Although non-normal data are widespread in biomedical research, parametric tests unnecessarily predominate in statistical analyses. Methods: We surveyed five biomedical journals and – for all studies which contain at least the unpaired t-test or the non-parametric Wilcoxon-Mann-Whitney test – investigated the relationship between the choice of a statistical test and other variables such as type of journal, sample size, randomization, sponsoring etc. Results: The non-parametric Wilcoxon-Mann-Whitney was used in 30% of the studies. In a multivariable logistic regression the type of journal, the test object, the scale of measurement and the statistical software were significant. The non-parametric test was more common in case of non-continuous data, in high-impact journals, in studies in humans, and when the statistical software is specified, in particular when SPSS was used.

  16. Tomato sorting using independent component analysis on spectral images

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.; Young, I.T.

    2003-01-01

    Independent Component Analysis is one of the most widely used methods for blind source separation. In this paper we use this technique to estimate the most important compounds which play a role in the ripening of tomatoes. Spectral images of tomatoes were analyzed. Two main independent components

  17. PIXE-quantified AXSIA : elemental mapping by multivariate spectral analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, Barney Lee; Antolak, Arlyn J. (Sandia National Labs, Livermore, CA); Campbell, J. L. (University of Guelph, Guelph, ON, Canada); Ryan, C. G. (CSIRO Exploration and Mining Bayview Road, Clayton VIC, Australia); Provencio, Paula Polyak; Barrett, Keith E. (Primecore Systems, Albuquerque, NM,); Kotula, Paul Gabriel

    2005-07-01

    Automated, nonbiased, multivariate statistical analysis techniques are useful for converting very large amounts of data into a smaller, more manageable number of chemical components (spectra and images) that are needed to describe the measurement. We report the first use of the multivariate spectral analysis program AXSIA (Automated eXpert Spectral Image Analysis) developed at Sandia National Laboratories to quantitatively analyze micro-PIXE data maps. AXSIA implements a multivariate curve resolution technique that reduces the spectral image data sets into a limited number of physically realizable and easily interpretable components (including both spectra and images). We show that the principal component spectra can be further analyzed using conventional PIXE programs to convert the weighting images into quantitative concentration maps. A common elemental data set has been analyzed using three different PIXE analysis codes and the results compared to the cases when each of these codes is used to separately analyze the associated AXSIA principal component spectral data. We find that these comparisons are in good quantitative agreement with each other.

  18. Spectangular - Spectral Disentangling For Detailed Chemical Analysis Of Binaries

    Science.gov (United States)

    Sablowski, Daniel

    2016-08-01

    Disentangling of spectra helps to improve the orbit parameters and allows detailed chemical analysis. Spectangular is a GUI program written in C++ for spectral disentangling of spectra of SB1 and SB2 systems. It is based on singular value decomposition in the wavelength space and is coupled to an orbital solution.The results are the component spectra and the orbital parameters.

  19. Spectral Analysis using Linearly Chirped Gaussian Pulse Stacking

    Institute of Scientific and Technical Information of China (English)

    ZHENG Huan; WANG An-Ting; XU Li-Xin; MING Hai

    2009-01-01

    We analyze the spectrum of a stacked pulse with the technique of linearly chirped Gaussian pulse stacking.Our results show that there are modulation structures in the spectrum of the stacked pulse. The modulation frequencies are discussed in detail. By applying spectral analysis, we find that the intensity fluctuation cannot be smoothed by introducing an optical amplitude filter.

  20. Spectral derivative analysis of solar spectroradiometric measurements: Theoretical basis

    Science.gov (United States)

    Hansell, R. A.; Tsay, S.-C.; Pantina, P.; Lewis, J. R.; Ji, Q.; Herman, J. R.

    2014-07-01

    Spectral derivative analysis, a commonly used tool in analytical spectroscopy, is described for studying cirrus clouds and aerosols using hyperspectral, remote sensing data. The methodology employs spectral measurements from the 2006 Biomass-burning Aerosols in Southeast Asia field study to demonstrate the approach. Spectral peaks associated with the first two derivatives of measured/modeled transmitted spectral fluxes are examined in terms of their shapes, magnitudes, and positions from 350 to 750 nm, where variability is largest. Differences in spectral features between media are mainly associated with particle size and imaginary term of the complex refractive index. Differences in derivative spectra permit cirrus to be conservatively detected at optical depths near the optical thin limit of ~0.03 and yield valuable insight into the composition and hygroscopic nature of aerosols. Biomass-burning smoke aerosols/cirrus generally exhibit positive/negative slopes, respectively, across the 500-700 nm spectral band. The effect of cirrus in combined media is to increase/decrease the slope as cloud optical thickness decreases/increases. For thick cirrus, the slope tends to 0. An algorithm is also presented which employs a two model fit of derivative spectra for determining relative contributions of aerosols/clouds to measured data, thus enabling the optical thickness of the media to be partitioned. For the cases examined, aerosols/clouds explain ~83%/17% of the spectral signatures, respectively, yielding a mean cirrus cloud optical thickness of 0.08 ± 0.03, which compared reasonably well with those retrieved from a collocated Micropulse Lidar Network Instrument (0.09 ± 0.04). This method permits extracting the maximum informational content from hyperspectral data for atmospheric remote sensing applications.

  1. Spectral characteristics analysis of red tide water in mesocosm experiment

    Science.gov (United States)

    Cui, Tingwei; Zhang, Jie; Zhang, Hongliang; Ma, Yi; Gao, Xuemin

    2003-05-01

    Mesocosm ecosystem experiment with seawater enclosed of the red tide was carried out from July to September 2001. We got four species of biology whose quantities of bion are dominant in the red tide. During the whole process from the beginning to their dying out for every specie, in situ spectral measurements were carried out. After data processing, characteristic spectra of red tide of different dominant species are got. Via comparison and analysis of characteristics of different spectra, we find that in the band region between 685 and 735 nanometers, spectral characteristics of red tide is apparently different from that of normal water. Compared to spectra of normal water, spectra of red tide have a strong reflectance peak in the above band region. As to spectra of red tide dominated by different species, the situations of reflectance peaks are also different: the second peak of Mesodinium rubrum spectrum lies between 726~732 nm, which is more than 21nm away from the other dominant species spectra"s Leptocylindrus danicus"s second spectral peak covers 686~694nm; that of Skeletonema costatum lies between 691~693 nm. Chattonella marina"s second spectral peak lies about 703~705 nm. Thus we can try to determine whether red tide has occurred according to its spectral data. In order to monitor the event of red tide and identify the dominant species by the application of the technology of hyperspectral remote sensing, acquiring spectral data of different dominant species of red tide as much as possible becomes a basic work to be achieved for spectral matching, information extraction and so on based on hyperspectral data.

  2. Image registration based on matrix perturbation analysis using spectral graph

    Institute of Scientific and Technical Information of China (English)

    Chengcai Leng; Zheng Tian; Jing Li; Mingtao Ding

    2009-01-01

    @@ We present a novel perspective on characterizing the spectral correspondence between nodes of the weighted graph with application to image registration.It is based on matrix perturbation analysis on the spectral graph.The contribution may be divided into three parts.Firstly, the perturbation matrix is obtained by perturbing the matrix of graph model.Secondly, an orthogonal matrix is obtained based on an optimal parameter, which can better capture correspondence features.Thirdly, the optimal matching matrix is proposed by adjusting signs of orthogonal matrix for image registration.Experiments on both synthetic images and real-world images demonstrate the effectiveness and accuracy of the proposed method.

  3. The Use of Information Transmission as Nonparametric Correlation in the Analysis of Complex Behavior: A Preliminary Report.

    Science.gov (United States)

    1980-05-01

    kEgeering Psychology Programs a dOffice of Naval Research (Code. 455) Arlington, VA 22217 1 14 MONITORING AGENCY NAME A ADORESS(hI diferent 0000...STATEMENT (of the Abe~ag @amed tol 8110" 2. it Aliment *001 Repel) Approved for public release; distribution unlimited. If. SUPPLEMENTARY NOTES I...Structure ........ I i Introduction J Experimental psychology has developed a sophisticated set of experimental designs for the analysis of behaviour when

  4. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  5. Spectral Fatigue Analysis of Jacket Stuctures in Mumbai High Field

    Directory of Open Access Journals (Sweden)

    S. Nallayarasu

    2010-09-01

    Full Text Available Fatigue analysis of offshore structures is an integral part of design of offshore structures and shall be carried out with suitable method of discretising the seastate. Historically, for most of the fixed offshore structures, deterministic fatigue analysis found to predict the fatigue damage reasonably well and has been in use for several decades. Fixed structures with small topsides, mostly exhibit a static response characteristics and the natural period may be in the order of less than 2 seconds. Offshore platforms with larger production capacity and deeper water depths may require specialised treatment of seastate due to their dynamic characteristics more vulnerable for fatigue damage. A spectral fatigue analysis has been performed for two different platforms in Mumbai high field (MNP and RS14 and a comparison is made with deterministic analysis. The spectral fatigue analysis indicates that the predicted fatigue life is lower than the deterministic analysis since the dynamic amplification of wave loads are treated approximately in deterministic analysis. Hence for large structures, it recommended to use spectral methods to assess the fatigue life of tubular joints.

  6. Time frequency analysis of Jovian and Saturnian radio spectral patterns

    Science.gov (United States)

    Boudjada, Mohammed Y.; Galopeau, Patrick H. M.; Al-Haddad, Emad; Lammer, Helmut

    2016-04-01

    Prominent radio spectral patterns were observed by the Cassini Radio and Plasma Wave Science experiment (RPWS) principally at Jupiter and Saturn. The spectral shapes are displayed in the usual dynamic spectra showing the flux density versus the time and the frequency. Those patterns exhibit well-organized shapes in the time-frequency plane connected with the rotation of the planet. We consider in this analysis the auroral emissions which occurred in the frequency range between 10 kHz and approximately 3 MHz. It concerns the Jovian hectometric emission (HOM) and the Saturnian kilometric radiation (SKR). We show in the case of Jupiter's HOM that the spectral patterns are well-arranged arc structures with curvatures depending on the Jovian rotation. Regarding the SKR emission, the spectral shapes exhibit generally complex patterns, and only sometimes arc structures are observed. We emphasize the curve alterations from vertex-early to vertex-late arcs (and vice versa) and we study their dependences, or not, on the planetary rotations. We also discuss the common physical process at the origin of the HOM and SKR emissions, specifically the spectral patterns created by the interaction between planetary satellites (e.g. Io or Dione) and the Jovian and Saturnian magnetospheres.

  7. UN ANÁLISIS NO PARAMÉTRICO DE ÍTEMS DE LA PRUEBA DEL BENDER/A NONPARAMETRIC ITEM ANALYSIS OF THE BENDER GESTALT TEST MODIFIED

    Directory of Open Access Journals (Sweden)

    César Merino Soto

    2009-05-01

    Full Text Available Resumen:La presente investigación hace un estudio psicométrico de un nuevo sistema de calificación de la Prueba Gestáltica del Bendermodificada para niños, que es el Sistema de Calificación Cualitativa (Brannigan y Brunner, 2002, en un muestra de 244 niñosingresantes a primer grado de primaria en cuatro colegios públicos, ubicados en Lima. El enfoque usado es un análisis noparamétrico de ítems mediante el programa Testgraf (Ramsay, 1991. Los resultados indican niveles apropiados deconsistencia interna, identificándose la unidimensionalidad, y el buen nivel discriminativo de las categorías de calificación deeste Sistema Cualitativo. No se hallaron diferencias demográficas respecto al género ni la edad. Se discuten los presenteshallazgos en el contexto del potencial uso del Sistema de Calificación Cualitativa y del análisis no paramétrico de ítems en lainvestigación psicométrica.AbstracThis research designs a psychometric study of a new scoring system of the Bender Gestalt test modified to children: it is theQualitative Scoring System (Brannigan & Brunner, 2002, in a sample of 244 first grade children of primary level, in four public school of Lima. The approach aplied is the nonparametric item analysis using The test graft computer program (Ramsay, 1991. Our findings point to good levels of internal consistency, unidimensionality and good discriminative level ofthe categories of scoring from the Qualitative Scoring System. There are not demographic differences between gender or age.We discuss our findings within the context of the potential use of the Qualitative Scoring System and of the nonparametricitem analysis approach in the psychometric research.

  8. A nonparametric method for detecting fixations and saccades using cluster analysis: removing the need for arbitrary thresholds.

    Science.gov (United States)

    König, Seth D; Buffalo, Elizabeth A

    2014-04-30

    Eye tracking is an important component of many human and non-human primate behavioral experiments. As behavioral paradigms have become more complex, including unconstrained viewing of natural images, eye movements measured in these paradigms have become more variable and complex as well. Accordingly, the common practice of using acceleration, dispersion, or velocity thresholds to segment viewing behavior into periods of fixations and saccades may be insufficient. Here we propose a novel algorithm, called Cluster Fix, which uses k-means cluster analysis to take advantage of the qualitative differences between fixations and saccades. The algorithm finds natural divisions in 4 state space parameters-distance, velocity, acceleration, and angular velocity-to separate scan paths into periods of fixations and saccades. The number and size of clusters adjusts to the variability of individual scan paths. Cluster Fix can detect small saccades that were often indistinguishable from noisy fixations. Local analysis of fixations helped determine the transition times between fixations and saccades. Because Cluster Fix detects natural divisions in the data, predefined thresholds are not needed. A major advantage of Cluster Fix is the ability to precisely identify the beginning and end of saccades, which is essential for studying neural activity that is modulated by or time-locked to saccades. Our data suggest that Cluster Fix is more sensitive than threshold-based algorithms but comes at the cost of an increase in computational time. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. An introduction to random vibrations, spectral & wavelet analysis

    CERN Document Server

    Newland, D E

    2005-01-01

    One of the first engineering books to cover wavelet analysis, this classic text describes and illustrates basic theory, with a detailed explanation of the workings of discrete wavelet transforms. Computer algorithms are explained and supported by examples and a set of problems, and an appendix lists ten computer programs for calculating and displaying wavelet transforms.Starting with an introduction to probability distributions and averages, the text examines joint probability distributions, ensemble averages, and correlation; Fourier analysis; spectral density and excitation response relation

  10. Semi- and Nonparametric ARCH Processes

    Directory of Open Access Journals (Sweden)

    Oliver B. Linton

    2011-01-01

    Full Text Available ARCH/GARCH modelling has been successfully applied in empirical finance for many years. This paper surveys the semiparametric and nonparametric methods in univariate and multivariate ARCH/GARCH models. First, we introduce some specific semiparametric models and investigate the semiparametric and nonparametrics estimation techniques applied to: the error density, the functional form of the volatility function, the relationship between mean and variance, long memory processes, locally stationary processes, continuous time processes and multivariate models. The second part of the paper is about the general properties of such processes, including stationary conditions, ergodic conditions and mixing conditions. The last part is on the estimation methods in ARCH/GARCH processes.

  11. Application of multivariate statistical analysis to STEM X-ray spectral images: interfacial analysis in microelectronics.

    Science.gov (United States)

    Kotula, Paul G; Keenan, Michael R

    2006-12-01

    Multivariate statistical analysis methods have been applied to scanning transmission electron microscopy (STEM) energy-dispersive X-ray spectral images. The particular application of the multivariate curve resolution (MCR) technique provides a high spectral contrast view of the raw spectral image. The power of this approach is demonstrated with a microelectronics failure analysis. Specifically, an unexpected component describing a chemical contaminant was found, as well as a component consistent with a foil thickness change associated with the focused ion beam specimen preparation process. The MCR solution is compared with a conventional analysis of the same spectral image data set.

  12. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira

    2013-01-01

    Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature...... benefits the process of mapping data into a usually lower dimensional space. Outliers are then identified by spectral analysis of the eigenspace spanned by the set of leading eigenvectors derived from the mapping procedure. The proposed technique is purely data-driven and imposes no assumptions regarding...... the data distribution, making it particularly suitable for identification of outliers from irregular, non-convex shaped distributions and from data with diverse, varying densities....

  13. Multi spectral imaging analysis for meat spoilage discrimination

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Carstensen, Jens Michael; Papadopoulou, Olga

    with corresponding sensory data would be of great interest. The purpose of this research was to produce a method capable of quantifying and/or predicting the spoilage status (e.g. express in TVC counts as well as on sensory evaluation) using a multi spectral image of a meat sample and thereby avoid any time...... classification methods: Naive Bayes Classifier as a reference model, Canonical Discriminant Analysis (CDA) and Support Vector Classification (SVC). As the final step, generalization of the models was performed using k-fold validation (k=10). Results showed that image analysis provided good discrimination of meat...... samples. In the case where all data were taken together the misclassification error amounted to 16%. When spoilage status was based on visual sensory data, the model produced a MER of 22% for the combined dataset. These results suggest that it is feasible to employ a multi spectral image...

  14. Spectral analysis of sinus arrhythmia - A measure of mental effort

    Science.gov (United States)

    Vicente, Kim J.; Craig Thornton, D.; Moray, Neville

    1987-01-01

    The validity of the spectral analysis of sinus arrhythmia as a measure of mental effort was investigated using a computer simulation of a hovercraft piloted along a river as the experimental task. Strong correlation was observed between the subjective effort-ratings and the heart-rate variability (HRV) power spectrum between 0.06 and 0.14 Hz. Significant correlations were observed not only between subjects but, more importantly, within subjects as well, indicating that the spectral analysis of HRV is an accurate measure of the amount of effort being invested by a subject. Results also indicate that the intensity of effort invested by subjects cannot be inferred from the objective ratings of task difficulty or from performance.

  15. Temporary spectral analysis of a laser plasma of mineral coal

    Science.gov (United States)

    Rebolledo, P.; Pacheco, P.; Sarmiento, R.; Cabanzo, R.; Mejía-Ospino, E.

    2013-11-01

    In this work we present results of the temporal spectral study of a plasma laser of mineral coal using the Laser-induced Breakdown Spectroscopy (LIBS) technique. The plasma was generated by focusing a laser beam of Nd:YAG laser emitting at 532 nm with energy per pulse of 35 mJ on coal target pellets. The plasma radiation was conducted by an optical fiber to the entrance slit of a spectrograph of 0.5 m, equipped with a 1200 and 2400 grooves/mm diffraction grating and an ICCD camera for registration with different delay times of the spectra in the spectral range from 250 nm to 900 nm. The temporal spectral analysis allowed the identification of the elements Al, Fe, Ca, Mg, K, and Si, and CN and C2 molecules present in natural coals. The characteristics of the spectral lines and bands were studied at different delay times obtaining the calculation of the evolution of electron temperature, electron density, and vibrational temperature of plasmas in the time. The delay times used were between 0.5 μs and 5 μs, calculating the electron temperature ranged between 5 000 K and 1 000 K.

  16. Nonparametric estimation of ultrasound pulses

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Leeman, Sidney

    1994-01-01

    An algorithm for nonparametric estimation of 1D ultrasound pulses in echo sequences from human tissues is derived. The technique is a variation of the homomorphic filtering technique using the real cepstrum, and the underlying basis of the method is explained. The algorithm exploits a priori...

  17. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  18. Spectral Image Processing and Analysis of the Archimedes Palimpsest

    Science.gov (United States)

    2011-09-01

    SPECTRAL IMAGE PROCESSING AND ANALYSIS OF THE ARCHIMEDES PALIMPSEST Roger L. Easton, Jr., William A. Christens-Barry, Keith T. Knox Chester F...5988 (fax), e-mail: easton@cis.rit.edu web: www.cis.rit.edu/people/faculty/easton ABSTRACT The Archimedes Palimpsest is a 10th-century parchment...rendering. 1. SIGNIFICANCE OF THE CODEX Almost everything known about the work of Archimedes has been gleaned from three codex manuscripts. The first

  19. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers

    Directory of Open Access Journals (Sweden)

    Stochl Jan

    2012-06-01

    Full Text Available Abstract Background Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Methods Scalability of data from 1 a cross-sectional health survey (the Scottish Health Education Population Survey and 2 a general population birth cohort study (the National Child Development Study illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. Results and conclusions After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items we show that all items from the 12-item General Health Questionnaire (GHQ-12 – when binary scored – were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech’s “well-being” and “distress” clinical scales. An illustration of ordinal item analysis

  20. Mokken scale analysis of mental health and well-being questionnaire item responses: a non-parametric IRT method in empirical research for applied health researchers.

    Science.gov (United States)

    Stochl, Jan; Jones, Peter B; Croudace, Tim J

    2012-06-11

    Mokken scaling techniques are a useful tool for researchers who wish to construct unidimensional tests or use questionnaires that comprise multiple binary or polytomous items. The stochastic cumulative scaling model offered by this approach is ideally suited when the intention is to score an underlying latent trait by simple addition of the item response values. In our experience, the Mokken model appears to be less well-known than for example the (related) Rasch model, but is seeing increasing use in contemporary clinical research and public health. Mokken's method is a generalisation of Guttman scaling that can assist in the determination of the dimensionality of tests or scales, and enables consideration of reliability, without reliance on Cronbach's alpha. This paper provides a practical guide to the application and interpretation of this non-parametric item response theory method in empirical research with health and well-being questionnaires. Scalability of data from 1) a cross-sectional health survey (the Scottish Health Education Population Survey) and 2) a general population birth cohort study (the National Child Development Study) illustrate the method and modeling steps for dichotomous and polytomous items respectively. The questionnaire data analyzed comprise responses to the 12 item General Health Questionnaire, under the binary recoding recommended for screening applications, and the ordinal/polytomous responses to the Warwick-Edinburgh Mental Well-being Scale. After an initial analysis example in which we select items by phrasing (six positive versus six negatively worded items) we show that all items from the 12-item General Health Questionnaire (GHQ-12)--when binary scored--were scalable according to the double monotonicity model, in two short scales comprising six items each (Bech's "well-being" and "distress" clinical scales). An illustration of ordinal item analysis confirmed that all 14 positively worded items of the Warwick-Edinburgh Mental

  1. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    Science.gov (United States)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  2. Harmonic component detection: Optimized Spectral Kurtosis for operational modal analysis

    Science.gov (United States)

    Dion, J.-L.; Tawfiq, I.; Chevallier, G.

    2012-01-01

    This work is a contribution in the field of Operational Modal Analysis to identify the modal parameters of mechanical structures using only measured responses. The study deals with structural responses coupled with harmonic components amplitude and frequency modulated in a short range, a common combination for mechanical systems with engines and other rotating machines in operation. These harmonic components generate misleading data interpreted erroneously by the classical methods used in OMA. The present work attempts to differentiate maxima in spectra stemming from harmonic components and structural modes. The detection method proposed is based on the so-called Optimized Spectral Kurtosis and compared with others definitions of Spectral Kurtosis described in the literature. After a parametric study of the method, a critical study is performed on numerical simulations and then on an experimental structure in operation in order to assess the method's performance.

  3. Advanced spectral analysis of ionospheric waves observed with sparse arrays

    CERN Document Server

    Helmboldt, Joseph

    2014-01-01

    This paper presents a case study from a single, six-hour observing period to illustrate the application of techniques developed for interferometric radio telescopes to the spectral analysis of observations of ionospheric fluctuations with sparse arrays. We have adapted the deconvolution methods used for making high dynamic range images of cosmic sources with radio arrays to making comparably high dynamic range maps of spectral power of wavelike ionospheric phenomena. In the example presented here, we have used observations of the total electron content (TEC) gradient derived from Very Large Array (VLA) observations of synchrotron emission from two galaxy clusters at 330 MHz as well as GPS-based TEC measurements from a sparse array of 33 receivers located within New Mexico near the VLA. We show that these techniques provide a significant improvement in signal to noise (S/N) of detected wavelike structures by correcting for both measurement inaccuracies and wavefront distortions. This is especially true for the...

  4. A Spectral Analysis of Laser Induced Fluorescence of Iodine

    CERN Document Server

    Bayram, S B

    2015-01-01

    When optically excited, iodine absorbs in the 490- to 650-nm visible region of the spectrum and, after radiative relaxation, it displays an emission spectrum of discrete vibrational bands at moderate resolution. This makes laser-induced fuorescence spectrum of molecular iodine especially suitable to study the energy structure of homonuclear diatomic molecules at room temperature. In this spirit, we present a rather straightforward and inexpensive experimental setup and the associated spectral analysis which provides an excellent exercise of applied quantum mechanics fit for advanced laboratory courses. The students would be required to assign spectral lines, fill a Deslandres table, process the data to estimate the harmonic and anharmonic characteristics of the ground vibronic state involved in the radiative transitions, and thenceforth calculate a set of molecular constants and discuss a model of molecular vibrator.

  5. Spectral analysis of SMC X-2 during its 2015 outburst

    CERN Document Server

    La Palombara, N; Pintore, F; Esposito, P; Mereghetti, S; Tiengo, A

    2016-01-01

    We report on the results of Swift and XMM-Newton observations of SMC X-2 during its last outburst in 2015 October, the first one since 2000. The source reached a very high luminosity ($L \\sim 10^{38}$ erg s$^{-1}$), which allowed us to perform a detailed analysis of its timing and spectral properties. We obtained a pulse period $P_{\\rm spin}$ = 2.372267(5) s and a characterization of the pulse profile also at low energies. The main spectral component is a hard ($\\Gamma \\simeq 0$) power-law model with an exponential cut-off, but at low energies we detected also a soft (with kT $\\simeq$ 0.15 keV) thermal component. Several emission lines can be observed at various energies. The identification of these features with the transition lines of highly ionized N, O, Ne, Si, and Fe suggests the presence of photoionized matter around the accreting source.

  6. Non-Parametric Inference in Astrophysics

    CERN Document Server

    Wasserman, L H; Nichol, R C; Genovese, C; Jang, W; Connolly, A J; Moore, A W; Schneider, J; Wasserman, Larry; Miller, Christopher J.; Nichol, Robert C.; Genovese, Chris; Jang, Woncheol; Connolly, Andrew J.; Moore, Andrew W.; Schneider, Jeff; group, the PICA

    2001-01-01

    We discuss non-parametric density estimation and regression for astrophysics problems. In particular, we show how to compute non-parametric confidence intervals for the location and size of peaks of a function. We illustrate these ideas with recent data on the Cosmic Microwave Background. We also briefly discuss non-parametric Bayesian inference.

  7. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb-Douglas and......We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs...... rejects both the Cobb-Douglas and the Translog functional form, while a recently developed nonparametric kernel regression method with a fully nonparametric panel data specification delivers plausible results. On average, the nonparametric regression results are similar to results that are obtained from...

  8. Nonparametric Maize TFP Measurement Analysis and Countermeasures%玉米全要素生产率非参数测算分析及对策

    Institute of Scientific and Technical Information of China (English)

    曲会朋; 李宁; 田玉英

    2014-01-01

    玉米生产受到诸多投入要素的限制与影响,如种子、秧苗、劳动力、土地、农药、化肥、农膜、机械设备、畜力和其他物质投入等,提高这些因素的投入-产出效率水平对于促进玉米高效持续增产至关重要。为此,在对中美两国玉米生产成本与单产时序比较的基础上,利用基于DEA 的非参数前沿面效率分解方法对全国主要地区的玉米全要素生产效率问题进行了实证分析,从纵向时间序列和横向不同区域两个视角研究了我国玉米生产效率和生产资源配置的演化过程及区域对比特征。最后,有针对性地提出了促进我国玉米生产资源有效配置、提高玉米生产效率的相应措施与途径。%Maize production is affected by the restrictions and lots of inputs , such as seed seedlings, Labour, land, pesti-cide , chemical fertilizer , agricultural film , machinery and equipment , animal and other material input , improve the effi-ciency of input-output level of these factors is very important to promote efficient continuous corn production .Based on corn production cost compared with the yield time series in China and the United States , on the basis of the nonparamet-ric frontier efficiency based on DEA decomposition methods for main parts of the country's corn crop total factor produc-tivity question has carried on the empirical analysis , from the perspective of two different regions of the horizontal and vertical time sequence to study the evolution of China's corn production efficiency and resource allocation process and re-gional correlation characteristics .Finally , puts forward the promotion our country maize production resources effectively configuration , corresponding measures and way to improve efficiency of corn production .

  9. Parametric versus non-parametric simulation

    OpenAIRE

    Dupeux, Bérénice; Buysse, Jeroen

    2014-01-01

    Most of ex-ante impact assessment policy models have been based on a parametric approach. We develop a novel non-parametric approach, called Inverse DEA. We use non parametric efficiency analysis for determining the farm’s technology and behaviour. Then, we compare the parametric approach and the Inverse DEA models to a known data generating process. We use a bio-economic model as a data generating process reflecting a real world situation where often non-linear relationships exist. Results s...

  10. Subpixel measurement of mangrove canopy closure via spectral mixture analysis

    Institute of Scientific and Technical Information of China (English)

    Minhe JI; Jing FENG

    2011-01-01

    Canopy closure can vary spatially within a remotely sensed image pixel,but Boolean logic inherent in traditional classification methods only works at the wholepixel level.This study attempted to decompose mangrove closure information from spectrally-mixed pixels through spectral mixture analysis (SMA) for coastal wetland management.Endmembers of different surface categories were established through signature selection and training,and memberships of a pixel with respect to the surface categories were determined via a spectral mixture model.A case study involving DigitalGlobe's Quickbird highresolution multispectral imagery of Beilun Estuary,China was used to demonstrate this approach.Mangrove canopy closure was first quantified as percent coverage through the model and then further grouped into eight ordinal categories.The model results were verified using Quickbird panchromatic data from the same acquisition.An overall accuracy of 84.4% (Kappa = 0.825) was achieved,indicating good application potential of the approach in coastal resource inventory and ecosystem management.

  11. Effective dielectric constants and spectral density analysis of plasmonic nanocomposites

    Science.gov (United States)

    Lu, Jin You; Raza, Aikifa; Fang, Nicholas X.; Chen, Gang; Zhang, TieJun

    2016-10-01

    Cermet or ceramic-metal composite coatings promise great potentials in light harvesting, but the complicated composite structure at the nanoscale induces a design challenge to predict their optical properties. We find that the effective dielectric constants of nanocomposites predicted by finite-difference-time-domain (FDTD) simulation results match those of different classical effective medium theories in their respective validity range. However, a precise prediction of the fabricated nanocomposite properties for different filling factors is very challenging. In this work, we extract the spectral density functions in the Bergman representation from the analytical models, numerical simulations, and experimental data of plasmonic nanocomposites. The spectral density functions, which only depend on geometry of the nanocomposite material, provide a unique measure on the contribution of individual and percolated particles inside the nanocomposite. According to the spectral density analysis of measured dielectric constants, the material properties of nanocomposites fabricated by the co-sputtering approach are dominated by electromagnetic interaction among individual metallic particles. While in the case of the nanocomposites fabricated by the multilayer thin film approach, the material properties are dominated by percolated metallic particles inside the dielectric host, as indicated by our FDTD simulation results. This understanding provides new physical insight into the interaction between light and plasmonic nanocomposites.

  12. Experimental spectral analysis of SALMON/STERLING decoupling. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Blandford, R.R.; Woolson, J.R.

    1979-11-30

    Re-analysis of SALMON and STERLING initial short-period compressional and surface waves at station PLMS (Poplarville, Mississippi) at a distance of 27 km shows a SALMON/STERLING compressional phase spectral ratio tending to a ratio of only 17 at 25 Hz in agreement with the theoretical caculations of Patterson (1966) and of Healy, King, and 0'Neill (1971). The spectral ratio for the surface waves tends to a ratio of approximately 100 at 25 Hz, in agreement with spectral ratios previously reported by Springer, Denny, Healy, and Mickey (1968), whose data window at PLMS was large enough to consist predominantly of surface waves. The fact that the ratio varies as a function of phase suggests that decoupling varies as a function of takeoff angle, with the least decoupling occurring at high frequencies for the most steeply departing rays. Another topic discussed is the apparent variation in decoupling as defined by the ratio of STERLING/STERLING HE. The variation in this ratio is determined to be explainable by the variation in short point between these two explosions, and not necessarily by a variation in decoupling as a function of azimuth.

  13. Coefficient of variation spectral analysis: An application to underwater acoustics

    Science.gov (United States)

    Herstein, P. D.; Laplante, R. F.

    1983-05-01

    Acoustic noise in the ocean is often described in terms of its power spectral density. Just as in other media, this noise consists of both narrowband and broadband frequency components. A major problem in the analysis of power spectral density measurements is distinguishing between narrowband spectral components of interest and contaminating narrowband components. In this paper, the use of coefficient of variation (Cv) spectrum is examined as an adjunct to the conventional power spectrum to distinguish narrowband components of interest from contaminating components. The theory of the Cv is presented. Coefficients for several classical input distributions are developed. It is shown that Cv spectra can be easily implemented as an adjunct procedure during the computation of the ensemble of averaged power spectra. Power and Cv spectra derived from actual at-sea sonobuoy measurements of deep ocean ambient noise separate narrowband components from narrowband lines of interest in the ensemble of averaged power spectra, these acoustic components of interest can be distinguished in the Cv spectra.

  14. Nonparametric Inference for Periodic Sequences

    KAUST Repository

    Sun, Ying

    2012-02-01

    This article proposes a nonparametric method for estimating the period and values of a periodic sequence when the data are evenly spaced in time. The period is estimated by a "leave-out-one-cycle" version of cross-validation (CV) and complements the periodogram, a widely used tool for period estimation. The CV method is computationally simple and implicitly penalizes multiples of the smallest period, leading to a "virtually" consistent estimator of integer periods. This estimator is investigated both theoretically and by simulation.We also propose a nonparametric test of the null hypothesis that the data have constantmean against the alternative that the sequence of means is periodic. Finally, our methodology is demonstrated on three well-known time series: the sunspots and lynx trapping data, and the El Niño series of sea surface temperatures. © 2012 American Statistical Association and the American Society for Quality.

  15. Nonparametric Econometrics: The np Package

    Directory of Open Access Journals (Sweden)

    Tristen Hayfield

    2008-07-01

    Full Text Available We describe the R np package via a series of applications that may be of interest to applied econometricians. The np package implements a variety of nonparametric and semiparametric kernel-based estimators that are popular among econometricians. There are also procedures for nonparametric tests of significance and consistent model specification tests for parametric mean regression models and parametric quantile regression models, among others. The np package focuses on kernel methods appropriate for the mix of continuous, discrete, and categorical data often found in applied settings. Data-driven methods of bandwidth selection are emphasized throughout, though we caution the user that data-driven bandwidth selection methods can be computationally demanding.

  16. Astronomical Methods for Nonparametric Regression

    Science.gov (United States)

    Steinhardt, Charles L.; Jermyn, Adam

    2017-01-01

    I will discuss commonly used techniques for nonparametric regression in astronomy. We find that several of them, particularly running averages and running medians, are generically biased, asymmetric between dependent and independent variables, and perform poorly in recovering the underlying function, even when errors are present only in one variable. We then examine less-commonly used techniques such as Multivariate Adaptive Regressive Splines and Boosted Trees and find them superior in bias, asymmetry, and variance both theoretically and in practice under a wide range of numerical benchmarks. In this context the chief advantage of the common techniques is runtime, which even for large datasets is now measured in microseconds compared with milliseconds for the more statistically robust techniques. This points to a tradeoff between bias, variance, and computational resources which in recent years has shifted heavily in favor of the more advanced methods, primarily driven by Moore's Law. Along these lines, we also propose a new algorithm which has better overall statistical properties than all techniques examined thus far, at the cost of significantly worse runtime, in addition to providing guidance on choosing the nonparametric regression technique most suitable to any specific problem. We then examine the more general problem of errors in both variables and provide a new algorithm which performs well in most cases and lacks the clear asymmetry of existing non-parametric methods, which fail to account for errors in both variables.

  17. Spectral analysis of snoring events from an Emfit mattress.

    Science.gov (United States)

    Perez-Macias, Jose Maria; Viik, Jari; Varri, Alpo; Himanen, Sari-Leena; Tenhunen, Mirja

    2016-12-01

    The aim of this study is to explore the capability of an Emfit (electromechanical film transducer) mattress to detect snoring (SN) by analyzing the spectral differences between normal breathing (NB) and SN. Episodes of representative NB and SN of a maximum of 10 min were visually selected for analysis from 33 subjects. To define the bands of interest, we studied the statistical differences in the power spectral density (PSD) between both breathing types. Three bands were selected for further analysis: 6-16 Hz (BW1), 16-30 Hz (BW2) and 60-100 Hz (BW3). We characterized the differences between NB and SN periods in these bands using a set of spectral features estimated from the PSD. We found that 15 out of the 29 features reached statistical significance with the Mann-Whitney U-test. Diagnostic properties for each feature were assessed using receiver operating characteristic analysis. According to our results, the highest diagnostic performance was achieved using the power ratio between BW2 and BW3 (0.85 area under the receiver operating curve, 80% sensitivity, 80% specificity and 80% accuracy). We found that there are significant differences in the defined bands between the NB and SN periods. A peak was found in BW3 for SN epochs, which was best detected using power ratios. Our work suggests that it is possible to detect snoring with an Emfit mattress. The mattress-type movement sensors are inexpensive and unobtrusive, and thus provide an interesting tool for sleep research.

  18. Spectral analysis in thin tubes with axial heterogeneities

    KAUST Repository

    Ferreira, Rita

    2015-01-01

    In this paper, we present the 3D-1D asymptotic analysis of the Dirichlet spectral problem associated with an elliptic operator with axial periodic heterogeneities. We extend to the 3D-1D case previous 3D-2D results (see [10]) and we analyze the special case where the scale of thickness is much smaller than the scale of the heterogeneities and the planar coefficient has a unique global minimum in the periodic cell. These results are of great relevance in the comprehension of the wave propagation in nanowires showing axial heterogeneities (see [17]).

  19. Incorporating Endmember Variability into Spectral Mixture Analysis Through Endmember Bundles

    Science.gov (United States)

    Bateson, C. Ann; Asner, Gregory P.; Wessman, Carol A.

    1998-01-01

    Variation in canopy structure and biochemistry induces a concomitant variation in the top-of-canopy spectral reflectance of a vegetation type. Hence, the use of a single endmember spectrum to track the fractional abundance of a given vegetation cover in a hyperspectral image may result in fractions with considerable error. One solution to the problem of endmember variability is to increase the number of endmembers used in a spectral mixture analysis of the image. For example, there could be several tree endmembers in the analysis because of differences in leaf area index (LAI) and multiple scatterings between leaves and stems. However, it is often difficult in terms of computer or human interaction time to select more than six or seven endmembers and any non-removable noise, as well as the number of uncorrelated bands in the image, limits the number of endmembers that can be discriminated. Moreover, as endmembers proliferate, their interpretation becomes increasingly difficult and often applications simply need the aerial fractions of a few land cover components which comprise most of the scene. In order to incorporate endmember variability into spectral mixture analysis, we propose representing a landscape component type not with one endmember spectrum but with a set or bundle of spectra, each of which is feasible as the spectrum of an instance of the component (e.g., in the case of a tree component, each spectrum could reasonably be the spectral reflectance of a tree canopy). These endmember bundles can be used with nonlinear optimization algorithms to find upper and lower bounds on endmember fractions. This approach to endmember variability naturally evolved from previous work in deriving endmembers from the data itself by fitting a triangle, tetrahedron or, more generally, a simplex to the data cloud reduced in dimension by a principal component analysis. Conceptually, endmember variability could make it difficult to find a simplex that both surrounds the data

  20. Schwarzschild scalar wigs: spectral analysis and late time behavior

    CERN Document Server

    Barranco, Juan; Degollado, Juan Carlos; Diez-Tejedor, Alberto; Megevand, Miguel; Alcubierre, Miguel; Nunez, Dario; Sarbach, Olivier

    2013-01-01

    Using the Green's function representation technique, the late time behavior of localized scalar field distributions on Schwarzschild spacetimes is studied. Assuming arbitrary initial data we perform a spectral analysis, computing the amplitude of each excited quasi-bound mode without the necessity of performing dynamical evolutions. The resulting superposition of modes is compared with a traditional numerical evolution with excellent agreement; therefore, we have an efficient way to determine final black hole wigs. The astrophysical relevance of the quasi-bound modes is discussed in the context of scalar field dark matter models and the axiverse.

  1. Multiphoton autofluorescence spectral analysis for fungus imaging and identification

    Science.gov (United States)

    Lin, Sung-Jan; Tan, Hsin-Yuan; Kuo, Chien-Jui; Wu, Ruei-Jr; Wang, Shiou-Han; Chen, Wei-Liang; Jee, Shiou-Hwa; Dong, Chen-Yuan

    2009-07-01

    We performed multiphoton imaging on fungi of medical significance. Fungal hyphae and spores of Aspergillus flavus, Micosporum gypseum, Micosoprum canis, Trichophyton rubrum, and Trichophyton tonsurans were found to be strongly autofluorescent but generate less prominent second harmonic signal. The cell wall and septum of fungal hyphae can be easily identified by autofluorescence imaging. We found that fungi of various species have distinct autofluorescence characteristics. Our result shows that the combination of multiphoton imaging and spectral analysis can be used to visualize and identify fungal species. This approach may be developed into an effective diagnostic tool for fungal identification.

  2. Spectral analysis of the Forbush decrease of 13 July 1982

    Science.gov (United States)

    Vainikka, E.; Torsti, J. J.; Valtonen, E.; Lumme, M.; Nieminen, M.; Peltonen, J.; Arvela, H.

    1985-01-01

    The maximum entropy method has been applied in the spectral analysis of high-energy cosmic-ray intensity during the large Forbush event of July 13, 1982. An oscillation with period of about 2 hours and amplitude of 1 to 3% was found to be present during the decrease phase. This oscillation can be related to a similar periodicity in the magnetospheric field. However, the variation was not observed at all neutron monitor stations. In the beginning of the recovery phase, the intensity oscillated with a period of about 10 hours and amplitude of 3%.

  3. Understanding Boswellia papyrifera tree secondary metabolites through bark spectral analysis

    Science.gov (United States)

    Girma, Atkilt; Skidmore, Andrew K.; de Bie, C. A. J. M.; Bongers, Frans

    2015-07-01

    Decision makers are concerned whether to tap or rest Boswellia Papyrifera trees. Tapping for the production of frankincense is known to deplete carbon reserves from the tree leading to production of less viable seeds, tree carbon starvation and ultimately tree mortality. Decision makers use traditional experience without considering the amount of metabolites stored or depleted from the stem-bark of the tree. This research was designed to come up with a non-destructive B. papyrifera tree metabolite estimation technique relevant for management using spectroscopy. The concentration of biochemicals (metabolites) found in the tree bark was estimated through spectral analysis. Initially, a random sample of 33 trees was selected, the spectra of bark measured with an Analytical Spectral Device (ASD) spectrometer. Bark samples were air dried and ground. Then, 10 g of sample was soaked in Petroleum ether to extract crude metabolites. Further chemical analysis was conducted to quantify and isolate pure metabolite compounds such as incensole acetate and boswellic acid. The crude metabolites, which relate to frankincense produce, were compared to plant properties (such as diameter and crown area) and reflectance spectra of the bark. Moreover, the extract was compared to the ASD spectra using partial least square regression technique (PLSR) and continuum removed spectral analysis. The continuum removed spectral analysis were performed, on two wavelength regions (1275-1663 and 1836-2217) identified through PLSR, using absorption features such as band depth, area, position, asymmetry and the width to characterize and find relationship with the bark extracts. The results show that tree properties such as diameter at breast height (DBH) and the crown area of untapped and healthy trees were strongly correlated to the amount of stored crude metabolites. In addition, the PLSR technique applied to the first derivative transformation of the reflectance spectrum was found to estimate the

  4. Comparing parametric and nonparametric regression methods for panel data

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    We investigate and compare the suitability of parametric and non-parametric stochastic regression methods for analysing production technologies and the optimal firm size. Our theoretical analysis shows that the most commonly used functional forms in empirical production analysis, Cobb......-Douglas and Translog, are unsuitable for analysing the optimal firm size. We show that the Translog functional form implies an implausible linear relationship between the (logarithmic) firm size and the elasticity of scale, where the slope is artificially related to the substitutability between the inputs....... The practical applicability of the parametric and non-parametric regression methods is scrutinised and compared by an empirical example: we analyse the production technology and investigate the optimal size of Polish crop farms based on a firm-level balanced panel data set. A nonparametric specification test...

  5. Quantitative Phylogenomics of Within-Species Mitogenome Variation: Monte Carlo and Non-Parametric Analysis of Phylogeographic Structure among Discrete Transatlantic Breeding Areas of Harp Seals (Pagophilus groenlandicus.

    Directory of Open Access Journals (Sweden)

    Steven M Carr

    -stepping-stone biogeographic models, but not a simple 1-step trans-Atlantic model. Plots of the cumulative pairwise sequence difference curves among seals in each of the four populations provide continuous proxies for phylogenetic diversification within each. Non-parametric Kolmogorov-Smirnov (K-S tests of maximum pairwise differences between these curves indicates that the Greenland Sea population has a markedly younger phylogenetic structure than either the White Sea population or the two Northwest Atlantic populations, which are of intermediate age and homogeneous structure. The Monte Carlo and K-S assessments provide sensitive quantitative tests of within-species mitogenomic phylogeography. This is the first study to indicate that the White Sea and Greenland Sea populations have different population genetic histories. The analysis supports the hypothesis that Harp Seals comprises three genetically distinguishable breeding populations, in the White Sea, Greenland Sea, and Northwest Atlantic. Implications for an ice-dependent species during ongoing climate change are discussed.

  6. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  7. Spectral analysis for automated exploration and sample acquisition

    Science.gov (United States)

    Eberlein, Susan; Yates, Gigi

    1992-01-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  8. MAC to VAX Connectivity: Heartrate Spectral Analysis System

    Science.gov (United States)

    Rahman, Hasan H.; Faruque, Monazer

    1993-01-01

    The heart rate Spectral Analysis System (SAS) acquires and analyzes, in real-time, the Space Shuttle onboard electrocardiograph (EKG) experiment signals, calculates the heartrate, and applies a Fast Fourier Transformation (FFT) to the heart rate. The system also calculates other statistical parameters such as the 'mean heart rate' over specific time period and heart rate histogram. This SAS is used by NASA Principal Investigators as a research tool to determine the effects of weightlessness on the human cardiovascular system. This is also used to determine if Lower Body Negative Pressure (LBNP) is an effective countermeasure to the orthostatic intolerance experienced by astronauts upon return to normal gravity. In microgravity, astronauts perform the LBNP experiment in the mid deck of the Space Shuttle. The experiment data are downlinked by the orbiter telemetry system, then processed and analyzed in real-time by the integrated Life Sciences Data Acquisition (LSDS) - Spectral Analysis System. The data system is integrated within the framework of two different computer systems, VAX and Macintosh (Mac), using the networking infrastructure to assist the investigators in further understanding the most complex machine on Earth--the human body.

  9. Asymptotic theory of nonparametric regression estimates with censored data

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    For regression analysis, some useful information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in literature, but the optimal rates of global convergence have not been obtained yet. Because of the possible information loss, one may think that it is impossible for an estimate based on censored data to achieve the optimal rates of global convergence for nonparametric regression, which were established by Stone based on complete data. This paper constructs a regression spline estimate of a general nonparametric regression function based on right_censored response data, and proves, under some regularity conditions, that this estimate achieves the optimal rates of global convergence for nonparametric regression. Since the parameters for the nonparametric regression estimate have to be chosen based on a data driven criterion, we also obtain the asymptotic optimality of AIC, AICC, GCV, Cp and FPE criteria in the process of selecting the parameters.

  10. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    Science.gov (United States)

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library.

  11. Spectral line removal in the LIGO Data Analysis System (LDAS)

    Energy Technology Data Exchange (ETDEWEB)

    Searle, Antony C; Scott, Susan M; McClelland, David E [Department of Physics, Faculty of Science, Australian National University, Canberra ACT 0200 (Australia)

    2003-09-07

    High power in narrow frequency bands, spectral lines, are a feature of an interferometric gravitational wave detector's output. Some lines are coherent between interferometers, in particular, the 2 km and 4 km LIGO Hanford instruments. This is of concern to data analysis techniques, such as the stochastic background search, that use correlations between instruments to detect gravitational radiation. Several techniques of 'line removal' have been proposed. Where a line is attributable to a measurable environmental disturbance, a simple linear model may be fitted to predict, and subsequently subtract away, that line. This technique has been implemented (as the command oelslr) in the LIGO Data Analysis System (LDAS). We demonstrate its application to LIGO S1 data.

  12. Software for the Spectral Analysis of Hot Stars

    CERN Document Server

    Rauch, Thomas; Stampa, Ulrike; Demleitner, Markus; Koesterke, Lars

    2009-01-01

    In a collaboration of the German Astrophysical Virtual Observatory (GAVO) and AstroGrid-D, the German Astronomy Community Grid (GACG), we provide a VO service for the access and the calculation of stellar synthetic energy distributions (SEDs) based on static as well as expanding non-LTE model atmospheres. At three levels, a VO user may directly compare observed and theoretical SEDs: The easiest and fastest way is to use pre-calculated SEDs from the GAVO database. For individual objects, grids of model atmospheres and SEDs can be calculated on the compute resources of AstroGrid-D within reasonable wallclock time. Experienced VO users may even create own atomic-data files for a more detailed analysis. This VO service opens also the perspective for a new approach to an automated spectral analysis of a large number of observations, e.g. provided by multi-object spectrographs.

  13. Spectral Line Removal in the LIGO Data Analysis System (LDAS)

    CERN Document Server

    Searle, A C; McClelland, D E; Searle, Antony C.; Scott, Susan M.; Clelland, David E. Mc

    2003-01-01

    High power in narrow frequency bands, spectral lines, are a feature of an interferometric gravitational wave detector's output. Some lines are coherent between interferometers, in particular, the 2 km and 4 km LIGO Hanford instruments. This is of concern to data analysis techniques, such as the stochastic background search, that use correlations between instruments to detect gravitational radiation. Several techniques of `line removal' have been proposed. Where a line is attributable to a measurable environmental disturbance, a simple linear model may be fitted to predict, and subsequently subtract away, that line. This technique has been implemented (as the command oelslr) in the LIGO Data Analysis System (LDAS). We demonstrate its application to LIGO S1 data.

  14. Separability Analysis of Sentinel-2A Multi-Spectral Instrument (MSI Data for Burned Area Discrimination

    Directory of Open Access Journals (Sweden)

    Haiyan Huang

    2016-10-01

    Full Text Available Biomass burning is a global phenomenon and systematic burned area mapping is of increasing importance for science and applications. With high spatial resolution and novelty in band design, the recently launched Sentinel-2A satellite provides a new opportunity for moderate spatial resolution burned area mapping. This study examines the performance of the Sentinel-2A Multi Spectral Instrument (MSI bands and derived spectral indices to differentiate between unburned and burned areas. For this purpose, five pairs of pre-fire and post-fire top of atmosphere (TOA reflectance and atmospherically corrected (surface reflectance images were studied. The pixel values of locations that were unburned in the first image and burned in the second image, as well as the values of locations that were unburned in both images which served as a control, were compared and the discrimination of individual bands and spectral indices were evaluated using parametric (transformed divergence and non-parametric (decision tree approaches. Based on the results, the most suitable MSI bands to detect burned areas are the 20 m near-infrared, short wave infrared and red-edge bands, while the performance of the spectral indices varied with location. The atmospheric correction only significantly influenced the separability of the visible wavelength bands. The results provide insights that are useful for developing Sentinel-2 burned area mapping algorithms.

  15. Nonparametric regression with filtered data

    CERN Document Server

    Linton, Oliver; Nielsen, Jens Perch; Van Keilegom, Ingrid; 10.3150/10-BEJ260

    2011-01-01

    We present a general principle for estimating a regression function nonparametrically, allowing for a wide variety of data filtering, for example, repeated left truncation and right censoring. Both the mean and the median regression cases are considered. The method works by first estimating the conditional hazard function or conditional survivor function and then integrating. We also investigate improved methods that take account of model structure such as independent errors and show that such methods can improve performance when the model structure is true. We establish the pointwise asymptotic normality of our estimators.

  16. Nonparametric identification of copula structures

    KAUST Repository

    Li, Bo

    2013-06-01

    We propose a unified framework for testing a variety of assumptions commonly made about the structure of copulas, including symmetry, radial symmetry, joint symmetry, associativity and Archimedeanity, and max-stability. Our test is nonparametric and based on the asymptotic distribution of the empirical copula process.We perform simulation experiments to evaluate our test and conclude that our method is reliable and powerful for assessing common assumptions on the structure of copulas, particularly when the sample size is moderately large. We illustrate our testing approach on two datasets. © 2013 American Statistical Association.

  17. Spectral Knowledge (SK-UTALCA): Software for Exploratory Analysis of High-Resolution Spectral Reflectance Data on Plant Breeding

    Science.gov (United States)

    Lobos, Gustavo A.; Poblete-Echeverría, Carlos

    2017-01-01

    This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules. PMID:28119705

  18. Advanced spectral analysis of ionospheric waves observed with sparse arrays

    Science.gov (United States)

    Helmboldt, J. F.; Intema, H. T.

    2014-02-01

    This paper presents a case study from a single, 6h observing period to illustrate the application of techniques developed for interferometric radio telescopes to the spectral analysis of observations of ionospheric fluctuations with sparse arrays. We have adapted the deconvolution methods used for making high dynamic range images of cosmic sources with radio arrays to making comparably high dynamic range maps of spectral power of wavelike ionospheric phenomena. In the example presented here, we have used observations of the total electron content (TEC) gradient derived from Very Large Array (VLA) observations of synchrotron emission from two galaxy clusters at 330MHz as well as GPS-based TEC measurements from a sparse array of 33 receivers located within New Mexico near the VLA. We show that these techniques provide a significant improvement in signal-to-noise ratio (S/N) of detected wavelike structures by correcting for both measurement inaccuracies and wavefront distortions. This is especially true for the GPS data when combining all available satellite/receiver pairs, which probe a larger physical area and likely have a wider variety of measurement errors than in the single-satellite case. In this instance, we found that the peak S/N of the detected waves was improved by more than an order of magnitude. The data products generated by the deconvolution procedure also allow for a reconstruction of the fluctuations as a two-dimensional waveform/phase screen that can be used to correct for their effects.

  19. Bayesian Spectral Analysis of Metal Abandance Deficient Stars

    CERN Document Server

    Sourlas, E; Kashyap, V L; Drake, J; Pease, D; Sourlas, Epaminondas; Dyk, David van; Kashyap, Vinay; Drake, Jeremy; Pease, Deron

    2002-01-01

    Metallicity can be measured by analyzing the spectra in the X-ray region and comparing the flux in spectral lines to the flux in the underlying Bremsstrahlung continuum. In this paper we propose new Bayesian methods which directly model the Poisson nature of the data and thus are expected to exhibit improved sampling properties. Our model also accounts for the Poisson nature of background contamination of the observations, image blurring due to instrument response, and the absorption of photons in space. The resulting highly structured hierarchical model is fit using the Gibbs sampler, data augmentation and Metropolis-Hasting. We demonstrate our methods with the X-ray spectral analysis of several "Metal Abundance Deficient" stars. The model is designed to summarize the relative frequency of the energy of photons (X-ray or gamma-ray) arriving at a detector. Independent Poisson distributions are more appropriate to model the counts than the commonly used normal approximation. We model the high energy tail of th...

  20. Accuracy Enhancement of Inertial Sensors Utilizing High Resolution Spectral Analysis

    Directory of Open Access Journals (Sweden)

    Michael Korenberg

    2012-08-01

    Full Text Available In both military and civilian applications, the inertial navigation system (INS and the global positioning system (GPS are two complementary technologies that can be integrated to provide reliable positioning and navigation information for land vehicles. The accuracy enhancement of INS sensors and the integration of INS with GPS are the subjects of widespread research. Wavelet de-noising of INS sensors has had limited success in removing the long-term (low-frequency inertial sensor errors. The primary objective of this research is to develop a novel inertial sensor accuracy enhancement technique that can remove both short-term and long-term error components from inertial sensor measurements prior to INS mechanization and INS/GPS integration. A high resolution spectral analysis technique called the fast orthogonal search (FOS algorithm is used to accurately model the low frequency range of the spectrum, which includes the vehicle motion dynamics and inertial sensor errors. FOS models the spectral components with the most energy first and uses an adaptive threshold to stop adding frequency terms when fitting a term does not reduce the mean squared error more than fitting white noise. The proposed method was developed, tested and validated through road test experiments involving both low-end tactical grade and low cost MEMS-based inertial systems. The results demonstrate that in most cases the position accuracy during GPS outages using FOS de-noised data is superior to the position accuracy using wavelet de-noising.

  1. ANALYSIS OF CAMOUFLAGE COVER SPECTRAL CHARACTERISTICS BY IMAGING SPECTROMETER

    Directory of Open Access Journals (Sweden)

    A. Y. Kouznetsov

    2016-03-01

    Full Text Available Subject of Research.The paper deals with the problems of detection and identification of objects in hyperspectral imagery. The possibility of object type determination by statistical methods is demonstrated. The possibility of spectral image application for its data type identification is considered. Method. Researching was done by means of videospectral equipment for objects detection at "Fregat" substrate. The postprocessing of hyperspectral information was done with the use of math model of pattern recognition system. The vegetation indexes TCHVI (Three-Channel Vegetation Index and NDVI (Normalized Difference Vegetation Index were applied for quality control of object recognition. Neumann-Pearson criterion was offered as a tool for determination of objects differences. Main Results. We have carried out analysis of the spectral characteristics of summer-typecamouflage cover (Germany. We have calculated the density distribution of vegetation indexes. We have obtained statistical characteristics needed for creation of mathematical model for pattern recognition system. We have shown the applicability of vegetation indices for detection of summer camouflage cover on averdure background. We have presented mathematical model of object recognition based on Neumann-Pearson criterion. Practical Relevance. The results may be useful for specialists in the field of hyperspectral data processing for surface state monitoring.

  2. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  3. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  4. Equity and efficiency in private and public education: a nonparametric comparison

    NARCIS (Netherlands)

    L. Cherchye; K. de Witte; E. Ooghe; I. Nicaise

    2007-01-01

    We present a nonparametric approach for the equity and efficiency evaluation of (private and public) primary schools in Flanders. First, we use a nonparametric (Data Envelopment Analysis) model that is specially tailored to assess educational efficiency at the pupil level. The model accounts for the

  5. Non-parametric tests of productive efficiency with errors-in-variables

    NARCIS (Netherlands)

    Kuosmanen, T.K.; Post, T.; Scholtes, S.

    2007-01-01

    We develop a non-parametric test of productive efficiency that accounts for errors-in-variables, following the approach of Varian. [1985. Nonparametric analysis of optimizing behavior with measurement error. Journal of Econometrics 30(1/2), 445-458]. The test is based on the general Pareto-Koopmans

  6. Equity and efficiency in private and public education: a nonparametric comparison

    NARCIS (Netherlands)

    Cherchye, L.; de Witte, K.; Ooghe, E.; Nicaise, I.

    2007-01-01

    We present a nonparametric approach for the equity and efficiency evaluation of (private and public) primary schools in Flanders. First, we use a nonparametric (Data Envelopment Analysis) model that is specially tailored to assess educational efficiency at the pupil level. The model accounts for the

  7. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  8. Spectral reflectance of surface soils - A statistical analysis

    Science.gov (United States)

    Crouse, K. R.; Henninger, D. L.; Thompson, D. R.

    1983-01-01

    The relationship of the physical and chemical properties of soils to their spectral reflectance as measured at six wavebands of Thematic Mapper (TM) aboard NASA's Landsat-4 satellite was examined. The results of performing regressions of over 20 soil properties on the six TM bands indicated that organic matter, water, clay, cation exchange capacity, and calcium were the properties most readily predicted from TM data. The middle infrared bands, bands 5 and 7, were the best bands for predicting soil properties, and the near infrared band, band 4, was nearly as good. Clustering 234 soil samples on the TM bands and characterizing the clusters on the basis of soil properties revealed several clear relationships between properties and reflectance. Discriminant analysis found organic matter, fine sand, base saturation, sand, extractable acidity, and water to be significant in discriminating among clusters.

  9. Spectral analysis methods for vehicle interior vibro-acoustics identification

    Science.gov (United States)

    Hosseini Fouladi, Mohammad; Nor, Mohd. Jailani Mohd.; Ariffin, Ahmad Kamal

    2009-02-01

    Noise has various effects on comfort, performance and health of human. Sound are analysed by human brain based on the frequencies and amplitudes. In a dynamic system, transmission of sound and vibrations depend on frequency and direction of the input motion and characteristics of the output. It is imperative that automotive manufacturers invest a lot of effort and money to improve and enhance the vibro-acoustics performance of their products. The enhancement effort may be very difficult and time-consuming if one relies only on 'trial and error' method without prior knowledge about the sources itself. Complex noise inside a vehicle cabin originated from various sources and travel through many pathways. First stage of sound quality refinement is to find the source. It is vital for automotive engineers to identify the dominant noise sources such as engine noise, exhaust noise and noise due to vibration transmission inside of vehicle. The purpose of this paper is to find the vibro-acoustical sources of noise in a passenger vehicle compartment. The implementation of spectral analysis method is much faster than the 'trial and error' methods in which, parts should be separated to measure the transfer functions. Also by using spectral analysis method, signals can be recorded in real operational conditions which conduce to more consistent results. A multi-channel analyser is utilised to measure and record the vibro-acoustical signals. Computational algorithms are also employed to identify contribution of various sources towards the measured interior signal. These achievements can be utilised to detect, control and optimise interior noise performance of road transport vehicles.

  10. Estimation of Spatial Dynamic Nonparametric Durbin Models with Fixed Effects

    Science.gov (United States)

    Qian, Minghui; Hu, Ridong; Chen, Jianwei

    2016-01-01

    Spatial panel data models have been widely studied and applied in both scientific and social science disciplines, especially in the analysis of spatial influence. In this paper, we consider the spatial dynamic nonparametric Durbin model (SDNDM) with fixed effects, which takes the nonlinear factors into account base on the spatial dynamic panel…

  11. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    With reference to a specific data set, we consider how to perform a flexible non-parametric Bayesian analysis of an inhomogeneous point pattern modelled by a Markov point process, with a location dependent first order term and pairwise interaction only. A priori we assume that the first order term...

  12. Homothetic Efficiency and Test Power: A Non-Parametric Approach

    NARCIS (Netherlands)

    J. Heufer (Jan); P. Hjertstrand (Per)

    2015-01-01

    markdownabstract__Abstract__ We provide a nonparametric revealed preference approach to demand analysis based on homothetic efficiency. Homotheticity is a useful restriction but data rarely satisfies testable conditions. To overcome this we provide a way to estimate homothetic efficiency of consump

  13. Water quality analysis in rivers with non-parametric probability distributions and fuzzy inference systems: application to the Cauca River, Colombia.

    Science.gov (United States)

    Ocampo-Duque, William; Osorio, Carolina; Piamba, Christian; Schuhmacher, Marta; Domingo, José L

    2013-02-01

    The integration of water quality monitoring variables is essential in environmental decision making. Nowadays, advanced techniques to manage subjectivity, imprecision, uncertainty, vagueness, and variability are required in such complex evaluation process. We here propose a probabilistic fuzzy hybrid model to assess river water quality. Fuzzy logic reasoning has been used to compute a water quality integrative index. By applying a Monte Carlo technique, based on non-parametric probability distributions, the randomness of model inputs was estimated. Annual histograms of nine water quality variables were built with monitoring data systematically collected in the Colombian Cauca River, and probability density estimations using the kernel smoothing method were applied to fit data. Several years were assessed, and river sectors upstream and downstream the city of Santiago de Cali, a big city with basic wastewater treatment and high industrial activity, were analyzed. The probabilistic fuzzy water quality index was able to explain the reduction in water quality, as the river receives a larger number of agriculture, domestic, and industrial effluents. The results of the hybrid model were compared to traditional water quality indexes. The main advantage of the proposed method is that it considers flexible boundaries between the linguistic qualifiers used to define the water status, being the belongingness of water quality to the diverse output fuzzy sets or classes provided with percentiles and histograms, which allows classify better the real water condition. The results of this study show that fuzzy inference systems integrated to stochastic non-parametric techniques may be used as complementary tools in water quality indexing methodologies.

  14. Spectral analysis in ultraweak emissions of chemi- and electrochemiluminescence systems

    Institute of Scientific and Technical Information of China (English)

    K.Staninski; M.Kaczmarek; S.Lis; D.Komar; A.Szyczewski

    2009-01-01

    Investigation of ultraweak emissions in the processes of chemiluminescence,CL,and electrochemiluminesce,ECL,requires special techniques of their recording and spectral analysis.From among the hitherto proposed methods of detection of the emission spectra of these processes,that of the cut-off filter was most sensitive.The usefulness of this method in interpretation of the CL and ECL systems of the quantum fields in the range 1×10-9-1×10-11 containing ions and complexes of Eu(Ⅲ),Tb(Ⅲ) and Dy(Ⅲ) was shown.Exceptional character of the emission bands of lanthanide ions,being a result of the f-f electron transitions and in particular their low FWHM,permitted the application of the cut-off filter method to their analysis.The results obtained for CL and ECL on the basis of analysis of ultraweak emission proved to be successful in analytical applications.The systems containing Eu(Ⅲ) ions hydrated or complexed with organic ligands enabled inferring changes in the coordination sphere of the ions.

  15. Spectral Analysis of Broadband Seismic Array Data, Tien Shan

    Science.gov (United States)

    Shamshy, S.; Pavlis, G. L.

    2003-12-01

    We used a spectral analysis method to examine amplitude variations of body waves recorded in the Tien Shan region of central Asia. We used broadband data from the Kyrgyz Network (KNET), Kazakhstan Network (KZNET), and from a set of temporary, PASSCAL stations operated from 1997-2000 we refer to as the Ghengis array. A spectral ratio method similar to that used by Wilson and Pavlis (2000) was employed, but with station AAK used as a reference instead of the array median. Spectral ratios were estimated for all teleseismic events and a larger, intermediate depth events from the Hindu-Kush region for all three-components of ground motion and total signal strength on all components. Results are visualized by maps of amplitude for various frequency bands and through the 4-D animation method introduced by Wilson and Pavlis (2000). Data from Hindu-Kush events showed amplitude variations as much as a factor of 100 across the study area with a strong frequency dependence. The largest variations were at the highest frequencies observed near 15 Hz. Stations in the northwestern part of the Tien Shan array show little variation in amplitude relative to the reference station, AAK. In the central and eastern part of the array, the amplitude estimates are significantly smaller at all frequencies. In contrast, for stations in the western Tien Shan near the Talas-Fergana Fault, and the southern Tien Shan near the Tarim Basin, the amplitude values become much larger than the reference site. The teleseismic data show a different pattern and show a somewhat smaller, overall amplitude variation at comparable frequencies. The northern part of the array again shows small variations relative to the reference stations. There are some amplifications in the southern stations of the array, especially in the Tarim Basin. The higher frequency observations that show large amplifications at stations in the Tarim Basin are readily explained by site effects due to the thick deposits of sediments

  16. Comprehensive spectral analysis of Cyg X-1 using RXTE data

    Institute of Scientific and Technical Information of China (English)

    Rizwan Shahid; Ranjeev Misra; S.N.A.Jaaffrey

    2012-01-01

    We analyze a large number (> 500) of pointed Rossi X-Ray Timing Explorer (RXTE) observations of Cyg X- 1 and model the spectrum of each one.A subset of the observations for which there is a simultaneous reliable measure of the hardness ratio by the All Sky Monitor shows that the sample covers nearly all the spectral shapes of Cyg X-1.Each observation is fitted with a generic empirical model consisting of a disk black body spectrum,a Comptonized component whose input photon shape is the same as the disk emission,a Gaussian to represent the iron line and a reflection feature.The relative strength,width of the iron line and the reflection parameter are in general correlated with the high energy photon spectral index Γ.This is broadly consistent with a geometry where for the hard state (low Γ ~ 1.7) there is a hot inner Comptonizing region surrounded by a truncated cold disk.The inner edge of the disk moves inwards as the source becomes softer till finally in the soft state (high Γ > 2.2) the disk fills the inner region and active regions above the disk produce the Comptonized component.However,the reflection parameter shows non-monotonic behavior near the transition region (Γ ~ 2),which suggests a more complex geometry or physical state of the reflector.In addition,the inner disk temperature,during the hard state,is on average higher than in the soft one,albeit with large scatter.These inconsistencies could be due to limitations in the data and the empirical model used to fit them.The flux of each spectral component is well correlated with Γ,which shows that unlike some other black hole systems,Cyg X- 1 does not show any hysteresis behavior.In the soft state,the flux of the Comptonized component is always similar to the disk one,which confirms that the ultra-soft state (seen in other brighter black hole systems) is not exhibited by Cyg X- 1.The rapid variation of the Compton amplification factor with Γ naturally explains the absence of spectra with Γ

  17. ENSO and its modulations on annual and multidecadal timescales revealed by Nonlinear Laplacian Spectral Analysis

    Science.gov (United States)

    Giannakis, D.; Slawinska, J. M.

    2016-12-01

    The variability of the Indo-Pacific Ocean on interannual to multidecadal timescales is investigated in a millennial control run of CCSM4 and in observations using a recently introduced technique called Nonlinear Laplacian Spectral Analysis (NLSA). Through this technique, drawbacks associated with ad hoc pre-filtering of the input data are avoided, enabling recovery of low-frequency and intermittent modes not accessible previously via classical approaches. Here, a multiscale hierarchy of modes is identified for Indo-Pacific SST and numerous linkages between these patterns are revealed. On interannual timescales, a mode with spatiotemporal pattern corresponding to the fundamental component of ENSO emerges, along with modulations of the annual cycle by ENSO in agreement with ENSO combination mode theory. In spatiotemporal reconstructions, these patterns capture the seasonal southward migration of SST and zonal wind anomalies associated with termination of El Niño and La Niña events. Notably, this family of modes explains a significant portion of SST variance in Eastern Indian Ocean regions employed in the definition of Indian Ocean dipole (IOD) indices, suggesting that it should be useful for understanding the linkage of these indices with ENSO and the interaction of the Indian and Pacific Oceans. In model data, we find that the ENSO and ENSO combination modes are modulated on multidecadal timescales by a mode predominantly active in the western tropical Pacific - we call this mode West Pacific Multidecadal Oscillation (WPMO). Despite the relatively low variance explained by this mode, its dynamical role appears to be significant as it has clear sign-dependent modulating relationships with the interannual modes carrying most of the variance. In particular, cold WPMO events are associated with anomalous Central Pacific westerlies favoring stronger ENSO events, while warm WPMO events suppress ENSO activity. Moreover, the WPMO has significant climatic impacts as

  18. Nonparametric Regression with Common Shocks

    Directory of Open Access Journals (Sweden)

    Eduardo A. Souza-Rodrigues

    2016-09-01

    Full Text Available This paper considers a nonparametric regression model for cross-sectional data in the presence of common shocks. Common shocks are allowed to be very general in nature; they do not need to be finite dimensional with a known (small number of factors. I investigate the properties of the Nadaraya-Watson kernel estimator and determine how general the common shocks can be while still obtaining meaningful kernel estimates. Restrictions on the common shocks are necessary because kernel estimators typically manipulate conditional densities, and conditional densities do not necessarily exist in the present case. By appealing to disintegration theory, I provide sufficient conditions for the existence of such conditional densities and show that the estimator converges in probability to the Kolmogorov conditional expectation given the sigma-field generated by the common shocks. I also establish the rate of convergence and the asymptotic distribution of the kernel estimator.

  19. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  20. An asymptotically optimal nonparametric adaptive controller

    Institute of Scientific and Technical Information of China (English)

    郭雷; 谢亮亮

    2000-01-01

    For discrete-time nonlinear stochastic systems with unknown nonparametric structure, a kernel estimation-based nonparametric adaptive controller is constructed based on truncated certainty equivalence principle. Global stability and asymptotic optimality of the closed-loop systems are established without resorting to any external excitations.

  1. Spectral Image Analysis for Measuring Ripeness of Tomatoes

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.; Young, I.T.

    2002-01-01

    In this study, spectral images of five ripeness stages of tomatoes have been recorded and analyzed. The electromagnetic spectrum between 396 and 736 nm was recorded in 257 bands (every 1.3 nm). Results show that spectral images offer more discriminating power than standard RGB images for measuring r

  2. Spectral analysis based on compressive sensing in nanophotonic structures.

    Science.gov (United States)

    Wang, Zhu; Yu, Zongfu

    2014-10-20

    A method of spectral sensing based on compressive sensing is shown to have the potential to achieve high resolution in a compact device size. The random bases used in compressive sensing are created by the optical response of a set of different nanophotonic structures, such as photonic crystal slabs. The complex interferences in these nanostructures offer diverse spectral features suitable for compressive sensing.

  3. Spectral Efficiency Analysis for Multicarrier Based 4G Systems

    DEFF Research Database (Denmark)

    Silva, Nuno; Rahman, Muhammad Imadur; Frederiksen, Flemming Bjerge

    2006-01-01

    In this paper, a spectral efficiency definition is proposed. Spectral efficiency for multicarrier based multiaccess techniques, such as OFDMA, MC-CDMA and OFDMA-CDM, is analyzed. Simulations for different indoor and outdoor scenarios are carried out. Based on the simulations, we have discussed ho...

  4. [A New HAC Unsupervised Classifier Based on Spectral Harmonic Analysis].

    Science.gov (United States)

    Yang, Ke-ming; Wei, Hua-feng; Shi, Gang-qiang; Sun, Yang-yang; Liu, Fei

    2015-07-01

    Hyperspectral images classification is one of the important methods to identify image information, which has great significance for feature identification, dynamic monitoring and thematic information extraction, etc. Unsupervised classification without prior knowledge is widely used in hyperspectral image classification. This article proposes a new hyperspectral images unsupervised classification algorithm based on harmonic analysis(HA), which is called the harmonic analysis classifer (HAC). First, the HAC algorithm counts the first harmonic component and draws the histogram, so it can determine the initial feature categories and the pixel of cluster centers according to the number and location of the peak. Then, the algorithm is to map the waveform information of pixels to be classified spectrum into the feature space made up of harmonic decomposition times, amplitude and phase, and the similar features can be gotten together in the feature space, these pixels will be classified according to the principle of minimum distance. Finally, the algorithm computes the Euclidean distance of these pixels between cluster center, and merges the initial classification by setting the distance threshold. so the HAC can achieve the purpose of hyperspectral images classification. The paper collects spectral curves of two feature categories, and obtains harmonic decomposition times, amplitude and phase after harmonic analysis, the distribution of HA components in the feature space verified the correctness of the HAC. While the HAC algorithm is applied to EO-1 satellite Hyperion hyperspectral image and obtains the results of classification. Comparing with the hyperspectral image classifying results of K-MEANS, ISODATA and HAC classifiers, the HAC, as a unsupervised classification method, is confirmed to have better application on hyperspectral image classification.

  5. Nonlinear Laplacian spectral analysis of Rayleigh-Bénard convection

    Science.gov (United States)

    Brenowitz, N. D.; Giannakis, D.; Majda, A. J.

    2016-06-01

    The analysis of physical datasets using modern methods developed in machine learning presents unique challenges and opportunities. These datasets typically feature many degrees of freedom, which tends to increase the computational cost of statistical methods and complicate interpretation. In addition, physical systems frequently exhibit a high degree of symmetry that should be exploited by any data analysis technique. The classic problem of Rayleigh Benárd convection in a periodic domain is an example of such a physical system with trivial symmetries. This article presents a technique for analyzing the time variability of numerical simulations of two-dimensional Rayleigh-Bénard convection at large aspect ratio and intermediate Rayleigh number. The simulated dynamics are highly unsteady and consist of several convective rolls that are distributed across the domain and oscillate with a preferred frequency. Intermittent extreme events in the net heat transfer, as quantified by the time-weighted probability distribution function of the Nusselt number, are a hallmark of these simulations. Nonlinear Laplacian Spectral Analysis (NLSA) is a data-driven method which is ideally suited for the study of such highly nonlinear and intermittent dynamics, but the trivial symmetries of the Rayleigh-Bénard problem such as horizontal shift-invariance can mask the interesting dynamics. To overcome this issue, the vertical velocity is averaged over parcels of similar temperature and height, which substantially compresses the size of the dataset and removes trivial horizontal symmetries. This isothermally averaged dataset, which is shown to preserve the net convective heat-flux across horizontal surfaces, is then used as an input to NLSA. The analysis generates a small number of orthogonal modes which describe the spatiotemporal variability of the heat transfer. A regression analysis shows that the extreme events of the net heat transfer are primarily associated with a family of

  6. Power Spectral Analysis of Heart Rate Variability of Driver Fatigue

    Institute of Scientific and Technical Information of China (English)

    JIAO Kun; LI Zeng-yong; CHEN Ming; WANG Cheng-tao

    2005-01-01

    This investigation was to evaluate the driving fatigue based on power spectral analysis of heart rate variability (HRV) under vertical vibration. Forty healthy male subjects (29.7±3.5 years) were randomly divided into two groups, Group A (28.8±4.3 years) and Group B (30.6±2.7 years). Group A (experiment group) was required to perform the simulated driving and Group B (control group) kept calm for 90min. The frequency domain indices of HRV such as low frequency (0.040.15 Hz, LF), high frequency (0.15-0.4Hz, HF), LF/HF together with the indices of hemodynamics such as blood pressure (BP) and heart rate (HR) of the subjects between both groups were calculated and analyzed after the simulated driving. There were significances of the former indices between both groups (P<0.05). All the data collected after experiment of Group A was observed the remarkable linear correlation (P<0.05) and parameters and errors of their linear regression equation were stated (α=0.05, P<0.001) in this paper, respectively. The present study investigated that sympathetic activity of the subjects enhanced after the simulated driving while parasympathetic activities decreased. The sympathovagal balance was also improved. As autonomic function indictors of HRV reflected fatigue level, quantitative evaluation of driving mental fatigue from physiological reaction could be possible.

  7. Studying soil properties using visible and near infrared spectral analysis

    Science.gov (United States)

    Moretti, S.; Garfagnoli, F.; Innocenti, L.; Chiarantini, L.

    2009-04-01

    This research is carried out inside the DIGISOIL Project, whose purposes are the integration and improvement of in situ and proximal measurement technologies, for the assessment of soil properties and soil degradation indicators, going form the sensing technologies to their integration and their application in digital soil mapping. The study area is located in the Virginio river basin, about 30 km south of Firenze, in the Chianti area, where soils with agricultural suitability have a high economic value connected to the production of internationally famous wines and olive oils. The most common soil threats, such as erosion and landslide, may determine huge economic losses, which must be considered in farming management practices. This basin has a length of about 23 km for a basin area of around 60,3 Km2. Geological formations outcropping in the area are Pliocene to Pleistocene marine and lacustrine sediments in beds with almost horizontal bedding. Vineyards, olive groves and annual crops are the main types of land use. A typical Mediterranean climate prevails with a dry summer followed by intense and sometimes prolonged rainfall in autumn, decreasing in winter. In this study, three types of VNIR and SWIR techniques, operating at different scales and in different environments (laboratory spectroscopy, portable field spectroscopy) are integrated to rapidly quantify various soil characteristics, in order to acquire data for assessing the risk of occurrence for typically agricultural practice-related soil threats (swelling, compaction, erosion, landslides, organic matter decline, ect.) and to collect ground data in order to build up a spectral library to be used in image analysis from air-borne and satellite sensors. Difficulties encountered in imaging spectroscopy, such as influence of measurements conditions, atmospheric attenuation, scene dependency and sampling representation are investigated and mathematical pre-treatments, using proper algorithms, are applied and

  8. Spectral analysis of HIV seropositivity among migrant workers entering Kuwait

    Directory of Open Access Journals (Sweden)

    Mohammad Hameed GHH

    2008-03-01

    Full Text Available Abstract Background There is paucity of published data on human immunodeficiency virus (HIV seroprevalence among migrant workers entering Middle-East particularly Kuwait. We took advantage of the routine screening of migrants for HIV infection, upon arrival in Kuwait from the areas with high HIV prevalence, to 1 estimate the HIV seroprevalence among migrant workers entering Kuwait and to 2 ascertain if any significant time trend or changes had occurred in HIV seroprevalence among these migrants over the study period. Methods The monthly aggregates of daily number of migrant workers tested and number of HIV seropositive were used to generate the monthly series of proportions of HIV seropositive (per 100,000 migrants over a period of 120 months from January 1, 1997 to December 31, 2006. We carried out spectral analysis of these time series data on monthly proportions (per 100,000 of HIV seropositive migrants. Results Overall HIV seroprevalence (per 100,000 among the migrants was 21 (494/2328582 (95% CI: 19 -23, ranging from 11 (95% CI: 8 – 16 in 2003 to 31 (95% CI: 24 -41 in 1998. There was no discernable pattern in the year-specific proportions of HIV seropositive migrants up to 2003; in subsequent years there was a slight but consistent increase in the proportions of HIV seropositive migrants. However, the Mann-Kendall test showed non-significant (P = 0.741 trend in de-seasonalized data series of proportions of HIV seropositive migrants. The spectral density had a statistically significant (P = 0.03 peak located at a frequency (radians 2.4, which corresponds to a regular cycle of three-month duration in this study. Auto-correlation function did not show any significant seasonality (correlation coefficient at lag 12 = – 0.025, P = 0.575. Conclusion During the study period, overall a low HIV seroprevalence (0.021% was recorded. Towards the end of the study, a slight but non-significant upward trend in the proportions of HIV seropositive

  9. Spectral mixture analysis of EELS spectrum-images

    OpenAIRE

    Dobigeon, Nicolas; Brun, Nathalie

    2012-01-01

    Recent advances in detectors and computer science have enabled the acquisition and the processing of multidimensional datasets, in particular in the field of spectral imaging. Benefiting from these new developments, Earth scientists try to recover the reflectance spectra of macroscopic materials (e.g., water, grass, mineral types...) present in an observed scene and to estimate their respective proportions in each mixed pixel of the acquired image. This task is usually referred to as spectral...

  10. Analysis of wheezes using wavelet higher order spectral features.

    Science.gov (United States)

    Taplidou, Styliani A; Hadjileontiadis, Leontios J

    2010-07-01

    Wheezes are musical breath sounds, which usually imply an existing pulmonary obstruction, such as asthma and chronic obstructive pulmonary disease (COPD). Although many studies have addressed the problem of wheeze detection, a limited number of scientific works has focused in the analysis of wheeze characteristics, and in particular, their time-varying nonlinear characteristics. In this study, an effort is made to reveal and statistically analyze the nonlinear characteristics of wheezes and their evolution over time, as they are reflected in the quadratic phase coupling of their harmonics. To this end, the continuous wavelet transform (CWT) is used in combination with third-order spectra to define the analysis domain, where the nonlinear interactions of the harmonics of wheezes and their time variations are revealed by incorporating instantaneous wavelet bispectrum and bicoherence, which provide with the instantaneous biamplitude and biphase curves. Based on this nonlinear information pool, a set of 23 features is proposed for the nonlinear analysis of wheezes. Two complementary perspectives, i.e., general and detailed, related to average performance and to localities, respectively, were used in the construction of the feature set, in order to embed trends and local behaviors, respectively, seen in the nonlinear interaction of the harmonic elements of wheezes over time. The proposed feature set was evaluated on a dataset of wheezes, acquired from adult patients with diagnosed asthma and COPD from a lung sound database. The statistical evaluation of the feature set revealed discrimination ability between the two pathologies for all data subgroupings. In particular, when the total breathing cycle was examined, all 23 features, but one, showed statistically significant difference between the COPD and asthma pathologies, whereas for the subgroupings of inspiratory and expiratory phases, 18 out of 23 and 22 out of 23 features exhibited discrimination power, respectively

  11. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  12. Nonparametric methods in actigraphy: An update

    Science.gov (United States)

    Gonçalves, Bruno S.B.; Cavalcanti, Paula R.A.; Tavares, Gracilene R.; Campos, Tania F.; Araujo, John F.

    2014-01-01

    Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm) results for each time interval. Simulated data showed that (1) synchronization analysis depends on sample size, and (2) fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization. PMID:26483921

  13. Synthesis, spectral, computational and thermal analysis studies of metallocefotaxime antibiotics.

    Science.gov (United States)

    Masoud, Mamdouh S; Ali, Alaa E; Elasala, Gehan S

    2015-01-01

    Cefotaxime metal complexes of Cr(III), Mn(II), Fe(III), Co(II), Ni(II), Cu(II), Zn(II), Cd(II), Hg(II) and two mixed metals complexes of (Fe,Cu) and (Fe,Ni) were synthesized and characterized by elemental analysis, IR, electronic spectra, magnetic susceptibility and ESR spectra. The studies proved that cefotaxime may act as mono, bi, tri and tetra-dentate ligand through oxygen atoms of lactam carbonyl, carboxylic or amide carbonyl groups and nitrogen atom of thiazole ring. From the magnetic measurements and electronic spectral data, octahedral structures were proposed for all complexes. Quantum chemical methods have been performed for cefotaxime to calculate charges, bond lengths, bond angles, dihedral angles, electronegativity (χ), chemical potential (μ), global hardness (η), softness (σ) and the electrophilicity index (ω). The thermal decomposition of the prepared metals complexes was studied by TGA, DTA and DSC techniques. Thermogravimetric studies revealed the presence of lattice or coordinated water molecules in all the prepared complexes. The decomposition mechanisms were suggested. The thermal decomposition of the complexes ended with the formation of metal oxides and carbon residue as a final product except in case of Hg complex, sublimation occur at the temperature range 376.5-575.0 °C so, only carbon residue was produced during thermal decomposition. The orders of chemical reactions (n) were calculated via the peak symmetry method and the activation parameters were computed from the thermal decomposition data. The geometries of complexes may be converted from Oh to Td during the thermal decomposition steps.

  14. Estimating Financial Risk Measures for Futures Positions:A Non-Parametric Approach

    OpenAIRE

    Cotter, John; dowd, kevin

    2011-01-01

    This paper presents non-parametric estimates of spectral risk measures applied to long and short positions in 5 prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk (VaR) and Expected Shortfall (ES). The spectral risk measures are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level. Our findings indicate that all risk measures increase dramatically and the...

  15. Spectral image analysis of mutual illumination between florescent objects.

    Science.gov (United States)

    Tominaga, Shoji; Kato, Keiji; Hirai, Keita; Horiuchi, Takahiko

    2016-08-01

    This paper proposes a method for modeling and component estimation of the spectral images of the mutual illumination phenomenon between two fluorescent objects. First, we briefly describe the bispectral characteristics of a single fluorescent object, which are summarized as a Donaldson matrix. We suppose that two fluorescent objects with different bispectral characteristics are located close together under a uniform illumination. Second, we model the mutual illumination between two objects. It is shown that the spectral composition of the mutual illumination is summarized with four components: (1) diffuse reflection, (2) diffuse-diffuse interreflection, (3) fluorescent self-luminescence, and (4) interreflection by mutual fluorescent illumination. Third, we develop algorithms for estimating the spectral image components from the observed images influenced by the mutual illumination. When the exact Donaldson matrices caused by the mutual illumination influence are unknown, we have to solve a non-linear estimation problem to estimate both the spectral functions and the location weights. An iterative algorithm is then proposed to solve the problem based on the alternate estimation of the spectral functions and the location weights. In our experiments, the feasibility of the proposed method is shown in three cases: the known Donaldson matrices, weak interreflection, and strong interreflection.

  16. Estimation of sub-pixel water area on Tibet plateau using multiple endmembers spectral mixture spectral analysis from MODIS data

    Science.gov (United States)

    Cui, Qian; Shi, Jiancheng; Xu, Yuanliu

    2011-12-01

    Water is the basic needs for human society, and the determining factor of stability of ecosystem as well. There are lots of lakes on Tibet Plateau, which will lead to flood and mudslide when the water expands sharply. At present, water area is extracted from TM or SPOT data for their high spatial resolution; however, their temporal resolution is insufficient. MODIS data have high temporal resolution and broad coverage. So it is valuable resource for detecting the change of water area. Because of its low spatial resolution, mixed-pixels are common. In this paper, four spectral libraries are built using MOD09A1 product, based on that, water body is extracted in sub-pixels utilizing Multiple Endmembers Spectral Mixture Analysis (MESMA) using MODIS daily reflectance data MOD09GA. The unmixed result is comparing with contemporaneous TM data and it is proved that this method has high accuracy.

  17. Parametric and Non-Parametric System Modelling

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg

    1999-01-01

    considered. It is shown that adaptive estimation in conditional parametric models can be performed by combining the well known methods of local polynomial regression and recursive least squares with exponential forgetting. The approach used for estimation in conditional parametric models also highlights how....... For this purpose non-parametric methods together with additive models are suggested. Also, a new approach specifically designed to detect non-linearities is introduced. Confidence intervals are constructed by use of bootstrapping. As a link between non-parametric and parametric methods a paper dealing with neural...... the focus is on combinations of parametric and non-parametric methods of regression. This combination can be in terms of additive models where e.g. one or more non-parametric term is added to a linear regression model. It can also be in terms of conditional parametric models where the coefficients...

  18. Rotating shadowband radiometer development and analysis of spectral shortwave data

    Energy Technology Data Exchange (ETDEWEB)

    Michalsky, J.; Harrison, L.; Min, Q. [State Univ. of New York, Albany, NY (United States)] [and others

    1996-04-01

    Our goals in the Atmospheric Radiation Measurement (ARM) Program are improved measurements of spectral shortwave radiation and improved techniques for the retrieval of climatologically sensitive parameters. The multifilter rotating shadowband radiometer (MFRSR) that was developed during the first years of the ARM program has become a workhorse at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site, and it is widely deployed in other climate programs. We have spent most of our effort this year developing techniques to retrieve column aerosol, water vapor, and ozone from direct beam spectral measurements of the MFRSR. Additionally, we have had some success in calculating shortwave surface diffuse spectral irradiance. Using the surface albedo and the global irradiance, we have calculated cloud optical depths. From cloud optical depth and liquid water measured with the microwave radiometer, we have calculated effective liquid cloud particle radii. The rest of the text will provide some detail regarding each of these efforts.

  19. Perturbative Analysis of Spectral Singularities and Their Optical Realizations

    CERN Document Server

    Mostafazadeh, Ali

    2012-01-01

    We develop a perturbative method of computing spectral singularities of a Schreodinger operator defined by a general complex potential that vanishes outside a closed interval. These can be realized as zero-width resonances in optical gain media and correspond to a lasing effect that occurs at the threshold gain. Their time-reversed copies yield coherent perfect absorption of light that is also known as an antilaser. We use our general results to establish the exactness of the n-th order perturbation theory for an arbitrary complex potential consisting of n delta-functions, obtain an exact expression for the transfer matrix of these potentials, and examine spectral singularities of complex barrier potentials of arbitrary shape. In the context of optical spectral singularities, these correspond to inhomogeneous gain media.

  20. Inverse spectral analysis for singular differential operators with matrix coefficients

    Directory of Open Access Journals (Sweden)

    Nour el Houda Mahmoud

    2006-02-01

    Full Text Available Let $L_alpha$ be the Bessel operator with matrix coefficients defined on $(0,infty$ by $$ L_alpha U(t = U''(t+ {I/4-alpha^2over t^2}U(t, $$ where $alpha$ is a fixed diagonal matrix. The aim of this study, is to determine, on the positive half axis, a singular second-order differential operator of $L_alpha+Q$ kind and its various properties from only its spectral characteristics. Here $Q$ is a matrix-valued function. Under suitable circumstances, the solution is constructed by means of the spectral function, with the help of the Gelfund-Levitan process. The hypothesis on the spectral function are inspired on the results of some direct problems. Also the resolution of Fredholm's equations and properties of Fourier-Bessel transforms are used here.

  1. Spectral analysis of the turbulent mixing of two fluids

    Energy Technology Data Exchange (ETDEWEB)

    Steinkamp, M.J.

    1996-02-01

    The authors describe a spectral approach to the investigation of fluid instability, generalized turbulence, and the interpenetration of fluids across an interface. The technique also applies to a single fluid with large variations in density. Departures of fluctuating velocity components from the local mean are far subsonic, but the mean Mach number can be large. Validity of the description is demonstrated by comparisons with experiments on turbulent mixing due to the late stages of Rayleigh-Taylor instability, when the dynamics become approximately self-similar in response to a constant body force. Generic forms for anisotropic spectral structure are described and used as a basis for deriving spectrally integrated moment equations that can be incorporated into computer codes for scientific and engineering analyses.

  2. Systematic Spectral Lag Analysis of Swift Known-z GRBs

    Directory of Open Access Journals (Sweden)

    Yuta Kawakubo

    2015-01-01

    arrive earlier than soft photons. The lag-luminosity relation is the empirical relationship between the isotropic peak luminosity and the spectral lag. We calculated the spectral lags for 40 known redshift GRBs observed by Swift addition to the previous 31 GRB samples. We confirmed that most of our samples follow the lag-luminosity relation. However, we noticed that there are some GRBs which show a significant scatter from the relation. We also confirm that the relationship between the break time and the luminosity of the X-ray afterglow (so-called Dainotti relation extends up to the lag-luminosity relation.

  3. Bayesian nonparametric duration model with censorship

    Directory of Open Access Journals (Sweden)

    Joseph Hakizamungu

    2007-10-01

    Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.

  4. Bootstrap Estimation for Nonparametric Efficiency Estimates

    OpenAIRE

    1995-01-01

    This paper develops a consistent bootstrap estimation procedure to obtain confidence intervals for nonparametric measures of productive efficiency. Although the methodology is illustrated in terms of technical efficiency measured by output distance functions, the technique can be easily extended to other consistent nonparametric frontier models. Variation in estimated efficiency scores is assumed to result from variation in empirical approximations to the true boundary of the production set. ...

  5. Least Squares Spectral Analysis and Its Application to Superconducting Gravimeter Data Analysis

    Institute of Scientific and Technical Information of China (English)

    YIN Hui; Spiros D. Pagiatakis

    2004-01-01

    Detection of a periodic signal hidden in noise is the goal of Superconducting Gravimeter (SG) data analysis. Due to spikes, gaps, datum shrifts (offsets) and other disturbances, the traditional FFT method shows inherent limitations. Instead, the least squares spectral analysis (LSSA) has showed itself more suitable than Fourier analysis of gappy, unequally spaced and unequally weighted data series in a variety of applications in geodesy and geophysics. This paper reviews the principle of LSSA and gives a possible strategy for the analysis of time series obtained from the Canadian Superconducting Gravimeter Installation (CGSI), with gaps, offsets, unequal sampling decimation of the data and unequally weighted data points.

  6. Accelerometer and gyroscope based gait analysis using spectral analysis of patients with osteoarthritis of the knee.

    Science.gov (United States)

    Staab, Wieland; Hottowitz, Ralf; Sohns, Christian; Sohns, Jan Martin; Gilbert, Fabian; Menke, Jan; Niklas, Andree; Lotz, Joachim

    2014-07-01

    [Purpose] A wide variety of accelerometer tools are used to estimate human movement, but there are no adequate data relating to gait symmetry parameters in the context of knee osteoarthritis. This study's purpose was to evaluate a 3D-kinematic system using body-mounted sensors (gyroscopes and accelerometers) on the trunk and limbs. This is the first study to use spectral analysis for data post processing. [Subjects] Twelve patients with unilateral knee osteoarthritis (OA) (10 male) and seven age-matched controls (6 male) were studied. [Methods] Measurements with 3-D accelerometers and gyroscopes were compared to video analysis with marker positions tracked by a six-camera optoelectronic system (VICON 460, Oxford Metrics). Data were recorded using the 3D-kinematic system. [Results] The results of both gait analysis systems were significantly correlated. Five parameters were significantly different between the knee OA and control groups. To overcome time spent in expensive post-processing routines, spectral analysis was performed for fast differentiation between normal gait and pathological gait signals using the 3D-kinematic system. [Conclusions] The 3D-kinematic system is objective, inexpensive, accurate and portable, and allows long-term recordings in clinical, sport as well as ergonomic or functional capacity evaluation (FCE) settings. For fast post-processing, spectral analysis of the recorded data is recommended.

  7. [The linearity analysis of ultrahigh temperature FTIR spectral emissivity measurement system].

    Science.gov (United States)

    Wang, Zong-wei; Dai, Jing-min; He, Xiao-wa; Yang, Chun-ling

    2012-02-01

    To study thermal radiation properties of special materials at high temperature in aerospace fields, the ultrahigh temperature spectral emissivity measurement system with Fourier spectrometer has been established. The linearity of system is the guarantee of emissivity measurement precision. Through measuring spectral radiation signals of a blackbody source at different temperatures, the function relations between spectral signal values and blackbody spectral radiation luminance of every spectrum points were calculated with the method of multi-temperature and multi-spectrum linear fitting. The spectral radiation signals of blackbody were measured between 1 000 degrees C and 2 000 degrees C in the spectral region from 3 to 20 microm. The linear relations between spectral signal and theory line at wavelength of 4 microm were calculated and introduced. The spectral response is well good between 4 and 18 microm, the spectral linearity are less than 1% except CO2 strong absorption spectrum regions. The results show that when the errors of measured spectrum radiation and linear fitting theory lines are certain, the higher the temperature, the smaller the spectral errors on emissivity. The linearity analysis of spectrum response is good at eliminating errors caused by individual temperature' disturbance to the spectra.

  8. Spectral and Temporal Laser Fluorescence Analysis Such as for Natural Aquatic Environments

    Science.gov (United States)

    Chekalyuk, Alexander (Inventor)

    2015-01-01

    An Advanced Laser Fluorometer (ALF) can combine spectrally and temporally resolved measurements of laser-stimulated emission (LSE) for characterization of dissolved and particulate matter, including fluorescence constituents, in liquids. Spectral deconvolution (SDC) analysis of LSE spectral measurements can accurately retrieve information about individual fluorescent bands, such as can be attributed to chlorophyll-a (Chl-a), phycobiliprotein (PBP) pigments, or chromophoric dissolved organic matter (CDOM), among others. Improved physiological assessments of photosynthesizing organisms can use SDC analysis and temporal LSE measurements to assess variable fluorescence corrected for SDC-retrieved background fluorescence. Fluorescence assessments of Chl-a concentration based on LSE spectral measurements can be improved using photo-physiological information from temporal measurements. Quantitative assessments of PBP pigments, CDOM, and other fluorescent constituents, as well as basic structural characterizations of photosynthesizing populations, can be performed using SDC analysis of LSE spectral measurements.

  9. Phasor analysis of multiphoton spectral images distinguishes autofluorescence components of in vivo human skin.

    Science.gov (United States)

    Fereidouni, Farzad; Bader, Arjen N; Colonna, Anne; Gerritsen, Hans C

    2014-08-01

    Skin contains many autofluorescent components that can be studied using spectral imaging. We employed a spectral phasor method to analyse two photon excited autofluorescence and second harmonic generation images of in vivo human skin. This method allows segmentation of images based on spectral features. Various structures in the skin could be distinguished, including Stratum Corneum, epidermal cells and dermis. The spectral phasor analysis allowed investigation of their fluorescence composition and identification of signals from NADH, keratin, FAD, melanin, collagen and elastin. Interestingly, two populations of epidermal cells could be distinguished with different melanin content.

  10. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  11. Test and analysis of spectral response for UV image intensifier

    Science.gov (United States)

    Qian, Yunsheng; Liu, Jian; Feng, Cheng; Lv, Yang; Zhang, Yijun

    2015-10-01

    The UV image intensifier is one kind of electric vacuum imaging device based on principle of photoelectronic imaging. To achieve solar-blind detection, its spectral response characteristic is extremely desirable. A broad spectrum response measurement system is developed. This instrument uses EQ-99 laser-driven light source to get broad spectrum in the range of 200 nm to 1700 nm. A special preamplifier as well as a test software is work out. The spectral response of the image intensifier can be tested in the range of 200~1700 nm. Using this spectrum response measuring instrument, the UV image intensifiers are tested. The spectral response at the spectral range of 200 nm to 600 nm are obtained. Because of the quantum efficiency of Te-Cs photocathode used in image intens ifier above 280nm wavelength still exists, especially at 280 nm to 320nm.Therefore, high-performance UV filters is required for solar blind UV detection. Based on two sets of UV filters, the influence of solar radiation on solar blind detection is calculated and analyzed.

  12. Two-body threshold spectral analysis, the critical case

    DEFF Research Database (Denmark)

    Skibsted, Erik; Wang, Xue Ping

    We study in dimension $d\\geq2$ low-energy spectral and scattering asymptotics for two-body $d$-dimensional Schrödinger operators with a radially symmetric potential falling off like $-\\gamma r^{-2},\\;\\gamma>0$. We consider angular momentum sectors, labelled by $l=0,1,\\dots$, for which $\\gamma...

  13. Prostate cancer spectral multifeature analysis using TRUS images.

    Science.gov (United States)

    Mohamed, S S; Salama, M A

    2008-04-01

    This paper focuses on extracting and analyzing different spectral features from transrectal ultrasound (TRUS) images for prostate cancer recognition. First, the information about the images' frequency domain features and spatial domain features are combined using a Gabor filter and then integrated with the expert radiologist's information to identify the highly suspicious regions of interest (ROIs). The next stage of the proposed algorithm is to scan each identified region in order to generate the corresponding 1-D signal that represents each region. For each ROI, possible spectral feature sets are constructed using different new geometrical features extracted from the power spectrum density (PSD) of each region's signal. Next, a classifier-based algorithm for feature selection using particle swarm optimization (PSO) is adopted and used to select the optimal feature subset from the constructed feature sets. A new spectral feature set for the TRUS images using estimation of signal parameters via rotational invariance technique (ESPRIT) is also constructed, and its ability to represent tissue texture is compared to the PSD-based spectral feature sets using the support vector machines (SVMs) classifier. The accuracy obtained ranges from 72.2% to 94.4%, with the best accuracy achieved by the ESPRIT feature set.

  14. Time-resolved spectral analysis of Radachlorin luminescence in water

    Science.gov (United States)

    Belik, V. P.; Gadzhiev, I. M.; Semenova, I. V.; Vasyutinskii, O. S.

    2017-05-01

    We report results of spectral- and time-resolved study of Radachlorin photosensitizer luminescence in water in the spectral range of 950-1350nm and for determination of the photosensitizer triplet state and the singlet oxygen lifetimes responsible for singlet oxygen generation and degradation. At any wavelength within the explored spectral range the luminescence decay contained two major contributions: a fast decay at the ns time scale and a slow evolution at the μs time scale. The fast decay was attributed to electric dipole fluorescence transitions in photosensitizer molecules and the slow evolution to intercombination phosphorescence transitions in singlet oxygen and photosensitizer molecules. Relatively high-amplitude ns peak observed at all wavelengths suggests that singlet oxygen monitoring with spectral isolation methods alone, without additional temporal resolution can be controversial. In the applied experimental conditions the total phosphorescence signal at any wavelength contained a contribution from the photosensitizer triplet state decay, while at 1274nm the singlet oxygen phosphorescence dominated. The results obtained can be used for optimization of the methods of singlet oxygen monitoring and imaging.

  15. Detecting gallbladders in chicken livers using spectral analysis

    DEFF Research Database (Denmark)

    Jørgensen, Anders; Mølvig Jensen, Eigil; Moeslund, Thomas B.

    2015-01-01

    This paper presents a method for detecting gallbladders attached to chicken livers using spectral imaging. Gallbladders can contaminate good livers, making them unfit for human consumption. A data set consisting of chicken livers with and without gallbladders, has been captured using 33 wavelengt...

  16. Detecting gallbladders in chicken livers using spectral analysis

    DEFF Research Database (Denmark)

    Jørgensen, Anders; Mølvig Jensen, Eigil; Moeslund, Thomas B.

    2015-01-01

    This paper presents a method for detecting gallbladders attached to chicken livers using spectral imaging. Gallbladders can contaminate good livers, making them unfit for human consumption. A data set consisting of chicken livers with and without gallbladders, has been captured using 33 wavelengths...

  17. Ultra-wideband spectral analysis using S2 technology

    Energy Technology Data Exchange (ETDEWEB)

    Krishna Mohan, R. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States)]. E-mail: krishna@spectrum.montana.edu; Chang, T. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Tian, M. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Bekker, S. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Olson, A. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Ostrander, C. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Khallaayoun, A. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Dollinger, C. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Babbitt, W.R. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Cole, Z. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); S2 Corporation, Bozeman, MT 59718 (United States); Reibel, R.R. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); S2 Corporation, Bozeman, MT 59718 (United States); Merkel, K.D. [Spectrum Lab, Montana State University, Bozeman, MT 59717 (United States); S2 Corporation, Bozeman, MT 59718 (United States); Sun, Y. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Cone, R. [Department of Physics, Montana State University, Bozeman, MT 59717 (United States); Schlottau, F. [University of Colorado, Boulder, CO 80309 (United States); Wagner, K.H. [University of Colorado, Boulder, CO 80309 (United States)

    2007-11-15

    This paper outlines the efforts to develop an ultra-wideband spectrum analyzer that takes advantage of the broad spectral response and fine spectral resolution ({approx}25 kHz) of spatial-spectral (S2) materials. The S2 material can process the full spectrum of broadband microwave transmissions, with adjustable time apertures (down to 100 {mu}s) and fast update rates (up to 1 kHz). A cryogenically cooled Tm:YAG crystal that operates on microwave signals modulated onto a stabilized optical carrier at 793 nm is used as the core for the spectrum analyzer. Efforts to develop novel component technologies that enhance the performance of the system and meet the application requirements are discussed, including an end-to-end device model for parameter optimization. We discuss the characterization of new ultra-wide bandwidth S2 materials. Detection and post-processing module development including the implementation of a novel spectral recovery algorithm using field programmable gate array technology (FPGA) is also discussed.

  18. Statistical Analysis of Spectral Properties and Prosodic Parameters of Emotional Speech

    Science.gov (United States)

    Přibil, J.; Přibilová, A.

    2009-01-01

    The paper addresses reflection of microintonation and spectral properties in male and female acted emotional speech. Microintonation component of speech melody is analyzed regarding its spectral and statistical parameters. According to psychological research of emotional speech, different emotions are accompanied by different spectral noise. We control its amount by spectral flatness according to which the high frequency noise is mixed in voiced frames during cepstral speech synthesis. Our experiments are aimed at statistical analysis of cepstral coefficient values and ranges of spectral flatness in three emotions (joy, sadness, anger), and a neutral state for comparison. Calculated histograms of spectral flatness distribution are visually compared and modelled by Gamma probability distribution. Histograms of cepstral coefficient distribution are evaluated and compared using skewness and kurtosis. Achieved statistical results show good correlation comparing male and female voices for all emotional states portrayed by several Czech and Slovak professional actors.

  19. Practical Aspects of the Spectral Analysis of Irregularly Sampled Data With Time-Series Models

    NARCIS (Netherlands)

    Broersen, P.M.T.

    2009-01-01

    Several algorithms for the spectral analysis of irregularly sampled random processes can estimate the spectral density for a low frequency range. A new time-series method extended that frequency range with a factor of thousand or more. The new algorithm has two requirements to give useful results. F

  20. Investigating the fracture non-linear dynamics through multi-spectral time series analysis of fracture-induced electromagnetic emissions

    Science.gov (United States)

    Kalimeris, Anastasios; Potirakis, Stelios M.; Eftaxias, Konstantinos; Antonopoulos, George; Kopanas, John; Nomicos, Constantinos

    2013-04-01

    -series excerpts. Then, each of them is studied through the aforementioned spectral decomposition methods. Following standard methodology we attempt to decompose each of the partial time-series into statistically significant non-linear trends, oscillatory modes and noise, by testing each spectral component against red noise and alternatively against locally white noise background. Monte Carlo simulations of AR(1)-process red noise simulations are also utilized in this analysis phase and finally the significant temporal empirical orthogonal functions (T-EOFs) and the associated temporal principal components (T-PCs) are specified. In this way, a non-parametric (and non-empirical) signal to noise resolution is achieved and the reconstruction of the statistically significant signal can be realized by adding the associated Reconstructed Components (RCs). By applying the aforementioned technique we reveal the significant signal characteristics for each period and try to interpret the underlying dynamics. Given the statistically significant reconstructed signal for each partial time-series we further attempt a comparison between their morphology (also using a spectral cross-correlation study) in order to detect similarities and differences mostly between the quiet and the active periods that could be signify the pre-seismic activity.

  1. Nonparametric estimation for hazard rate monotonously decreasing system

    Institute of Scientific and Technical Information of China (English)

    Han Fengyan; Li Weisong

    2005-01-01

    Estimation of density and hazard rate is very important to the reliability analysis of a system. In order to estimate the density and hazard rate of a hazard rate monotonously decreasing system, a new nonparametric estimator is put forward. The estimator is based on the kernel function method and optimum algorithm. Numerical experiment shows that the method is accurate enough and can be used in many cases.

  2. Non-parametric versus parametric methods in environmental sciences

    Directory of Open Access Journals (Sweden)

    Muhammad Riaz

    2016-01-01

    Full Text Available This current report intends to highlight the importance of considering background assumptions required for the analysis of real datasets in different disciplines. We will provide comparative discussion of parametric methods (that depends on distributional assumptions (like normality relative to non-parametric methods (that are free from many distributional assumptions. We have chosen a real dataset from environmental sciences (one of the application areas. The findings may be extended to the other disciplines following the same spirit.

  3. Spectral compression algorithms for the analysis of very large multivariate images

    Science.gov (United States)

    Keenan, Michael R.

    2007-10-16

    A method for spectrally compressing data sets enables the efficient analysis of very large multivariate images. The spectral compression algorithm uses a factored representation of the data that can be obtained from Principal Components Analysis or other factorization technique. Furthermore, a block algorithm can be used for performing common operations more efficiently. An image analysis can be performed on the factored representation of the data, using only the most significant factors. The spectral compression algorithm can be combined with a spatial compression algorithm to provide further computational efficiencies.

  4. Analysis of the spectral response of flourishing-withering vegetation changes based on ground spectral measurements

    Institute of Scientific and Technical Information of China (English)

    Guli·Japper; CHEN Xi; ZHAO Jin; MA ZhongGuo; CHANG Cun; ZHANG XueRen

    2007-01-01

    A structural mode was used to characterize vegetation composition at the plant leaf level and a flourishing-withering ratio was developed. The spectral responses of vegetation with different flourishing-withering ratios were analyzed, the change rates of the chlorophyll and moisture content indices of vegetation with different flourishing-withering ratios were compared, and correlations between the chlorophyll and moisture content indices were analyzed. The results reveal that leaves with an intermediate flourishing-withering ratio can increase the absorption signatures of vegetation and that band ranges of 570-700 nm and 1300-1540 nm can play a role in indicating changes in the flourishing-withering ratios of vegetation; NPQI, NPCI, R695/R420, R695/R760, R750/R700, the peak-value area of red selvedge, the red selvedge amplitude, the ratio between the red selvedge amplitude and the minimum amplitude, and the NDVl of vegetation change regularly with the change in flourishing-withering ratios,and these nine vegetation indices are highly related to the chlorophyll content. Vegetation indexes of NDWI and PRI are very sensitive to the flourishing-withering change in vegetation and are closely related to the moisture content, and the correlation coefficient is higher than 0.9. The derivative of the spectra is more effective in describing changes in the structural mode of vegetation with different flourishing-withering ratios, especially at band ranges of 552-628 nm and 630-686 nm, and it is more sensitive to the mixed flourishing-withering ratios of leaves rather than to the vegetation indices. The red selvedge position in the spectrum is highly related to the chlorophyll content and is not sensitive to changes in the structural mode of mixed flourishing-withering leaves. The red selvedge parameters are sensitive to changes in the flourishing-withering ratio at the peak-value area of the red selvedge amplitude and the ratio between the red selvedge amplitude and the

  5. Photon propagation function: spectral analysis of its asymptotic form.

    Science.gov (United States)

    Schwinger, J

    1974-08-01

    The physical attitudes of source theory, displacing those of renormalized, perturbative, operator field theory, are used in a simple discussion of the asymptotic behavior of the photon propagation function. A guiding principle is the elementary consistency requirement that, under circumstances where a physical parameter cannot be accurately measured, no sensitivity to its precise value can enter the description of those circumstances. The mathematical tool is the spectral representation of the propagation function, supplemented by an equivalent phase representation. The Gell-Mann-Low equation is recovered, but with their function now interpreted physically as the spectral weight function. A crude inequality is established for the latter, which helps in interpolating between the initial rising behavior and the ultimate zero at infinite mass. There is a brief discussion of the aggressive source theory viewpoint that denies the existence of a "bare charge".

  6. Multivariate Spectral Analysis to Extract Materials from Multispectral Data

    Science.gov (United States)

    1993-09-01

    highest omm* &Poin I1w v~ ovaw~ to be agricultural fields. Their spectral behavior and &s mat" V4 jr4wmwg I ewo be understood by referring to Appendi*x A...unoway-U 15 Swamp-A ____ Urbee-D QUMt. Urbain -V Aim__ j Average)et Using the 20 otining classes rim dmsume4 (ClaM. ItWlik- ifwm *&.,m MAeW* all 5

  7. Perturbative analysis of spectral singularities and their optical realizations

    OpenAIRE

    Mostafazadeh, Ali; Rostamzadeh, Saber

    2012-01-01

    We develop a perturbative method of computing spectral singularities of a Schrodinger operator defined by a general complex potential that vanishes outside a closed interval. These can be realized as zero-width resonances in optical gain media and correspond to a lasing effect that occurs at the threshold gain. Their time-reversed copies yield coherent perfect absorption of light that is also known as antilasing. We use our general results to establish the exactness of the nth-order perturbat...

  8. Spectral analysis of dike-induced earthquakes in Afar, Ethiopia

    Science.gov (United States)

    Tepp, Gabrielle; Ebinger, Cynthia J.; Yun, Sang-Ho

    2016-04-01

    Shallow dike intrusions may be accompanied by fault slip above the dikes, a superposition which complicates seismic and geodetic data analyses. The diverse volcano-tectonic and low-frequency local earthquakes accompanying the 2005-2010 large-volume dike intrusions in the Dabbahu-Manda Hararo rift (Afar), some with fault displacements of up to 3 m at the surface, provide an opportunity to examine the relations among the earthquakes, dike intrusions, and surface ruptures. We apply the frequency index (FI) method to characterize the spectra of swarm earthquakes from six of the dikes. These earthquakes often have broad spectra with multiple peaks, making the usual peak frequency classification method unreliable. Our results show a general bimodal character with high FI earthquakes associated with deeper dikes (top > 3 km subsurface) and low FI earthquakes associated with shallow dikes, indicating that shallow dikes result in earthquakes with more low-frequency content and larger-amplitude surface waves. Low FI earthquakes are more common during dike emplacement, suggesting that interactions between the dike and faults may lead to lower FI. Taken together, likely source processes for low FI earthquakes are shallow hypocenters (<3 km) possibly with surface rupture, slow rupture velocities, and interactions with dike fluids. Strong site effects also heavily influence the earthquake spectral content. Additionally, our results suggest a continuum of spectral responses, implying either that impulsive volcano-tectonic earthquakes and the unusual, emergent earthquakes have similar source processes or that simple spectral analyses, such as FI, cannot distinguish different source processes.

  9. Data Adaptive Spectral Analysis of Unsteady Leakage Flow in an Axial Turbine

    Directory of Open Access Journals (Sweden)

    Konstantinos G. Barmpalias

    2012-01-01

    Full Text Available A data adaptive spectral analysis method is applied to characterize the unsteady loss generation in the leakage flow of an axial turbine. Unlike conventional spectral analysis, this method adapts a model dataset to the actual data. The method is illustrated from the analysis of the unsteady wall pressures in the labyrinth seal of an axial turbine. Spectra from the method are shown to be in good agreement with conventional spectral estimates. Furthermore, the spectra using the method are obtained with data records that are 16 times shorter than for conventional spectral analysis, indicating that the unsteady processes in turbomachines can be studied with substantially shorter measurement schedules than is presently the norm.

  10. Spectral analysis of heart rate and blood pressure variability in primary Sjogren's syndrome

    NARCIS (Netherlands)

    P.J. Barendregt (Pieternella); J.H.M. Tulen (Joke); A.H. van den Meiracker (Anton); H.M. Markusse

    2002-01-01

    textabstractBACKGROUND: Autonomic dysfunction has been described in primary Sjogren's syndrome (SS). OBJECTIVE: To investigate the circulatory autonomic regulation in patients with primary SS by power spectral analysis of heart rate and blood pressure variability. METHODS: Forty th

  11. Spectral analysis of heart rate and blood pressure variability in primary Sjogren's syndrome

    NARCIS (Netherlands)

    P.J. Barendregt (Pieternella); J.H.M. Tulen (Joke); A.H. van den Meiracker (Anton); H.M. Markusse

    2002-01-01

    textabstractBACKGROUND: Autonomic dysfunction has been described in primary Sjogren's syndrome (SS). OBJECTIVE: To investigate the circulatory autonomic regulation in patients with primary SS by power spectral analysis of heart rate and blood pressure variability. METHODS: Forty th

  12. Why preferring parametric forecasting to nonparametric methods?

    Science.gov (United States)

    Jabot, Franck

    2015-05-07

    A recent series of papers by Charles T. Perretti and collaborators have shown that nonparametric forecasting methods can outperform parametric methods in noisy nonlinear systems. Such a situation can arise because of two main reasons: the instability of parametric inference procedures in chaotic systems which can lead to biased parameter estimates, and the discrepancy between the real system dynamics and the modeled one, a problem that Perretti and collaborators call "the true model myth". Should ecologists go on using the demanding parametric machinery when trying to forecast the dynamics of complex ecosystems? Or should they rely on the elegant nonparametric approach that appears so promising? It will be here argued that ecological forecasting based on parametric models presents two key comparative advantages over nonparametric approaches. First, the likelihood of parametric forecasting failure can be diagnosed thanks to simple Bayesian model checking procedures. Second, when parametric forecasting is diagnosed to be reliable, forecasting uncertainty can be estimated on virtual data generated with the fitted to data parametric model. In contrast, nonparametric techniques provide forecasts with unknown reliability. This argumentation is illustrated with the simple theta-logistic model that was previously used by Perretti and collaborators to make their point. It should convince ecologists to stick to standard parametric approaches, until methods have been developed to assess the reliability of nonparametric forecasting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Statistical Analysis of the Spectral Density Estimate Obtained via Coifman Scaling Function

    OpenAIRE

    2007-01-01

    Spectral density built as Fourier transform of covariance sequence of stationary random process is determining the process characteristics and makes for analysis of it’s structure. Thus, one of the main problems in time series analysis is constructing consistent estimates of spectral density via successive, taken after equal periods of time observations of stationary random process. This article is devoted to investigation of problems dealing with application of wavelet anal...

  14. Spectral analysis of the electroencephalogram in the developing rat.

    Science.gov (United States)

    Bronzino, J D; Siok, C J; Austin, K; Austin-Lafrance, R J; Morgane, P J

    1987-10-01

    Power spectral measures of the EEG obtained from the frontal cortex and hippocampal formation during different vigilance states in the developing rat have been computed and compared. The most significant ontogenetic changes were observed in the hippocampal power spectra obtained during the vigilance state of REM sleep. These spectral analyses have revealed in the hippocampus: (1) a significant increase in the frequency at which the peak power occurs in the theta-frequency (4-11 Hz) band from 14 to 45 days of age; (2) a decrease in the quality factor of the peak from 14 to 45 days of age; (3) a decrease in the relative power co-ordinate for the center of spectral mass associated with the 0-4-Hz frequency band coupled with an increase in the frequency coordinate of the 4-11-Hz frequency band from 14 to 45 days of age, and; (4) a significant decrease in the average percent relative power associated with the 0-4-Hz frequency band from 14 to 22 days of age. For the EEG obtained from the frontal cortex, the major findings of note were: (1) a dominant contribution of relative power in the 0-4-Hz frequency band which was observed at every age and during every vigilance state tested, and; (2) a significant increase in the average percent relative power associated with this band at 18, 22, and 45 days of age. The results of this study provide a quantitative description of the electroencephalographic (EEG) ontogeny of the hippocampal formation and the frontal cortex in the rat. These ontogenetic changes in EEG activity relate closely to development of the internal circuitry and synaptic maturation in the hippocampal formation and frontal cortex.

  15. Single-sweep spectral analysis of contact heat evoked potentials

    DEFF Research Database (Denmark)

    Hansen, Tine M; Graversen, Carina; Frøkjaer, Jens B

    2015-01-01

    -sweep characteristics to identify alterations induced by morphine. METHODS: In a crossover study 15 single-sweep CHEPs were analyzed from 62 electroencephalography electrodes in 26 healthy volunteers before and after administration of morphine or placebo. Each sweep was decomposed by a continuous wavelet transform...... to obtain normalized spectral indices in the delta (0.5-4 Hz), theta (4-8 Hz), alpha (8-12 Hz), beta (12-32 Hz) and gamma (32-80 Hz) bands. The average distribution over all sweeps and channels was calculated for the four recordings for each volunteer, and the two recordings before treatments were assessed...

  16. Spectral Analysis of Certain Schrödinger Operators

    Directory of Open Access Journals (Sweden)

    Mourad E.H. Ismail

    2012-09-01

    Full Text Available The J-matrix method is extended to difference and q-difference operators and is applied to several explicit differential, difference, q-difference and second order Askey-Wilson type operators. The spectrum and the spectral measures are discussed in each case and the corresponding eigenfunction expansion is written down explicitly in most cases. In some cases we encounter new orthogonal polynomials with explicit three term recurrence relations where nothing is known about their explicit representations or orthogonality measures. Each model we analyze is a discrete quantum mechanical model in the sense of Odake and Sasaki [J. Phys. A: Math. Theor. 44 (2011, 353001, 47 pages].

  17. Yield Stability of Maize Hybrids Evaluated in Maize Regional Trials in Southwestern China Using Nonparametric Methods

    Institute of Scientific and Technical Information of China (English)

    LIU Yong-jian; DUAN Chuan; TIAN Meng-liang; HU Er-liang; HUANG Yu-bi

    2010-01-01

    Analysis of multi-environment trials (METs) of crops for the evaluation and recommendation of varieties is an important issue in plant breeding research. Evaluating on the both stability of performance and high yield is essential in MET analyses. The objective of the present investigation was to compare 11 nonparametric stability statistics and apply nonparametric tests for genotype-by-environment interaction (GEI) to 14 maize (Zea mays L.) genotypes grown at 25 locations in southwestern China during 2005. Results of nonparametric tests of GEI and a combined ANOVA across locations showed that both crossover and noncrossover GEI, and genotypes varied highly significantly for yield. The results of principal component analysis, correlation analysis of nonparametric statistics, and yield indicated the nonparametric statistics grouped as four distinct classes that corresponded to different agronomic and biological concepts of stability.Furthermore, high values of TOP and low values of rank-sum were associated with high mean yield, but the other nonparametric statistics were not positively correlated with mean yield. Therefore, only rank-sum and TOP methods would be useful for simultaneously selection for high yield and stability. These two statistics recommended JY686 and HX 168 as desirable and ND 108, CM 12, CN36, and NK6661 as undesirable genotypes.

  18. MR PRISM - Spectral Analysis Tool for the CRISM

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    We describe a computer application designed to analyze hyperspectral data collected by the Compact Infrared Spectrometer for Mars (CRISM). The application links the spectral, imaging and mapping perspectives on the eventual CRISM dataset by presenting the user with three different ways to analyze the data. One of the goals when developing this instrument is to build in the latest algorithms for detection of spectrally compelling targets on the surface of the Red Planet, so they may be available to the Planetary Science community without cost and with a minimal learning barrier to cross. This will allow the Astrobiology community to look for targets of interest such as hydrothermal minerals, sulfate minerals and hydrous minerals and be able to map the extent of these minerals using the most up-to-date and effective algorithms. The application is programmed in Java and will be made available for Windows, Mac and Linux platforms. Users will be able to embed Groovy scripts into the program in order to extend its ...

  19. Stellar and wind parameters of massive stars from spectral analysis

    Science.gov (United States)

    Araya, I.; Curé, M.

    2017-07-01

    The only way to deduce information from stars is to decode the radiation it emits in an appropriate way. Spectroscopy can solve this and derive many properties of stars. In this work we seek to derive simultaneously the stellar and wind characteristics of A and B supergiant stars. Our stellar properties encompass the effective temperature, the surface gravity, the stellar radius, the micro-turbulence velocity, the rotational velocity and, finally, the chemical composition. For wind properties we consider the mass-loss rate, the terminal velocity and the line-force parameters (α, k and δ) obtained from the standard line-driven wind theory. To model the data we use the radiative transport code Fastwind considering the newest hydrodynamical solutions derived with Hydwind code, which needs stellar and line-force parameters to obtain a wind solution. A grid of spectral models of massive stars is created and together with the observed spectra their physical properties are determined through spectral line fittings. These fittings provide an estimation about the line-force parameters, whose theoretical calculations are extremely complex. Furthermore, we expect to confirm that the hydrodynamical solutions obtained with a value of δ slightly larger than ˜ 0.25, called δ-slow solutions, describe quite reliable the radiation line-driven winds of A and late B supergiant stars and at the same time explain disagreements between observational data and theoretical models for the Wind-Momentum Luminosity Relationship (WLR).

  20. Generative Temporal Modelling of Neuroimaging - Decomposition and Nonparametric Testing

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff

    The goal of this thesis is to explore two improvements for functional magnetic resonance imaging (fMRI) analysis; namely our proposed decomposition method and an extension to the non-parametric testing framework. Analysis of fMRI allows researchers to investigate the functional processes...... of the brain, and provides insight into neuronal coupling during mental processes or tasks. The decomposition method is a Gaussian process-based independent components analysis (GPICA), which incorporates a temporal dependency in the sources. A hierarchical model specification is used, featuring both...

  1. Recent Advances and Trends in Nonparametric Statistics

    CERN Document Server

    Akritas, MG

    2003-01-01

    The advent of high-speed, affordable computers in the last two decades has given a new boost to the nonparametric way of thinking. Classical nonparametric procedures, such as function smoothing, suddenly lost their abstract flavour as they became practically implementable. In addition, many previously unthinkable possibilities became mainstream; prime examples include the bootstrap and resampling methods, wavelets and nonlinear smoothers, graphical methods, data mining, bioinformatics, as well as the more recent algorithmic approaches such as bagging and boosting. This volume is a collection o

  2. Correlated Non-Parametric Latent Feature Models

    CERN Document Server

    Doshi-Velez, Finale

    2012-01-01

    We are often interested in explaining data through a set of hidden factors or features. When the number of hidden features is unknown, the Indian Buffet Process (IBP) is a nonparametric latent feature model that does not bound the number of active features in dataset. However, the IBP assumes that all latent features are uncorrelated, making it inadequate for many realworld problems. We introduce a framework for correlated nonparametric feature models, generalising the IBP. We use this framework to generate several specific models and demonstrate applications on realworld datasets.

  3. Nonparametric correlation models for portfolio allocation

    DEFF Research Database (Denmark)

    Aslanidis, Nektarios; Casas, Isabel

    2013-01-01

    This article proposes time-varying nonparametric and semiparametric estimators of the conditional cross-correlation matrix in the context of portfolio allocation. Simulations results show that the nonparametric and semiparametric models are best in DGPs with substantial variability or structural...... breaks in correlations. Only when correlations are constant does the parametric DCC model deliver the best outcome. The methodologies are illustrated by evaluating two interesting portfolios. The first portfolio consists of the equity sector SPDRs and the S&P 500, while the second one contains major...

  4. Military target detection using spectrally modeled algorithms and independent component analysis

    Science.gov (United States)

    Tiwari, Kailash Chandra; Arora, Manoj K.; Singh, Dharmendra; Yadav, Deepti

    2013-02-01

    Most military targets of strategic importance are very small in size. Though some of them may get spatially resolved, most cannot be detected due to lack of adequate spectral resolution. Hyperspectral data, acquired over hundreds of narrow contiguous wavelength bands, are extremely suitable for most military target detection applications. Target detection, however, still remains complicated due to a host of other issues. These include, first, the heavy volume of hyperspectral data, which leads to computational complexities; second, most materials in nature exhibit spectral variability and remain unpredictable; and third, most target detection algorithms are based on spectral modeling and availability of a priori target spectra is an essential requirement, a condition difficult to meet in practice. Independent component analysis (ICA) is a new evolving technique that aims at finding components that are statistically independent or as independent as possible. It does not have any requirement of a priori availability of target spectra and is an attractive alternative. This paper, presents a study of military target detection using four spectral matching algorithms, namely, orthogonal subspace projection (OSP), constrained energy minimisation, spectral angle mapper and spectral correlation mapper, four anomaly detection algorithms, namely, OSP anomaly detector (OSPAD), Reed-Xiaoli anomaly detector (RXD), uniform target detector (UTD), a combination of RXD-UTD. The performances of these spectrally modeled algorithms are then also compared with ICA using receiver operating characteristic analysis. The superior performance of ICA indicates that it may be considered a viable alternative for military target detection.

  5. [Estimation of Hunan forest carbon density based on spectral mixture analysis of MODIS data].

    Science.gov (United States)

    Yan, En-ping; Lin, Hui; Wang, Guang-xing; Chen, Zhen-xiong

    2015-11-01

    With the fast development of remote sensing technology, combining forest inventory sample plot data and remotely sensed images has become a widely used method to map forest carbon density. However, the existence of mixed pixels often impedes the improvement of forest carbon density mapping, especially when low spatial resolution images such as MODIS are used. In this study, MODIS images and national forest inventory sample plot data were used to conduct the study of estimation for forest carbon density. Linear spectral mixture analysis with and without constraint, and nonlinear spectral mixture analysis were compared to derive the fractions of different land use and land cover (LULC) types. Then sequential Gaussian co-simulation algorithm with and without the fraction images from spectral mixture analyses were employed to estimate forest carbon density of Hunan Province. Results showed that 1) Linear spectral mixture analysis with constraint, leading to a mean RMSE of 0.002, more accurately estimated the fractions of LULC types than linear spectral and nonlinear spectral mixture analyses; 2) Integrating spectral mixture analysis model and sequential Gaussian co-simulation algorithm increased the estimation accuracy of forest carbon density to 81.5% from 74.1%, and decreased the RMSE to 5.18 from 7.26; and 3) The mean value of forest carbon density for the province was 30.06 t · hm(-2), ranging from 0.00 to 67.35 t · hm(-2). This implied that the spectral mixture analysis provided a great potential to increase the estimation accuracy of forest carbon density on regional and global level.

  6. [Decoloring and spectral properties analysis of innoxious ultraviolet absorbents].

    Science.gov (United States)

    Fang, Yi-Wen; Ni, Wen-Xiu; Huang, Chong; Xue, Liang; Yu, Lin

    2006-07-01

    The ultraviolet absorbent extracted from mango leaves, was discolored by some decoloring agent. Then the spectral properties of the discolored ultraviolet absorbents were analyzed. The discolored method of ultraviolet absorbent was studied by comparing one with the others. The results showed that the discoloring effect was satisfactory by using active carbon, H2O2, citric acid, and oxalic acid as decoloring agent. Specially, when oxalic acid was used as decoloring agent, the color of the production was slight, the rate of production was high, and the absorption effect of ultraviolet ray was well. When the concentration of the ultraviolet absorbent solution is 0.5% (w/w), the ultraviolet ray transmission was smaller than 0.3% in 200-370 nm, and it increased slightly from 370 nm. There was a maximum value at 400 nm, approaching 12%.

  7. Cool DZ white dwarfs I: Identification and spectral analysis

    Science.gov (United States)

    Hollands, M. A.; Koester, D.; Alekseev, V.; Herbert, E. L.; Gänsicke, B. T.

    2017-01-01

    White dwarfs with metal lines in their spectra act as signposts for post-main sequence planetary systems. Searching the Sloan Digital Sky Survey (SDSS) data release 12, we have identified 231 cool (absorption, extending the DZ cooling sequence to both higher metal abundances, lower temperatures, and hence longer cooler ages. Of these 231 systems, 104 are previously unknown white dwarfs. Compared with previous work, our spectral fitting uses improved model atmospheres with updated line profiles and line-lists, which we use to derive effective temperatures and abundances for up to 8 elements. We also determine spectroscopic distances to our sample, identifying two halo-members with tangential space-velocities >300 km s-1. The implications of our results on remnant planetary systems are to be discussed in a separate paper.

  8. Analysis of solvation and structural contributions in spectral characteristics of dipyrrin Zn(II) complexes.

    Science.gov (United States)

    Marfin, Yu S; Rumyantsev, E V

    2014-09-15

    Photophysical characteristics of several alkylated dipyrrin Zn(II) complexes in organic solvents were analyzed. Relations between spectral properties of complexes and physical-chemical parameters of solvents were determined with the use of linear regression analysis method. Each solvent parameter contribution in investigated spectral characteristics was estimated. Spectral properties of complexes under study depend on the specific interactions of zinc with the solvent molecules by specific axial coordination. Increasing of alkyl substitution lead to the bathochromic shifts in spectra due to the positive induction effect of alkyl groups.

  9. Analysis of solvation and structural contributions in spectral characteristics of dipyrrin Zn(II) complexes

    Science.gov (United States)

    Marfin, Yu. S.; Rumyantsev, E. V.

    2014-09-01

    Photophysical characteristics of several alkylated dipyrrin Zn(II) complexes in organic solvents were analyzed. Relations between spectral properties of complexes and physical-chemical parameters of solvents were determined with the use of linear regression analysis method. Each solvent parameter contribution in investigated spectral characteristics was estimated. Spectral properties of complexes under study depend on the specific interactions of zinc with the solvent molecules by specific axial coordination. Increasing of alkyl substitution lead to the bathochromic shifts in spectra due to the positive induction effect of alkyl groups.

  10. Highly sensitive index of sympathetic activity based on time-frequency spectral analysis of electrodermal activity.

    Science.gov (United States)

    Posada-Quintero, Hugo F; Florian, John P; Orjuela-Cañón, Álvaro D; Chon, Ki H

    2016-09-01

    Time-domain indices of electrodermal activity (EDA) have been used as a marker of sympathetic tone. However, they often show high variation between subjects and low consistency, which has precluded their general use as a marker of sympathetic tone. To examine whether power spectral density analysis of EDA can provide more consistent results, we recently performed a variety of sympathetic tone-evoking experiments (43). We found significant increase in the spectral power in the frequency range of 0.045 to 0.25 Hz when sympathetic tone-evoking stimuli were induced. The sympathetic tone assessed by the power spectral density of EDA was found to have lower variation and more sensitivity for certain, but not all, stimuli compared with the time-domain analysis of EDA. We surmise that this lack of sensitivity in certain sympathetic tone-inducing conditions with time-invariant spectral analysis of EDA may lie in its inability to characterize time-varying dynamics of the sympathetic tone. To overcome the disadvantages of time-domain and time-invariant power spectral indices of EDA, we developed a highly sensitive index of sympathetic tone, based on time-frequency analysis of EDA signals. Its efficacy was tested using experiments designed to elicit sympathetic dynamics. Twelve subjects underwent four tests known to elicit sympathetic tone arousal: cold pressor, tilt table, stand test, and the Stroop task. We hypothesize that a more sensitive measure of sympathetic control can be developed using time-varying spectral analysis. Variable frequency complex demodulation, a recently developed technique for time-frequency analysis, was used to obtain spectral amplitudes associated with EDA. We found that the time-varying spectral frequency band 0.08-0.24 Hz was most responsive to stimulation. Spectral power for frequencies higher than 0.24 Hz were determined to be not related to the sympathetic dynamics because they comprised less than 5% of the total power. The mean value of time

  11. Spatio-spectral analysis of ionization times in high-harmonic generation

    Energy Technology Data Exchange (ETDEWEB)

    Soifer, Hadas, E-mail: hadas.soifer@weizmann.ac.il [Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel); Dagan, Michal; Shafir, Dror; Bruner, Barry D. [Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel); Ivanov, Misha Yu. [Department of Physics, Imperial College London, South Kensington Campus, SW7 2AZ London (United Kingdom); Max-Born Institute for Nonlinear Optics and Short Pulse Spectroscopy, Max-Born-Strasse 2A, D-12489 Berlin (Germany); Serbinenko, Valeria; Barth, Ingo; Smirnova, Olga [Max-Born Institute for Nonlinear Optics and Short Pulse Spectroscopy, Max-Born-Strasse 2A, D-12489 Berlin (Germany); Dudovich, Nirit [Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 76100 (Israel)

    2013-03-12

    Graphical abstract: A spatio-spectral analysis of the two-color oscillation phase allows us to accurately separate short and long trajectories and reconstruct their ionization times. Highlights: ► We perform a complete spatio-spectral analysis of the high harmonic generation process. ► We analyze the ionization times across the entire spatio-spectral plane of the harmonics. ► We apply this analysis to reconstruct the ionization times of both short and long trajectories. - Abstract: Recollision experiments have been very successful in resolving attosecond scale dynamics. However, such schemes rely on the single atom response, neglecting the macroscopic properties of the interaction and the effects of using multi-cycle laser fields. In this paper we perform a complete spatio-spectral analysis of the high harmonic generation process and resolve the distribution of the subcycle dynamics of the recolliding electron. Specifically, we focus on the measurement of ionization times. Recently, we have demonstrated that the addition of a weak, crossed polarized second harmonic field allows us to resolve the moment of ionization (Shafir, 2012) [1]. In this paper we extend this measurement and perform a complete spatio-spectral analysis. We apply this analysis to reconstruct the ionization times of both short and long trajectories showing good agreement with the quantum path analysis.

  12. ANALYSIS OF SPECTRAL CHARACTERISTICS AMONG DIFFERENT SENSORS BY USE OF SIMULATED RS IMAGES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This research, by use of RS image-simulating method, simulated apparent reflectance images at sensor level and ground-reflectance images of SPOT-HRV,CBERS-CCD,Landsat-TM and NOAA14-AVHRR' s corresponding bands. These images were used to analyze sensor's differences caused by spectral sensitivity and atmospheric impacts. The differences were analyzed on Normalized Difference Vegetation Index(NDVI). The results showed that the differences of sensors' spectral characteristics cause changes of their NDVI and reflectance. When multiple sensors' data are applied to digital analysis, the error should be taken into account. Atmospheric effect makes NDVI smaller, and atn~pheric correction has the tendency of increasing NDVI values. The reflectance and their NDVIs of different sensors can be used to analyze the differences among sensor' s features. The spectral analysis method based on RS simulated images can provide a new way to design the spectral characteristics of new sensors.

  13. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  14. Near-Infrared Hyper-spectral Image Analysis of Astaxanthin Concentration in Fish Feed Coating

    DEFF Research Database (Denmark)

    Ljungqvist, Martin Georg; Ersbøll, Bjarne Kjær; Kobayashi, K.;

    2012-01-01

    The aim of this study was to investigate the possibility of predicting concentration levels of synthetic astaxanthin coating of aquaculture feed pellets by hyper-spectral image analysis in the near infra-red (NIR) range and optical filter design. The imaging devices used were a Videometer...... for prediction of the concentration level. The results show that it is possible to predict the level of synthetic astaxanthin coating using either hyper-spectral imaging or three bandpass filters (BPF)....

  15. Multiple endmember spectral-angle-mapper (SAM) analysis improves discrimination of Savanna tree species

    CSIR Research Space (South Africa)

    Cho, Moses A

    2009-08-01

    Full Text Available architecture. Several mapping methods are applied in remote sensing to quantify species or vegetation community distribution at the local to regional scale. The most commonly used methods include maximum likelihood, spectral mixture analysis (SMA)[1...] and spectral angle mapper (SAM)[2]. The application of some of these methods including SAM and SMA has become popular with the advent of hyperspectral remote sensing. SAM determines the degree of similarity between two spectra by treating the spectra...

  16. Spectral Analysis of Transition Operators, Automata Groups and Translation in BBS

    Science.gov (United States)

    Kato, Tsuyoshi; Tsujimoto, Satoshi; Zuk, Andrzej

    2016-06-01

    We give the automata that describe time evolution rules of the box-ball system with a carrier. It can be shown by use of tropical geometry that such systems are ultradiscrete analogues of KdV equation. We discuss their relation with the lamplighter group generated by an automaton. We present spectral analysis of the stochastic matrices induced by these automata and verify their spectral coincidence.

  17. Technical Training on High-Order Spectral Analysis and Thermal Anemometry Applications

    Science.gov (United States)

    Maslov, A. A.; Shiplyuk, A. N.; Sidirenko, A. A.; Bountin, D. A.

    2003-01-01

    The topics of thermal anemometry and high-order spectral analyses were the subject of the technical training. Specifically, the objective of the technical training was to study: (i) the recently introduced constant voltage anemometer (CVA) for high-speed boundary layer; and (ii) newly developed high-order spectral analysis techniques (HOSA). Both CVA and HOSA are relevant tools for studies of boundary layer transition and stability.

  18. Application of spectral decomposition analysis to in vivo quantification of aluminum by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Comsa, D.C. E-mail: comsadc@mcmaster.ca; Prestwich, W.V.; McNeill, F.E.; Byun, S.H

    2004-12-01

    The toxic effects of aluminum are cumulative and result in painful forms of renal osteodystrophy, most notably adynamic bone disease and osteomalacia, but also other forms of disease. The Trace Element Group at McMaster University has developed an accelerator-based in vivo procedure for detecting aluminum body burden by neutron activation analysis (NAA). Further refining of the method was necessary for increasing its sensitivity. In this context, the present study proposes an improved algorithm for data analysis, based on spectral decomposition. A new minimum detectable limit (MDL) of (0.7{+-}0.1) mg Al was reached for a local dose of (20{+-}1) mSv. The study also addresses the feasibility of a new data acquisition technique, the electronic rejection of the coincident events detected by a NaI(Tl) system. It is expected that the application of this technique, together with spectral decomposition analysis, would provide an acceptable MDL for the method to be valuable in a clinical setting.

  19. Stratified spectral mixture analysis of medium resolution imagery for impervious surface mapping

    Science.gov (United States)

    Sun, Genyun; Chen, Xiaolin; Ren, Jinchang; Zhang, Aizhu; Jia, Xiuping

    2017-08-01

    Linear spectral mixture analysis (LSMA) is widely employed in impervious surface estimation, especially for estimating impervious surface abundance in medium spatial resolution images. However, it suffers from a difficulty in endmember selection due to within-class spectral variability and the variation in the number and the type of endmember classes contained from pixel to pixel, which may lead to over or under estimation of impervious surface. Stratification is considered as a promising process to address the problem. This paper presents a stratified spectral mixture analysis in spectral domain (Sp_SSMA) for impervious surface mapping. It categorizes the entire data into three groups based on the Combinational Build-up Index (CBI), the intensity component in the color space and the Normalized Difference Vegetation Index (NDVI) values. A suitable endmember model is developed for each group to accommodate the spectral variation from group to group. The unmixing into the associated subset (or full set) of endmembers in each group can make the unmixing adaptive to the types of endmember classes that each pixel actually contains. Results indicate that the Sp_SSMA method achieves a better performance than full-set-endmember SMA and prior-knowledge-based spectral mixture analysis (PKSMA) in terms of R, RMSE and SE.

  20. Thirty years of nonparametric item response theory

    NARCIS (Netherlands)

    Molenaar, W.

    2001-01-01

    Relationships between a mathematical measurement model and its real-world applications are discussed. A distinction is made between large data matrices commonly found in educational measurement and smaller matrices found in attitude and personality measurement. Nonparametric methods are evaluated fo

  1. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  2. Nonparametric confidence intervals for monotone functions

    NARCIS (Netherlands)

    Groeneboom, P.; Jongbloed, G.

    2015-01-01

    We study nonparametric isotonic confidence intervals for monotone functions. In [Ann. Statist. 29 (2001) 1699–1731], pointwise confidence intervals, based on likelihood ratio tests using the restricted and unrestricted MLE in the current status model, are introduced. We extend the method to the trea

  3. Nonparametric confidence intervals for monotone functions

    NARCIS (Netherlands)

    Groeneboom, P.; Jongbloed, G.

    2015-01-01

    We study nonparametric isotonic confidence intervals for monotone functions. In [Ann. Statist. 29 (2001) 1699–1731], pointwise confidence intervals, based on likelihood ratio tests using the restricted and unrestricted MLE in the current status model, are introduced. We extend the method to the

  4. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  5. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise of an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.

  6. An analysis of gamma-ray burst spectral break models

    CERN Document Server

    Zhang, B; Zhang, Bing; Meszaros, Peter

    2002-01-01

    Typical gamma-ray burst spectra are characterized by a spectral break, Ep, which for bright BATSE bursts is found to be narrowly clustered around 300 keV. Recently identified X-ray flashes, which may account for a significant portion of the whole GRB population, seem to extend the Ep distribution to a broader range below 40 keV. On the other hand, within the cosmological fireball model, the issues concerning the dominant energy ingredient of the fireball as well as the location of the GRB emission site are not unambiguously settled, leading to several variants of the fireball model. Here we analyze these models within a unified framework, and critically reexamine the Ep predictions in the various model variants, focusing on their predicted properties. Attention is focused on the ability of the models to match a narrowness of the Ep distribution, and the correlations among Ep and some other measurable observables, as well as the effect of extending these properties to X-ray flash sources. These model propertie...

  7. Spectral analysis of Kepler SPB and Beta Cep candidate stars

    CERN Document Server

    Lehmann, H; Semaan, T; Gutiérrez, J; Smalley, B; Briquet, M; Shulyak, D; Tsymbal, V; de Cat, P

    2010-01-01

    We determine the fundamental parameters of SPB and Beta Cep candidate stars observed by the Kepler satellite mission and estimate the expected types of non-radial pulsators by comparing newly obtained high-resolution spectra with synthetic spectra computed on a grid of stellar parameters assuming LTE and check for NLTE effects for the hottest stars. For comparison, we determine Teff independently from fitting the spectral energy distribution of the stars obtained from the available photometry. We determine Teff, log(g), micro-turbulent velocity, vsin(i), metallicity, and elemental abundance for 14 of the 16 candidate stars, two of the stars are spectroscopic binaries. No significant influence of NLTE effects on the results could be found. For hot stars, we find systematic deviations of the determined effective temperatures from those given in the Kepler Input Catalogue. The deviations are confirmed by the results obtained from ground-based photometry. Five stars show reduced metallicity, two stars are He-stro...

  8. Spectral analysis and markov switching model of Indonesia business cycle

    Science.gov (United States)

    Fajar, Muhammad; Darwis, Sutawanir; Darmawan, Gumgum

    2017-03-01

    This study aims to investigate the Indonesia business cycle encompassing the determination of smoothing parameter (λ) on Hodrick-Prescott filter. Subsequently, the components of the filter output cycles were analyzed using a spectral method useful to know its characteristics, and Markov switching regime modeling is made to forecast the probability recession and expansion regimes. The data used in the study is real GDP (1983Q1 - 2016Q2). The results of the study are: a) Hodrick-Prescott filter on real GDP of Indonesia to be optimal when the value of the smoothing parameter is 988.474, b) Indonesia business cycle has amplitude varies between±0.0071 to±0.01024, and the duration is between 4 to 22 quarters, c) the business cycle can be modelled by MSIV-AR (2) but regime periodization is generated this model not perfect exactly with real regime periodzation, and d) Based on the model MSIV-AR (2) obtained long-term probabilities in the expansion regime: 0.4858 and in the recession regime: 0.5142.

  9. Spectral analysis of the Dirac operator on a 3-sphere

    CERN Document Server

    Fang, Yan-Long; Vassiliev, Dmitri

    2016-01-01

    We study the (massless) Dirac operator on a 3-sphere equipped with Riemannian metric. For the standard metric the spectrum is known. In particular, the eigenvalues closest to zero are the two double eigenvalues +3/2 and -3/2. Our aim is to analyse the behaviour of eigenvalues when the metric is perturbed in an arbitrary smooth fashion from the standard one. We derive explicit asymptotic formulae for the two eigenvalues closest to zero. Note that these eigenvalues remain double eigenvalues under perturbations of the metric: they cannot split because of a particular symmetry of the Dirac operator in dimension three (it commutes with the antilinear operator of charge conjugation). Our asymptotic formulae show that in the first approximation our two eigenvalues maintain symmetry about zero and are completely determined by the increment of Riemannian volume. Spectral asymmetry is observed only in the second approximation of the perturbation process. As an example we consider a special family of metrics, the so-cal...

  10. Spectral analysis of hearing protector impulsive insertion loss.

    Science.gov (United States)

    Fackler, Cameron J; Berger, Elliott H; Murphy, William J; Stergar, Michael E

    2017-01-01

    To characterise the performance of hearing protection devices (HPDs) in impulsive-noise conditions and to compare various protection metrics between impulsive and steady-state noise sources with different characteristics. HPDs were measured per the impulsive test methods of ANSI/ASA S12.42- 2010 . Protectors were measured with impulses generated by both an acoustic shock tube and an AR-15 rifle. The measured data were analysed for impulse peak insertion loss (IPIL) and impulsive spectral insertion loss (ISIL). These impulsive measurements were compared to insertion loss measured with steady-state noise and with real-ear attenuation at threshold (REAT). Tested HPDs included a foam earplug, a level-dependent earplug and an electronic sound-restoration earmuff. IPIL for a given protector varied between measurements with the two impulse noise sources, but ISIL agreed between the two sources. The level-dependent earplug demonstrated level-dependent effects both in IPIL and ISIL. Steady-state insertion loss and REAT measurements tended to provide a conservative estimate of the impulsively-measured attenuation. Measurements of IPIL depend strongly on the source used to measure them, especially for HPDs with less attenuation at low frequencies. ISIL provides an alternative measurement of impulse protection and appears to be a more complete description of an HPD's performance.

  11. Quantitative characterization of surface topography using spectral analysis

    Science.gov (United States)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  12. Koopmans' Analysis of Chemical Hardness with Spectral-Like Resolution

    Science.gov (United States)

    2013-01-01

    Three approximation levels of Koopmans' theorem are explored and applied: the first referring to the inner quantum behavior of the orbitalic energies that depart from the genuine ones in Fock space when the wave-functions' Hilbert-Banach basis set is specified to solve the many-electronic spectra of spin-orbitals' eigenstates; it is the most subtle issue regarding Koopmans' theorem as it brings many critics and refutation in the last decades, yet it is shown here as an irrefutable “observational” effect through computation, specific to any in silico spectra of an eigenproblem; the second level assumes the “frozen spin-orbitals” approximation during the extracting or adding of electrons to the frontier of the chemical system through the ionization and affinity processes, respectively; this approximation is nevertheless workable for great deal of chemical compounds, especially organic systems, and is justified for chemical reactivity and aromaticity hierarchies in an homologue series; the third and the most severe approximation regards the extension of the second one to superior orders of ionization and affinities, here studied at the level of chemical hardness compact-finite expressions up to spectral-like resolution for a paradigmatic set of aromatic carbohydrates. PMID:23970834

  13. Koopmans' Analysis of Chemical Hardness with Spectral-Like Resolution

    Directory of Open Access Journals (Sweden)

    Mihai V. Putz

    2013-01-01

    Full Text Available Three approximation levels of Koopmans' theorem are explored and applied: the first referring to the inner quantum behavior of the orbitalic energies that depart from the genuine ones in Fock space when the wave-functions' Hilbert-Banach basis set is specified to solve the many-electronic spectra of spin-orbitals' eigenstates; it is the most subtle issue regarding Koopmans' theorem as it brings many critics and refutation in the last decades, yet it is shown here as an irrefutable “observational” effect through computation, specific to any in silico spectra of an eigenproblem; the second level assumes the “frozen spin-orbitals” approximation during the extracting or adding of electrons to the frontier of the chemical system through the ionization and affinity processes, respectively; this approximation is nevertheless workable for great deal of chemical compounds, especially organic systems, and is justified for chemical reactivity and aromaticity hierarchies in an homologue series; the third and the most severe approximation regards the extension of the second one to superior orders of ionization and affinities, here studied at the level of chemical hardness compact-finite expressions up to spectral-like resolution for a paradigmatic set of aromatic carbohydrates.

  14. Midinfrared spectral investigations of carbonates: Analysis of remotely sensed data

    Science.gov (United States)

    Roush, T.; Pollack, J. B.; Mckay, C. P.

    1991-01-01

    Recent airborne thermal infrared observations of Mars from the Kuiper Airborne Observatory (KAO) have provided evidence for the presence of carbonates, sulfates, and hydrates. Using the optical properties of calcite and anhydrite, it was estimated that CO3's and SO4's constituted about 1 to 3 and 10 to 15 wt. percent, repectively of the materials composing the atmospheric dust. Using the derived value as an estimate of total CO3 abundance, and making an assumption that the CO3's were uniformly distributed within the Martian regolith, it was estimated that such a CO3 reservoir could contain roughly 2 to 5 bars of CO2. While the results indicate that several volatile-bearing materials are present on Mars, the observations from the KAO are inherently limited in their ability to determine the spatial distributions of these materials. However, previous spacecraft observations of Mars provide both the spectral coverage necessary to identify these materials, as well as the potential for investigating their spatial variability. This has prompted us to pursue a reinvestigation of the Mariner 6 and 7 infrared spectrometer and Mariner 9 infrared interferometer spectrometer observations. The former data have been recently made available in digital format and calibration of wavelengths and intensities are almost complete. Additionally, we are pursuing the derivation of optical constants of more appropriate carbonates and sulfates.

  15. LDA measurements and turbulence spectral analysis in an agitated vessel

    Directory of Open Access Journals (Sweden)

    Chára Zdeněk

    2013-04-01

    Full Text Available During the last years considerable improvement of the derivation of turbulence power spectrum from Laser Doppler Anemometry (LDA has been achieved. The irregularly sampled LDA data is proposed to approximate by several methods e.g. Lomb-Scargle method, which estimates amplitude and phase of spectral lines from missing data, methods based on the reconstruction of the auto-correlation function (referred to as correlation slotting technique, methods based on the reconstruction of the time series using interpolation between the uneven sampling and subsequent resampling etc. These different methods were used on the LDA data measured in an agitated vessel and the results of the power spectrum calculations were compared. The measurements were performed in the mixing vessel with flat bottom. The vessel was equipped with four baffles and agitated with a six-blade pitched blade impeller. Three values of the impeller speed (Reynolds number were tested. Long time series of the axial velocity component were measured in selected points. In each point the time series were analyzed and evaluated in a form of power spectrum.

  16. LDA measurements and turbulence spectral analysis in an agitated vessel

    Science.gov (United States)

    Kysela, Bohuš; Konfršt, Jiří; Chára, Zdeněk

    2013-04-01

    During the last years considerable improvement of the derivation of turbulence power spectrum from Laser Doppler Anemometry (LDA) has been achieved. The irregularly sampled LDA data is proposed to approximate by several methods e.g. Lomb-Scargle method, which estimates amplitude and phase of spectral lines from missing data, methods based on the reconstruction of the auto-correlation function (referred to as correlation slotting technique), methods based on the reconstruction of the time series using interpolation between the uneven sampling and subsequent resampling etc. These different methods were used on the LDA data measured in an agitated vessel and the results of the power spectrum calculations were compared. The measurements were performed in the mixing vessel with flat bottom. The vessel was equipped with four baffles and agitated with a six-blade pitched blade impeller. Three values of the impeller speed (Reynolds number) were tested. Long time series of the axial velocity component were measured in selected points. In each point the time series were analyzed and evaluated in a form of power spectrum.

  17. Functional Spectral Analysis of Paleoclimatic Evolution in Lanzhou Area over the Last 15 ka

    Institute of Scientific and Technical Information of China (English)

    杨桂芳; 殷鸿福; 李长安; 陈中原

    2003-01-01

    In this paper,we make use of the functional spectral analysis to infer the periodicity of paleoclimate in the Hongzuisi section since about 15 ka. Through combined analysis of organic carbon isotope and CaCO3 content,the law of paleoclimatic evolution of the Hongzuisi section is obtained. There were climatic changes from 10 ka to about 0.1 ka over the last 15 ka. Among these cycles,the cycle of several ka is most remarkable. The result indicates that functional spectral analysis is helpful for paleoclimatic study,which can provide useful information about paleoclimatic reconstruction and future forecast.

  18. VIBRATIONS DETECTION IN INDUSTRIAL PUMPS BASED ON SPECTRAL ANALYSIS TO INCREASE THEIR EFFICIENCY

    Directory of Open Access Journals (Sweden)

    Belhadef RACHID

    2016-01-01

    Full Text Available Spectral analysis is the key tool for the study of vibration signals in rotating machinery. In this work, the vibration analy-sis applied for conditional preventive maintenance of such machines is proposed, as part of resolved problems related to vibration detection on the organs of these machines. The vibration signal of a centrifugal pump was treated to mount the benefits of the approach proposed. The obtained results present the signal estimation of a pump vibration using Fourier transform technique compared by the spectral analysis methods based on Prony approach.

  19. Hyperspectral image classification based on spatial and spectral features and sparse representation

    Institute of Scientific and Technical Information of China (English)

    Yang Jing-Hui; Wang Li-Guo; Qian Jin-Xi

    2014-01-01

    To minimize the low classification accuracy and low utilization of spatial information in traditional hyperspectral image classification methods, we propose a new hyperspectral image classification method, which is based on the Gabor spatial texture features and nonparametric weighted spectral features, and the sparse representation classification method (Gabor–NWSF and SRC), abbreviated GNWSF–SRC. The proposed (GNWSF–SRC) method first combines the Gabor spatial features and nonparametric weighted spectral features to describe the hyperspectral image, and then applies the sparse representation method. Finally, the classification is obtained by analyzing the reconstruction error. We use the proposed method to process two typical hyperspectral data sets with different percentages of training samples. Theoretical analysis and simulation demonstrate that the proposed method improves the classification accuracy and Kappa coefficient compared with traditional classification methods and achieves better classification performance.

  20. Spectral characterization as a tool for parchment analysis

    Science.gov (United States)

    Radis, Michela; Iacomussi, Paola; Rossi, Giuseppe

    2015-06-01

    The paper presents an investigation on the correlation between spectral characteristics and conservation conditions of parchment to define a NON invasive methodology able to detect and monitor deterioration process in historical parchment without the need of taking small samples. To verify the feasibility and define the most appropriate measurement method, several samples of contemporary parchments, produced following ancient recipes and coming from different animal species, with different degrees of artificially induced damage, were analyzed. The SRF and STF of each sample were measured in the same point, before and after each step of the artificial ageing treatment. Having at disposal a parchment coming from a whole lamb leather, allowed also the study of the correlations between the variations of SRF - STF and the intrinsic factors of a parchment like the variability of animal skin anatomy and of manufacturing. Analyzing different samples allowed also the definition of the measuring method sensitivity and of reference spectrum for the different animal species parchments with accuracy limits. The definition of a reference spectrum of not damaged parchment with acceptability limits is a necessary step for understanding, through SRF - STF measurements, historical parchments conservation conditions: indeed it is necessary to know if deviations from the reference spectrum are ascribable to damage or only to parchment anatomic/production variability. As a case study, the method has been applied to two historical parchment scrolls stored at the Archivio di Stato di Torino (Italy). The SRF - STF of both scrolls was acquired in several points of the scroll, the average spectrum of each scroll was compared with the reference spectra with the relative tolerance limits, recognizing the animal species and damage alterations and demonstrating the feasibility of the method.

  1. Spectral Estimation Methods Comparison and Performance Analysis on a Steganalysis Application

    CERN Document Server

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  2. Spectral analysis of time series of categorical variables in earth sciences

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Rodríguez-Tovar, Francisco J.; Dorador, Javier

    2016-10-01

    Time series of categorical variables often appear in Earth Science disciplines and there is considerable interest in studying their cyclic behavior. This is true, for example, when the type of facies, petrofabric features, ichnofabrics, fossil assemblages or mineral compositions are measured continuously over a core or throughout a stratigraphic succession. Here we deal with the problem of applying spectral analysis to such sequences. A full indicator approach is proposed to complement the spectral envelope often used in other disciplines. Additionally, a stand-alone computer program is provided for calculating the spectral envelope, in this case implementing the permutation test to assess the statistical significance of the spectral peaks. We studied simulated sequences as well as real data in order to illustrate the methodology.

  3. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  4. Are Public-Private Partnerships a Source of Greater Efficiency in Water Supply? Results of a Non-Parametric Performance Analysis Relating to the Italian Industry

    Directory of Open Access Journals (Sweden)

    Corrado lo Storto

    2013-12-01

    Full Text Available This article reports the outcome of a performance study of the water service provision industry in Italy. The study evaluates the efficiency of 21 “private or public-private” equity and 32 “public” equity water service operators and investigates controlling factors. In particular, the influence that the operator typology and service management nature - private vs. public - has on efficiency is assessed. The study employed a two-stage Data Envelopment Analysis methodology. In the first stage, the operational efficiency of water supply operators is calculated by implementing a conventional BCC DEA model, that uses both physical infrastructure and financial input and output variables to explore economies of scale. In the second stage, bootstrapped DEA and Tobit regression are performed to estimate the influence that a number of environmental factors have on water supplier efficiency. The results show that the integrated water provision industry in Italy is characterized by operational inefficiencies of service operators, and scale and agglomeration economies may have a not negligible effect on efficiency. In addition, the operator typology and its geographical location affect efficiency.

  5. IR spectral analysis for the diagnostics of crust earthquake precursors

    Science.gov (United States)

    Umarkhodgaev, R. M.; Liperovsky, V. A.; Mikhailin, V. V.; Meister, C.-V.; Naumov, D. Ju

    2012-04-01

    In regions of future earthquakes, a few days before the seismic shock, the emanation of radon and hydrogen is being observed, which causes clouds of increased ionisation in the atmosphere. In the present work the possible diagnostics of these clouds using infrared (IR) spectroscopy is considered, which may be important and useful for the general geophysical system of earthquake prediction and the observation of industrial emissions of radioactive materials into the atmosphere. Some possible physical processes are analysed, which cause, under the condition of additional ionisation in a pre-breakdown electrical field, emissions in the IR interval. In doing so, the transparency region of the IR spectrum at wavelengths of 7-15 μm is taken into account. This transparency region corresponds to spectral lines of small atmospheric constituents like CH4, CO2, N2O, NO2, NO, and O3. The possible intensities of the IR emissions observable in laboratories and in nature are estimated. The acceleration process of the electrons in the pre-breakdown electrical field before its adhesion to the molecules is analysed. The laboratory equipment for the investigation of the IR absorption spectrum is constructed for the cases of normal and decreased atmospheric pressures. The syntheses of ozone and nitrous oxides are performed in the barrier discharge. It is studied if the products of the syntheses may be used to model atmospheric processes where these components take part. Spectra of products of the syntheses in the wavelength region of 2-10 μm are observed and analysed. A device is created for the syntheses and accumulation of nitrous oxides. Experiments to observe the IR-spectra of ozone and nitrous oxides during the syntheses and during the further evolution of these molecules are performed. For the earthquake prediction, practically, the investigation of emission spectra is most important, but during the laboratory experiments, the radiation of the excited molecules is shifted by a

  6. 交通拥堵持续时间的非参数生存分析%Nonparametric survival analysis of traffic congestion duration time

    Institute of Scientific and Technical Information of China (English)

    杨小宝; 周映雪

    2013-01-01

    The hazard-based traffic congestion duration model was established through survival analysis method. Based on the empirical traffic flow data on the Third Ring Road in Beijing, the traffic congestion duration of the Third Ring Road was estimated. The results show that 56% of the congestion durations of road segments on the Third Ring Road are not larger than four minutes. 90% of the congestion durations are not larger than 12 minutes. The hazard rate is less than 10% when the duration is larger than 12 minutes. The occurrence frequency of congestion on the outer ring is larger than the inner ring while the congestion duration on the inner ring is longer than the duration on the outer ring.%采用生存分析的非参数方法,建立基于风险的交通拥堵持续时间模型,根据北京市三环快速路的交通流数据,对三环道路交通拥堵持续时间进行了估计.结果表明:三环各路段的拥堵持续时间56%在4 min以内,90%在12 min之内;当拥堵持续时间超过12 min之后拥堵结束的可能性小于10%.外环比内环更容易发生拥堵,但当拥堵发生后内环的拥堵持续时间更长.

  7. Multi spectral imaging analysis for meat spoilage discrimination

    DEFF Research Database (Denmark)

    Christiansen, Asger Nyman; Carstensen, Jens Michael; Papadopoulou, Olga

    ) was performed in parallel with videometer image snapshots and sensory analysis. Odour and colour characteristics of meat were determined by a test panel and attributed into three pre-characterized quality classes, namely Fresh; Semi Fresh and Spoiled during the days of its shelf life. So far, different...... classification methods: Naive Bayes Classifier as a reference model, Canonical Discriminant Analysis (CDA) and Support Vector Classification (SVC). As the final step, generalization of the models was performed using k-fold validation (k=10). Results showed that image analysis provided good discrimination of meat...... samples regarding the spoilage process as evaluated from sensory as well as from microbiological data. The support vector classification (SVC) model outperformed other models. Specifically, the misclassification error rate (MER), derived from odour characteristics, was 18% for both aerobic and MAP meat...

  8. [Spectral analysis of self-oscillating motility in isolated plasmodial strand of Physarum polycephalum].

    Science.gov (United States)

    Proskurin, S G; Avsievich, T I

    2014-01-01

    In this study the experimental dependencies of the velocity of shuttle endoplasmic motion in the isolated plasmodial strand of Physarum polycephalum obtained by laser Doppler microscopy are presented. The spectral analysis of the time dependencies of the endoplasm allows obtaining two distinct harmonic components. Influence of KCN and SHAM--inhibitors of cellular respiration--leads to a complete cessation of endoplasmic motion in the strand. After removal of the inhibitors the respiratory system becomes normal, gradually restoring the activity of both harmonic oscillation sources. Based on the spectral analysis the simulated time-dependent velocity of the endoplasmic motion is rather good consistent with experimental data.

  9. EEG Signal Decomposition and Improved Spectral Analysis Using Wavelet Transform

    Science.gov (United States)

    2001-10-25

    research and medical applications. Wavelet transform (WT) is a new multi-resolution time-frequency analysis method. WT possesses localization feature both... wavelet transform , the EEG signals are successfully decomposed and denoised. In this paper we also use a ’quasi-detrending’ method for classification of EEG

  10. Sleep EEG spectral analysis in a diurnal rodent : Eutamias sibiricus

    NARCIS (Netherlands)

    DIJK, DJ; DAAN, S

    1989-01-01

    1. Sleep was studied in the diurnal rodent Eutamias sibiricus, chronically implanted with EEG and EMG electrodes. Analysis of the distribution of wakefulness, nonrapid eye movement (NREM) sleep, and rapid eye movement (REM) sleep over the 24 h period (LD 12:12) showed that total sleep time was 27.5%

  11. Customized spectral band analysis compared with conventional Fourier analysis of heart rate variability in neonates.

    Science.gov (United States)

    de Beer, N A M; Andriessen, P; Berendsen, R C M; Oei, S G; Wijn, P F F; Oetomo, S Bambang

    2004-12-01

    A customized filtering technique is introduced and compared with fast Fourier transformation (FFT) for analyzing heart rate variability (HRV) in neonates from short-term recordings. FFT is classically the most commonly used spectral technique to investigate cardiovascular fluctuations. FFT requires stability of the physiological signal within a 300 s time window that is usually analyzed in adults. Preterm infants, however, show characteristics of rapidly fluctuating heart rate and blood pressure due to an immature autonomic regulation, resulting in non-stationarity of these signals. Therefore neonatal studies use (half-overlapping or moving) windows of 64 s length within a recording time of 2-5 min. The proposed filtering technique performs a filtering operation in the frequency range of interest before calculating the spectrum, which allows it to perform an analysis of shorter periods of only 42 s. The frequency bands of interest are 0.04-0.15 Hz (low frequency, LF) and 0.4-1.5 Hz (high frequency, HF). Although conventional FFT analysis as well as the proposed alternative technique result in errors in the estimation of LF power, due to spectral leakage from the very low frequencies, FFT analysis is more sensitive to this effect. The response times show comparable behavior for both the techniques. Applying both the methods to heart rate data obtained from a neonate before and after atropine administration (inducing a wide range of HRV), shows a very significant correlation between the two methods in estimating LF and HF power. We conclude that a customized filtering technique might be beneficial for analyzing HRV in neonates because it reduces the necessary time window for signal stability.

  12. Mapping tropical dry forest succession using multiple criteria spectral mixture analysis

    Science.gov (United States)

    Cao, Sen; Yu, Qiuyan; Sanchez-Azofeifa, Arturo; Feng, Jilu; Rivard, Benoit; Gu, Zhujun

    2015-11-01

    Tropical dry forests (TDFs) in the Americas are considered the first frontier of economic development with less than 1% of their total original coverage under protection. Accordingly, accurate estimates of their spatial extent, fragmentation, and degree of regeneration are critical in evaluating the success of current conservation policies. This study focused on a well-protected secondary TDF in Santa Rosa National Park (SRNP) Environmental Monitoring Super Site, Guanacaste, Costa Rica. We used spectral signature analysis of TDF ecosystem succession (early, intermediate, and late successional stages), and its intrinsic variability, to propose a new multiple criteria spectral mixture analysis (MCSMA) method on the shortwave infrared (SWIR) of HyMap image. Unlike most existing iterative mixture analysis (IMA) techniques, MCSMA tries to extract and make use of representative endmembers with spectral and spatial information. MCSMA then considers three criteria that influence the comparative importance of different endmember combinations (endmember models): root mean square error (RMSE); spatial distance (SD); and fraction consistency (FC), to create an evaluation framework to select a best-fit model. The spectral analysis demonstrated that TDFs have a high spectral variability as a result of biomass variability. By adopting two search strategies, the unmixing results showed that our new MCSMA approach had a better performance in root mean square error (early: 0.160/0.159; intermediate: 0.322/0.321; and late: 0.239/0.235); mean absolute error (early: 0.132/0.128; intermediate: 0.254/0.251; and late: 0.191/0.188); and systematic error (early: 0.045/0.055; intermediate: -0.211/-0.214; and late: 0.161/0.160), compared to the multiple endmember spectral mixture analysis (MESMA). This study highlights the importance of SWIR in differentiating successional stages in TDFs. The proposed MCSMA provides a more flexible and generalized means for the best-fit model determination

  13. a Multivariate Downscaling Model for Nonparametric Simulation of Daily Flows

    Science.gov (United States)

    Molina, J. M.; Ramirez, J. A.; Raff, D. A.

    2011-12-01

    A multivariate, stochastic nonparametric framework for stepwise disaggregation of seasonal runoff volumes to daily streamflow is presented. The downscaling process is conditional on volumes of spring runoff and large-scale ocean-atmosphere teleconnections and includes a two-level cascade scheme: seasonal-to-monthly disaggregation first followed by monthly-to-daily disaggregation. The non-parametric and assumption-free character of the framework allows consideration of the random nature and nonlinearities of daily flows, which parametric models are unable to account for adequately. This paper examines statistical links between decadal/interannual climatic variations in the Pacific Ocean and hydrologic variability in US northwest region, and includes a periodicity analysis of climate patterns to detect coherences of their cyclic behavior in the frequency domain. We explore the use of such relationships and selected signals (e.g., north Pacific gyre oscillation, southern oscillation, and Pacific decadal oscillation indices, NPGO, SOI and PDO, respectively) in the proposed data-driven framework by means of a combinatorial approach with the aim of simulating improved streamflow sequences when compared with disaggregated series generated from flows alone. A nearest neighbor time series bootstrapping approach is integrated with principal component analysis to resample from the empirical multivariate distribution. A volume-dependent scaling transformation is implemented to guarantee the summability condition. In addition, we present a new and simple algorithm, based on nonparametric resampling, that overcomes the common limitation of lack of preservation of historical correlation between daily flows across months. The downscaling framework presented here is parsimonious in parameters and model assumptions, does not generate negative values, and produces synthetic series that are statistically indistinguishable from the observations. We present evidence showing that both

  14. Spectral Methods

    CERN Document Server

    Shen, Jie; Wang, Li-Lian

    2011-01-01

    Along with finite differences and finite elements, spectral methods are one of the three main methodologies for solving partial differential equations on computers. This book provides a detailed presentation of basic spectral algorithms, as well as a systematical presentation of basic convergence theory and error analysis for spectral methods. Readers of this book will be exposed to a unified framework for designing and analyzing spectral algorithms for a variety of problems, including in particular high-order differential equations and problems in unbounded domains. The book contains a large

  15. Nonparametric tests for pathwise properties of semimartingales

    CERN Document Server

    Cont, Rama; 10.3150/10-BEJ293

    2011-01-01

    We propose two nonparametric tests for investigating the pathwise properties of a signal modeled as the sum of a L\\'{e}vy process and a Brownian semimartingale. Using a nonparametric threshold estimator for the continuous component of the quadratic variation, we design a test for the presence of a continuous martingale component in the process and a test for establishing whether the jumps have finite or infinite variation, based on observations on a discrete-time grid. We evaluate the performance of our tests using simulations of various stochastic models and use the tests to investigate the fine structure of the DM/USD exchange rate fluctuations and SPX futures prices. In both cases, our tests reveal the presence of a non-zero Brownian component and a finite variation jump component.

  16. Nonparametric Transient Classification using Adaptive Wavelets

    CERN Document Server

    Varughese, Melvin M; Stephanou, Michael; Bassett, Bruce A

    2015-01-01

    Classifying transients based on multi band light curves is a challenging but crucial problem in the era of GAIA and LSST since the sheer volume of transients will make spectroscopic classification unfeasible. Here we present a nonparametric classifier that uses the transient's light curve measurements to predict its class given training data. It implements two novel components: the first is the use of the BAGIDIS wavelet methodology - a characterization of functional data using hierarchical wavelet coefficients. The second novelty is the introduction of a ranked probability classifier on the wavelet coefficients that handles both the heteroscedasticity of the data in addition to the potential non-representativity of the training set. The ranked classifier is simple and quick to implement while a major advantage of the BAGIDIS wavelets is that they are translation invariant, hence they do not need the light curves to be aligned to extract features. Further, BAGIDIS is nonparametric so it can be used for blind ...

  17. An open source tool for heart rate variability spectral analysis.

    Science.gov (United States)

    Rodríguez-Liñares, L; Méndez, A J; Lado, M J; Olivieri, D N; Vila, X A; Gómez-Conde, I

    2011-07-01

    In this paper we describe a software package for developing heart rate variability analysis. This package, called RHRV, is a third party extension for the open source statistical environment R, and can be freely downloaded from the R-CRAN repository. We review the state of the art of software related to the analysis of heart rate variability (HRV). Based upon this review, we motivate the development of an open source software platform which can be used for developing new algorithms for studying HRV or for performing clinical experiments. In particular, we show how the RHRV package greatly simplifies and accelerates the work of the computer scientist or medical specialist in the HRV field. We illustrate the utility of our package with practical examples.

  18. Graph spectral analysis of protein interaction network evolution

    OpenAIRE

    Thorne, Thomas; Stumpf, Michael P. H.

    2012-01-01

    We present an analysis of protein interaction network data via the comparison of models of network evolution to the observed data. We take a Bayesian approach and perform posterior density estimation using an approximate Bayesian computation with sequential Monte Carlo method. Our approach allows us to perform model selection over a selection of potential network growth models. The methodology we apply uses a distance defined in terms of graph spectra which captures the network data more natu...

  19. Spectral analysis of optical emission of microplasma in sea water

    Science.gov (United States)

    Gamaleev, Vladislav; Morita, Hayato; Oh, Jun-Seok; Furuta, Hiroshi; Hatta, Akimitsu

    2016-09-01

    This work presents an analysis of optical emission spectra from microplasma in three types of liquid, namely artificial sea water composed of 10 typical agents (10ASW), reference solutions each containing a single agent (NaCl, MgCl2 + H2O, Na2SO4, CaCl2, KCl, NaHCO3, KBr, NaHCO3, H3BO3, SrCl2 + H2O, NaF) and naturally sampled deep sea water (DSW). Microplasma was operated using a needle(Pd)-to-plate(Pt) electrode system sunk into each liquid in a quartz cuvette. The radius of the tip of the needle was 50 μm and the gap between the electrodes was set at 20 μm. An inpulse generator circuit, consisting of a MOSFET switch, a capacitor, an inductor and the resistance of the liquid between the electrodes, was used as a pulse current source for operation of discharges. In the spectra, the emission peaks for the main components of sea water and contaminants from the electrodes were detected. Spectra for reference solutions were examined to enable the identification of unassigned peaks in the spectra for sea water. Analysis of the Stark broadening of H α peak was carried out to estimate the electron density of the plasma under various conditions. The characteristics of microplasma discharge in sea water and the analysis of the optical emission spectra will be presented. This work was supported by JSPS KAKENHI Grant Number 26600129.

  20. Processing of spectral X-ray data with principal components analysis

    CERN Document Server

    Butler, A P H; Cook, N J; Butzer, J; Schleich, N; Tlustos, L; Scott, N; Grasset, R; de Ruiter, N; Anderson, N G

    2011-01-01

    The goal of the work was to develop a general method for processing spectral x-ray image data. Principle component analysis (PCA) is a well understood technique for multivariate data analysis and so was investigated. To assess this method, spectral (multi-energy) computed tomography (CT) data was obtained using a Medipix2 detector in a MARS-CT (Medipix All Resolution System). PCA was able to separate bone (calcium) from two elements with k-edges in the X-ray spectrum used (iodine and barium) within a mouse. This has potential clinical application in dual-energy CT systems and future Medipix3 based spectral imaging where up to eight energies can be recorded simultaneously with excellent energy resolution. (c) 2010 Elsevier B.V. All rights reserved.

  1. Spectral Analysis of Blood Pressure Variability as a Quantitative Indicator of Driving Fatigue

    Institute of Scientific and Technical Information of China (English)

    李增勇; 焦昆; 陈铭; 王成焘

    2004-01-01

    The quantitative detector of driver fatigue presents appropriate warnings and helps to prevent traffic accidents.The aim of this study was to quantifiably evaluate driver mental fatigue using the power spectral analysis of the blood pressure variability (BPV) and subjective evaluation. In this experiment twenty healthy male subjects were required to perform a driving simulator task for 3-hours. The physiological variables for evaluating driver mental fatigue were spectral values of blood pressure variability (BPV)including very low frequency (VLF), low frequency (LF),high frequency (HF). As a result, LF, HF and LF/HF showed high correlations with driver mental fatigue but not found in VLF. The findings represent a possible utility of BPV spectral analysis in quantitatively evaluating driver mental fatigue.

  2. Stochastic analysis of spectral broadening by a free turbulent shear layer

    Science.gov (United States)

    Hardin, J. C.; Preisser, J. S.

    1981-01-01

    The effect of the time-varying shear layer between a harmonic acoustic source and an observer on the frequency content of the observed sound is considered. Experimental data show that the spectral content of the acoustic signal is considerably broadened upon passing through such a shear layer. Theoretical analysis is presented which shows that such spectral broadening is entirely consistent with amplitude modulation of the acoustic signal by the time-varying shear layer. Thus, no actual frequency shift need be hypothesized to explain the spectral phenomenon. Experimental tests were conducted at 2, 4, and 6 kHz and at free jet flow velocities of 10, 20, and 30 m/s. Analysis of acoustic pressure time histories obtained from these tests confirms the above conclusion, at least for the low Mach numbers considered.

  3. Bayesian nonparametric estimation for Quantum Homodyne Tomography

    OpenAIRE

    Naulet, Zacharie; Barat, Eric

    2016-01-01

    We estimate the quantum state of a light beam from results of quantum homodyne tomography noisy measurements performed on identically prepared quantum systems. We propose two Bayesian nonparametric approaches. The first approach is based on mixture models and is illustrated through simulation examples. The second approach is based on random basis expansions. We study the theoretical performance of the second approach by quantifying the rate of contraction of the posterior distribution around ...

  4. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  5. Asymptotic Spectral Analysis of Cross-Product Matrices.

    Science.gov (United States)

    1982-11-01

    Ing ( +4. 1 o3) 8 , a-., .... , and ’E3 - 1Q(jzq) (4.4) a matriz partitioned so all Y2 subatrices are zero except for the multiples of Identity...from (4.7) (it gives them to be I (q-1 times). +S C 1-l2 uac P T (one)) be c, ba A Is within z71/ of A all the terms In the matriz In (4.6G) are of...Maar Barnes Building Fort Meade, MD 20755 1 495 Summer Street Boston , MA 02210 1 ATAA-SL, Library U.S. Army TRADOC Systems Commanding Officer Analysis

  6. SPECTRAL ANALYSIS OF POLYMER MODIFIED BITUMEN USED IN WATERPROOFING

    Directory of Open Access Journals (Sweden)

    Maria RATAJCZAK

    Full Text Available Asphalt is one of the most commonly used building material. The first attempts at modifying asphalt were made at the beginning of the twentieth century. Nowadays the most popular asphalt modifier is the styrene-butadiene-styrene (SBS. This thermoplastic elastomer increases the thermal resistance of bitumen, widens the range of plasticity and amends rheological properties. IR spectroscopy is by far the most common instrumental method used in analytical chemistry. The popularity of this method results from its simple measurement technique, universality and high precision. That is why IR spectroscopy applies to the analysis of polymer modified binder (PMB used in waterproofing.

  7. Towards the Procedure Automation of Full Stochastic Spectral Based Fatigue Analysis

    Directory of Open Access Journals (Sweden)

    Khurram Shehzad

    2013-05-01

    Full Text Available Fatigue is one of the most significant failure modes for marine structures such as ships and offshore platforms. Among numerous methods for fatigue life estimation, spectral method is considered as the most reliable one due to its ability to cater different sea states as well as their probabilities of occurrence. However, spectral based simulation procedure itself is quite complex and numerically intensive owing to various critical technical details. Present research study is focused on the application and automation of spectral based fatigue analysis procedure for ship structure using ANSYS software with 3D liner sea keeping code AQWA. Ansys Parametric Design Language (APDL macros are created and subsequently implemented to automate the workflow of simulation process by reducing the time spent on non-value added repetitive activity. A MATLAB program based on direct calculation procedure of spectral fatigue is developed to calculate total fatigue damage. The automation procedure is employed to predict the fatigue life of a ship structural detail using wave scatter data of North Atlantic and Worldwide trade. The current work will provide a system for efficient implementation of stochastic spectral fatigue analysis procedure for ship structures.

  8. Spectral saliency via automatic adaptive amplitude spectrum analysis

    Science.gov (United States)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  9. Micro-Raman Imaging for Biology with Multivariate Spectral Analysis

    KAUST Repository

    Malvaso, Federica

    2015-05-05

    Raman spectroscopy is a noninvasive technique that can provide complex information on the vibrational state of the molecules. It defines the unique fingerprint that allow the identification of the various chemical components within a given sample. The aim of the following thesis work is to analyze Raman maps related to three pairs of different cells, highlighting differences and similarities through multivariate algorithms. The first pair of analyzed cells are human embryonic stem cells (hESCs), while the other two pairs are induced pluripotent stem cells (iPSCs) derived from T lymphocytes and keratinocytes, respectively. Although two different multivariate techniques were employed, ie Principal Component Analysis and Cluster Analysis, the same results were achieved: the iPSCs derived from T-lymphocytes show a higher content of genetic material both compared with the iPSCs derived from keratinocytes and the hESCs . On the other side, equally evident, was that iPS cells derived from keratinocytes assume a molecular distribution very similar to hESCs.

  10. Spectral analysis of Gene co-expression network of Zebrafish

    CERN Document Server

    Jalan, S; Bhojwani, J; Li, B; Zhang, L; Lan, S H; Gong, Z

    2012-01-01

    We analyze the gene expression data of Zebrafish under the combined framework of complex networks and random matrix theory. The nearest neighbor spacing distribution of the corresponding matrix spectra follows random matrix predictions of Gaussian orthogonal statistics. Based on the eigenvector analysis we can divide the spectra into two parts, first part for which the eigenvector localization properties match with the random matrix theory predictions, and the second part for which they show deviation from the theory and hence are useful to understand the system dependent properties. Spectra with the localized eigenvectors can be characterized into three groups based on the eigenvalues. We explore the position of localized nodes from these different categories. Using an overlap measure, we find that the top contributing nodes in the different groups carry distinguished structural features. Furthermore, the top contributing nodes of the different localized eigenvectors corresponding to the lower eigenvalue reg...

  11. Unsupervised linear spectral mixture analysis with AVIRIS data

    Institute of Scientific and Technical Information of China (English)

    GU Yan-feng; YANG Dong-yun; ZHANG Ye

    2005-01-01

    A new algorithm for unsupervised hyperspectral data unmixing is investigated, which includes a modified minimum noise fraction (MNF) transformation and independent component analysis (ICA). The modified MNF transformation is used to reduce noise and remove correlation between neighboring bands. Then the ICA is applied to unmix hyperspectral images, and independent endmembers are obtained from unmixed images by using post-processing which includes image segmentation based on statistical histograms and morphological operations. The experimental results demonstrate that this algorithm can identify endmembers resident in mixed pixels. Meanwhile, the results show the high computational efficiency of the modified MNF transformation. The time consumed by the modified method is almost one fifth of the traditional MNF transformation.

  12. Spectral analysis of viscous static compressible fluid equilibria

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, Manuel [Departamento de Analisis Matematico, Universidad de Valladolid, Valladolid (Spain)

    2001-05-25

    It is generally assumed that the study of the spectrum of the linearized Navier-Stokes equations around a static state will provide information about the stability of the equilibrium. This is obvious for inviscid barotropic compressible fluids by the self-adjoint character of the relevant operator, and rather easy for viscous incompressible fluids by the compact character of the resolvent. The viscous compressible linearized system, both for periodic and homogeneous Dirichlet boundary problems, satisfies neither condition, but it does turn out to be the generator of an immediately continuous, almost stable semigroup, which justifies the analysis of the spectrum as predictive of the initial behaviour of the flow. As for the spectrum itself, except for a unique negative finite accumulation point, it is formed by eigenvalues with negative real part, and nonreal eigenvalues are confined to a certain bounded subset of complex numbers. (author)

  13. Spatio-temporal spectral analysis of a forced cylinder wake

    CERN Document Server

    D'Adamo, Juan; Wesfreid, José Eduardo

    2011-01-01

    The wake of a circular cylinder performing rotary oscillations is studied using hydrodynamic tunnel experiments at $Re=100$. Two-dimensional particle image velocimetry on the mid-plane perpendicular to the axis of cylinder is used to characterize the spatial development of the flow and its stability properties. The lock-in phenomenon that determines the boundaries between regions of the forcing parameter space were the wake is globally unstable or convectively unstable is scrutinized using the experimental data. A novel method based on the analysis of power density spectra of the flow allows us to give a detailed description of the forced wake, shedding light on the energy distribution in the different frequency components and in particular on a cascade-like mechanism evidenced for a high amplitude of the forcing oscillation. In addition, a calculation of the drag from the velocity field is performed, allowing us to relate the resulting force on the body to the wake properties.

  14. Spectral analysis of musical sounds with emphasis on the piano

    CERN Document Server

    Koenig, David M

    2014-01-01

    There are three parts to this book which addresses the analysis of musical sounds from the viewpoint of someone at the intersection between physicists, engineers, piano technicians, and musicians. The reader is introduced to a variety of waves and a variety of ways of presenting, visualizing, and analyzing them in the first part. A tutorial on the tools used throughout the book accompanies this introduction. The mathematics behind the tools is left to the appendices. Part 2 is a graphical survey of the classical areas of acoustics that pertain to musical instruments: vibrating strings, bars, membranes, and plates. Part 3 is devoted almost exclusively to the piano. Several two- and three-dimensional graphical tools are introduced to study the following characteristics of pianos: individual notes and interactions among them, the missing fundamental, inharmonicity, tuning visualization, the different distribution of harmonic power for the various zones of the piano keyboard, and potential uses for quality contro...

  15. Analysis of the p-i-n-structures Electrophysical Characteristics Influence on the Spectral Characteristics Sensitivity

    Directory of Open Access Journals (Sweden)

    V.N. Murashev

    2015-06-01

    Full Text Available In this paper the simulation of the silicon p-i-n-photodiodes spectral sensitivity characteristics was carried out. The analysis of the semiconductor material characteristics (the doping level, lifetime, surface recombination velocity, the construction and operation modes on the photosensitive structures characteristics in order to optimize them were investigated.

  16. Spectral analysis of the light scattered from a chemically relaxing fluid: A ternary mixture

    NARCIS (Netherlands)

    Carle, D.L.; Laidlaw, W.G.; Lekkerkerker, H.N.W.

    1974-01-01

    The spectral distribution of light scattered by a ternary fluid mixture containing two chemically reactive species and one nonreactive species is considered and a normal mode analysis is carried out for a range of k-values for which the pressure fluctuations are decoupled from those in entropy and c

  17. Sex Differences in the Sleep EEG of Young Adults : Visual Scoring and Spectral Analysis

    NARCIS (Netherlands)

    Dijk, Derk Jan; Beersma, Domien G.M.; Bloem, Gerda M.

    1989-01-01

    Baseline sleep of 13 men (mean age of 23.5 years) and 15 women (21.9 years) was analyzed. Visual scoring of the electroencephalograms (EEGs) revealed no significant differences between the sexes in the amounts of slow-wave sleep and rapid-eye-movement (REM) sleep. Spectral analysis, however, detecte

  18. Syntax is from Mars while Semantics from Venus! Insights from Spectral Analysis of Distributional Similarity Networks

    CERN Document Server

    Biemann, Chris; Mukherjee, Animesh

    2009-01-01

    We study the global topology of the syntactic and semantic distributional similarity networks for English through the technique of spectral analysis. We observe that while the syntactic network has a hierarchical structure with strong communities and their mixtures, the semantic network has several tightly knit communities along with a large core without any such well-defined community structure.

  19. Mass spectral analysis of C3 and C4 aliphatic amino acid derivatives.

    Science.gov (United States)

    Lawless, J. G.; Chadha, M. S.

    1971-01-01

    Diagnostic criteria are obtained for the distinction of alpha, beta, gamma, and N-methyl isomers of the C3 and C4 aliphatic amino acids, using mass spectral analysis of the derivatives of these acids. The use of deuterium labeling has helped in the understanding of certain fragmentation pathways.

  20. All Night Spectral Analysis of EEG Sleep in Young Adult and Middle-Aged Male Subjects

    NARCIS (Netherlands)

    Dijk, Derk Jan; Beersma, Domien G.M.; Hoofdakker, Rutger H. van den

    1989-01-01

    The sleep EEGs of 9 young adult males (age 20-28 years) and 8 middle-aged males (42-56 years) were analyzed by visual scoring and spectral analysis. In the middle-aged subjects power density in the delta, theta and sigma frequencies were attenuated as compared to the young subjects. In both age grou

  1. Spectral Analysis and Musical Theory in Support to the Pianism of Samba and Related Genres

    Directory of Open Access Journals (Sweden)

    Luiz E. Castelões

    2013-02-01

    Full Text Available The present article proposes a methodology, which integrates spectral analysis, music theory, and instrumental practice, in order to approach the left-hand accompaniment of the samba's pianism to the muffled and loose tone of three different kinds of surdos (round shape drums used in the performance of samba.

  2. [Spectral Analysis of Trace Fluorine Phase in Phosphogypsum].

    Science.gov (United States)

    Zhao, Hong-tao; Li, Hui-quan; Bao, Wei-jun; Wang, Chen-ye; Li, Song-geng; Lin, Wei-gang

    2015-08-01

    Phosphogypsum, which contains more than 90% of the calcium sulfate dehydrate (CaSO4 · 2H2O), is a kind of important renewable gypsum resources. Unlike the natural gypsum, however, phosphorus, fluorine, organic matter and other harmful impurities in phosphogypsum limit its practical use. To ascertain the existence form, content and phase distribution of trace fluoride in phosphogypsum has important theoretical values in removing trace fluoride effectively. In this present paper, the main existence form and phase distribution of trace fluoride in phosphogypsum was investigated by the combination of X-ray photoelectron spectroscopy (XPS) and Electron microprobe analysis (EMPA). The results show that trace fluoride phase mainly includes NaF, KF, CaF2, K2SiF6, Na2SiF6, Na3AlF6, K3AlF6, AlF3 · 3H2O, AlF2.3(OH)0.7 · H2O, Ca5(PO4)3F, Ca10(PO4)6F2. Among them, 4.83% of fluorine exists in the form of fluoride (NaF, KF, CaF2); Accordingly, 8.43% in the form of fluoride phosphate (Ca5(PO4)3F, Ca10(PO4)6F2); 12.21% in the form of fluorine aluminate (Na3AlF6, K3AlF6); 41.52% in the form of fluorosilicate (K2SiF6, Na2SiF6); 33.02% in the form of aluminum fluoride with crystal water (AlF3 · 3H2O, AlF2.3(OH)0.7 · H2O). In the analysis of phase constitution for trace elements in solid samples, the method of combining XPS and EMPA has more advantages. This study also provides theoretical basis for the removal of trace fluorine impurity and the effective recovery of fluorine resources.

  3. Spectral Analysis of Spatial Series Data of Pathologic Tissue: A Study on Small Intestine in ICR Mouse

    Science.gov (United States)

    Mise, Keiji; Sumi, Ayako; Kobayashi, Nobumichi; Torigoe, Toshihiko; Ohtomo, Norio

    2009-01-01

    We examined the usefulness of spectral analysis for investigating quantitatively the spatial pattern of pathologic tissue. To interpret the results obtained from real tissue, we constructed a two-dimensional spatial model of the tissue. Spectral analysis was applied to the spatial series data, which were obtained from the real tissue and model. From the results of spectral analysis, spatial patterns of the tissue and model were characterized quantitatively in reference to the frequencies and powers of the spectral peaks in power spectral densities (PSDs). The results for the model were essentially consistent with those for the tissue. It was concluded that the model was capable of adequately explaining the spatial pattern of the tissue. It is anticipated that spectral analysis will become a useful tool for characterizing the spatial pattern of the tissue quantitatively, resulting in an automated first screening of pathological specimens.

  4. Synthesis, spectral, computational and thermal analysis studies of metalloceftriaxone antibiotic

    Science.gov (United States)

    Masoud, Mamdouh S.; Ali, Alaa E.; Elasala, Gehan S.

    2015-03-01

    Binary ceftriaxone metal complexes of Cr(III), Mn(II), Fe(III), Co(II), Ni(II), Cu(II), Zn(II), Cd(II), Hg(II) and six mixed metals complexes of (Fe, Cu), (Fe, Co), (Co, Ni), (Co, Cu), (Ni, Cu) and (Fe, Ni) were synthesized and characterized by elemental analysis, IR, electronic spectra, magnetic susceptibility and ESR spectra. The studies proved that the ligand has different combination modes and all complexes were of octahedral geometry. Molecular modeling techniques and quantum chemical methods have been performed for ceftriaxone to calculate charges, bond lengths, bond angles, dihedral angles, electronegativity (χ), chemical potential (μ), global hardness (η), softness (σ) and the electrophilicity index (ω). The thermal decomposition of the prepared metals complexes was studied by TGA, DTA and DSC techniques. The kinetic parameters and the reaction orders were estimated. The thermal decomposition of all the complexes ended with the formation of metal oxides and carbon residue as a final product except in case of Hg complex, sublimation occurs at the temperature range 297.7-413.7 °C so, only carbon residue was produced during thermal decomposition. The geometries of complexes may be altered from Oh to Td during the thermal decomposition steps. Decomposition mechanisms were suggested.

  5. Atomistic interpretation of solid solution hardening from spectral analysis.

    Science.gov (United States)

    Plendl, J N

    1971-05-01

    From analysis of a series of vibrational spectra of ir energy absorption and laser Raman, an attempt is made to interpret solid solution hardening from an atomistic point of view for the system CaF(2)/SrF(2). It is shown to be caused by the combined action of three atomic characteristics, i.e., their changes as a function of composition. They are deformation of the atomic coordination polyhedrons, overlap of the outer electron shells of the atom pairs, and the ratio of the ionic to covalent share of binding. A striking nonlinear behavior of the three characteristics, as a function of composition, gives maximum atomic bond strength to the 55/45 position of the system CaF(2)/SrF(2), in agreement with the measured data of the solid solution hardening. The curve for atomic bond strength, derived from the three characteristics, is almost identical to the curve for measured microhardness data. This result suggests that the atomistic interpretation, put forward in this paper, is correct.

  6. ON SPECTRAL METHODS FOR VOLTERRA INTEGRAL EQUATIONS AND THE CONVERGENCE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Tao Tang; Xiang Xu; Jin Cheng

    2008-01-01

    The main purpose of this work is to provide a novel numerical approach for the Volterra integral equations based on a spectral approach. A Legendre-collocation method is pro-posed to solve the Volterra integral equations of the second kind. We provide a rigorous error analysis for the proposed method, which indicates that the numerical errors decay exponentially provided that the kernel function and the source function are sufficiently smooth. Numerical results confirm the theoretical prediction of the exponential rate of convergence. The result in this work seems to be the first successful spectral approach (with theoretical justification) for the Volterra type equations.

  7. The spectral analysis of syllables in patients using dentures.

    Science.gov (United States)

    Jindra, Petr; Eber, Miroslav; Pesák, Josef

    2002-12-01

    Changes in the oral cavity resulting from the loss of teeth and the ensuing reconstruction of a set of teeth by dentures (partial or complete) may cause changes in the speech and voice of the patient. The aim of the present investigation was to study the changes in speech and voice in patients suffering from teeth loss and the degree of speech improvement using dentures. Voice and speech parameters of a set of tested syllables were analysed in 10 patients at the 2nd Clinic of Stomatology. The analysis was carried out by means of an FFT, SoundForge 5.0 programme. Differently expressed acoustic changes in both consonants and vowels were ascertained in a percentage of the patients under examination. These concerned especially the sibilant ("s", "(see text)"), labiodental ("f", "v") and vibrating ("r", "(see text)") consonants. Changes in the FFT spectrum and air leakage in constrictive consonants were also found. In some patients the vowels, especially the closed ones ("i", "u"), may change their fundamental frequency and show noise admixture manifested as a blurred delimitation of the formants. A denture should, inter alia, render it possible for the patient to produce the same articulation to which he/she had been accustomed before the loss of teeth. For the construction of dentures the most important factors from a phonetic point of view appear to be the following: overbite, overjet, the height of the plate, the thickness of the palatal material, the incisor position, and the modelling of the ruga palatina on the hard palate. In case of wrong denture construction the acoustic changes may continue, resulting in the patient's stress load dependent upon sex, age, psychic condition and seriousness of the problem.

  8. Spectral and gravimetric analysis of completely oxidized amalgam systems.

    Science.gov (United States)

    Mueller, H J

    1980-01-01

    Analysis of the soluble solution species, insoluble solution precipitate, adherent corrosion products, and microstructural changes of the substrate amalgam after selected polarization to -0.2v and +0.5v in a chloride solution is reported. Results indicate only small concentrations of soluble species, high concentrations of a Sn insoluble solution precipitate at -0.2v, and high concentrations of a Cu precipitate at +0.5v, related to CuCl2 . 3 Cu (OH)2. The completely oxidized amalgam microstructure indicates a thin outermost layer of predominantly Sn--Cl, a thick corroded layer of Ag-Sn-Hg-Cl, and the remaining substrate amalgam. The compound of (SnO) 160 is also associated with the thick corroded layer. The microstructure of the substrate amalgam exhibits, besides the normally occurring phases and products, four new phases or alterations due to the redistribution of Sn, Cl and 0 from the gamma-2 corrosion products, (1) the reappearance of voids, (2) a grey Ag-Sn-Cl phase with and without Cu localized at specific sites in the gamma-1 matrix, (3) dark areas or partially filled voids containing the same elements as in (2) and formerly occupied by the gamma-2 products, and (4) a Cu-rich phase from the deterioration of the Cu6Sn5 phase and also included within the matrix. These changes, particularly (1), (2) and (3) occur with the onset of the gamma-1 deterioration. Unreacted Ag3Sn including the additions of the Cu3Sn component is the last phase to be attacked in the composite amalgam.

  9. RXTE Observation of Cygnus X-1 Spectral Analysis

    Science.gov (United States)

    Dove, J. B.; Wilms, Joern; Nowak, M. A.; Vaughan, B. A.; Begelman, M. C.

    1998-01-01

    We present the results of the analysis of the broad-band spectrum of Cygnus X-1 from 3.0 to 200 keV, using data from a 10 ksec observation by the Rossi X-ray Timing Explorer. Although the spectrum can be well described phenomenologically by an exponentially cut-off power law (photon index Gamma = 1.45+0.01 -0.02 , e-folding energy e(sub f) = 162+9 -8 keV, plus a deviation from a power law that formally can be modeled as a thermal blackbody, with temperature kT(sub BB) = 1.2 +0.0 -0.1 keV), the inclusion of a reflection component does not improve the fit. As a physical description of this system, we apply the accretion disc corona (ADC) models. A slab-geometry ADC model is unable to describe the data. However, a spherical corona, with a total optical depth tau- = 1.6 + or - 0.1 and an average temperature kTc = 87 + or - 5 keV, surrounded by an exterior cold disc, does provide a good description of the data (X red (exp 2) = 1.55). These models deviate from the data bv up to 7% in the 5-10 keV range. However, considering how successfully the spherical corona reproduces the 10-200 keV data, such "photon-starved" coronal geometries seem very promising for explaining the accretion processes of Cygnus X-1.

  10. Quantitative analysis of the dual-energy CT virtual spectral curve for focal liver lesions characterization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qi, E-mail: wq20@hotmail.com; Shi, Gaofeng, E-mail: gaofengs62@sina.com; Qi, Xiaohui, E-mail: qixiaohui1984@163.com; Fan, Xueli, E-mail: 407849960@qq.com; Wang, Lijia, E-mail: 893197597@qq.com

    2014-10-15

    Highlights: • We establish a feasible method using the virtual spectral curves (VSC) to differentiate focal liver lesions using DECT. • Our study shows the slope of the VSC can be used to differentiate between hemangioma, HCC, metastasis and cyst. • Importantly, the diagnostic specificities associated with using the slope to diagnose both hemangioma and cysts were 100%. - Abstract: Objective: To assess the usefulness of the spectral curve slope of dual-energy CT (DECT) for differentiating between hepatocellular carcinoma (HCC), hepatic metastasis, hemangioma (HH) and cysts. Methods: In total, 121 patients were imaged in the portal venous phase using dual-energy mode. Of these patients, 23 patients had HH, 28 patients had HCC, 40 patients had metastases and 30 patients had simple cysts. The spectral curves of the hepatic lesions were derived from the 40–190 keV levels of virtual monochromatic spectral imaging. The spectral curve slopes were calculated from 40 to 110 keV. The slopes were compared using the Kruskal–Wallis test. Receiver operating characteristic curves (ROC) were used to determine the optimal cut-off value of the slope of the spectral curve to differentiate between the lesions. Results: The spectral curves of the four lesion types had different baseline levels. The HH baseline level was the highest followed by HCC, metastases and cysts. The slopes of the spectral curves of HH, HCC, metastases and cysts were 3.81 ± 1.19, 1.49 ± 0.57, 1.06 ± 0.76 and 0.13 ± 0.17, respectively. These values were significantly different (P < 0.008). Based on ROC analysis, the respective diagnostic sensitivity and specificity were 87% and 100% for hemangioma (cut-off value ≥ 2.988), 82.1% and 65.9% for HCC (cut-off value 1.167–2.998), 65.9% and 59% for metastasis (cut-off value 0.133–1.167) and 44.4% and 100% for cysts (cut-off value ≤ 0.133). Conclusion: Quantitative analysis of the DECT spectral curve in the portal venous phase can be used to

  11. Nonparametric Analyses of Log-Periodic Precursors to Financial Crashes

    Science.gov (United States)

    Zhou, Wei-Xing; Sornette, Didier

    We apply two nonparametric methods to further test the hypothesis that log-periodicity characterizes the detrended price trajectory of large financial indices prior to financial crashes or strong corrections. The term "parametric" refers here to the use of the log-periodic power law formula to fit the data; in contrast, "nonparametric" refers to the use of general tools such as Fourier transform, and in the present case the Hilbert transform and the so-called (H, q)-analysis. The analysis using the (H, q)-derivative is applied to seven time series ending with the October 1987 crash, the October 1997 correction and the April 2000 crash of the Dow Jones Industrial Average (DJIA), the Standard & Poor 500 and Nasdaq indices. The Hilbert transform is applied to two detrended price time series in terms of the ln(tc-t) variable, where tc is the time of the crash. Taking all results together, we find strong evidence for a universal fundamental log-frequency f=1.02±0.05 corresponding to the scaling ratio λ=2.67±0.12. These values are in very good agreement with those obtained in earlier works with different parametric techniques. This note is extracted from a long unpublished report with 58 figures available at , which extensively describes the evidence we have accumulated on these seven time series, in particular by presenting all relevant details so that the reader can judge for himself or herself the validity and robustness of the results.

  12. Wavelet Estimators in Nonparametric Regression: A Comparative Simulation Study

    Directory of Open Access Journals (Sweden)

    Anestis Antoniadis

    2001-06-01

    Full Text Available Wavelet analysis has been found to be a powerful tool for the nonparametric estimation of spatially-variable objects. We discuss in detail wavelet methods in nonparametric regression, where the data are modelled as observations of a signal contaminated with additive Gaussian noise, and provide an extensive review of the vast literature of wavelet shrinkage and wavelet thresholding estimators developed to denoise such data. These estimators arise from a wide range of classical and empirical Bayes methods treating either individual or blocks of wavelet coefficients. We compare various estimators in an extensive simulation study on a variety of sample sizes, test functions, signal-to-noise ratios and wavelet filters. Because there is no single criterion that can adequately summarise the behaviour of an estimator, we use various criteria to measure performance in finite sample situations. Insight into the performance of these estimators is obtained from graphical outputs and numerical tables. In order to provide some hints of how these estimators should be used to analyse real data sets, a detailed practical step-by-step illustration of a wavelet denoising analysis on electrical consumption is provided. Matlab codes are provided so that all figures and tables in this paper can be reproduced.

  13. Spectral Analysis of Biosignals to Evaluate Heart Activity due to the Consumption of Energy Drinks

    Directory of Open Access Journals (Sweden)

    Md. Bashir Uddin

    2016-08-01

    Full Text Available The heart activity is clearly evaluated in this study by analyzing spectral or frequency components of three Biosignals such as ECG, PPG and blood perfusion signal. This study is done with several healthy human subjects who are totally free from any type of cardiovascular diseases. ECG and PPG recordings were performed with electrode lead set and pulse transducer respectively connected to the same MP36 (Biopac, USA data acquisition unit. LDF measurements were performed with skin surface probe connected to LDF100C module on middle finger tip. This LDF module was connected to MP150 (Biopac, USA data acquisition unit. ECG, PPG and blood perfusion signal recordings were performed before and after having energy drinks available in Bangladesh. After consuming energy drinks, it is observed that the spectral or frequency components for ECG as well as PPG signal decreases with a significant rate from the instant of having ED. That is, the spectral parameters of heart activity decrease due to the consumption of energy drinks. The spectral analysis of LDF signal also results similar type of decrement in their spectral parameters for same type of energy drinks consumption. These results reflect adverse impacts of energy drinks consumption on heart activity.

  14. Performances and Spending Efficiency in Higher Education: A European Comparison through Non-Parametric Approaches

    Science.gov (United States)

    Agasisti, Tommaso

    2011-01-01

    The objective of this paper is an efficiency analysis concerning higher education systems in European countries. Data have been extracted from OECD data-sets (Education at a Glance, several years), using a non-parametric technique--data envelopment analysis--to calculate efficiency scores. This paper represents the first attempt to conduct such an…

  15. Spectral quantitation by principal component analysis using complex singular value decomposition.

    Science.gov (United States)

    Elliott, M A; Walter, G A; Swift, A; Vandenborne, K; Schotland, J C; Leigh, J S

    1999-03-01

    Principal component analysis (PCA) is a powerful method for quantitative analysis of nuclear magnetic resonance spectral data sets. It has the advantage of being model independent, making it well suited for the analysis of spectra with complicated or unknown line shapes. Previous applications of PCA have required that all spectra in a data set be in phase or have implemented iterative methods to analyze spectra that are not perfectly phased. However, improper phasing or imperfect convergence of the iterative methods has resulted in systematic errors in the estimation of peak areas with PCA. Presented here is a modified method of PCA, which utilizes complex singular value decomposition (SVD) to analyze spectral data sets with any amount of variation in spectral phase. The new method is shown to be completely insensitive to spectral phase. In the presence of noise, PCA with complex SVD yields a lower variation in the estimation of peak area than conventional PCA by a factor of approximately 2. The performance of the method is demonstrated with simulated data and in vivo 31P spectra from human skeletal muscle.

  16. Spectral Analysis of Acceleration Data for Detection of Generalized Tonic-Clonic Seizures

    Science.gov (United States)

    Joo, Hyo Sung; Han, Su-Hyun; Lee, Jongshill; Jang, Dong Pyo; Kang, Joong Koo; Woo, Jihwan

    2017-01-01

    Generalized tonic-clonic seizures (GTCSs) can be underestimated and can also increase mortality rates. The monitoring devices used to detect GTCS events in daily life are very helpful for early intervention and precise estimation of seizure events. Several studies have introduced methods for GTCS detection using an accelerometer (ACM), electromyography, or electroencephalography. However, these studies need to be improved with respect to accuracy and user convenience. This study proposes the use of an ACM banded to the wrist and spectral analysis of ACM data to detect GTCS in daily life. The spectral weight function dependent on GTCS was used to compute a GTCS-correlated score that can effectively discriminate between GTCS and normal movement. Compared to the performance of the previous temporal method, which used a standard deviation method, the spectral analysis method resulted in better sensitivity and fewer false positive alerts. Finally, the spectral analysis method can be implemented in a GTCS monitoring device using an ACM and can provide early alerts to caregivers to prevent risks associated with GTCS. PMID:28264522

  17. Nonparametric signal processing validation in T-wave alternans detection and estimation.

    Science.gov (United States)

    Goya-Esteban, R; Barquero-Pérez, O; Blanco-Velasco, M; Caamaño-Fernández, A J; García-Alberola, A; Rojo-Álvarez, J L

    2014-04-01

    Although a number of methods have been proposed for T-Wave Alternans (TWA) detection and estimation, their performance strongly depends on their signal processing stages and on their free parameters tuning. The dependence of the system quality with respect to the main signal processing stages in TWA algorithms has not yet been studied. This study seeks to optimize the final performance of the system by successive comparisons of pairs of TWA analysis systems, with one single processing difference between them. For this purpose, a set of decision statistics are proposed to evaluate the performance, and a nonparametric hypothesis test (from Bootstrap resampling) is used to make systematic decisions. Both the temporal method (TM) and the spectral method (SM) are analyzed in this study. The experiments were carried out in two datasets: first, in semisynthetic signals with artificial alternant waves and added noise; second, in two public Holter databases with different documented risk of sudden cardiac death. For semisynthetic signals (SNR = 15 dB), after the optimization procedure, a reduction of 34.0% (TM) and 5.2% (SM) of the power of TWA amplitude estimation errors was achieved, and the power of error probability was reduced by 74.7% (SM). For Holter databases, appropriate tuning of several processing blocks, led to a larger intergroup separation between the two populations for TWA amplitude estimation. Our proposal can be used as a systematic procedure for signal processing block optimization in TWA algorithmic implementations.

  18. Correlative Spectral Analysis of Gamma-Ray Bursts using Swift-BAT and GLAST-GBM

    CERN Document Server

    Stamatikos, Michael; Band, David L

    2008-01-01

    We discuss the preliminary results of spectral analysis simulations involving anticipated correlated multi-wavelength observations of gamma-ray bursts (GRBs) using Swift's Burst Alert Telescope (BAT) and the Gamma-Ray Large Area Space Telescope's (GLAST) Burst Monitor (GLAST-GBM), resulting in joint spectral fits, including characteristic photon energy (Epeak) values, for a conservative annual estimate of ~30 GRBs. The addition of BAT's spectral response will (i) complement in-orbit calibration efforts of GBM's detector response matrices, (ii) augment GLAST's low energy sensitivity by increasing the ~20-100 keV effective area, (iii) facilitate ground-based follow-up efforts of GLAST GRBs by increasing GBM's source localization precision, and (iv) help identify a subset of non-triggered GRBs discovered via off-line GBM data analysis. Such multi-wavelength correlative analyses, which have been demonstrated by successful joint-spectral fits of Swift-BAT GRBs with other higher energy detectors such as Konus-WIND ...

  19. Chemometric analysis for near-infrared spectral detection of beef in fish meal

    Science.gov (United States)

    Yang, Chun-Chieh; Garrido-Novell, Cristóbal; Pérez-Marín, Dolores; Guerrero-Ginel, José E.; Garrido-Varo, Ana; Kim, Moon S.

    2015-05-01

    This paper reports the chemometric analysis of near-infrared spectra drawn from hyperspectral images to develop, evaluate, and compare statistical models for the detection of beef in fish meal. There were 40 pure-fish meal samples, 15 pure-beef meal samples, and 127 fish/beef mixture meal samples prepared for hyperspectral line-scan imaging by a machine vision system. Spectral data for 3600 pixels per sample, in which individual spectra was obtain, were retrieved from the region of interest (ROI) in every sample image. The spectral data spanning 969 nm to 1551 nm (across 176 spectral bands) were analyzed. Statistical models were built using the principal component analysis (PCA) and the partial least squares regression (PLSR) methods. The models were created and developed using the spectral data from the purefish meal and pure-beef meal samples, and were tested and evaluated using the data from the ROI in the mixture meal samples. The results showed that, with a ROI as large as 3600 pixels to cover sufficient area of a mixture meal sample, the success detection rate of beef in fish meal could be satisfactory 99.2% by PCA and 98.4% by PLSR.

  20. Duplex-Doppler spectral analysis in the physiopathology of the temporomandibular joint.

    Science.gov (United States)

    Marini, M; Odoardi, G L; Bolle, G; Tartaglia, P

    1994-01-01

    We introduce a new method of analysis of the normal and abnormal behavior of the TMJ, using a duplex-doppler spectral analysis. The method consists in monitoring the joint movement by means of a study of the Fourier transformed signals, which give information on the velocity distribution of the condylo-meniscal complex during the opening and closing phases of the jaw. Using repeated sampling over short time intervals we get a detailed description of the motion which allows to discriminate the normal and abnormal action of the condylomeniscal complex. We are able to identify various physiopathological conditions, among which opening and/or closing clicks, complex locking conditions and anomalies related to an asymmetrical behavior during the operation cycle. Duplex-doppler spectral analysis is correlated to a clinical examination in order to define various classes of anomalies.

  1. 2-D Prony-Huang Transform: A New Tool for 2-D Spectral Analysis

    CERN Document Server

    Schmitt, Jérémy; Borgnat, Pierre; Flandrin, Patrick; Condat, Laurent

    2014-01-01

    This work proposes an extension of the 1-D Hilbert Huang transform for the analysis of images. The proposed method consists in (i) adaptively decomposing an image into oscillating parts called intrinsic mode functions (IMFs) using a mode decomposition procedure, and (ii) providing a local spectral analysis of the obtained IMFs in order to get the local amplitudes, frequencies, and orientations. For the decomposition step, we propose two robust 2-D mode decompositions based on non-smooth convex optimization: a "Genuine 2-D" approach, that constrains the local extrema of the IMFs, and a "Pseudo 2-D" approach, which constrains separately the extrema of lines, columns, and diagonals. The spectral analysis step is based on Prony annihilation property that is applied on small square patches of the IMFs. The resulting 2-D Prony-Huang transform is validated on simulated and real data.

  2. Determination of defect in rotor of induction machine by spectral analysis of stator phase current

    Directory of Open Access Journals (Sweden)

    Myrteza Braneshi

    2010-10-01

    Full Text Available Induction motors are important part of safe and efficient running of any industrial plant. These motors are often used in industrial applications thanks to their usability and their robustness. Faults and failures of induction machine can lead to excessive downtimes processes; generate large losses in revenues and long term maintenance. Early detection of motor abnormalities would help avoiding costly breakdowns. In this paper a diagnostic technique of induction motor broken rotor bars is presented. The applied method is the so-called Motor Current Signature Analysis (MCSA which utilized the results of spectral analysis of the stator current. The broken rotor bars and rings will cause twice slip frequency side bands around the supplying frequency. The fault detection method consists in monitoring of stator phase current spectrum. Twice slip frequency side bands around the main frequency detected by spectral analysis is an indicator of the broken bars. The experimental results show the efficiency of the method.

  3. Joint Spectral Analysis for Early Bright X-ray Flares of -Ray Bursts with Swift BAT and XRT Data

    Indian Academy of Sciences (India)

    Fang-Kun Peng; You-Dong Hu; Xiang-Gao Wang; Rui-Jing Lu; En-Wei Liang

    2014-09-01

    A joint spectral analysis for early bright X-ray flares that were simultaneously observed with Swift BAT and XRT are present. Both BAT and XRT lightcurves of these flares are correlated. Our joint spectral analysis shows that the radiations in the two energy bands are from the same spectral component, which can be well fitted with a single power-law. Except for the flares in GRBs 060904B and 100906A, the photon spectral indices are < 2.0, indicating the peak energies (p) of the prompt -rays should be above the high energy end of the BAT band.

  4. A nonparametric and diversified portfolio model

    Science.gov (United States)

    Shirazi, Yasaman Izadparast; Sabiruzzaman, Md.; Hamzah, Nor Aishah

    2014-07-01

    Traditional portfolio models, like mean-variance (MV) suffer from estimation error and lack of diversity. Alternatives, like mean-entropy (ME) or mean-variance-entropy (MVE) portfolio models focus independently on the issue of either a proper risk measure or the diversity. In this paper, we propose an asset allocation model that compromise between risk of historical data and future uncertainty. In the new model, entropy is presented as a nonparametric risk measure as well as an index of diversity. Our empirical evaluation with a variety of performance measures shows that this model has better out-of-sample performances and lower portfolio turnover than its competitors.

  5. Non-Parametric Estimation of Correlation Functions

    DEFF Research Database (Denmark)

    Brincker, Rune; Rytter, Anders; Krenk, Steen

    In this paper three methods of non-parametric correlation function estimation are reviewed and evaluated: the direct method, estimation by the Fast Fourier Transform and finally estimation by the Random Decrement technique. The basic ideas of the techniques are reviewed, sources of bias are pointed...... out, and methods to prevent bias are presented. The techniques are evaluated by comparing their speed and accuracy on the simple case of estimating auto-correlation functions for the response of a single degree-of-freedom system loaded with white noise....

  6. Nonparametric inferences for kurtosis and conditional kurtosis

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-heng; HE You-hua

    2009-01-01

    Under the assumption of strictly stationary process, this paper proposes a nonparametric model to test the kurtosis and conditional kurtosis for risk time series. We apply this method to the daily returns of S&P500 index and the Shanghai Composite Index, and simulate GARCH data for verifying the efficiency of the presented model. Our results indicate that the risk series distribution is heavily tailed, but the historical information can make its future distribution light-tailed. However the far future distribution's tails are little affected by the historical data.

  7. Preliminary results on nonparametric facial occlusion detection

    Directory of Open Access Journals (Sweden)

    Daniel LÓPEZ SÁNCHEZ

    2016-10-01

    Full Text Available The problem of face recognition has been extensively studied in the available literature, however, some aspects of this field require further research. The design and implementation of face recognition systems that can efficiently handle unconstrained conditions (e.g. pose variations, illumination, partial occlusion... is still an area under active research. This work focuses on the design of a new nonparametric occlusion detection technique. In addition, we present some preliminary results that indicate that the proposed technique might be useful to face recognition systems, allowing them to dynamically discard occluded face parts.

  8. The spectral analysis of motion: An "open field" activity test example

    Directory of Open Access Journals (Sweden)

    Obradović Z.

    2013-01-01

    Full Text Available In this work we have described the new mathematical approach, with spectral analysis of the data to evaluate position and motion in the „„open field““ experiments. The aim of this work is to introduce several new parameters mathematically derived from experimental data by means of spectral analysis, and to quantitatively estimate the quality of the motion. Two original software packages (TRACKER and POSTPROC were used for transforming a video data to a log file, suitable for further computational analysis, and to perform analysis from the log file. As an example, results obtained from the experiments with Wistar rats in the „open field“ test are included. The test group of animals was treated with diazepam. Our results demonstrate that all the calculated parameters, such as movement variability, acceleration and deceleration, were significantly lower in the test group compared to the control group. We believe that the application of parameters obtained by spectral analysis could be of great significance in assessing the locomotion impairment in any kind of motion. [Projekat Ministarstva nauke Republike Srbije, br. III41007 i br. ON174028

  9. Non-Parametric Bayesian Updating within the Assessment of Reliability for Offshore Wind Turbine Support Structures

    DEFF Research Database (Denmark)

    Ramirez, José Rangel; Sørensen, John Dalsgaard

    2011-01-01

    This work illustrates the updating and incorporation of information in the assessment of fatigue reliability for offshore wind turbine. The new information, coming from external and condition monitoring can be used to direct updating of the stochastic variables through a non-parametric Bayesian...... updating approach and be integrated in the reliability analysis by a third-order polynomial chaos expansion approximation. Although Classical Bayesian updating approaches are often used because of its parametric formulation, non-parametric approaches are better alternatives for multi-parametric updating...... with a non-conjugating formulation. The results in this paper show the influence on the time dependent updated reliability when non-parametric and classical Bayesian approaches are used. Further, the influence on the reliability of the number of updated parameters is illustrated....

  10. Parallel implementation of the multiple endmember spectral mixture analysis algorithm for hyperspectral unmixing

    Science.gov (United States)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    In the last decade, the issue of endmember variability has received considerable attention, particularly when each pixel is modeled as a linear combination of endmembers or pure materials. As a result, several models and algorithms have been developed for considering the effect of endmember variability in spectral unmixing and possibly include multiple endmembers in the spectral unmixing stage. One of the most popular approach for this purpose is the multiple endmember spectral mixture analysis (MESMA) algorithm. The procedure executed by MESMA can be summarized as follows: (i) First, a standard linear spectral unmixing (LSU) or fully constrained linear spectral unmixing (FCLSU) algorithm is run in an iterative fashion; (ii) Then, we use different endmember combinations, randomly selected from a spectral library, to decompose each mixed pixel; (iii) Finally, the model with the best fit, i.e., with the lowest root mean square error (RMSE) in the reconstruction of the original pixel, is adopted. However, this procedure can be computationally very expensive due to the fact that several endmember combinations need to be tested and several abundance estimation steps need to be conducted, a fact that compromises the use of MESMA in applications under real-time constraints. In this paper we develop (for the first time in the literature) an efficient implementation of MESMA on different platforms using OpenCL, an open standard for parallel programing on heterogeneous systems. Our experiments have been conducted using a simulated data set and the clMAGMA mathematical library. This kind of implementations with the same descriptive language on different architectures are very important in order to actually calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  11. Clues to Coral Reef Ecosystem Health: Spectral Analysis Coupled with Radiative Transfer Modeling

    Science.gov (United States)

    Guild, L.; Ganapol, B.; Kramer, P.; Armstrong, R.; Gleason, A.; Torres, J.; Johnson, L.; Garfield, N.

    2003-12-01

    Coral reefs are among the world's most productive and biologically rich ecosystems and are some of the oldest ecosystems on Earth. Coralline structures protect coastlines from storms, maintain high diversity of marine life, and provide nurseries for marine species. Coral reefs play a role in carbon cycling through high rates of organic carbon metabolism and calcification. Coral reefs provide fisheries habitat that are the sole protein source for humans on remote islands. Reefs respond immediately to environmental change and therefore are considered "canaries" of the oceans. However, the world's reefs are in peril: they have shrunk 10-50% from their historical extent due to climate change and anthropogenic activity. An important contribution to coral reef research is improved spectral distinction of reef species' health where anthropogenic activity and climate change impacts are high. Relatively little is known concerning the spectral properties of coral or how coral structures reflect and transmit light. New insights into optical processes of corals under stressed conditions can lead to improved interpretation of airborne and satellite data and forecasting of immediate or long-term impacts of events such as bleaching and disease in coral. We are investigating the spatial and spectral resolution required to detect remotely changes in reef health by coupling spectral analysis of in situ spectra and airborne spectral data with a new radiative transfer model called CorMOD2. Challenges include light attenuation by the water column, atmospheric scattering, and scattering caused by the coral themselves that confound the spectral signal. In CorMOD2, input coral reflectance measurements produce modeled absorption through an inversion at each visible wavelength. The first model development phase of CorMOD2 imposes a scattering baseline that is constant regardless of coral condition, and further specifies that coral is optically thick. Evolution of CorMOD2 is towards a coral

  12. EEG Resolutions in Detecting and Decoding Finger Movements from Spectral Analysis

    Directory of Open Access Journals (Sweden)

    Ran eXiao

    2015-09-01

    Full Text Available Mu/beta rhythms are well-studied brain activities that originate from sensorimotor cortices. These rhythms reveal spectral changes in alpha and beta bands induced by movements of different body parts, e.g. hands and limbs, in electroencephalography (EEG signals. However, less can be revealed in them about movements of different fine body parts that activate adjacent brain regions, such as individual fingers from one hand. Several studies have reported spatial and temporal couplings of rhythmic activities at different frequency bands, suggesting the existence of well-defined spectral structures across multiple frequency bands. In the present study, spectral principal component analysis (PCA was applied on EEG data, obtained from a finger movement task, to identify cross-frequency spectral structures. Features from identified spectral structures were examined in their spatial patterns, cross-condition pattern changes, detection capability of finger movements from resting, and decoding performance of individual finger movements in comparison to classic mu/beta rhythms. These new features reveal some similar, but more different spatial and spectral patterns as compared with classic mu/beta rhythms. Decoding results further indicate that these new features (91% can detect finger movements much better than classic mu/beta rhythms (75.6%. More importantly, these new features reveal discriminative information about movements of different fingers (fine body-part movements, which is not available in classic mu/beta rhythms. The capability in decoding fingers (and hand gestures in the future from EEG will contribute significantly to the development of noninvasive brain computer interface (BCI and neuroprosthesis with intuitive and flexible controls.

  13. NONPARAMETRIC FIXED EFFECT PANEL DATA MODELS: RELATIONSHIP BETWEEN AIR POLLUTION AND INCOME FOR TURKEY

    Directory of Open Access Journals (Sweden)

    Rabia Ece OMAY

    2013-06-01

    Full Text Available In this study, relationship between gross domestic product (GDP per capita and sulfur dioxide (SO2 and particulate matter (PM10 per capita is modeled for Turkey. Nonparametric fixed effect panel data analysis is used for the modeling. The panel data covers 12 territories, in first level of Nomenclature of Territorial Units for Statistics (NUTS, for period of 1990-2001. Modeling of the relationship between GDP and SO2 and PM10 for Turkey, the non-parametric models have given good results.

  14. Spectral analysis of bacanora (agave-derived liquor) by using FT-Raman spectroscopy

    Science.gov (United States)

    Ortega Clavero, Valentin; Weber, Andreas; Schröder, Werner; Curticapean, Dan

    2016-04-01

    The industry of the agave-derived bacanora, in the northern Mexican state of Sonora, has been growing substantially in recent years. However, this higher demand still lies under the influences of a variety of social, legal, cultural, ecological and economic elements. The governmental institutions of the state have tried to encourage a sustainable development and certain levels of standardization in the production of bacanora by applying different economical and legal strategies. However, a large portion of this alcoholic beverage is still produced in a traditional and rudimentary fashion. Beyond the quality of the beverage, the lack of proper control, by using adequate instrumental methods, might represent a health risk, as in several cases traditional-distilled beverages can contain elevated levels of harmful materials. The present article describes the qualitative spectral analysis of samples of the traditional-produced distilled beverage bacanora in the range from 0 cm-1 to 3500 cm-1 by using a Fourier Transform Raman spectrometer. This particular technique has not been previously explored for the analysis of bacanora, as in the case of other beverages, including tequila. The proposed instrumental arrangement for the spectral analysis has been built by combining conventional hardware parts (Michelson interferometer, photo-diodes, visible laser, etc.) and a set of self-developed evaluation algorithms. The resulting spectral information has been compared to those of pure samples of ethanol and to the spectra from different samples of the alcoholic beverage tequila. The proposed instrumental arrangement can be used the analysis of bacanora.

  15. [Analysis of software for identifying spectral line of laser-induced breakdown spectroscopy based on LabVIEW].

    Science.gov (United States)

    Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang

    2012-03-01

    Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.

  16. Novel Spectral Representations and Sparsity-Driven Algorithms for Shape Modeling and Analysis

    Science.gov (United States)

    Zhong, Ming

    In this dissertation, we focus on extending classical spectral shape analysis by incorporating spectral graph wavelets and sparsity-seeking algorithms. Defined with the graph Laplacian eigenbasis, the spectral graph wavelets are localized both in the vertex domain and graph spectral domain, and thus are very effective in describing local geometry. With a rich dictionary of elementary vectors and forcing certain sparsity constraints, a real life signal can often be well approximated by a very sparse coefficient representation. The many successful applications of sparse signal representation in computer vision and image processing inspire us to explore the idea of employing sparse modeling techniques with dictionary of spectral basis to solve various shape modeling problems. Conventional spectral mesh compression uses the eigenfunctions of mesh Laplacian as shape bases, which are highly inefficient in representing local geometry. To ameliorate, we advocate an innovative approach to 3D mesh compression using spectral graph wavelets as dictionary to encode mesh geometry. The spectral graph wavelets are locally defined at individual vertices and can better capture local shape information than Laplacian eigenbasis. The multi-scale SGWs form a redundant dictionary as shape basis, so we formulate the compression of 3D shape as a sparse approximation problem that can be readily handled by greedy pursuit algorithms. Surface inpainting refers to the completion or recovery of missing shape geometry based on the shape information that is currently available. We devise a new surface inpainting algorithm founded upon the theory and techniques of sparse signal recovery. Instead of estimating the missing geometry directly, our novel method is to find this low-dimensional representation which describes the entire original shape. More specifically, we find that, for many shapes, the vertex coordinate function can be well approximated by a very sparse coefficient representation with

  17. Detector level ABI spectral response function: FM4 analysis and comparison for different ABI modules

    Science.gov (United States)

    Efremova, Boryana; Pearlman, Aaron J.; Padula, Frank; Wu, Xiangqian

    2016-09-01

    A new generation of imaging instruments Advanced Baseline Imager (ABI) is to be launched aboard the Geostationary Operational Environmental Satellites - R Series (GOES-R). Four ABI flight modules (FM) are planned to be launched on GOES-R,S,T,U, the first one in the fall of 2016. Pre-launch testing is on-going for FM3 and FM4. ABI has 16 spectral channels, six in the visible/near infrared (VNIR 0.47 - 2.25 μm), and ten in the thermal infrared (TIR 3.9 - 13.3 μm) spectral regions, to be calibrated on-orbit by observing respectively a solar diffuser and a blackbody. Each channel has hundreds of detectors arranged in columns. Operationally one Analytic Generation of Spectral Response (ANGEN) function will be used to represent the spectral response function (SRF) of all detectors in a band. The Vendor conducted prelaunch end-to-end SRF testing to compare to ANGEN; detector specific SRF data was taken for: i) best detector selected (BDS) mode - for FM 2,3, and 4; and ii) all detectors (column mode) - for four spectral bands in FM3 and FM4. The GOES-R calibration working group (CWG) has independently used the SRF test data for FM2 and FM3 to study the potential impact of detector-to-detector SRF differences on the ABI detected Earth view radiances. In this paper we expand the CWG analysis to include the FM4 SRF test data - the results are in agreement with the Vendor analysis, and show excellent instrument performance and compare the detector-to-detector SRF differences and their potential impact on the detected Earth view radiances for all of the tested ABI modules.

  18. Bayesian Nonparametric Clustering for Positive Definite Matrices.

    Science.gov (United States)

    Cherian, Anoop; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2016-05-01

    Symmetric Positive Definite (SPD) matrices emerge as data descriptors in several applications of computer vision such as object tracking, texture recognition, and diffusion tensor imaging. Clustering these data matrices forms an integral part of these applications, for which soft-clustering algorithms (K-Means, expectation maximization, etc.) are generally used. As is well-known, these algorithms need the number of clusters to be specified, which is difficult when the dataset scales. To address this issue, we resort to the classical nonparametric Bayesian framework by modeling the data as a mixture model using the Dirichlet process (DP) prior. Since these matrices do not conform to the Euclidean geometry, rather belongs to a curved Riemannian manifold,existing DP models cannot be directly applied. Thus, in this paper, we propose a novel DP mixture model framework for SPD matrices. Using the log-determinant divergence as the underlying dissimilarity measure to compare these matrices, and further using the connection between this measure and the Wishart distribution, we derive a novel DPM model based on the Wishart-Inverse-Wishart conjugate pair. We apply this model to several applications in computer vision. Our experiments demonstrate that our model is scalable to the dataset size and at the same time achieves superior accuracy compared to several state-of-the-art parametric and nonparametric clustering algorithms.

  19. Analyzing single-molecule time series via nonparametric Bayesian inference.

    Science.gov (United States)

    Hines, Keegan E; Bankston, John R; Aldrich, Richard W

    2015-02-03

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  20. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    Science.gov (United States)

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  1. Polarized spectral features of human breast tissues through wavelet transform and principal component analysis

    Indian Academy of Sciences (India)

    Anita Gharekhan; Ashok N Oza; M B Sureshkumar; Asima Pradhan; Prasanta K Panigrahi

    2010-12-01

    Fluorescence characteristics of human breast tissues are investigated through wavelet transform and principal component analysis (PCA). Wavelet transform of polarized fluorescence spectra of human breast tissues is found to localize spectral features that can reliably differentiate different tissue types. The emission range in the visible wavelength regime of 500–700 nm is analysed, with the excitation wavelength at 488 nm using laser as an excitation source, where flavin and porphyrin are some of the active fluorophores. A number of global and local parameters from principal component analysis of both high- and low-pass coefficients extracted in the wavelet domain, capturing spectral variations and subtle changes in the diseased tissues are clearly identifiable.

  2. [Optimizing spectral region in using near-infrared spectroscopy for donkey milk analysis].

    Science.gov (United States)

    Zheng, Li-Min; Zhang, Lu-Da; Guo, Hui-Yuan; Pang, Kun; Zhang, Wen-Juan; Ren, Fa-Zheng

    2007-11-01

    Donkey milk has aroused more attention in recent years since its nutrition composition shows a higher similarity to human milk than others. Due to the composition difference between cow milk and donkey milk, the present models available for cow milk analysis could not be applied to donkey milk without modifications. A rapid and reliable analysis method is required to measure the nutrition composition of donkey milk. Near infrared spectroscopy is a newly developed method in food industry, but no literature report was found regarding to its application in the analysis of donkey milk. Protein, fat, ash contents and energy value are the major nutrition factors of milk. In the present paper, these factors of donkey milk were investigated by Fourier transform near-infrared (FT-NIR) spectroscopy. The ranges of protein, fat and ash contents, and energy value in donkey milk samples were 1.15%-2.54%, 0.34%-2.67%, 0.28%-0.57% and 355.87-565.17 cal x kg(-1), respectively. The IR spectra ranged f from 3 899.6 to 12 493.4 cm(-1), with a 1 cm(-1) scanning interval. When the principal least square (PLS) regression algorithm is used for spectral regions information extraction, the additional constraint makes the principal components of matrix X to be related with the components of Y which is to be analyzed. Various spectral regions and data pretreatment methods were selected for principal least square (PLS) regression model development. A comparison of the whole and optimized spectral region NIR indicated that the models of selecting optimum spectral region were better than those of the whole spectral region. It was shown that the protein, fat and ash contents, and energy value in donkey milk obtained by chemical methods were well correlated to the respective values predicted by the NIR spectroscopy quantitative analysis model (alpha = 0.05). The RMSEP values were 0.18, 0.117, 0.040 6 and 23.5 respectively, indicating that these predicted values were reliable. These results

  3. Toward compressed DMD: spectral analysis of fluid flows using sub-Nyquist-rate PIV data

    CERN Document Server

    Tu, Jonathan H; Kutz, J Nathan; Shang, Jessica K

    2014-01-01

    Dynamic mode decomposition (DMD) is a powerful and increasingly popular tool for performing spectral analysis of fluid flows. However, it requires data that satisfy the Nyquist-Shannon sampling criterion. In many fluid flow experiments, such data are impossible to capture. We propose a new approach that combines ideas from DMD and compressed sensing. Given a vector-valued signal, we take measurements randomly in time (at a sub-Nyquist rate) and project the data onto a low-dimensional subspace. We then use compressed sensing to identify the dominant frequencies in the signal and their corresponding modes. We demonstrate this method using two examples, analyzing both an artificially constructed test dataset and particle image velocimetry data collected from the flow past a cylinder. In each case, our method correctly identifies the characteristic frequencies and oscillatory modes dominating the signal, proving the proposed method to be a capable tool for spectral analysis using sub-Nyquist-rate sampling.

  4. Localized Spectral Analysis of Fluctuating Power Generation from Solar Energy Systems

    Directory of Open Access Journals (Sweden)

    Johan Nijs

    2007-01-01

    Full Text Available Fluctuations in solar irradiance are a serious obstacle for the future large-scale application of photovoltaics. Occurring regularly with the passage of clouds, they can cause unexpected power variations and introduce voltage dips to the power distribution system. This paper proposes the treatment of such fluctuating time series as realizations of a stochastic, locally stationary, wavelet process. Its local spectral density can be estimated from empirical data by means of wavelet periodograms. The wavelet approach allows the analysis of the amplitude of fluctuations per characteristic scale, hence, persistence of the fluctuation. Furthermore, conclusions can be drawn on the frequency of occurrence of fluctuations of different scale. This localized spectral analysis was applied to empirical data of two successive years. The approach is especially useful for network planning and load management of power distribution systems containing a high density of photovoltaic generation units.

  5. Voyager 2 solar plasma and magnetic field spectral analysis for intermediate data sparsity

    CERN Document Server

    Gallana, Luca; Iovieno, Michele; Fosson, Sophie M; Magli, Enrico; Opher, Merav; Richardson, John D; Tordella, Daniela

    2015-01-01

    The Voyager probes are the furthest, still active, spacecraft ever launched from Earth. During their 38-year trip, they have collected data regarding solar wind properties (such as the plasma velocity and magnetic field intensity). Unfortunately, a complete time evolution of the measured physical quantities is not available. The time series contains many gaps which increase in frequency and duration at larger distances. The aim of this work is to perform a spectral and statistical analysis of the solar wind plasma velocity and magnetic field using Voyager 2 data measured in 1979, when the gaps/signal ratio is of order of unity. This analysis is achieved using four different data reconstruction techniques: averages on linearly interpolated subsets, correlation of linearly interpolated data, compressed sensing spectral estimation, and maximum likelihood data reconstruction. With five frequency decades, the spectra we obtained have the largest frequency range ever computed at 5 astronomical units from the Sun; s...

  6. A distributed microcomputer-controlled system for data acquisition and power spectral analysis of EEG.

    Science.gov (United States)

    Vo, T D; Dwyer, G; Szeto, H H

    1986-04-01

    A relatively powerful and inexpensive microcomputer-based system for the spectral analysis of the EEG is presented. High resolution and speed is achieved with the use of recently available large-scale integrated circuit technology with enhanced functionality (INTEL Math co-processors 8087) which can perform transcendental functions rapidly. The versatility of the system is achieved with a hardware organization that has distributed data acquisition capability performed by the use of a microprocessor-based analog to digital converter with large resident memory (Cyborg ISAAC-2000). Compiled BASIC programs and assembly language subroutines perform on-line or off-line the fast Fourier transform and spectral analysis of the EEG which is stored as soft as well as hard copy. Some results obtained from test application of the entire system in animal studies are presented.

  7. Spectral decomposition in advection-diffusion analysis by finite element methods

    Energy Technology Data Exchange (ETDEWEB)

    Nickell, R.E.; Gartling, D.K.; Strang, G.

    1978-08-11

    In a recent study of the convergence properties of finite element methods in nonlinear fluid mechanics, an indirect approach was taken. A two-dimensional example with a known exact solution was chosen as the vehicle for the study, and various mesh refinements were tested in an attempt to extract information on the effect of the local Reynolds number. However, more direct approaches are usually preferred. In this study one such direct approach is followed, based upon the spectral decomposition of the solution operator. Spectral decomposition is widely employed as a solution technique for linear structural dynamics problems and can be applied readily to linear, transient heat transfer analysis; in this case, the extension to nonlinear problems is of interest. It was shown previously that spectral techniques were applicable to stiff systems of rate equations, while recent studies of geometrically and materially nonlinear structural dynamics have demonstrated the increased information content of the numerical results. The use of spectral decomposition in nonlinear problems of heat and mass transfer would be expected to yield equally increased flow of information to the analyst, and this information could include a quantitative comparison of various solution strategies, meshes, and element hierarchies.

  8. Attometer resolution spectral analysis based on polarization pulling assisted Brillouin scattering merged with heterodyne detection.

    Science.gov (United States)

    Preussler, Stefan; Schneider, Thomas

    2015-10-05

    Spectral analysis is essential for measuring and monitoring advanced optical communication systems and the characterization of active and passive devices like amplifiers, filters and especially frequency combs. Conventional devices have a limited resolution or tuning range. Therefore, the true spectral shape of the signal remains hidden. In this work, a small part of the signal under test is preselected with help of the polarization pulling effect of stimulated Brillouin scattering where all unwanted spectral components are suppressed. Subsequently, this part is analyzed more deeply through heterodyne detection. Thereby, the local oscillator is generated from a narrow linewidth fiber laser which acts also as pump wave for Brillouin scattering. By scanning the pump wave together with the local oscillator through the signal spectrum, the whole signal is measured. The method is tunable over a broad wavelength range, is not affected by unwanted mixing products and utilizes a conventional narrow bandwidth photo diode. First proof of concept experiments show the measurement of the power spectral density function with a resolution in the attometer or lower kilohertz range at 1550 nm.

  9. Weeds: a CLASS extension for the analysis of millimeter and sub-millimeter spectral surveys

    CERN Document Server

    Maret, S; Pety, J; Bardeau, S; Reynier, E

    2010-01-01

    The advent of large instantaneous bandwidth receivers and high spectral resolution spectrometers on (sub-)millimeter telescopes has opened up the possibilities for unbiased spectral surveys. Because of the large amount of data they contain, any analysis of these surveys requires dedicated software tools. Here we present an extension of the widely used CLASS software that we developed to that purpose. This extension, named Weeds, allows for searches in atomic and molecular lines databases (e.g. JPL or CDMS) that may be accessed over the internet using a virtual observatory (VO) compliant protocol. The package permits a quick navigation across a spectral survey to search for lines of a given species. Weeds is also capable of modeling a spectrum, as often needed for line identification. We expect that Weeds will be useful for analyzing and interpreting the spectral surveys that will be done with the HIFI instrument on board Herschel, but also observations carried-out with ground based millimeter and sub-millimet...

  10. Spectral and Image Integrated Analysis of Hyperspectral Data for Waxy Corn Seed Variety Classification

    Directory of Open Access Journals (Sweden)

    Xiaoling Yang

    2015-07-01

    Full Text Available The purity of waxy corn seed is a very important index of seed quality. A novel procedure for the classification of corn seed varieties was developed based on the combined spectral, morphological, and texture features extracted from visible and near-infrared (VIS/NIR hyperspectral images. For the purpose of exploration and comparison, images of both sides of corn kernels (150 kernels of each variety were captured and analyzed. The raw spectra were preprocessed with Savitzky-Golay (SG smoothing and derivation. To reduce the dimension of spectral data, the spectral feature vectors were constructed using the successive projections algorithm (SPA. Five morphological features (area, circularity, aspect ratio, roundness, and solidity and eight texture features (energy, contrast, correlation, entropy, and their standard deviations were extracted as appearance character from every corn kernel. Support vector machines (SVM and a partial least squares–discriminant analysis (PLS-DA model were employed to build the classification models for seed varieties classification based on different groups of features. The results demonstrate that combining spectral and appearance characteristic could obtain better classification results. The recognition accuracy achieved in the SVM model (98.2% and 96.3% for germ side and endosperm side, respectively was more satisfactory than in the PLS-DA model. This procedure has the potential for use as a new method for seed purity testing.

  11. Spectral variability analysis of an XMM-Newton observation of Ark 564

    CERN Document Server

    Brinkmann, W; Raeth, C

    2007-01-01

    We present a spectral variability analysis of the X-ray emission of the Narrow Line Seyfert 1 galaxy Ark 564 using the data from a ~100 ks XMM-Newton observation. Taking advantage of the high sensitivity of this long observation and the simple spectral shape of Ark 564, we determine accurately the spectral variability patterns in the source. We use standard cross-correlation methods to investigate the correlations between the soft and hard energy band light curves. We also generated 200 energy spectra from data stretches of 500 s duration each and fitted each one of them with a power law plus a bremsstrahlung component (for the soft excess) and we investigated the correlations between the various best fit model parameter values. The ``power law plus bremsstrahlung'' model describes the spectrum well at all times. The iron line and the absorption features, which are found in the time-averaged spectrum of the source are too weak to effect the results of the time resolved spectral fits. We find that the power la...

  12. Numerical Solution of Nonlinear Fredholm Integro-Differential Equations Using Spectral Homotopy Analysis Method

    Directory of Open Access Journals (Sweden)

    Z. Pashazadeh Atabakan

    2013-01-01

    Full Text Available Spectral homotopy analysis method (SHAM as a modification of homotopy analysis method (HAM is applied to obtain solution of high-order nonlinear Fredholm integro-differential problems. The existence and uniqueness of the solution and convergence of the proposed method are proved. Some examples are given to approve the efficiency and the accuracy of the proposed method. The SHAM results show that the proposed approach is quite reasonable when compared to homotopy analysis method, Lagrange interpolation solutions, and exact solutions.

  13. Spectral analysis, digital integration, and measurement of low backscatter in coherent laser radar

    Science.gov (United States)

    Vaughan, J. M.; Callan, R. D.; Bowdle, D. A.; Rothermel, J.

    1989-01-01

    A method of surface acoustic wave (SAW) spectral analysis and digital integration that has been used previously in coherent CW laser work with CO2 lasers at 10.6 microns is described. Expressions are derived for the signal to noise ratio in the measured voltage spectrum with an approximation for the general case and rigorous treatment for the low signal case. The atmospheric backscatter data accumulated by the airborne LATAS (laser true airspeed) coherent laser radar system are analyzed.

  14. Spectral Analysis Related to Bare-Metal and Drug-Eluting Coronary Stent Implantation

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Rose Mary Ferreira Lisboa da, E-mail: roselisboa@cardiol.br [Faculdade de Medicina da UFMG, Divinópolis, MG (Brazil); Silva, Carlos Augusto Bueno [Faculdade de Medicina da UFMG, Divinópolis, MG (Brazil); Belo Horizonte, Hospital São João de Deus, Divinópolis, MG (Brazil); Greco, Otaviano José [Belo Horizonte, Hospital São João de Deus, Divinópolis, MG (Brazil); Moreira, Maria da Consolação Vieira [Faculdade de Medicina da UFMG, Divinópolis, MG (Brazil)

    2014-08-15

    The autonomic nervous system plays a central role in cardiovascular regulation; sympathetic activation occurs during myocardial ischemia. To assess the spectral analysis of heart rate variability during stent implantation, comparing the types of stent. This study assessed 61 patients (mean age, 64.0 years; 35 men) with ischemic heart disease and indication for stenting. Stent implantation was performed under Holter monitoring to record the spectral analysis of heart rate variability (Fourier transform), measuring the low-frequency (LF) and high-frequency (HF) components, and the LF/HF ratio before and during the procedure. Bare-metal stent was implanted in 34 patients, while the others received drug-eluting stents. The right coronary artery was approached in 21 patients, the left anterior descending, in 28, and the circumflex, in 9. As compared with the pre-stenting period, all patients showed an increase in LF and HF during stent implantation (658 versus 185 ms2, p = 0.00; 322 versus 121, p = 0.00, respectively), with no change in LF/HF. During stent implantation, LF was 864 ms2 in patients with bare-metal stents, and 398 ms2 in those with drug-eluting stents (p = 0.00). The spectral analysis of heart rate variability showed no association with diabetes mellitus, family history, clinical presentation, beta-blockers, age, and vessel or its segment. Stent implantation resulted in concomitant sympathetic and vagal activations. Diabetes mellitus, use of beta-blockers, and the vessel approached showed no influence on the spectral analysis of heart rate variability. Sympathetic activation was lower during the implantation of drug-eluting stents.

  15. Spectral analysis of CFB data: Predictive models of Circulating Fluidized Bed combustors

    Energy Technology Data Exchange (ETDEWEB)

    Gamwo, I.K.; Miller, A.; Gidaspow, D.

    1992-04-01

    The overall objective of this investigation is to develop experimentally verified models for circulating fluidized bed (CFB) combustors. Spectral analysis of CFB data obtained at Illinois Institute of Technology shows that the frequencies of pressure oscillations are less than 0.1 Hertz and that they increase with solids volume fraction to the usual value of one Hertz obtained in bubbling beds. These data are consistent with the kinetic theory interpretation of density wave propagation.

  16. High Dynamic Range Spectral Analysis in the kHz Band

    CERN Document Server

    Boccardi, A

    2009-01-01

    Many beam instrumentation signals of large circular accelerators are in the kHz range and can thus be digitised with powerful high resolution ADCs. A particularly large dynamic range can be achieved if the signals are analysed in the frequency domain. This report presents a system employing audio ADCs and FPGA-based spectral analysis, initially developed for tune measurement applications. Technical choices allowing frequency domain dynamic ranges beyond 140 dB are summarised.

  17. Power spectral analysis of cardiovascular variability in patients at risk for sudden cardiac death.

    Science.gov (United States)

    Malliani, A; Lombardi, F; Pagani, M; Cerutti, S

    1994-03-01

    The time series of successive heart periods present important variations around its mean value, determining the phenomenon of heart rate variability (HRV), assessed with both time and frequency domain approaches. A low standard deviation of the heart period (a time domain index of HRV) is a powerful prognostic indicator of sudden coronary death in patients recovering from acute myocardial infarction. Spectral analysis of HRV usually demonstrates two major components: indicated as LF (low frequency, approximately 0.1 Hz) and HF (high frequency, approximately 0.25 Hz). They are defined by center frequency and associated power, which is expressed in msec2 or normalized units. When assessed in normalized units, LF and HF provide quantitative indicators of neural control of the sinoatrial node. Numerous experimental and clinical studies have consistently indicated that the LF component is a marker of sympathetic modulation and HF a marker of vagal modulation; the LF/HF ratio is a synthetic index of sympathovagal balance. In the analysis of 24-hour Holter recordings of normal subjects, a circadian rhythmicity of spectral markers of sympathetic and vagal modulation is clearly present, with a sympathetic predominance during the day and a vagal predominance during the night. In patients recovering from an acute myocardial infarction, spectral analysis of HRV revealed an increased sympathetic and decreased vagal activity during early convalescence, and a return to their normal balance by 6 to 12 months. A clear increase of LF was also evident in patients studied within a few hours of the onset of symptoms related to an acute myocardial infarction, independent of its location. Similarly, LF increased during transient myocardial ischemia. An increase in markers of sympathetic activity has also been observed prior to episodes of malignant arrhythmias. Spectral analysis of HRV could help in the understanding of the role of abnormal neural mechanisms in sudden coronary death

  18. Spectral Analysis Related to Bare-Metal and Drug-Eluting Coronary Stent Implantation

    Directory of Open Access Journals (Sweden)

    Rose Mary Ferreira Lisboa da Silva

    2014-08-01

    Full Text Available Background: The autonomic nervous system plays a central role in cardiovascular regulation; sympathetic activation occurs during myocardial ischemia. Objective: To assess the spectral analysis of heart rate variability during stent implantation, comparing the types of stent. Methods: This study assessed 61 patients (mean age, 64.0 years; 35 men with ischemic heart disease and indication for stenting. Stent implantation was performed under Holter monitoring to record the spectral analysis of heart rate variability (Fourier transform, measuring the low-frequency (LF and high-frequency (HF components, and the LF/HF ratio before and during the procedure. Results: Bare-metal stent was implanted in 34 patients, while the others received drug-eluting stents. The right coronary artery was approached in 21 patients, the left anterior descending, in 28, and the circumflex, in 9. As compared with the pre-stenting period, all patients showed an increase in LF and HF during stent implantation (658 versus 185 ms2, p = 0.00; 322 versus 121, p = 0.00, respectively, with no change in LF/HF. During stent implantation, LF was 864 ms2 in patients with bare-metal stents, and 398 ms2 in those with drug-eluting stents (p = 0.00. The spectral analysis of heart rate variability showed no association with diabetes mellitus, family history, clinical presentation, beta-blockers, age, and vessel or its segment. Conclusions: Stent implantation resulted in concomitant sympathetic and vagal activations. Diabetes mellitus, use of beta-blockers, and the vessel approached showed no influence on the spectral analysis of heart rate variability. Sympathetic activation was lower during the implantation of drug-eluting stents.

  19. Nonparametric estimation of population density for line transect sampling using FOURIER series

    Science.gov (United States)

    Crain, B.R.; Burnham, K.P.; Anderson, D.R.; Lake, J.L.

    1979-01-01

    A nonparametric, robust density estimation method is explored for the analysis of right-angle distances from a transect line to the objects sighted. The method is based on the FOURIER series expansion of a probability density function over an interval. With only mild assumptions, a general population density estimator of wide applicability is obtained.

  20. A non-parametric peak finder algorithm and its application in searches for new physics

    CERN Document Server

    Chekanov, S

    2011-01-01

    We have developed an algorithm for non-parametric fitting and extraction of statistically significant peaks in the presence of statistical and systematic uncertainties. Applications of this algorithm for analysis of high-energy collision data are discussed. In particular, we illustrate how to use this algorithm in general searches for new physics in invariant-mass spectra using pp Monte Carlo simulations.

  1. Technical progress report: Completion of spectral rotating shadowband radiometers and analysis of atmospheric radiation measurement spectral shortwave data

    Energy Technology Data Exchange (ETDEWEB)

    Michalsky, J.; Harrison, L. [State Univ. of New York, Albany, NY (United States)

    1996-04-01

    Our goal in the Atmospheric Radiation Measurement (ARM) Program is the improvement of radiation models used in general circulation models (GCMs), especially in the shortwave, (1) by providing improved shortwave radiometric measurements for the testing of models and (2) by developing methods for retrieving climatologically sensitive parameters that serve as input to shortwave and longwave models. At the Atmospheric Sciences Research Center (ASRC) in Albany, New York, we are acquiring downwelling direct and diffuse spectral irradiance, at six wavelengths, plus downwelling broadband longwave, and upwelling and downwelling broadband shortwave irradiances that we combine with National Weather Service surface and upper air data from the Albany airport as a test data set for ARM modelers. We have also developed algorithms to improve shortwave measurements made at the Southern Great Plains (SGP) ARM site by standard thermopile instruments and by the multifilter rotating shadowband radiometer (MFRSR) based on these Albany data sets. Much time has been spent developing techniques to retrieve column aerosol, water vapor, and ozone from the direct beam spectral measurements of the MFRSR. Additionally, we have had success in calculating shortwave surface albedo and aerosol optical depth from the ratio of direct to diffuse spectral reflectance.

  2. Identification of neuronal network properties from the spectral analysis of calcium imaging signals in neuronal cultures.

    Science.gov (United States)

    Tibau, Elisenda; Valencia, Miguel; Soriano, Jordi

    2013-01-01

    Neuronal networks in vitro are prominent systems to study the development of connections in living neuronal networks and the interplay between connectivity, activity and function. These cultured networks show a rich spontaneous activity that evolves concurrently with the connectivity of the underlying network. In this work we monitor the development of neuronal cultures, and record their activity using calcium fluorescence imaging. We use spectral analysis to characterize global dynamical and structural traits of the neuronal cultures. We first observe that the power spectrum can be used as a signature of the state of the network, for instance when inhibition is active or silent, as well as a measure of the network's connectivity strength. Second, the power spectrum identifies prominent developmental changes in the network such as GABAA switch. And third, the analysis of the spatial distribution of the spectral density, in experiments with a controlled disintegration of the network through CNQX, an AMPA-glutamate receptor antagonist in excitatory neurons, reveals the existence of communities of strongly connected, highly active neurons that display synchronous oscillations. Our work illustrates the interest of spectral analysis for the study of in vitro networks, and its potential use as a network-state indicator, for instance to compare healthy and diseased neuronal networks.

  3. Identification of Neuronal Network Properties from the Spectral Analysis of Calcium Imaging Signals in Neuronal Cultures

    Directory of Open Access Journals (Sweden)

    Elisenda eTibau

    2013-12-01

    Full Text Available Neuronal networks in vitro are prominent systems to study the development of connections in living neuronal networks and the interplay between connectivity, activity and function. These cultured networks show a rich spontaneous activity that evolves concurrently with the connectivity of the underlying network. In this work we monitor the development of neuronal cultures, and record their activity using calcium fluorescence imaging. We use spectral analysis to characterize global dynamical and structural traits of the neuronal cultures. We first observe that the power spectrum can be used as a signature of the state of the network, for instance when inhibition is active or silent, as well as a measure of the network's connectivity strength. Second, the power spectrum identifies prominent developmental changes in the network such as GABAA switch. And third, the analysis of the spatial distribution of the spectral density, in experiments with a controlled disintegration of the network through CNQX, an AMPA-glutamate receptor antagonist in excitatory neurons, reveals the existence of communities of strongly connected, highly active neurons that display synchronous oscillations. Our work illustrates the interest of spectral analysis for the study of in vitro networks, and its potential use as a network-state indicator, for instance to compare healthy and diseased neuronal networks.

  4. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  5. Evaluating the validity of spectral calibration models for quantitative analysis following signal preprocessing.

    Science.gov (United States)

    Chen, Da; Grant, Edward

    2012-11-01

    When paired with high-powered chemometric analysis, spectrometric methods offer great promise for the high-throughput analysis of complex systems. Effective classification or quantification often relies on signal preprocessing to reduce spectral interference and optimize the apparent performance of a calibration model. However, less frequently addressed by systematic research is the affect of preprocessing on the statistical accuracy of a calibration result. The present work demonstrates the effectiveness of two criteria for validating the performance of signal preprocessing in multivariate models in the important dimensions of bias and precision. To assess the extent of bias, we explore the applicability of the elliptic joint confidence region (EJCR) test and devise a new means to evaluate precision by a bias-corrected root mean square error of prediction. We show how these criteria can effectively gauge the success of signal pretreatments in suppressing spectral interference while providing a straightforward means to determine the optimal level of model complexity. This methodology offers a graphical diagnostic by which to visualize the consequences of pretreatment on complex multivariate models, enabling optimization with greater confidence. To demonstrate the application of the EJCR criterion in this context, we evaluate the validity of representative calibration models using standard pretreatment strategies on three spectral data sets. The results indicate that the proposed methodology facilitates the reliable optimization of a well-validated calibration model, thus improving the capability of spectrophotometric analysis.

  6. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    Science.gov (United States)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2004-01-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative

  7. Spectral analysis for the mineralogical characterization of planosols in NE Brazil

    Science.gov (United States)

    Costa, Diego; Souza, Deorgia; Rocha, Washington

    2016-04-01

    This paper aims to conduct a spectral characterization in two soil profiles located in the northeast of Brazil proposing relations between the pedogenetic evolution and the environmental settings generated from the characteristics of Planosols analyzed and the presence of minerals identified by spectral pattern obtained in a laboratory. The methodological procedures were divided into the characterization of the study area, theoretical framework, field work with sampling, sample preparation, measurement in the laboratory, processing of spectral data, analysis and interpretation of results and a vegetation index calculation for aid in the environmental characterization. It is possible to see that: i) both profiles have similar spectral characterized patterns; ii) the horizons A and E show higher reflectance compared with B and C; iii) Minerals 2: 1 and 1: 1, such as montmorillonite and kaolinite can be identified; iv) Planosols are fragile to erosion. In both profiles, the C horizon less weathered and B horizon iluvial show intense absorption bands at 1400nm, 1900nm and 2200nm. These absorption bands indicate the existence of mineralogy 2: 1 on the horizons of the soils analyzed. In both profiles were found small peaks absorption in 2265nm, corresponding to gibbsite. The occurrence of this type of mineral is more common in highly weathered soils or old surfaces of erosion, which is reflected in small intensities of absorption observed in this analysis since these are of little-weathered soils of the Brazilian semiarid region. Spectral analysis and morphology described in the two profiles show difficulties for the growth of vegetation, which is consistent with NDVI values found, ranging from -0.32 to 0.61with a predominance of 0.19. These factors lead to the intensification of erosion. Erosion is characterized as one of the main indicators of environmental degradation, causing loss of important elements of the soil, which creates consequently a reduction in fertility

  8. An Excel-based implementation of the spectral method of action potential alternans analysis.

    Science.gov (United States)

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results.

  9. Comparative spectral analysis between the functionality of the human eye and of the optical part of a digital camera

    Science.gov (United States)

    Toadere, Florin

    2015-02-01

    A software that comparatively analysis the spectral functionality of the optical part of the human eye and of the optical image acquisition system of the digital camera, is presented. Comparisons are done using demonstrative images which present the spectral color transformations of an image that is considered the test object. To perform the simulations are presented the spectral models and are computed their effects on the colors of the spectral image, during the propagation of the D48 sun light through the eye and the optics of the digital camera. The simulations are made using a spectral image processing algorithm which converts the spectral image into XYZ color space, CIE CAM02 color appearance model and then into RGB color space.

  10. TOF plotter—a program to perform routine analysis time-of-flight mass spectral data

    Science.gov (United States)

    Knippel, Brad C.; Padgett, Clifford W.; Marcus, R. Kenneth

    2004-03-01

    The main article discusses the operation and application of the program to mass spectral data files. This laboratory has recently reported the construction and characterization of a linear time-of-flight mass spectrometer (ToF-MS) utilizing a radio frequency glow discharge ionization source. Data acquisition and analysis was performed using a digital oscilloscope and Microsoft Excel, respectively. Presently, no software package is available that is specifically designed for time-of-flight mass spectral analysis that is not instrument dependent. While spreadsheet applications such as Excel offer tremendous utility, they can be cumbersome when repeatedly performing tasks which are too complex or too user intensive for macros to be viable. To address this situation and make data analysis a faster, simpler task, our laboratory has developed a Microsoft Windows-based software program coded in Microsoft Visual Basic. This program enables the user to rapidly perform routine data analysis tasks such as mass calibration, plotting and smoothing on x- y data sets. In addition to a suite of tools for data analysis, a number of calculators are built into the software to simplify routine calculations pertaining to linear ToF-MS. These include mass resolution, ion kinetic energy and single peak identification calculators. A detailed description of the software and its associated functions is presented followed by a characterization of its performance in the analysis of several representative ToF-MS spectra obtained from different GD-ToF-MS systems.

  11. An Augmented Classical Least Squares Method for Quantitative Raman Spectral Analysis against Component Information Loss

    Directory of Open Access Journals (Sweden)

    Yan Zhou

    2013-01-01

    Full Text Available We propose an augmented classical least squares (ACLS calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS and principal component regression (PCR using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  12. The Swift X-ray Telescope Cluster Survey III: X-ray spectral analysis

    CERN Document Server

    Tozzi, P; Tundo, E; Liu, T; Rosati, P; Borgani, S; Tagliaferri, G; Campana, S; Fugazza, D; D'Avanzo, P

    2014-01-01

    (Abridged) We present a spectral analysis of a new, flux-limited sample of 72 X-ray selected clusters of galaxies identified with the X-ray Telescope (XRT) on board the Swift satellite down to a flux limit of ~10-14 erg/s/cm2 (SWXCS, Tundo et al. 2012). We carry out a detailed X-ray spectral analysis with the twofold aim of measuring redshifts and characterizing the properties of the Intra-Cluster Medium (ICM). Optical counterparts and spectroscopic or photometric redshifts are obtained with a cross-correlation with NED. Additional photometric redshifts are computed with a dedicated follow-up program with the TNG and a cross-correlation with the SDSS. We also detect the iron emission lines in 35% of the sample, and hence obtain a robust measure of the X-ray redshift zX. We use zX whenever the optical redshift is not available. Finally, for all the sources with measured redshift, background-subtracted spectra are fitted with a mekal model. We perform extensive spectral simulations to derive an empirical formul...

  13. Identification of mineral compositions in some renal calculi by FT Raman and IR spectral analysis

    Science.gov (United States)

    Tonannavar, J.; Deshpande, Gouri; Yenagi, Jayashree; Patil, Siddanagouda B.; Patil, Nikhil A.; Mulimani, B. G.

    2016-02-01

    We present in this paper accurate and reliable Raman and IR spectral identification of mineral constituents in nine samples of renal calculi (kidney stones) removed from patients suffering from nephrolithiasis. The identified mineral components include Calcium Oxalate Monohydrate (COM, whewellite), Calcium Oxalate Dihydrate (COD, weddellite), Magnesium Ammonium Phosphate Hexahydrate (MAPH, struvite), Calcium Hydrogen Phosphate Dihydrate (CHPD, brushite), Pentacalcium Hydroxy Triphosphate (PCHT, hydroxyapatite) and Uric Acid (UA). The identification is based on a satisfactory assignment of all the observed IR and Raman bands (3500-400 cm- 1) to chemical functional groups of mineral components in the samples, aided by spectral analysis of pure materials of COM, MAPH, CHPD and UA. It is found that the eight samples are composed of COM as the common component, the other mineral species as common components are: MAPH in five samples, PCHT in three samples, COD in three samples, UA in three samples and CHPD in two samples. One sample is wholly composed of UA as a single component; this inference is supported by the good agreement between ab initio density functional theoretical spectra and experimental spectral measurements of both sample and pure material. A combined application of Raman and IR techniques has shown that, where the IR is ambiguous, the Raman analysis can differentiate COD from COM and PCHT from MAPH.

  14. Spectral analysis of bilateral or alternate-site kindling-induced afterdischarges in the rabbit hippocampi.

    Science.gov (United States)

    Tsuchiya, Komei; Kogure, Shinichi

    2012-09-01

    Kindling is one of the popular animal models of temporal lobe epilepsy. In the present study following the previous results obtained using unilateral hippocampal kindling (UK), we performed spectral analysis of bilateral or alternate-site kindling-induced afterdischarges (ADs) in the rabbit hippocampi. Eight and ten adult rabbits were used for bilateral kindling (BK) and alternate-site kindling (AK), respectively. Kindling stimuli consisted of a train of biphasic pulses (1ms duration each) of 50Hz for 1s, with suprathreshold intensity for AD. The stimulations were applied simultaneously to the bilateral hippocampi in the BK and were delivered to the right and left hippocampus once every 24h in the AK. Motor responses were classified into five stages according to the conventional criteria. All animals in BK as well as AK developed stage 5 convulsions. This contrasts to the result of UK (kindled: 50%; incomplete: 50%). We normalized power spectral density (PSD) and monitored the changes in the proportion of lower frequency band component (LFB: 0-9Hz) and the higher frequency band (HFB: 12-30Hz). BK animals showed a significantly large decrement (0.5 times, pkindling progression. Very strong positive correlations were found in both kindling animals. Chronological spectral analysis of seizure discharges, resulting in a pattern of LFB decrement accompanied by HFB increment, is a convenient tool to investigate epileptic disorders and diagnose epileptic states.

  15. Uterine EMG spectral analysis and relationship to mechanical activity in pregnant monkeys.

    Science.gov (United States)

    Mansour, S; Devedeux, D; Germain, G; Marque, C; Duchêne, J

    1996-03-01

    The objective is to analyse internal and external recordings of uterine EMG in order to reveal common features and to assess the relationship between electrical activity and intra-uterine pressure modification. Three monkeys participated in the study, one as a reference and the others for data. EMGs are recorded simultaneously, internally by unipolar wire electrodes and externally by bipolar Ag/AgCl electrodes. Intra-uterine pressure is recorded as a mechanical index. Except for delay measurements, parameters are derived from spectral analysis and relationships between recordings are assessed by studying the coherence. Spectral analysis exhibits two basic activities in the analysed frequency band, and frequency limits are defined as relevant parameters for electrical activity description. Parameter values do not depend on the internal electrode location. Internal and external EMGs present a similar spectral shape, despite differences in electrode configuration and tissue filtering. It is deduced that external uterine EMG is a good image of the genuine uterine electrical activity. To some extent, it can be related to an average cellular electrical activity.

  16. XMM-Newton and Swift observations of WZ Sge: spectral and timing analysis

    CERN Document Server

    Nucita, A A; De Paolis, F; Mukai, K; Ingrosso, G; Maiolo, B M T

    2014-01-01

    WZ Sagittae is the prototype object of a subclass of dwarf novae, with rare and long (super)outbursts, in which a white dwarf primary accretes matter from a low mass companion. High-energy observations offer the possibility of a better understanding of the disk-accretion mechanism in WZ Sge-like binaries. We used archival XMM-Newton and Swift data to characterize the X-ray spectral and temporal properties of WZ Sge in quiescence. We performed a detailed timing analysis of the simultaneous X-ray and UV light curves obtained with the EPIC and OM instruments on board XMM-Newton in 2003. We employed several techniques in this study, including a correlation study between the two curves. We also performed an X-ray spectral analysis using the EPIC data, as well as Swift/XRT data obtained in 2011. We find that the X-ray intensity is clearly modulated at a period of about 28.96 s, confirming previously published preliminary results. We find that the X-ray spectral shape of WZ Sge remains practically unchanged between ...

  17. Assessment of Infrared Sounder Radiometric Noise from Analysis of Spectral Residuals

    Science.gov (United States)

    Dufour, E.; Klonecki, A.; Standfuss, C.; Tournier, B.; Serio, C.; Masiello, G.; Tjemkes, S.; Stuhlmann, R.

    2016-08-01

    For the preparation and performance monitoring of the future generation of hyperspectral InfraRed sounders dedicated to the precise vertical profiling of the atmospheric state, such as the Meteosat Third Generation hyperspectral InfraRed Sounder, a reliable assessment of the instrument radiometric error covariance matrix is needed.Ideally, an inflight estimation of the radiometrric noise is recommended as certain sources of noise can be driven by the spectral signature of the observed Earth/ atmosphere radiance. Also, unknown correlated noise sources, generally related to incomplete knowledge of the instrument state, can be present, so a caracterisation of the noise spectral correlation is also neeed.A methodology, relying on the analysis of post-retreival spectral residuals, is designed and implemented to derive in-flight the covariance matrix on the basis of Earth scenes measurements. This methodology is successfully demonstrated using IASI observations as MTG-IRS proxy data and made it possible to highlight anticipated correlation structures explained by apodization and micro-vibration effects (ghost). This analysis is corroborated by a parallel estimation based on an IASI black body measurement dataset and the results of an independent micro-vibration model.

  18. Analyzing multiple spike trains with nonparametric Granger causality.

    Science.gov (United States)

    Nedungadi, Aatira G; Rangarajan, Govindan; Jain, Neeraj; Ding, Mingzhou

    2009-08-01

    Simultaneous recordings of spike trains from multiple single neurons are becoming commonplace. Understanding the interaction patterns among these spike trains remains a key research area. A question of interest is the evaluation of information flow between neurons through the analysis of whether one spike train exerts causal influence on another. For continuous-valued time series data, Granger causality has proven an effective method for this purpose. However, the basis for Granger causality estimation is autoregressive data modeling, which is not directly applicable to spike trains. Various filtering options distort the properties of spike trains as point processes. Here we propose a new nonparametric approach to estimate Granger causality directly from the Fourier transforms of spike train data. We validate the method on synthetic spike trains generated by model networks of neurons with known connectivity patterns and then apply it to neurons simultaneously recorded from the thalamus and the primary somatosensory cortex of a squirrel monkey undergoing tactile stimulation.

  19. Nonparametric Estimation of Distributions in Random Effects Models

    KAUST Repository

    Hart, Jeffrey D.

    2011-01-01

    We propose using minimum distance to obtain nonparametric estimates of the distributions of components in random effects models. A main setting considered is equivalent to having a large number of small datasets whose locations, and perhaps scales, vary randomly, but which otherwise have a common distribution. Interest focuses on estimating the distribution that is common to all datasets, knowledge of which is crucial in multiple testing problems where a location/scale invariant test is applied to every small dataset. A detailed algorithm for computing minimum distance estimates is proposed, and the usefulness of our methodology is illustrated by a simulation study and an analysis of microarray data. Supplemental materials for the article, including R-code and a dataset, are available online. © 2011 American Statistical Association.

  20. Nonparametric estimation of stochastic differential equations with sparse Gaussian processes

    Science.gov (United States)

    García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.

    2017-08-01

    The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.