WorldWideScience

Sample records for wavelet-based multivariable approach

  1. Finding the multipath propagation of multivariable crude oil prices using a wavelet-based network approach

    Science.gov (United States)

    Jia, Xiaoliang; An, Haizhong; Sun, Xiaoqi; Huang, Xuan; Gao, Xiangyun

    2016-04-01

    The globalization and regionalization of crude oil trade inevitably give rise to the difference of crude oil prices. The understanding of the pattern of the crude oil prices' mutual propagation is essential for analyzing the development of global oil trade. Previous research has focused mainly on the fuzzy long- or short-term one-to-one propagation of bivariate oil prices, generally ignoring various patterns of periodical multivariate propagation. This study presents a wavelet-based network approach to help uncover the multipath propagation of multivariable crude oil prices in a joint time-frequency period. The weekly oil spot prices of the OPEC member states from June 1999 to March 2011 are adopted as the sample data. First, we used wavelet analysis to find different subseries based on an optimal decomposing scale to describe the periodical feature of the original oil price time series. Second, a complex network model was constructed based on an optimal threshold selection to describe the structural feature of multivariable oil prices. Third, Bayesian network analysis (BNA) was conducted to find the probability causal relationship based on periodical structural features to describe the various patterns of periodical multivariable propagation. Finally, the significance of the leading and intermediary oil prices is discussed. These findings are beneficial for the implementation of periodical target-oriented pricing policies and investment strategies.

  2. Wavelet based approach for facial expression recognition

    Directory of Open Access Journals (Sweden)

    Zaenal Abidin

    2015-03-01

    Full Text Available Facial expression recognition is one of the most active fields of research. Many facial expression recognition methods have been developed and implemented. Neural networks (NNs have capability to undertake such pattern recognition tasks. The key factor of the use of NN is based on its characteristics. It is capable in conducting learning and generalizing, non-linear mapping, and parallel computation. Backpropagation neural networks (BPNNs are the approach methods that mostly used. In this study, BPNNs were used as classifier to categorize facial expression images into seven-class of expressions which are anger, disgust, fear, happiness, sadness, neutral and surprise. For the purpose of feature extraction tasks, three discrete wavelet transforms were used to decompose images, namely Haar wavelet, Daubechies (4 wavelet and Coiflet (1 wavelet. To analyze the proposed method, a facial expression recognition system was built. The proposed method was tested on static images from JAFFE database.

  3. A Wavelet-Based Approach to Fall Detection

    Directory of Open Access Journals (Sweden)

    Luca Palmerini

    2015-05-01

    Full Text Available Falls among older people are a widely documented public health problem. Automatic fall detection has recently gained huge importance because it could allow for the immediate communication of falls to medical assistance. The aim of this work is to present a novel wavelet-based approach to fall detection, focusing on the impact phase and using a dataset of real-world falls. Since recorded falls result in a non-stationary signal, a wavelet transform was chosen to examine fall patterns. The idea is to consider the average fall pattern as the “prototype fall”.In order to detect falls, every acceleration signal can be compared to this prototype through wavelet analysis. The similarity of the recorded signal with the prototype fall is a feature that can be used in order to determine the difference between falls and daily activities. The discriminative ability of this feature is evaluated on real-world data. It outperforms other features that are commonly used in fall detection studies, with an Area Under the Curve of 0.918. This result suggests that the proposed wavelet-based feature is promising and future studies could use this feature (in combination with others considering different fall phases in order to improve the performance of fall detection algorithms.

  4. Wavelet-Based Diffusion Approach for DTI Image Restoration

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiang-fen; CHEN Wu-fan; TIAN Wei-feng; YE Hong

    2008-01-01

    The Rician noise introduced into the diffusion tensor images (DTIs) can bring serious impacts on tensor calculation and fiber tracking. To decrease the effects of the Rician noise, we propose to consider the wavelet-based diffusion method to denoise multichannel typed diffusion weighted (DW) images. The presented smoothing strategy, which utilizes anisotropic nonlinear diffusion in wavelet domain, successfully removes noise while preserving both texture and edges. To evaluate quantitatively the efficiency of the presented method in accounting for the Rician noise introduced into the DW images, the peak-to-peak signal-to-noise ratio (PSNR) and signal-to-mean squared error ratio (SMSE) metrics are adopted. Based on the synthetic and real data, we calculated the apparent diffusion coefficient (ADC) and tracked the fibers. We made comparisons between the presented model,the wave shrinkage and regularized nonlinear diffusion smoothing method. All the experiment results prove quantitatively and visually the better performance of the presented filter.

  5. Dependence and risk assessment for oil prices and exchange rate portfolios: A wavelet based approach

    Science.gov (United States)

    Aloui, Chaker; Jammazi, Rania

    2015-10-01

    In this article, we propose a wavelet-based approach to accommodate the stylized facts and complex structure of financial data, caused by frequent and abrupt changes of markets and noises. Specifically, we show how the combination of both continuous and discrete wavelet transforms with traditional financial models helps improve portfolio's market risk assessment. In the empirical stage, three wavelet-based models (wavelet-EGARCH with dynamic conditional correlations, wavelet-copula, and wavelet-extreme value) are considered and applied to crude oil price and US dollar exchange rate data. Our findings show that the wavelet-based approach provides an effective and powerful tool for detecting extreme moments and improving the accuracy of VaR and Expected Shortfall estimates of oil-exchange rate portfolios after noise is removed from the original data.

  6. Examination of the wavelet-based approach for measuring self-similarity of epileptic electroencephalogram data

    Institute of Scientific and Technical Information of China (English)

    Suparerk JANJARASJITT

    2014-01-01

    Self-similarity or scale-invariance is a fascinating characteristic found in various signals including electroencephalogram (EEG) signals. A common measure used for characterizing self-similarity or scale-invariance is the spectral exponent. In this study, a computational method for estimating the spectral exponent based on wavelet transform was examined. A series of Daubechies wavelet bases with various numbers of vanishing moments were applied to analyze the self-similar characteristics of intracranial EEG data corresponding to different pathological states of the brain, i.e., ictal and interictal states, in patients with epilepsy. The computational results show that the spectral exponents of intracranial EEG signals obtained during epileptic seizure activity tend to be higher than those obtained during non-seizure periods. This suggests that the intracranial EEG signals obtained during epileptic seizure activity tend to be more self-similar than those obtained during non-seizure periods. The computational results obtained using the wavelet-based approach were validated by comparison with results obtained using the power spectrum method.

  7. A wavelet-based structural damage assessment approach with progressively downloaded sensor data

    Science.gov (United States)

    Li, Jian; Zhang, Yunfeng; Zhu, Songye

    2008-02-01

    This paper presents a wavelet-based on-line damage assessment approach based on the use of progressively transmitted multi-resolution sensor data. In extreme events like strong earthquakes, real-time retrieval of structural monitoring data and on-line damage assessment of civil infrastructures are crucial for emergency relief and disaster assistance efforts such as resource allocation and evacuation route arrangement. Due to the limited communication bandwidth available to data transmission during and immediately after major earthquakes, innovative methods for integrated sensor data transmission and on-line damage assessment are highly desired. The proposed approach utilizes a lifting scheme wavelet transform to generate multi-resolution sensor data, which are transmitted progressively in increasing resolution. Multi-resolution sensor data enable interactive on-line condition assessment of structural damages. To validate this concept, a hysteresis-based damage assessment method, proposed by Iwan for extreme-event use, is selected in this study. A sensitivity study on the hysteresis-based damage assessment method under varying data resolution levels was conducted using simulation data from a six-story steel braced frame building subjected to earthquake ground motion. The results of this study show that the proposed approach is capable of reducing the raw sensor data size by a significant amount while having a minor effect on the accuracy of hysteresis-based damage assessment. The proposed approach provides a valuable decision support tool for engineers and emergency response personnel who want to access the data in real time and perform on-line damage assessment in an efficient manner.

  8. A novel wavelet based approach for near lossless image compression using modified duplicate free run length coding

    Directory of Open Access Journals (Sweden)

    Pacha Sreenivasulu

    2014-12-01

    Full Text Available In this paper we are presenting a three-stage near lossless image compression scheme. It belongs to the class of lossless coding which consists of wavelet based decomposition followed by modified duplicate free run-length coding. We go for the selection of optimum bit rate to guarantee minimum MSE (mean square error, high PSNR (peak signal to noise ratio and also ensure that time required for computation is very less unlike other compression schemes. Hence we propose 'A wavelet based novel approach for near lossless image compression'. Which is very much useful for real time applications and is also compared with EZW, SPIHT, SOFM and the proposed method is out performed.

  9. Wavelet-based Evapotranspiration Forecasts

    Science.gov (United States)

    Bachour, R.; Maslova, I.; Ticlavilca, A. M.; McKee, M.; Walker, W.

    2012-12-01

    Providing a reliable short-term forecast of evapotranspiration (ET) could be a valuable element for improving the efficiency of irrigation water delivery systems. In the last decade, wavelet transform has become a useful technique for analyzing the frequency domain of hydrological time series. This study shows how wavelet transform can be used to access statistical properties of evapotranspiration. The objective of the research reported here is to use wavelet-based techniques to forecast ET up to 16 days ahead, which corresponds to the LANDSAT 7 overpass cycle. The properties of the ET time series, both physical and statistical, are examined in the time and frequency domains. We use the information about the energy decomposition in the wavelet domain to extract meaningful components that are used as inputs for ET forecasting models. Seasonal autoregressive integrated moving average (SARIMA) and multivariate relevance vector machine (MVRVM) models are coupled with the wavelet-based multiresolution analysis (MRA) results and used to generate short-term ET forecasts. Accuracy of the models is estimated and model robustness is evaluated using the bootstrap approach.

  10. A new approach to pre-processing digital image for wavelet-based watermark

    Science.gov (United States)

    Agreste, Santa; Andaloro, Guido

    2008-11-01

    The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.

  11. Multivariate Bioclimatic Ecosystem Change Approaches

    Science.gov (United States)

    2015-02-06

    conclude that an analogous patch did not exist. It must exist somewhere, but some of the other MVA techniques were restricted by the mathematical ...found that the Primarily Analogous Multivariate approach developed during this research clearly distinguished itself from the other five approaches in...Principally Analogous Multivariate (PAM) approach ............................................... 29 4.6.1 Introduction to the PAM approach

  12. A wavelet-based time frequency analysis approach for classification of motor imagery for brain computer interface applications

    Science.gov (United States)

    Qin, Lei; He, Bin

    2005-12-01

    Electroencephalogram (EEG) recordings during motor imagery tasks are often used as input signals for brain-computer interfaces (BCIs). The translation of these EEG signals to control signals of a device is based on a good classification of various kinds of imagination. We have developed a wavelet-based time-frequency analysis approach for classifying motor imagery tasks. Time-frequency distributions (TFDs) were constructed based on wavelet decomposition and event-related (de)synchronization patterns were extracted from symmetric electrode pairs. The weighted energy difference of the electrode pairs was then compared to classify the imaginary movement. The present method has been tested in nine human subjects and reached an averaged classification rate of 78%. The simplicity of the present technique suggests that it may provide an alternative method for EEG-based BCI applications.

  13. Wavelet-Based Mixed-Resolution Coding Approach Incorporating with SPT for the Stereo Image

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    With the advances of display technology, three-dimensional(3-D) imaging systems are becoming increasingly popular. One way of stimulating 3-D perception is to use stereo pairs, a pair of images of the same scene acquired from different perspectives. Since there is an inherent redundancy between the images of a stereo pairs, data compression algorithms should be employed to represent stereo pairs efficiently. The proposed techniques generally use blockbased disparity compensation. In order to get the higher compression ratio, this paper employs the wavelet-based mixed-resolution coding technique to incorporate with SPT-based disparity-compensation to compress the stereo image data. The mixed-resolution coding is a perceptually justified technique that is achieved by presenting one eye with a low-resolution image and the other with a high-resolution image. Psychophysical experiments show that the stereo image pairs with one high-resolution image and one low-resolution image provide almost the same stereo depth to that of a stereo image with two high-resolution images. By combining the mixed-resolution coding and SPT-based disparity-compensation techniques, one reference (left) high-resolution image can be compressed by a hierarchical wavelet transform followed by vector quantization and Huffman encoder. After two level wavelet decompositions, for the lowresolution right image and low-resolution left image, subspace projection technique using the fixed block size disparity compensation estimation is used. At the decoder, the low-resolution right subimage is estimated using the disparity from the low-resolution left subimage. A full-size reconstruction is obtained by upsampling a factor of 4 and reconstructing with the synthesis low pass filter. Finally, experimental results are presented, which show that our scheme achieves a PSNR gain (about 0.92dB) as compared to the current block-based disparity compensation coding techniques.``

  14. Joint discrepancy evaluation of an existing steel bridge using time-frequency and wavelet-based approach

    Science.gov (United States)

    Walia, Suresh Kumar; Patel, Raj Kumar; Vinayak, Hemant Kumar; Parti, Raman

    2013-12-01

    The objective of this study is to bring out the errors introduced during construction which are overlooked during the physical verification of the bridge. Such errors can be pointed out if the symmetry of the structure is challenged. This paper thus presents the study of downstream and upstream truss of newly constructed steel bridge using time-frequency and wavelet-based approach. The variation in the behavior of truss joints of bridge with variation in the vehicle speed has been worked out to determine their flexibility. The testing on the steel bridge was carried out with the same instrument setup on both the upstream and downstream trusses of the bridge at two different speeds with the same moving vehicle. The nodal flexibility investigation is carried out using power spectral density, short-time Fourier transform, and wavelet packet transform with respect to both the trusses and speed. The results obtained have shown that the joints of both upstream and downstream trusses of the bridge behave in a different manner even if designed for the same loading due to constructional variations and vehicle movement, in spite of the fact that the analytical models present a simplistic model for analysis and design. The difficulty of modal parameter extraction of the particular bridge under study increased with the increase in speed due to decreased excitation time.

  15. Multi scale risk measurement in electricity market:a wavelet based value at risk approach

    Institute of Scientific and Technical Information of China (English)

    Guu; Sy-Ming; Lai; Kin; Keung

    2008-01-01

    Value at risk (VaR) is adopted to measure the risk level in the electricity market. To estimate VaR at higher accuracy and reliability, the wavelet variance decomposed approach for value at risk estimates (WVDVaR) is proposed. Empirical studies conduct in five Australian electricity markets, which evaluate the performances of both the proposed approach and the traditional ARMA-GARCH approach using the Kupiec backtesting procedure. Experimental results suggest that the proposed approach measures electricity ...

  16. A wavelet-based approach to assessing timing errors in hydrologic predictions

    Science.gov (United States)

    Liu, Yuqiong; Brown, James; Demargne, Julie; Seo, Dong-Jun

    2011-02-01

    SummaryStreamflow predictions typically contain errors in both the timing and the magnitude of peak flows. These two types of error often originate from different sources (e.g. rainfall-runoff modeling vs. routing) and hence may have different implications and ramifications for both model diagnosis and decision support. Thus, where possible and relevant, they should be distinguished and separated in model evaluation and forecast verification applications. Distinct information on timing errors in hydrologic prediction could lead to more targeted model improvements in a diagnostic evaluation context, as well as better-informed decisions in many practical applications, such as flood prediction, water supply forecasting, river regulation, navigation, and engineering design. However, information on timing errors in hydrologic predictions is rarely evaluated or provided. In this paper, we discuss the importance of assessing and quantifying timing error in hydrologic predictions and present a new approach, which is based on the cross wavelet transform (XWT) technique. The XWT technique transforms the time series of predictions and corresponding observations into a two-dimensional time-scale space and provides information on scale- and time-dependent timing differences between the two time series. The results for synthetic timing errors (both constant and time-varying) indicate that the XWT-based approach can estimate timing errors in streamflow predictions with reasonable reliability. The approach is then employed to analyze the timing errors in real streamflow simulations for a number of headwater basins in the US state of Texas. The resulting timing error estimates were consistent with the physiographic and climatic characteristics of these basins. A simple post-factum timing adjustment based on these estimates led to considerably improved agreement between streamflow observations and simulations, further illustrating the potential for using the XWT-based approach for

  17. A wavelet-based approach to the discovery of themes and sections in monophonic melodies

    DEFF Research Database (Denmark)

    Velarde, Gissel; Meredith, David

    We present the computational method submitted to the MIREX 2014 Discovery of Repeated Themes & Sections task, and the results on the monophonic version of the JKU Patterns Development Database. In the context of pattern discovery in monophonic music, the idea behind our method is that, with a good...... melodic structure in terms of segments, it should be possible to gather similar segments into clusters and rank their salience within the piece. We present an approach to this problem and how we address it. In general terms, we represent melodies either as raw 1D pitch signals or as these signals filtered...

  18. A Novel Wavelet-Based Approach for Predicting Nucleosome Positions Using DNA Structural Information.

    Science.gov (United States)

    Gan, Yanglan; Zou, Guobing; Guan, Jihong; Xu, Guangwei

    2014-01-01

    Nucleosomes are basic elements of chromatin structure. The positioning of nucleosomes along a genome is very important to dictate eukaryotic DNA compaction and access. Current computational methods have focused on the analysis of nucleosome occupancy and the positioning of well-positioned nucleosomes. However, fuzzy nucleosomes require more complex configurations and are more difficult to predict their positions. We analyzed the positioning of well-positioned and fuzzy nucleosomes from a novel structural perspective, and proposed WaveNuc, a computational approach for inferring their positions based on continuous wavelet transformation. The comparative analysis demonstrates that these two kinds of nucleosomes exhibit different propeller twist structural characteristics. Well-positioned nucleosomes tend to locate at sharp peaks of the propeller twist profile, whereas fuzzy nucleosomes correspond to broader peaks. The sharpness of these peaks shows that the propeller twist profile may contain nucleosome positioning information. Exploiting this knowledge, we applied WaveNuc to detect the two different kinds of peaks of the propeller twist profile along the genome. We compared the performance of our method with existing methods on real data sets. The results show that the proposed method can accurately resolve complex configurations of fuzzy nucleosomes, which leads to better performance of nucleosome positioning prediction on the whole genome.

  19. Wavelet Based Image Denoising Technique

    Directory of Open Access Journals (Sweden)

    Sachin D Ruikar

    2011-03-01

    Full Text Available This paper proposes different approaches of wavelet based image denoising methods. The search for efficient image denoising methods is still a valid challenge at the crossing of functional analysis and statistics. In spite of the sophistication of the recently proposed methods, most algorithms have not yet attained a desirable level of applicability. Wavelet algorithms are useful tool for signal processing such as image compression and denoising. Multi wavelets can be considered as an extension of scalar wavelets. The main aim is to modify the wavelet coefficients in the new basis, the noise can be removed from the data. In this paper, we extend the existing technique and providing a comprehensive evaluation of the proposed method. Results based on different noise, such as Gaussian, Poisson’s, Salt and Pepper, and Speckle performed in this paper. A signal to noise ratio as a measure of the quality of denoising was preferred.

  20. Measuring equilibrium models: a multivariate approach

    Directory of Open Access Journals (Sweden)

    Nadji RAHMANIA

    2011-04-01

    Full Text Available This paper presents a multivariate methodology for obtaining measures of unobserved macroeconomic variables. The used procedure is the multivariate Hodrick-Prescot which depends on smoothing param eters. The choice of these parameters is crucial. Our approach is based on consistent estimators of these parameters, depending only on the observed data.

  1. Drought prediction using a wavelet based approach to model the temporal consequences of different types of droughts

    Science.gov (United States)

    Maity, Rajib; Suman, Mayank; Verma, Nitesh Kumar

    2016-08-01

    Droughts are expected to propagate from one type to another - meteorological to agricultural to hydrological to socio-economic. However, they do not possess a universal, straightforward temporal dependence. Rather, assessment of one type of drought (successor) from another (predecessor) is a complex problem depending on the basin's physiographic and climatic characteristics, such as, spatial extent, topography, land use, land cover and climate regime. In this paper, a wavelet decomposition based approach is proposed to model the temporal dependence between different types of droughts. The idea behind is to separate the rapidly and slowly moving components of drought indices. It is shown that the temporal dependence of predecessor (say meteorological drought) on the successor (say hydrological drought) can be better captured at its constituting components level. Such components are obtained through wavelet decomposition retaining its temporal correspondence. Thus, in the proposed approach, predictand drought index is predicted using the decomposed components of predecessor drought. Several alternative models are investigated to arrive at the best possible model structure for predicting different types of drought. The proposed approach is found to be very useful for foreseeing the agricultural or hydrological droughts knowing the meteorological drought status, offering the scope for better management of drought consequences. The mathematical framework of the proposed approach is general in nature and can be applied to different basins. However, the limitation is the requirement of region/catchment specific calibration of some parameters before using the proposed model, which is not very difficult and uncommon though.

  2. Approaches to Assessment in Multivariate Analysis.

    Science.gov (United States)

    O'Connell, Ann A.

    This paper reviews trends in assessment in quantitative courses and illustrates several options and approaches to assessment for advanced courses at the graduate level, especially in multivariate analysis. The paper provides a summary of how a researcher has used alternatives to traditional methods of assessment in a course on multivariate…

  3. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  4. Multivariate Approaches to Classification in Extragalactic Astronomy

    Science.gov (United States)

    Fraix-Burnet, Didier; Thuillard, Marc; Chattopadhyay, Asis Kumar

    2015-08-01

    Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  5. Multivariate Approaches to Classification in Extragalactic Astronomy

    Directory of Open Access Journals (Sweden)

    Didier eFraix-Burnet

    2015-08-01

    Full Text Available Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  6. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    This Ph.D. thesis, A Multivariate Approach to Functional Neuro Modeling, deals with the analysis and modeling of data from functional neuro imaging experiments. A multivariate dataset description is provided which facilitates efficient representation of typical datasets and, more importantly...... and overall conditions governing the functional experiment, via associated micro- and macroscopic variables. The description facilitates an efficient microscopic re-representation, as well as a handle on the link between brain and behavior; the latter is achieved by hypothesizing variations in the micro...... a generalization theoretical framework centered around measures of model generalization error. - Only few, if any, examples of the application of generalization theory to functional neuro modeling currently exist in the literature. - Exemplification of the proposed generalization theoretical framework...

  7. Multivariate Approaches to Classification in Extragalactic Astronomy

    CERN Document Server

    Fraix-Burnet, Didier; Chattopadhyay, Asis Kumar

    2015-01-01

    Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono-or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.

  8. Wavelet based Non LSB Steganography

    Directory of Open Access Journals (Sweden)

    H S Manjunatha Reddy

    2011-11-01

    Full Text Available Steganography is the methods of communicating secrete information hidden in the cover object. The messages hidden in a host data are digital image, video or audio files, etc, and then transmitted secretly to the destination. In this paper we propose Wavelet based Non LSB Steganography (WNLS. The cover image is segmented into 4*4 cells and DWT/IWT is applied on each cell. The 2*2 cell of HH band of DWT/IWT are considered and manipulated with payload bit pairs using identity matrix to generate stego image. The key is used to extract payload bit pairs at the destination. It is observed that the PSNR values are better in the case of IWT compare to DWT for all image formats. The algorithm can’t be detected by existing steganalysis techniques such as chi-square and pair of values techniques. The PSNR values are high in the case of raw images compared to formatted images.

  9. Wavelet-based multispectral face recognition

    Institute of Scientific and Technical Information of China (English)

    LIU Dian-ting; ZHOU Xiao-dan; WANG Cheng-wen

    2008-01-01

    This paper proposes a novel wavelet-based face recognition method using thermal infrared (1R) and visible-light face images. The method applies the combination of Gabor and the Fisherfaces method to the reconstructed IR and visible images derived from wavelet frequency subbands. Our objective is to search for the subbands that are insensitive to the variation in expression and in illumination. The classification performance is improved by combining the multispectal information coming from the subbands that attain individually low equal error rate. Experimental results on Notre Dame face database show that the proposed wavelet-based algorithm outperforms previous multispectral images fusion method as well as monospectral method.

  10. Wavelet-based prediction of oil prices

    Energy Technology Data Exchange (ETDEWEB)

    Yousefi, Shahriar [Econometric Group, Department of Economics, University of Southern Denmark, DK-5230 Odense M (Denmark); Weinreich, Ilona [Department of Mathematics and Technology, University of Applied Sciences Koblenz, RheinAhr Campus, D-53424 Remagen (Germany)]. E-mail: weinreich@rheinahrcampus.de; Reinarz, Dominik [Department of Mathematics and Technology, University of Applied Sciences Koblenz, RheinAhr Campus, D-53424 Remagen (Germany)

    2005-07-01

    This paper illustrates an application of wavelets as a possible vehicle for investigating the issue of market efficiency in futures markets for oil. The paper provides a short introduction to the wavelets and a few interesting wavelet-based contributions in economics and finance are briefly reviewed. A wavelet-based prediction procedure is introduced and market data on crude oil is used to provide forecasts over different forecasting horizons. The results are compared with data from futures markets for oil and the relative performance of this procedure is used to investigate whether futures markets are efficiently priced.

  11. A multivariate approach in measuring innovation performance

    Directory of Open Access Journals (Sweden)

    Elżbieta Roszko-Wójtowicz

    2016-12-01

    Full Text Available The goal of this research is to propose a procedure of innovativeness measurement, taking Summary Innovation Index methodology as a starting point. In contemporary world, innovative activity is perceived as a source of competitiveness and economic growth. New products, utility models, trademarks and creative projects are an important element of present socio-economic reality. In particular, authors focus on selection and application of multivariable statistical analysis to distinguish factors influencing innovativeness of EU economies to the highest degree. The result of quantitative analyses is linear ordering of EU countries by the level of their innovativeness based on the reduced set of diagnostic variables. The rating was compared with the outcome presented in Innovation Union Scoreboard (IUS with Summary Innovation Index (SII. Conducted analysis proves a convergence between authors’ results and existing ratings of innovativeness. Nevertheless, the main conclusion is that the methodology of innovativeness assessment remains an open issue and requires further research. Especially, it should first and foremost concentrate on deeper verification of a small set of variables that have the strongest impact on innovativeness. It is both, in economic and social interest, to get a clear picture of innovativeness driving forces.

  12. Construction of a class of Daubechies type wavelet bases

    Energy Technology Data Exchange (ETDEWEB)

    Li Dengfeng [Institute of Applied Mathematics, School of Mathematics and Information Sciences Henan University, Kaifeng 475001 (China); Wu Guochang [College of Information, Henan University of Finance and Economics, Zhengzhou 450002 (China)], E-mail: archang-0111@163.com

    2009-10-15

    Extensive work has been done in the theory and the construction of compactly supported orthonormal wavelet bases of L{sup 2}(R). Some of the most distinguished work was done by Daubechies, who constructed a whole family of such wavelet bases. In this paper, we construct a class of orthonormal wavelet bases by using the principle of Daubechies, and investigate the length of support and the regularity of these wavelet bases.

  13. Removal of muscle artifact from EEG data: comparison between stochastic (ICA and CCA) and deterministic (EMD and wavelet-based) approaches

    Science.gov (United States)

    Safieddine, Doha; Kachenoura, Amar; Albera, Laurent; Birot, Gwénaël; Karfoul, Ahmad; Pasnicu, Anca; Biraben, Arnaud; Wendling, Fabrice; Senhadji, Lotfi; Merlet, Isabelle

    2012-12-01

    Electroencephalographic (EEG) recordings are often contaminated with muscle artifacts. This disturbing myogenic activity not only strongly affects the visual analysis of EEG, but also most surely impairs the results of EEG signal processing tools such as source localization. This article focuses on the particular context of the contamination epileptic signals (interictal spikes) by muscle artifact, as EEG is a key diagnosis tool for this pathology. In this context, our aim was to compare the ability of two stochastic approaches of blind source separation, namely independent component analysis (ICA) and canonical correlation analysis (CCA), and of two deterministic approaches namely empirical mode decomposition (EMD) and wavelet transform (WT) to remove muscle artifacts from EEG signals. To quantitatively compare the performance of these four algorithms, epileptic spike-like EEG signals were simulated from two different source configurations and artificially contaminated with different levels of real EEG-recorded myogenic activity. The efficiency of CCA, ICA, EMD, and WT to correct the muscular artifact was evaluated both by calculating the normalized mean-squared error between denoised and original signals and by comparing the results of source localization obtained from artifact-free as well as noisy signals, before and after artifact correction. Tests on real data recorded in an epileptic patient are also presented. The results obtained in the context of simulations and real data show that EMD outperformed the three other algorithms for the denoising of data highly contaminated by muscular activity. For less noisy data, and when spikes arose from a single cortical source, the myogenic artifact was best corrected with CCA and ICA. Otherwise when spikes originated from two distinct sources, either EMD or ICA offered the most reliable denoising result for highly noisy data, while WT offered the better denoising result for less noisy data. These results suggest that

  14. Multivariate analysis of 2-DE protein patterns - Practical approaches

    DEFF Research Database (Denmark)

    Jacobsen, Charlotte; Jacobsen, Susanne; Grove, H.

    2007-01-01

    Practical approaches to the use of multivariate data analysis of 2-DE protein patterns are demonstrated by three independent strategies for the image analysis and the multivariate analysis on the same set of 2-DE data. Four wheat varieties were selected on the basis of their baking quality. Two...... of the varieties were of strong baking quality and hard wheat kernel and two were of weak baking quality and soft kernel. Gliadins at different stages of grain development were analyzed by the application of multivariate data analysis on images of 2-DEs. Patterns related to the wheat varieties, harvest times...... and quality were detected on images of 2-DE protein patterns for all the three strategies. The use of the multivariate methods was evaluated in the alignment and matching procedures of 2-DE gels. All the three strategies were able to discriminate the samples according to quality, harvest time and variety...

  15. Wavelet-based LASSO in functional linear regression.

    Science.gov (United States)

    Zhao, Yihong; Ogden, R Todd; Reiss, Philip T

    2012-07-01

    In linear regression with functional predictors and scalar responses, it may be advantageous, particularly if the function is thought to contain features at many scales, to restrict the coefficient function to the span of a wavelet basis, thereby converting the problem into one of variable selection. If the coefficient function is sparsely represented in the wavelet domain, we may employ the well-known LASSO to select a relatively small number of nonzero wavelet coefficients. This is a natural approach to take but to date, the properties of such an estimator have not been studied. In this paper we describe the wavelet-based LASSO approach to regressing scalars on functions and investigate both its asymptotic convergence and its finite-sample performance through both simulation and real-data application. We compare the performance of this approach with existing methods and find that the wavelet-based LASSO performs relatively well, particularly when the true coefficient function is spiky. Source code to implement the method and data sets used in the study are provided as supplemental materials available online.

  16. Complex Wavelet Based Modulation Analysis

    DEFF Research Database (Denmark)

    Luneau, Jean-Marc; Lebrun, Jérôme; Jensen, Søren Holdt

    2008-01-01

     because only the magnitudes are taken into account and the phase data is often neglected. We remedy this problem with the use of a complex wavelet transform as a more appropriate envelope and phase processing tool. Complex wavelets carry both magnitude and phase explicitly with great sparsity and preserve well...... polynomial trends. Moreover an analytic Hilbert-like transform is possible with complex wavelets implemented as an orthogonal filter bank. By working in an alternative transform domain coined as “Modulation Subbands”, this transform shows very promising denoising capabilities and suggests new approaches for joint...

  17. Wavelet-based deconvolution of ultrasonic signals in nondestructive evaluation

    Institute of Scientific and Technical Information of China (English)

    HERRERA Roberto Henry; OROZCO Rubén; RODRIGUEZ Manuel

    2006-01-01

    In this paper, the inverse problem of reconstructing reflectivity function of a medium is examined within a blind deconvolution framework. The ultrasound pulse is estimated using higher-order statistics, and Wiener filter is used to obtain the ultrasonic reflectivity function through wavelet-based models. A new approach to the parameter estimation of the inverse filtering step is proposed in the nondestructive evaluation field, which is based on the theory of Fourier-Wavelet regularized deconvolution (ForWaRD). This new approach can be viewed as a solution to the open problem of adaptation of the ForWaRD framework to perform the convolution kernel estimation and deconvolution interdependently. The results indicate stable solutions of the estimated pulse and an improvement in the radio-frequency (RF) signal taking into account its signal-to-noise ratio (SNR) and axial resolution. Simulations and experiments showed that the proposed approach can provide robust and optimal estimates of the reflectivity function.

  18. Research on Wavelet-Based Algorithm for Image Contrast Enhancement

    Institute of Scientific and Technical Information of China (English)

    Wu Ying-qian; Du Pei-jun; Shi Peng-fei

    2004-01-01

    A novel wavelet-based algorithm for image enhancement is proposed in the paper. On the basis of multiscale analysis, the proposed algorithm solves efficiently the problem of noise over-enhancement, which commonly occurs in the traditional methods for contrast enhancement. The decomposed coefficients at same scales are processed by a nonlinear method, and the coefficients at different scales are enhanced in different degree. During the procedure, the method takes full advantage of the properties of Human visual system so as to achieve better performance. The simulations demonstrate that these characters of the proposed approach enable it to fully enhance the content in images, to efficiently alleviate the enhancement of noise and to achieve much better enhancement effect than the traditional approaches.

  19. Palmprint Recognition by Applying Wavelet-Based Kernel PCA

    Institute of Scientific and Technical Information of China (English)

    Murat Ekinci; Murat Aykut

    2008-01-01

    This paper presents a wavelet-based kernel Principal Component Analysis (PCA) method by integrating the Daubechies wavelet representation of palm images and the kernel PCA method for palmprint recognition. Kernel PCA is a technique for nonlinear dimension reduction of data with an underlying nonlinear spatial structure. The intensity values of the palmprint image are first normalized by using mean and standard deviation. The palmprint is then transformed into the wavelet domain to decompose palm images and the lowest resolution subband coefficients are chosen for palm representation.The kernel PCA method is then applied to extract non-linear features from the subband coefficients. Finally, similarity measurement is accomplished by using weighted Euclidean linear distance-based nearest neighbor classifier. Experimental results on PolyU Palmprint Databases demonstrate that the proposed approach achieves highly competitive performance with respect to the published palmprint recognition approaches.

  20. Mulch materials in processing tomato: a multivariate approach

    Directory of Open Access Journals (Sweden)

    Marta María Moreno

    2013-08-01

    Full Text Available Mulch materials of different origins have been introduced into the agricultural sector in recent years alternatively to the standard polyethylene due to its environmental impact. This study aimed to evaluate the multivariate response of mulch materials over three consecutive years in a processing tomato (Solanum lycopersicon L. crop in Central Spain. Two biodegradable plastic mulches (BD1, BD2, one oxo-biodegradable material (OB, two types of paper (PP1, PP2, and one barley straw cover (BS were compared using two control treatments (standard black polyethylene [PE] and manual weed control [MW]. A total of 17 variables relating to yield, fruit quality, and weed control were investigated. Several multivariate statistical techniques were applied, including principal component analysis, cluster analysis, and discriminant analysis. A group of mulch materials comprised of OB and BD2 was found to be comparable to black polyethylene regarding all the variables considered. The weed control variables were found to be an important source of discrimination. The two paper mulches tested did not share the same treatment group membership in any case: PP2 presented a multivariate response more similar to the biodegradable plastics, while PP1 was more similar to BS and MW. Based on our multivariate approach, the materials OB and BD2 can be used as an effective, more environmentally friendly alternative to polyethylene mulches.

  1. A Simplified Approach to Multivariable Model Predictive Control

    Directory of Open Access Journals (Sweden)

    Michael Short

    2015-01-01

    Full Text Available The benefits of applying the range of technologies generally known as Model Predictive Control (MPC to the control of industrial processes have been well documented in recent years. One of the principal drawbacks to MPC schemes are the relatively high on-line computational burdens when used with adaptive, constrained and/or multivariable processes, which has warranted some researchers and practitioners to seek simplified approaches for its implementation. To date, several schemes have been proposed based around a simplified 1-norm formulation of multivariable MPC, which is solved online using the simplex algorithm in both the unconstrained and constrained cases. In this paper a 2-norm approach to simplified multivariable MPC is formulated, which is solved online using a vector-matrix product or a simple iterative coordinate descent algorithm for the unconstrained and constrained cases respectively. A CARIMA model is employed to ensure offset-free control, and a simple scheme to produce the optimal predictions is described. A small simulation study and further discussions help to illustrate that this quadratic formulation performs well and can be considered a useful adjunct to its linear counterpart, and still retains the beneficial features such as ease of computer-based implementation.

  2. Evaluation of droplet size distributions using univariate and multivariate approaches

    DEFF Research Database (Denmark)

    Gauno, M.H.; Larsen, C.C.; Vilhelmsen, T.

    2013-01-01

    of the distribution. The current study was aiming to compare univariate and multivariate approach in evaluating droplet size distributions. As a model system, the atomization of a coating solution from a two-fluid nozzle was investigated. The effect of three process parameters (concentration of ethyl cellulose....... Investigation of loading and score plots from principal component analysis (PCA) revealed additional information on the droplet size distributions and it was possible to identify univariate statistics (volume median droplet size), which were similar, however, originating from varying droplet size distributions....... The multivariate data analysis was proven to be an efficient tool for evaluating the full information contained in a distribution. © 2013 Informa Healthcare USA, Inc....

  3. Causal Information Approach to Partial Conditioning in Multivariate Data Sets

    Directory of Open Access Journals (Sweden)

    D. Marinazzo

    2012-01-01

    Full Text Available When evaluating causal influence from one time series to another in a multivariate data set it is necessary to take into account the conditioning effect of the other variables. In the presence of many variables and possibly of a reduced number of samples, full conditioning can lead to computational and numerical problems. In this paper, we address the problem of partial conditioning to a limited subset of variables, in the framework of information theory. The proposed approach is tested on simulated data sets and on an example of intracranial EEG recording from an epileptic subject. We show that, in many instances, conditioning on a small number of variables, chosen as the most informative ones for the driver node, leads to results very close to those obtained with a fully multivariate analysis and even better in the presence of a small number of samples. This is particularly relevant when the pattern of causalities is sparse.

  4. Wavelet-based acoustic recognition of aircraft

    Energy Technology Data Exchange (ETDEWEB)

    Dress, W.B.; Kercel, S.W.

    1994-09-01

    We describe a wavelet-based technique for identifying aircraft from acoustic emissions during take-off and landing. Tests show that the sensor can be a single, inexpensive hearing-aid microphone placed close to the ground the paper describes data collection, analysis by various technique, methods of event classification, and extraction of certain physical parameters from wavelet subspace projections. The primary goal of this paper is to show that wavelet analysis can be used as a divide-and-conquer first step in signal processing, providing both simplification and noise filtering. The idea is to project the original signal onto the orthogonal wavelet subspaces, both details and approximations. Subsequent analysis, such as system identification, nonlinear systems analysis, and feature extraction, is then carried out on the various signal subspaces.

  5. Wavelet-Based Monitoring for Biosurveillance

    Directory of Open Access Journals (Sweden)

    Galit Shmueli

    2013-07-01

    Full Text Available Biosurveillance, focused on the early detection of disease outbreaks, relies on classical statistical control charts for detecting disease outbreaks. However, such methods are not always suitable in this context. Assumptions of normality, independence and stationarity are typically violated in syndromic data. Furthermore, outbreak signatures are typically of unknown patterns and, therefore, call for general detectors. We propose wavelet-based methods, which make less assumptions and are suitable for detecting abnormalities of unknown form. Wavelets have been widely used for data denoising and compression, but little work has been published on using them for monitoring. We discuss monitoring-based issues and illustrate them using data on military clinic visits in the USA.

  6. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek

    2016-04-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  7. A multivariable approach toward predicting dental motor skill performance.

    Science.gov (United States)

    Wilson, S G; Husak, W S

    1988-08-01

    The purpose of the present study was to examine the potential of a multivariable approach in predicting dental motor skill performance. Variables measuring cognitive knowledge, motor abilities, educational background, and family demographics were examined. Data were obtained from 33 first-year dental students. Scaling and root planing tests were administered to each student at the beginning and end of a 14-week preclinical periodontal course. Correlations were low and no variable significantly predicted pre- or posttest scaling and root planing performance. Results are discussed in terms of the problems associated with predicting motor performance.

  8. An approach to the linear multivariable servomechanism problem.

    Science.gov (United States)

    Young, P. C.; Willems, J. C.

    1972-01-01

    This paper presents a state-space approach to the multivariable 'type one' servomechanism problem. Necessary and sufficient conditions for the controllability of such systems are derived and applied to the observability of the (dual) state reconstructor problem for a system with an unknown constant input. The paper also presents a simple systematic design algorithm which provides type one servomechanism performance to command inputs, together with pre-specified closed-loop pole locations. Examples are given to illustrate the utility of the design procedure.

  9. Evaluation of droplet size distributions using univariate and multivariate approaches.

    Science.gov (United States)

    Gaunø, Mette Høg; Larsen, Crilles Casper; Vilhelmsen, Thomas; Møller-Sonnergaard, Jørn; Wittendorff, Jørgen; Rantanen, Jukka

    2013-01-01

    Pharmaceutically relevant material characteristics are often analyzed based on univariate descriptors instead of utilizing the whole information available in the full distribution. One example is droplet size distribution, which is often described by the median droplet size and the width of the distribution. The current study was aiming to compare univariate and multivariate approach in evaluating droplet size distributions. As a model system, the atomization of a coating solution from a two-fluid nozzle was investigated. The effect of three process parameters (concentration of ethyl cellulose in ethanol, atomizing air pressure, and flow rate of coating solution) on the droplet size and droplet size distribution using a full mixed factorial design was used. The droplet size produced by a two-fluid nozzle was measured by laser diffraction and reported as volume based size distribution. Investigation of loading and score plots from principal component analysis (PCA) revealed additional information on the droplet size distributions and it was possible to identify univariate statistics (volume median droplet size), which were similar, however, originating from varying droplet size distributions. The multivariate data analysis was proven to be an efficient tool for evaluating the full information contained in a distribution.

  10. Approaches to sample size determination for multivariate data

    NARCIS (Netherlands)

    Saccenti, Edoardo; Timmerman, Marieke E.

    2016-01-01

    Sample size determination is a fundamental step in the design of experiments. Methods for sample size determination are abundant for univariate analysis methods, but scarce in the multivariate case. Omics data are multivariate in nature and are commonly investigated using multivariate statistical

  11. TOURISM SEGMENTATION BASED ON TOURISTS PREFERENCES: A MULTIVARIATE APPROACH

    Directory of Open Access Journals (Sweden)

    Sérgio Dominique Ferreira

    2010-11-01

    Full Text Available Over the last decades, tourism became one of the most important sectors of the international economy. Specifically in Portugal and Brazil, its contribution to Gross Domestic Product (GDP and job creation is quite relevant. In this sense, to follow a strong marketing approach on the management of tourism resources of a country comes to be paramount. Such an approach should be based on innovations which help unveil the preferences of tourists with accuracy, turning it into a competitive advantage. In this context, the main objective of the present study is to illustrate the importance and benefits associated with the use of multivariate methodologies for market segmentation. Another objective of this work is to illustrate on the importance of a post hoc segmentation. In this work, the authors applied a Cluster Analysis, with a hierarchical method followed by an  optimization method. The main results of this study allow the identification of five clusters that are distinguished by assigning special importance to certain tourism attributes at the moment of choosing a specific destination. Thus, the authors present the advantages of post hoc segmentation based on tourists’ preferences, in opposition to an a priori segmentation based on socio-demographic characteristics.

  12. A wavelet-based two-stage near-lossless coder.

    Science.gov (United States)

    Yea, Sehoon; Pearlman, William A

    2006-11-01

    In this paper, we present a two-stage near-lossless compression scheme. It belongs to the class of "lossy plus residual coding" and consists of a wavelet-based lossy layer followed by arithmetic coding of the quantized residual to guarantee a given L(infinity) error bound in the pixel domain. We focus on the selection of the optimum bit rate for the lossy layer to achieve the minimum total bit rate. Unlike other similar lossy plus lossless approaches using a wavelet-based lossy layer, the proposed method does not require iteration of decoding and inverse discrete wavelet transform in succession to locate the optimum bit rate. We propose a simple method to estimate the optimal bit rate, with a theoretical justification based on the critical rate argument from the rate-distortion theory and the independence of the residual error.

  13. multivariate approach to the study of aquatic species diversity of ...

    African Journals Online (AJOL)

    User

    2016-12-02

    Dec 2, 2016 ... Generalized Linear Model further revealed the pattern in ... Hierarchical framework at multiple spatial levels can ... Limitations of most multivariate applications included the absence of ... HANNA Instruction Manual. Five days ...

  14. AN IMPROVED MULTIVARIATE LOSS FUNCTION APPROACH TO OPTIMIZATION

    Institute of Scientific and Technical Information of China (English)

    Yizhong MA; Fengyu ZHAO

    2004-01-01

    The basic purpose of a quality loss function is to evaluate a loss to customers in a quantitative manner. Although there are several multivariate loss functions that have been proposed and studied in the literature, it has room for improvement. A good multivariate loss function should represent an appropriate compromise in terms of both process economics and the correlation structure among various responses. More important, it should be easily understood and implemented in practice.According to this criterion, we first introduce a pragmatic dimensionless multivariate loss function proposed by Artiles-Leon, then we improve the multivariate loss function in two respects: one is making it suitable for all three types of quality characteristics; the other is considering correlation structure among the various responses, which makes the improved multivariate loss function more adequate in the real world. On the bases of these, an example from industrial practice is provided to compare our improved method with other methods, and last, some reviews are presented in conclusion.

  15. Univariate and multivariate Chen-Stein characterizations -- a parametric approach

    CERN Document Server

    Ley, Christophe

    2011-01-01

    We provide a general framework for characterizing families of (univariate, multivariate, discrete and continuous) distributions in terms of a parameter of interest. We show how this allows for recovering known Chen-Stein characterizations, and for constructing many more. Several examples are worked out in full, and different potential applications are discussed.

  16. 3D Wavelet-Based Filter and Method

    Science.gov (United States)

    Moss, William C.; Haase, Sebastian; Sedat, John W.

    2008-08-12

    A 3D wavelet-based filter for visualizing and locating structural features of a user-specified linear size in 2D or 3D image data. The only input parameter is a characteristic linear size of the feature of interest, and the filter output contains only those regions that are correlated with the characteristic size, thus denoising the image.

  17. WAVELET BASED SPECTRAL CORRELATION METHOD FOR DPSK CHIP RATE ESTIMATION

    Institute of Scientific and Technical Information of China (English)

    Li Yingxiang; Xiao Xianci; Tai Hengming

    2004-01-01

    A wavelet-based spectral correlation algorithm to detect and estimate BPSK signal chip rate is proposed. Simulation results show that the proposed method can correctly estimate the BPSK signal chip rate, which may be corrupted by the quadratic characteristics of the spectral correlation function, in a low SNR environment.

  18. A Speckle Reduction Filter Using Wavelet-Based Methods for Medical Imaging Application

    Science.gov (United States)

    2001-10-25

    A Speckle Reduction Filter using Wavelet-Based Methods for Medical Imaging Application Su...Wavelet-Based Methods for Medical Imaging Application Contract Number Grant Number Program Element Number Author(s) Project Number Task Number Work

  19. Evaluating Wall Street Journal survey forecasters: a multivariate approach

    OpenAIRE

    Eisenbeis, Robert; Waggoner, Daniel; Zha, Tao

    2002-01-01

    This paper proposes a methodology for assessing the joint performance of multivariate forecasts of economic variables. The methodology is illustrated by comparing the rankings of forecasters by the Wall Street Journal with the authors’ alternative rankings. The results show that the methodology can provide useful insights as to the certainty of forecasts as well as the extent to which various forecasts are similar or different.

  20. Evaluating Wall Street Journal survey forecasters: a multivariate approach

    OpenAIRE

    Eisenbeis, Robert; Waggoner, Daniel; Zha, Tao

    2002-01-01

    This paper proposes a methodology for assessing the joint performance of multivariate forecasts of economic variables. The methodology is illustrated by comparing the rankings of forecasters by the Wall Street Journal with the authors’ alternative rankings. The results show that the methodology can provide useful insights as to the certainty of forecasts as well as the extent to which various forecasts are similar or different.

  1. Mulch materials in processing tomato: a multivariate approach

    OpenAIRE

    Marta María Moreno; Carmen Moreno; Ana María Tarquis

    2013-01-01

    Mulch materials of different origins have been introduced into the agricultural sector in recent years alternatively to the standard polyethylene due to its environmental impact. This study aimed to evaluate the multivariate response of mulch materials over three consecutive years in a processing tomato (Solanum lycopersicon L.) crop in Central Spain. Two biodegradable plastic mulches (BD1, BD2), one oxo-biodegradable material (OB), two types of paper (PP1, PP2), and one barley straw cover (B...

  2. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  3. A Wavelet-Based Multiresolution Reconstruction Method for Fluorescent Molecular Tomography

    Directory of Open Access Journals (Sweden)

    Wei Zou

    2009-01-01

    Full Text Available Image reconstruction of fluorescent molecular tomography (FMT often involves repeatedly solving large-dimensional matrix equations, which are computationally expensive, especially for the case where there are large deviations in the optical properties between the target and the reference medium. In this paper, a wavelet-based multiresolution reconstruction approach is proposed for the FMT reconstruction in combination with a parallel forward computing strategy, in which both the forward and the inverse problems of FMT are solved in the wavelet domain. Simulation results demonstrate that the proposed approach can significantly speed up the reconstruction process and improve the image quality of FMT.

  4. A correlation consistency based multivariate alarm thresholds optimization approach.

    Science.gov (United States)

    Gao, Huihui; Liu, Feifei; Zhu, Qunxiong

    2016-11-01

    Different alarm thresholds could generate different alarm data, resulting in different correlations. A new multivariate alarm thresholds optimization methodology based on the correlation consistency between process data and alarm data is proposed in this paper. The interpretative structural modeling is adopted to select the key variables. For the key variables, the correlation coefficients of process data are calculated by the Pearson correlation analysis, while the correlation coefficients of alarm data are calculated by kernel density estimation. To ensure the correlation consistency, the objective function is established as the sum of the absolute differences between these two types of correlations. The optimal thresholds are obtained using particle swarm optimization algorithm. Case study of Tennessee Eastman process is given to demonstrate the effectiveness of proposed method.

  5. Energy and economic growth in the USA: a multivariate approach

    Energy Technology Data Exchange (ETDEWEB)

    Stern, D.I. (Boston Univ., MA (United States). Center for Energy and Environmental Studies)

    1993-04-01

    This paper examines the casual relationship between Gross Domestic Product and energy use for the period 1947-90 in the United States of America. The relationship between energy use and economic growth has been examined by both biophysical and neoclassical economists. In particular, several studies have tested for the presence of a causal relationships (in the Granger sense) between energy use and economic growth. However, these tests do not allow a direct test of the relative explanatory powers of the neoclassical and biophysical models. A multivariate adaptation of the test-vector autoregression (VAR) does allow such a test. A VAR of GDP, energy use, capital stock and employment is estimated and Granger tests for causal relationships between the variables are carried out. Although there is no evidence that gross energy use Granger causes GDP, a measure of final energy use adjusted for changing fuel composition does Granger cause GDP. (author)

  6. Multivariate respiratory motion prediction

    Science.gov (United States)

    Dürichen, R.; Wissel, T.; Ernst, F.; Schlaefer, A.; Schweikard, A.

    2014-10-01

    In extracranial robotic radiotherapy, tumour motion is compensated by tracking external and internal surrogates. To compensate system specific time delays, time series prediction of the external optical surrogates is used. We investigate whether the prediction accuracy can be increased by expanding the current clinical setup by an accelerometer, a strain belt and a flow sensor. Four previously published prediction algorithms are adapted to multivariate inputs—normalized least mean squares (nLMS), wavelet-based least mean squares (wLMS), support vector regression (SVR) and relevance vector machines (RVM)—and evaluated for three different prediction horizons. The measurement involves 18 subjects and consists of two phases, focusing on long term trends (M1) and breathing artefacts (M2). To select the most relevant and least redundant sensors, a sequential forward selection (SFS) method is proposed. Using a multivariate setting, the results show that the clinically used nLMS algorithm is susceptible to large outliers. In the case of irregular breathing (M2), the mean root mean square error (RMSE) of a univariate nLMS algorithm is 0.66 mm and can be decreased to 0.46 mm by a multivariate RVM model (best algorithm on average). To investigate the full potential of this approach, the optimal sensor combination was also estimated on the complete test set. The results indicate that a further decrease in RMSE is possible for RVM (to 0.42 mm). This motivates further research about sensor selection methods. Besides the optical surrogates, the sensors most frequently selected by the algorithms are the accelerometer and the strain belt. These sensors could be easily integrated in the current clinical setup and would allow a more precise motion compensation.

  7. A multivariate approach to oil hydrocarbon fingerprinting and spill source identification

    DEFF Research Database (Denmark)

    Christensen, Jan H.; Tomasi, Giorgio

    2016-01-01

    of the available data. A framework (integrated multivariate oil hydrocarbon fingerprinting - IMOF) for the use of chemometric approaches in tiered oil spill fingerprinting is presented in this chapter. It consists of four main steps where a suite of analytical instruments, data preprocessing and multivariate...

  8. Fingerprint spoof detection using wavelet based local binary pattern

    Science.gov (United States)

    Kumpituck, Supawan; Li, Dongju; Kunieda, Hiroaki; Isshiki, Tsuyoshi

    2017-02-01

    In this work, a fingerprint spoof detection method using an extended feature, namely Wavelet-based Local Binary Pattern (Wavelet-LBP) is introduced. Conventional wavelet-based methods calculate wavelet energy of sub-band images as the feature for discrimination while we propose to use Local Binary Pattern (LBP) operation to capture the local appearance of the sub-band images instead. The fingerprint image is firstly decomposed by two-dimensional discrete wavelet transform (2D-DWT), and then LBP is applied on the derived wavelet sub-band images. Furthermore, the extracted features are used to train Support Vector Machine (SVM) classifier to create the model for classifying the fingerprint images into genuine and spoof. Experiments that has been done on Fingerprint Liveness Detection Competition (LivDet) datasets show the improvement of the fingerprint spoof detection by using the proposed feature.

  9. Enhanced ATM Security using Biometric Authentication and Wavelet Based AES

    Directory of Open Access Journals (Sweden)

    Sreedharan Ajish

    2016-01-01

    Full Text Available The traditional ATM terminal customer recognition systems rely only on bank cards, passwords and such identity verification methods are not perfect and functions are too single. Biometrics-based authentication offers several advantages over other authentication methods, there has been a significant surge in the use of biometrics for user authentication in recent years. This paper presents a highly secured ATM banking system using biometric authentication and wavelet based Advanced Encryption Standard (AES algorithm. Two levels of security are provided in this proposed design. Firstly we consider the security level at the client side by providing biometric authentication scheme along with a password of 4-digit long. Biometric authentication is achieved by considering the fingerprint image of the client. Secondly we ensure a secured communication link between the client machine to the bank server using an optimized energy efficient and wavelet based AES processor. The fingerprint image is the data for encryption process and 4-digit long password is the symmetric key for the encryption process. The performance of ATM machine depends on ultra-high-speed encryption, very low power consumption, and algorithmic integrity. To get a low power consuming and ultra-high speed encryption at the ATM machine, an optimized and wavelet based AES algorithm is proposed. In this system biometric and cryptography techniques are used together for personal identity authentication to improve the security level. The design of the wavelet based AES processor is simulated and the design of the energy efficient AES processor is simulated in Quartus-II software. Simulation results ensure its proper functionality. A comparison among other research works proves its superiority.

  10. Enhanced ATM Security using Biometric Authentication and Wavelet Based AES

    OpenAIRE

    Sreedharan Ajish

    2016-01-01

    The traditional ATM terminal customer recognition systems rely only on bank cards, passwords and such identity verification methods are not perfect and functions are too single. Biometrics-based authentication offers several advantages over other authentication methods, there has been a significant surge in the use of biometrics for user authentication in recent years. This paper presents a highly secured ATM banking system using biometric authentication and wavelet based Advanced Encryption ...

  11. Analysis of a wavelet-based robust hash algorithm

    Science.gov (United States)

    Meixner, Albert; Uhl, Andreas

    2004-06-01

    This paper paper is a quantitative evaluation of a wavelet-based, robust authentication hashing algorithm. Based on the results of a series of robustness and tampering sensitivity tests, we describepossible shortcomings and propose variousmodifications to the algorithm to improve its performance. The second part of the paper describes and attack against the scheme. It allows an attacker to modify a tampered image, such that it's hash value closely matches the hash value of the original.

  12. Wavelet-Based Denoising Attack on Image Watermarking

    Institute of Scientific and Technical Information of China (English)

    XUAN Jian-hui; WANG Li-na; ZHANG Huan-guo

    2005-01-01

    In this paper, we propose wavelet-based denoising attack methods on image watermarking in discrete cosine transform (DCT) or discrete Fourier transform (DFT) domain or discrete wavelet transform (DWT) domain. Wiener filtering based on wavelet transform is performed in approximation subband to remove DCT or DFT domain watermark,and adaptive wavelet soft thresholding is employed to remove the watermark resided in detail subbands of DWT domain.

  13. Classification of Underwater Signals Using Wavelet-Based Decompositions

    Science.gov (United States)

    1998-06-01

    proposed by Learned and Willsky [21], uses the SVD information obtained from the power mapping, the second one selects the most within-a-class...34 SPIE, Vol. 2242, pp. 792-802, Wavelet Applications, 1994 [14] R. Coifman and D. Donoho, "Translation-Invariant Denoising ," Internal Report...J. Barsanti, Jr., Denoising of Ocean Acoustic Signals Using Wavelet-Based Techniques, MSEE Thesis, Naval Postgraduate School, Monterey, California

  14. Wavelet-based Multiresolution Particle Methods

    Science.gov (United States)

    Bergdorf, Michael; Koumoutsakos, Petros

    2006-03-01

    Particle methods offer a robust numerical tool for solving transport problems across disciplines, such as fluid dynamics, quantitative biology or computer graphics. Their strength lies in their stability, as they do not discretize the convection operator, and appealing numerical properties, such as small dissipation and dispersion errors. Many problems of interest are inherently multiscale, and their efficient solution requires either multiscale modeling approaches or spatially adaptive numerical schemes. We present a hybrid particle method that employs a multiresolution analysis to identify and adapt to small scales in the solution. The method combines the versatility and efficiency of grid-based Wavelet collocation methods while retaining the numerical properties and stability of particle methods. The accuracy and efficiency of this method is then assessed for transport and interface capturing problems in two and three dimensions, illustrating the capabilities and limitations of our approach.

  15. Wavelet-based coding of ultraspectral sounder data

    Science.gov (United States)

    Garcia-Vilchez, Fernando; Serra-Sagrista, Joan; Auli-Llinas, Francesc

    2005-08-01

    In this paper we provide a study concerning the suitability of well-known image coding techniques originally devised for lossy compression of still natural images when applied to lossless compression of ultraspectral sounder data. We present here the experimental results of six wavelet-based widespread coding techniques, namely EZW, IC, SPIHT, JPEG2000, SPECK and CCSDS-IDC. Since the considered techniques are 2-dimensional (2D) in nature but the ultraspectral data are 3D, a pre-processing stage is applied to convert the two spatial dimensions into a single spatial dimension. All the wavelet-based techniques are competitive when compared either to the benchmark prediction-based methods for lossless compression, CALIC and JPEG-LS, or to two common compression utilities, GZIP and BZIP2. EZW, SPIHT, SPECK and CCSDS-IDC provide a very similar performance, while IC and JPEG2000 improve the compression factor when compared to the other wavelet-based methods. Nevertheless, they are not competitive when compared to a fast precomputed vector quantizer. The benefits of applying a pre-processing stage, the Bias Adjusted Reordering, prior to the coding process in order to further exploit the spectral and/or spatial correlation when 2D techniques are employed, are also presented.

  16. WAVELET-BASED FINE GRANULARITY SCALABLE VIDEO CODING

    Institute of Scientific and Technical Information of China (English)

    Zhang Jiangshan; Zhu Guangxi

    2003-01-01

    This letter proposes an efficient wavelet-based Fine Granularity Scalable (FGS)coding scheme, where the base layer is encoded with a newly designed wavelet-based coder, and the enhancement layer is encoded with Progressive Fine Granularity Scalable (PFGS) coding.This algorithm involves multi-frame motion compensation, rate-distortion optimizing strategy with Lagrangian cost function and context-based adaptive arithmetic coding. In order to improve efficiency of the enhancement layer coding, an improved motion estimation scheme that uses both information from the base layer and the enhancement layer is also proposed in this letter. The wavelet-based coder significantly improves the coding efficiency of the base layer compared with MPEG-4 ASP (Advanced Simple Profile) and H.26L TML9. The PFGS coding is a significant improvement over MPEG-4 FGS coding at the enhancement layer. Experiments show that single layer coding efficiency gain of the proposed scheme is about 2.0-3.0dB and 0.3-1.0dB higher than that of MPEG-4 ASP and H.26L TML9, respectively. The overall coding efficiency gain of the proposed scheme is about 4.0-5.0dB higher than that of MPEG-4 FGS.

  17. Fast Wavelet-Based Visual Classification

    CERN Document Server

    Yu, Guoshen

    2008-01-01

    We investigate a biologically motivated approach to fast visual classification, directly inspired by the recent work of Serre et al. Specifically, trading-off biological accuracy for computational efficiency, we explore using wavelet and grouplet-like transforms to parallel the tuning of visual cortex V1 and V2 cells, alternated with max operations to achieve scale and translation invariance. A feature selection procedure is applied during learning to accelerate recognition. We introduce a simple attention-like feedback mechanism, significantly improving recognition and robustness in multiple-object scenes. In experiments, the proposed algorithm achieves or exceeds state-of-the-art success rate on object recognition, texture and satellite image classification, language identification and sound classification.

  18. Fast wavelet based sparse approximate inverse preconditioner

    Energy Technology Data Exchange (ETDEWEB)

    Wan, W.L. [Univ. of California, Los Angeles, CA (United States)

    1996-12-31

    Incomplete LU factorization is a robust preconditioner for both general and PDE problems but unfortunately not easy to parallelize. Recent study of Huckle and Grote and Chow and Saad showed that sparse approximate inverse could be a potential alternative while readily parallelizable. However, for special class of matrix A that comes from elliptic PDE problems, their preconditioners are not optimal in the sense that independent of mesh size. A reason may be that no good sparse approximate inverse exists for the dense inverse matrix. Our observation is that for this kind of matrices, its inverse entries typically have piecewise smooth changes. We can take advantage of this fact and use wavelet compression techniques to construct a better sparse approximate inverse preconditioner. We shall show numerically that our approach is effective for this kind of matrices.

  19. A MULTI-VARIABLE APPROACH TO SUPPLIER SEGMENTATION.

    OpenAIRE

    Rezaei, Jafar; Ortt, Roland

    2011-01-01

    Abstract The aim of this paper is to develop a new approach to supplier segmentation that considers the various variables used in existing literature to segment suppliers. A literature review reveals a serious problem from a management perspective. The problem is that many different supplier segmentation methods have been proposed in the last three decades, each of which uses different segmentation variables and hence results in different segments. An overarching supplier segmentat...

  20. Random matrix approach to multivariate categorical data analysis

    CERN Document Server

    Patil, Aashay

    2015-01-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow those from random matrix theory. We demonstrate this approach by applying it to the data of Indian general elections and sea level pressures in North Atlantic ocean.

  1. Characterising powder flow properties - the need for a multivariate approach

    Science.gov (United States)

    Freeman, Tim; Brockbank, Katrina; Sabathier, Jerome

    2017-06-01

    Despite their widespread and well-established use, powders are challenging materials to work with, as evidenced by the common problems encountered during storage and processing, as well as in the quality and consistency of final products. The diverse range of unit operations used to handle and manipulate powders subject them to extremes of stress and flow regimes; from the high stress, static conditions present in hoppers to the dispersed, dynamic state of a fluidised bed dryer. It is therefore possible for a powder to behave a certain way in a given unit operation, but entirely differently in another. Many existing powder testing techniques don't deliver the required information as the test conditions do not represent the conditions in the process. Modern powder rheometers generate process relevant data by accurately measuring dynamic flow, bulk and shear properties. This approach enables a powder's response to aeration, consolidation, forced flow and changes in flow rate to be reliably quantified thereby simulating the conditions which a powder will be subjected to in process. This paper provides an introduction to powder rheology, including a comparison with traditional techniques, and uses case studies to demonstrate how powder rheology can be applied to optimise production processes and enhance product quality

  2. An operational modal analysis approach based on parametrically identified multivariable transmissibilities

    Science.gov (United States)

    Devriendt, Christof; De Sitter, Gert; Guillaume, Patrick

    2010-07-01

    In this contribution the approach to identify modal parameters from output-only (scalar) transmissibility measurements [C. Devriendt, P. Guillaume, The use of transmissibility measurements in output-only modal analysis, Mechanical Systems and Signal Processing 21 (7) (2007) 2689-2696] is generalized to multivariable transmissibilities. In general, the poles that are identified from (scalar as well as multivariable) transmissibility measurements do not correspond with the system's poles. However, by combining transmissibility measurements under different loading conditions, it is shown in this paper how model parameters can be identified from multivariable transmissibility measurements.

  3. A Wavelet-Based Assessment of Topographic-Isostatic Reductions for GOCE Gravity Gradients

    Science.gov (United States)

    Grombein, Thomas; Luo, Xiaoguang; Seitz, Kurt; Heck, Bernhard

    2014-07-01

    Gravity gradient measurements from ESA's satellite mission Gravity field and steady-state Ocean Circulation Explorer (GOCE) contain significant high- and mid-frequency signal components, which are primarily caused by the attraction of the Earth's topographic and isostatic masses. In order to mitigate the resulting numerical instability of a harmonic downward continuation, the observed gradients can be smoothed with respect to topographic-isostatic effects using a remove-compute-restore technique. For this reason, topographic-isostatic reductions are calculated by forward modeling that employs the advanced Rock-Water-Ice methodology. The basis of this approach is a three-layer decomposition of the topography with variable density values and a modified Airy-Heiskanen isostatic concept incorporating a depth model of the Mohorovičić discontinuity. Moreover, tesseroid bodies are utilized for mass discretization and arranged on an ellipsoidal reference surface. To evaluate the degree of smoothing via topographic-isostatic reduction of GOCE gravity gradients, a wavelet-based assessment is presented in this paper and compared with statistical inferences in the space domain. Using the Morlet wavelet, continuous wavelet transforms are applied to measured GOCE gravity gradients before and after reducing topographic-isostatic signals. By analyzing a representative data set in the Himalayan region, an employment of the reductions leads to significantly smoothed gradients. In addition, smoothing effects that are invisible in the space domain can be detected in wavelet scalograms, making a wavelet-based spectral analysis a powerful tool.

  4. Wavelet-based image compression using fixed residual value

    Science.gov (United States)

    Muzaffar, Tanzeem; Choi, Tae-Sun

    2000-12-01

    Wavelet based compression is getting popular due to its promising compaction properties at low bitrate. Zerotree wavelet image coding scheme efficiently exploits multi-level redundancy present in transformed data to minimize coding bits. In this paper, a new technique is proposed to achieve high compression by adding new zerotree and significant symbols to original EZW coder. Contrary to four symbols present in basic EZW scheme, modified algorithm uses eight symbols to generate fewer bits for a given data. Subordinate pass of EZW is eliminated and replaced with fixed residual value transmission for easy implementation. This modification simplifies the coding technique as well and speeds up the process, retaining the property of embeddedness.

  5. A Robust Wavelet Based Watermarking System for Color Video

    Directory of Open Access Journals (Sweden)

    Mohsen Ashourian

    2011-09-01

    Full Text Available In this paper, we propose a wavelet based watermarking system. The system uses wavelet transform for red, green and blues channel independently. We use space-time coding for encoding the watermark message before data embedding. The bit-error-rate of the recovered message is calculated. The embedding factor is selected in such a way that the host video maintains the same quality with/without using space-time coding. The developed system is further examined, when host video faces compression and noise addition. The result shows the effectiveness of the proposed watermarking system, especially when space-time coding is used.

  6. Wavelet based hierarchical coding scheme for radar image compression

    Science.gov (United States)

    Sheng, Wen; Jiao, Xiaoli; He, Jifeng

    2007-12-01

    This paper presents a wavelet based hierarchical coding scheme for radar image compression. Radar signal is firstly quantized to digital signal, and reorganized as raster-scanned image according to radar's repeated period frequency. After reorganization, the reformed image is decomposed to image blocks with different frequency band by 2-D wavelet transformation, each block is quantized and coded by the Huffman coding scheme. A demonstrating system is developed, showing that under the requirement of real time processing, the compression ratio can be very high, while with no significant loss of target signal in restored radar image.

  7. A multivariate piecing-together approach with an application to operational loss data

    CERN Document Server

    Aulbach, Stefan; Falk, Michael; 10.3150/10-BEJ343

    2012-01-01

    The univariate piecing-together approach (PT) fits a univariate generalized Pareto distribution (GPD) to the upper tail of a given distribution function in a continuous manner. We propose a multivariate extension. First it is shown that an arbitrary copula is in the domain of attraction of a multivariate extreme value distribution if and only if its upper tail can be approximated by the upper tail of a multivariate GPD with uniform margins. The multivariate PT then consists of two steps: The upper tail of a given copula $C$ is cut off and substituted by a multivariate GPD copula in a continuous manner. The result is again a copula. The other step consists of the transformation of each margin of this new copula by a given univariate distribution function. This provides, altogether, a multivariate distribution function with prescribed margins whose copula coincides in its central part with $C$ and in its upper tail with a GPD copula. When applied to data, this approach also enables the evaluation of a wide rang...

  8. Wavelet-based multifractal analysis of laser biopsy imagery

    CERN Document Server

    Jagtap, Jaidip; Panigrahi, Prasanta K; Pradhan, Asima

    2011-01-01

    In this work, we report a wavelet based multi-fractal study of images of dysplastic and neoplastic HE- stained human cervical tissues captured in the transmission mode when illuminated by a laser light (He-Ne 632.8nm laser). It is well known that the morphological changes occurring during the progression of diseases like cancer manifest in their optical properties which can be probed for differentiating the various stages of cancer. Here, we use the multi-resolution properties of the wavelet transform to analyze the optical changes. For this, we have used a novel laser imagery technique which provides us with a composite image of the absorption by the different cellular organelles. As the disease progresses, due to the growth of new cells, the ratio of the organelle to cellular volume changes manifesting in the laser imagery of such tissues. In order to develop a metric that can quantify the changes in such systems, we make use of the wavelet-based fluctuation analysis. The changing self- similarity during di...

  9. Wavelet-based denoising using local Laplace prior

    Science.gov (United States)

    Rabbani, Hossein; Vafadust, Mansur; Selesnick, Ivan

    2007-09-01

    Although wavelet-based image denoising is a powerful tool for image processing applications, relatively few publications have addressed so far wavelet-based video denoising. The main reason is that the standard 3-D data transforms do not provide useful representations with good energy compaction property, for most video data. For example, the multi-dimensional standard separable discrete wavelet transform (M-D DWT) mixes orientations and motions in its subbands, and produces the checkerboard artifacts. So, instead of M-D DWT, usually oriented transforms suchas multi-dimensional complex wavelet transform (M-D DCWT) are proposed for video processing. In this paper we use a Laplace distribution with local variance to model the statistical properties of noise-free wavelet coefficients. This distribution is able to simultaneously model the heavy-tailed and intrascale dependency properties of wavelets. Using this model, simple shrinkage functions are obtained employing maximum a posteriori (MAP) and minimum mean squared error (MMSE) estimators. These shrinkage functions are proposed for video denoising in DCWT domain. The simulation results shows that this simple denoising method has impressive performance visually and quantitatively.

  10. Majorization-minimization algorithms for wavelet-based image restoration.

    Science.gov (United States)

    Figueiredo, Mário A T; Bioucas-Dias, José M; Nowak, Robert D

    2007-12-01

    Standard formulations of image/signal deconvolution under wavelet-based priors/regularizers lead to very high-dimensional optimization problems involving the following difficulties: the non-Gaussian (heavy-tailed) wavelet priors lead to objective functions which are nonquadratic, usually nondifferentiable, and sometimes even nonconvex; the presence of the convolution operator destroys the separability which underlies the simplicity of wavelet-based denoising. This paper presents a unified view of several recently proposed algorithms for handling this class of optimization problems, placing them in a common majorization-minimization (MM) framework. One of the classes of algorithms considered (when using quadratic bounds on nondifferentiable log-priors) shares the infamous "singularity issue" (SI) of "iteratively reweighted least squares" (IRLS) algorithms: the possibility of having to handle infinite weights, which may cause both numerical and convergence issues. In this paper, we prove several new results which strongly support the claim that the SI does not compromise the usefulness of this class of algorithms. Exploiting the unified MM perspective, we introduce a new algorithm, resulting from using l1 bounds for nonconvex regularizers; the experiments confirm the superior performance of this method, when compared to the one based on quadratic majorization. Finally, an experimental comparison of the several algorithms, reveals their relative merits for different standard types of scenarios.

  11. Wavelet based characterization of ex vivo vertebral trabecular bone structure with 3T MRI compared to microCT

    Energy Technology Data Exchange (ETDEWEB)

    Krug, R; Carballido-Gamio, J; Burghardt, A; Haase, S; Sedat, J W; Moss, W C; Majumdar, S

    2005-04-11

    Trabecular bone structure and bone density contribute to the strength of bone and are important in the study of osteoporosis. Wavelets are a powerful tool to characterize and quantify texture in an image. In this study the thickness of trabecular bone was analyzed in 8 cylindrical cores of the vertebral spine. Images were obtained from 3 Tesla (T) magnetic resonance imaging (MRI) and micro-computed tomography ({micro}CT). Results from the wavelet based analysis of trabecular bone were compared with standard two-dimensional structural parameters (analogous to bone histomorphometry) obtained using mean intercept length (MR images) and direct 3D distance transformation methods ({micro}CT images). Additionally, the bone volume fraction was determined from MR images. We conclude that the wavelet based analyses delivers comparable results to the established MR histomorphometric measurements. The average deviation in trabecular thickness was less than one pixel size between the wavelet and the standard approach for both MR and {micro}CT analysis. Since the wavelet based method is less sensitive to image noise, we see an advantage of wavelet analysis of trabecular bone for MR imaging when going to higher resolution.

  12. Incorporation of wavelet-based denoising in iterative deconvolution for partial volume correction in whole-body PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Boussion, N.; Cheze Le Rest, C.; Hatt, M.; Visvikis, D. [INSERM, U650, Laboratoire de Traitement de l' Information Medicale (LaTIM) CHU MORVAN, Brest (France)

    2009-07-15

    Partial volume effects (PVEs) are consequences of the limited resolution of emission tomography. The aim of the present study was to compare two new voxel-wise PVE correction algorithms based on deconvolution and wavelet-based denoising. Deconvolution was performed using the Lucy-Richardson and the Van-Cittert algorithms. Both of these methods were tested using simulated and real FDG PET images. Wavelet-based denoising was incorporated into the process in order to eliminate the noise observed in classical deconvolution methods. Both deconvolution approaches led to significant intensity recovery, but the Van-Cittert algorithm provided images of inferior qualitative appearance. Furthermore, this method added massive levels of noise, even with the associated use of wavelet-denoising. On the other hand, the Lucy-Richardson algorithm combined with the same denoising process gave the best compromise between intensity recovery, noise attenuation and qualitative aspect of the images. The appropriate combination of deconvolution and wavelet-based denoising is an efficient method for reducing PVEs in emission tomography. (orig.)

  13. Optimum wavelet based masking for the contrast enhancement of medical images using enhanced cuckoo search algorithm.

    Science.gov (United States)

    Daniel, Ebenezer; Anitha, J

    2016-04-01

    Unsharp masking techniques are a prominent approach in contrast enhancement. Generalized masking formulation has static scale value selection, which limits the gain of contrast. In this paper, we propose an Optimum Wavelet Based Masking (OWBM) using Enhanced Cuckoo Search Algorithm (ECSA) for the contrast improvement of medical images. The ECSA can automatically adjust the ratio of nest rebuilding, using genetic operators such as adaptive crossover and mutation. First, the proposed contrast enhancement approach is validated quantitatively using Brain Web and MIAS database images. Later, the conventional nest rebuilding of cuckoo search optimization is modified using Adaptive Rebuilding of Worst Nests (ARWN). Experimental results are analyzed using various performance matrices, and our OWBM shows improved results as compared with other reported literature.

  14. A wavelet based approach to Solar-Terrestrial Coupling

    Science.gov (United States)

    Katsavrias, Ch.; Hillaris, A.; Preka-Papadema, P.

    2016-05-01

    Transient and recurrent solar activity drive geomagnetic disturbances; these are quantified (amongst others) by DST , AE indices time-series. Transient disturbances are related to the Interplanetary Coronal Mass Ejections (ICMEs) while recurrent disturbances are related to corotating interaction regions (CIR). We study the relationship of the geomagnetic disturbances to the solar wind drivers within solar cycle 23 where the drivers are represented by ICMEs and CIRs occurrence rate and compared to the DST and AE as follows: terms with common periodicity in both the geomagnetic disturbances and the solar drivers are, firstly, detected using continuous wavelet transform (CWT). Then, common power and phase coherence of these periodic terms are calculated from the cross-wavelet spectra (XWT) and wavelet-coherence (WTC) respectively. In time-scales of ≈27 days our results indicate an anti-correlation of the effects of ICMEs and CIRs on the geomagnetic disturbances. The former modulates the DST and AE time series during the cycle maximum the latter during periods of reduced solar activity. The phase relationship of these modulation is highly non-linear. Only the annual frequency component of the ICMEs is phase-locked with DST and AE. In time-scales of ≈1.3-1.7 years the CIR seem to be the dominant driver for both geomagnetic indices throughout the whole solar cycle 23.

  15. A Wavelet Based Approach to Solar--Terrestrial Coupling

    CERN Document Server

    Katsavrias, Ch; Preka--Papadema, P

    2016-01-01

    Transient and recurrent solar activity drive geomagnetic disturbances; these are quantified (amongst others) by DST, AE indices time-series. Transient disturbances are related to the Interplanetary Coronal Mass Ejections (ICMEs) while recurrent disturbances are related to corotating interaction regions (CIR). We study the relationship of the geomagnetic disturbances to the solar wind drivers within solar Cycle 23 where the drivers are represented by ICMEs and CIRs occurrence rate and compared to the DST and AE as follows: terms with common periodicity in both the geomagnetic disturbances and the solar drivers are, firstly, detected using continuous wavelet transform (CWT). Then, common power and phase coherence of these periodic terms are calculated from the cross-wavelet spectra (XWT) and waveletcoherence (WTC) respectively. In time-scales of about 27 days our results indicate an anti-correlation of the effects of ICMEs and CIRs on the geomagnetic disturbances. The former modulates the DST and AE time series...

  16. Automatic key frame selection using a wavelet-based approach

    Science.gov (United States)

    Campisi, Patrizio; Longari, Andrea; Neri, Alessandro

    1999-10-01

    In a multimedia framework, digital image sequences (videos) are by far the most demanding as far as storage, search, browsing and retrieval requirements are concerned. In order to reduce the computational burden associated to video browsing and retrieval, a video sequence is usually decomposed into several scenes (shots) and each of them is characterized by means of some key frames. The proper selection of these key frames, i.e. the most representative frames in the scene, is of paramount importance for computational efficiency. In this contribution a novel key frame extraction technique based on the wavelet analysis is presented. Experimental results show the capability of the proposed algorithm to select key frames properly summarizing the shot.

  17. Analysis of the real EADGENE data set: Multivariate approaches and post analyis

    NARCIS (Netherlands)

    Sorensen, P.; Bonnet, A.; Buitenhuis, B.; Closset, R.; Dejean, S.; Delmas, C.; Duval, M.; Glass, L.; Hedegaard, J.; Hornshoj, H.; Hulsegge, B.; Jaffrezic, F.; Jensen, K.; Jiang, L.; Koning, de D.J.; Lê Cao, K.A.; Nie, H.; Petzl, W.; Pool, M.H.; Robert-Granie, C.; San Cristobal, M.; Lund, M.S.; Schothorst, van E.M.; Schuberth, H.J.; Seyfert, H.M.; Tosser-klopp, G.; Waddington, D.; Watson, D.; Yang, W.; Zerbe, H.

    2007-01-01

    The aim of this paper was to describe, and when possible compare, the multivariate methods used by the participants in the EADGENE WP1.4 workshop. The first approach was for class discovery and class prediction using evidence from the data at hand. Several teams used hierarchical clustering (HC) or

  18. A note on a simplified and general approach to simulating from multivariate copula functions

    Science.gov (United States)

    Barry K. Goodwin

    2013-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses ‘Probability-...

  19. A Simpli ed, General Approach to Simulating from Multivariate Copula Functions

    Science.gov (United States)

    Barry Goodwin

    2012-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...

  20. Wavelet-based Image Compression using Subband Threshold

    Science.gov (United States)

    Muzaffar, Tanzeem; Choi, Tae-Sun

    2002-11-01

    Wavelet based image compression has been a focus of research in recent days. In this paper, we propose a compression technique based on modification of original EZW coding. In this lossy technique, we try to discard less significant information in the image data in order to achieve further compression with minimal effect on output image quality. The algorithm calculates weight of each subband and finds the subband with minimum weight in every level. This minimum weight subband in each level, that contributes least effect during image reconstruction, undergoes a threshold process to eliminate low-valued data in it. Zerotree coding is done next on the resultant output for compression. Different values of threshold were applied during experiment to see the effect on compression ratio and reconstructed image quality. The proposed method results in further increase in compression ratio with negligible loss in image quality.

  1. Wavelet-based zerotree coding of aerospace images

    Science.gov (United States)

    Franques, Victoria T.; Jain, Vijay K.

    1996-06-01

    This paper presents a wavelet based image coding method achieving high levels of compression. A multi-resolution subband decomposition system is constructed using Quadrature Mirror Filters. Symmetric extension and windowing of the multi-scaled subbands are incorporated to minimize the boundary effects. Next, the Embedded Zerotree Wavelet coding algorithm is used for data compression method. Elimination of the isolated zero symbol, for certain subbands, leads to an improved EZW algorithm. Further compression is obtained with an adaptive arithmetic coder. We achieve a PSNR of 26.91 dB at a bit rate of 0.018, 35.59 dB at a bit rate of 0.149, and 43.05 dB at 0.892 bits/pixel for the aerospace image, Refuel.

  2. Complete quantum circuit of Haar wavelet based MRA

    Institute of Scientific and Technical Information of China (English)

    HE Yuguo; SUN Jigui

    2005-01-01

    Wavelet analysis has applications in many areas, such as signal analysis and image processing. We propose a method for generating the complete circuit of Haar wavelet based MRA by factoring butterfly matrices and conditional perfect shuffle permutation matrices. The factorization of butterfly matrices is the essential part of the design. As a result, it is the key point to obtain the circuits of .I2t()W()I2n-2t-2. In this paper, we use a simple means to develop quantum circuits for this kind of matrices. Similarly, the conditional permutation matrix is implemented entirely, combined with the scheme of Fijany and Williams. The cir-cuits and the ideas adopted in the design are simple and in-telligible.

  3. Wavelet-based gray-level digital image watermarking

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The watermarking technique has been proposed as a method by hiding secret information into the im age to protect the copyright of multimedia data. But most previous work focuses on the algorithms of embedding one-dimensional watermarks or two-dimensional binary digital watermarks. In this paper, a wavelet-based method for embedding a gray-level digital watermark into an image is proposed. By still image decomposition technique, a gray-level digital watermark is decompounded into a series of bitplanes. By discrete wavelet transform ( DWT ), the host image is decomposed into multiresolution representations with hierarchical structure. Thedifferent bitplanes of the gray-level watermark is embedded into the corresponding resolution of the decomposed host image. The experimental results show that the proposed techniques can successfully survive image processing operations and the lossy compression techniques such as Joint Photographic Experts Group (JPEG).

  4. Adaptively wavelet-based image denoising algorithm with edge preserving

    Institute of Scientific and Technical Information of China (English)

    Yihua Tan; Jinwen Tian; Jian Liu

    2006-01-01

    @@ A new wavelet-based image denoising algorithm, which exploits the edge information hidden in the corrupted image, is presented. Firstly, a canny-like edge detector identifies the edges in each subband.Secondly, multiplying the wavelet coefficients in neighboring scales is implemented to suppress the noise while magnifying the edge information, and the result is utilized to exclude the fake edges. The isolated edge pixel is also identified as noise. Unlike the thresholding method, after that we use local window filter in the wavelet domain to remove noise in which the variance estimation is elaborated to utilize the edge information. This method is adaptive to local image details, and can achieve better performance than the methods of state of the art.

  5. Adaptive Image Transmission Scheme over Wavelet-Based OFDM System

    Institute of Scientific and Technical Information of China (English)

    GAOXinying; YUANDongfeng; ZHANGHaixia

    2005-01-01

    In this paper an adaptive image transmission scheme is proposed over Wavelet-based OFDM (WOFDM) system with Unequal error protection (UEP) by the design of non-uniform signal constellation in MLC. Two different data division schemes: byte-based and bitbased, are analyzed and compared. Different bits are protected unequally according to their different contribution to the image quality in bit-based data division scheme, which causes UEP combined with this scheme more powerful than that with byte-based scheme. Simulation results demonstrate that image transmission by UEP with bit-based data division scheme presents much higher PSNR values and surprisingly better image quality. Furthermore, by considering the tradeoff of complexity and BER performance, Haar wavelet with the shortest compactly supported filter length is the most suitable one among orthogonal Daubechies wavelet series in our proposed system.

  6. Research on the Sparse Representation for Gearbox Compound Fault Features Using Wavelet Bases

    Directory of Open Access Journals (Sweden)

    Chunyan Luo

    2015-01-01

    Full Text Available The research on gearbox fault diagnosis has been gaining increasing attention in recent years, especially on single fault diagnosis. In engineering practices, there is always more than one fault in the gearbox, which is demonstrated as compound fault. Hence, it is equally important for gearbox compound fault diagnosis. Both bearing and gear faults in the gearbox tend to result in different kinds of transient impulse responses in the captured signal and thus it is necessary to propose a potential approach for compound fault diagnosis. Sparse representation is one of the effective methods for feature extraction from strong background noise. Therefore, sparse representation under wavelet bases for compound fault features extraction is developed in this paper. With the proposed method, the different transient features of both bearing and gear can be separated and extracted. Both the simulated study and the practical application in the gearbox with compound fault verify the effectiveness of the proposed method.

  7. Design of wavelet-based ECG detector for implantable cardiac pacemakers.

    Science.gov (United States)

    Min, Young-Jae; Kim, Hoon-Ki; Kang, Yu-Ri; Kim, Gil-Su; Park, Jongsun; Kim, Soo-Won

    2013-08-01

    A wavelet Electrocardiogram (ECG) detector for low-power implantable cardiac pacemakers is presented in this paper. The proposed wavelet-based ECG detector consists of a wavelet decomposer with wavelet filter banks, a QRS complex detector of hypothesis testing with wavelet-demodulated ECG signals, and a noise detector with zero-crossing points. In order to achieve high detection accuracy with low power consumption, a multi-scaled product algorithm and soft-threshold algorithm are efficiently exploited in our ECG detector implementation. Our algorithmic and architectural level approaches have been implemented and fabricated in a standard 0.35 μm CMOS technology. The testchip including a low-power analog-to-digital converter (ADC) shows a low detection error-rate of 0.196% and low power consumption of 19.02 μW with a 3 V supply voltage.

  8. A Comparative Analysis of Exemplar Based and Wavelet Based Inpainting Technique

    Directory of Open Access Journals (Sweden)

    Vaibhav V Nalawade

    2012-06-01

    Full Text Available Image inpainting is the process of filling in of missing region so as to preserve its overall continuity. Image inpainting is manipulation and modification of an image in a form that is not easily detected. Digital image inpainting is relatively new area of research, but numerous and different approaches to tackle the inpainting problem have been proposed since the concept was first introduced. This paper compares two separate techniques viz, Exemplar based inpainting technique and Wavelet based inpainting technique, each portraying a different set of characteristics. The algorithms analyzed under exemplar technique are large object removal by exemplar based inpainting technique (Criminisi’s and modified exemplar (Cheng. The algorithm analyzed under wavelet is Chen’s visual image inpainting method. A number of examples on real and synthetic images are demonstrated to compare the results of different algorithms using both qualitative and quantitative parameters.

  9. A new digital approach to design multivariable robust optimal control systems

    Institute of Scientific and Technical Information of China (English)

    LIU Xiang; CHEN Lin; SUN You-xian

    2005-01-01

    This paper presents a new design of robust optimal controller for multivariable system. The row characteristic functions of a linear multivariable system and dynamic decoupling of its equivalent system, were applied to change the transfer function matrix of a closed-loop system into a normal function matrix, so that robustH∞ optimal stability is guaranteed. Furthermore,for the decoupled equivalent control system the l∞ optimization approach is used to have the closed-loop system embody optimal time domain indexes. A successful application on a heater control system verified the excellence of the new control scheme.

  10. Multivariable robust controller design of ACLS using loop-shaping approach

    Science.gov (United States)

    Dong, Chaoyang; Cui, Haihua; Wang, Qing

    2008-10-01

    In this paper a multivariable robust controller design approach of the ACLS is accomplished by using robust loop-shaping techniques. In order to avoid the inefficient way of choosing the weight functions by trial-and-error method, the structured genetic algorithm (SGA) approach is introduced, which is capable of simultaneously searching the orders and coefficients of the pre- and post-compensator for weight matrices. According to this approach, engineers can achieve an ideal loop-shape which lies in an appropriate region relating to the desired performance specifications. The effectiveness of this approach is illustrated by the longitudinal equations of a carrier-based aircraft's motion design example.

  11. Chemical Discrimination of Cortex Phellodendri amurensis and Cortex Phellodendri chinensis by Multivariate Analysis Approach.

    Science.gov (United States)

    Sun, Hui; Wang, Huiyu; Zhang, Aihua; Yan, Guangli; Han, Ying; Li, Yuan; Wu, Xiuhong; Meng, Xiangcai; Wang, Xijun

    2016-01-01

    As herbal medicines have an important position in health care systems worldwide, their current assessment, and quality control are a major bottleneck. Cortex Phellodendri chinensis (CPC) and Cortex Phellodendri amurensis (CPA) are widely used in China, however, how to identify species of CPA and CPC has become urgent. In this study, multivariate analysis approach was performed to the investigation of chemical discrimination of CPA and CPC. Principal component analysis showed that two herbs could be separated clearly. The chemical markers such as berberine, palmatine, phellodendrine, magnoflorine, obacunone, and obaculactone were identified through the orthogonal partial least squared discriminant analysis, and were identified tentatively by the accurate mass of quadruple-time-of-flight mass spectrometry. A total of 29 components can be used as the chemical markers for discrimination of CPA and CPC. Of them, phellodenrine is significantly higher in CPC than that of CPA, whereas obacunone and obaculactone are significantly higher in CPA than that of CPC. The present study proves that multivariate analysis approach based chemical analysis greatly contributes to the investigation of CPA and CPC, and showed that the identified chemical markers as a whole should be used to discriminate the two herbal medicines, and simultaneously the results also provided chemical information for their quality assessment. Multivariate analysis approach was performed to the investigate the herbal medicineThe chemical markers were identified through multivariate analysis approachA total of 29 components can be used as the chemical markers. UPLC-Q/TOF-MS-based multivariate analysis method for the herbal medicine samples Abbreviations used: CPC: Cortex Phellodendri chinensis, CPA: Cortex Phellodendri amurensis, PCA: Principal component analysis, OPLS-DA: Orthogonal partial least squares discriminant analysis, BPI: Base peaks ion intensity.

  12. Quality by design case study: an integrated multivariate approach to drug product and process development.

    Science.gov (United States)

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  13. Multivariate meta-analysis: a robust approach based on the theory of U-statistic.

    Science.gov (United States)

    Ma, Yan; Mazumdar, Madhu

    2011-10-30

    Meta-analysis is the methodology for combining findings from similar research studies asking the same question. When the question of interest involves multiple outcomes, multivariate meta-analysis is used to synthesize the outcomes simultaneously taking into account the correlation between the outcomes. Likelihood-based approaches, in particular restricted maximum likelihood (REML) method, are commonly utilized in this context. REML assumes a multivariate normal distribution for the random-effects model. This assumption is difficult to verify, especially for meta-analysis with small number of component studies. The use of REML also requires iterative estimation between parameters, needing moderately high computation time, especially when the dimension of outcomes is large. A multivariate method of moments (MMM) is available and is shown to perform equally well to REML. However, there is a lack of information on the performance of these two methods when the true data distribution is far from normality. In this paper, we propose a new nonparametric and non-iterative method for multivariate meta-analysis on the basis of the theory of U-statistic and compare the properties of these three procedures under both normal and skewed data through simulation studies. It is shown that the effect on estimates from REML because of non-normal data distribution is marginal and that the estimates from MMM and U-statistic-based approaches are very similar. Therefore, we conclude that for performing multivariate meta-analysis, the U-statistic estimation procedure is a viable alternative to REML and MMM. Easy implementation of all three methods are illustrated by their application to data from two published meta-analysis from the fields of hip fracture and periodontal disease. We discuss ideas for future research based on U-statistic for testing significance of between-study heterogeneity and for extending the work to meta-regression setting.

  14. Analysis of the real EADGENE data set::Multivariate approaches and post analysis

    OpenAIRE

    Schuberth Hans-Joachim; van Schothorst Evert M; Lund Mogens; San Cristobal Magali; Robert-Granié Christèle; Pool Marco H; Petzl Wolfram; Nie Haisheng; Cao Kim-Anh; de Koning Dirk-Jan; Jiang Li; Jensen Kirsty; Hulsegge Ina; Jaffrézic Florence; Hornshøj Henrik

    2007-01-01

    Abstract The aim of this paper was to describe, and when possible compare, the multivariate methods used by the participants in the EADGENE WP1.4 workshop. The first approach was for class discovery and class prediction using evidence from the data at hand. Several teams used hierarchical clustering (HC) or principal component analysis (PCA) to identify groups of differentially expressed genes with a similar expression pattern over time points and infective agent (E. coli or S. aureus). The m...

  15. Multivariate wavelet frames

    CERN Document Server

    Skopina, Maria; Protasov, Vladimir

    2016-01-01

    This book presents a systematic study of multivariate wavelet frames with matrix dilation, in particular, orthogonal and bi-orthogonal bases, which are a special case of frames. Further, it provides algorithmic methods for the construction of dual and tight wavelet frames with a desirable approximation order, namely compactly supported wavelet frames, which are commonly required by engineers. It particularly focuses on methods of constructing them. Wavelet bases and frames are actively used in numerous applications such as audio and graphic signal processing, compression and transmission of information. They are especially useful in image recovery from incomplete observed data due to the redundancy of frame systems. The construction of multivariate wavelet frames, especially bases, with desirable properties remains a challenging problem as although a general scheme of construction is well known, its practical implementation in the multidimensional setting is difficult. Another important feature of wavelet is ...

  16. Unified structural equation modeling approach for the analysis of multisubject, multivariate functional MRI data.

    Science.gov (United States)

    Kim, Jieun; Zhu, Wei; Chang, Linda; Bentler, Peter M; Ernst, Thomas

    2007-02-01

    The ultimate goal of brain connectivity studies is to propose, test, modify, and compare certain directional brain pathways. Path analysis or structural equation modeling (SEM) is an ideal statistical method for such studies. In this work, we propose a two-stage unified SEM plus GLM (General Linear Model) approach for the analysis of multisubject, multivariate functional magnetic resonance imaging (fMRI) time series data with subject-level covariates. In Stage 1, we analyze the fMRI multivariate time series for each subject individually via a unified SEM model by combining longitudinal pathways represented by a multivariate autoregressive (MAR) model, and contemporaneous pathways represented by a conventional SEM. In Stage 2, the resulting subject-level path coefficients are merged with subject-level covariates such as gender, age, IQ, etc., to examine the impact of these covariates on effective connectivity via a GLM. Our approach is exemplified via the analysis of an fMRI visual attention experiment. Furthermore, the significant path network from the unified SEM analysis is compared to that from a conventional SEM analysis without incorporating the longitudinal information as well as that from a Dynamic Causal Modeling (DCM) approach.

  17. Defining critical habitats of threatened and endemic reef fishes with a multivariate approach.

    Science.gov (United States)

    Purcell, Steven W; Clarke, K Robert; Rushworth, Kelvin; Dalton, Steven J

    2014-12-01

    Understanding critical habitats of threatened and endemic animals is essential for mitigating extinction risks, developing recovery plans, and siting reserves, but assessment methods are generally lacking. We evaluated critical habitats of 8 threatened or endemic fish species on coral and rocky reefs of subtropical eastern Australia, by measuring physical and substratum-type variables of habitats at fish sightings. We used nonmetric and metric multidimensional scaling (nMDS, mMDS), Analysis of similarities (ANOSIM), similarity percentages analysis (SIMPER), permutational analysis of multivariate dispersions (PERMDISP), and other multivariate tools to distinguish critical habitats. Niche breadth was widest for 2 endemic wrasses, and reef inclination was important for several species, often found in relatively deep microhabitats. Critical habitats of mainland reef species included small caves or habitat-forming hosts such as gorgonian corals and black coral trees. Hard corals appeared important for reef fishes at Lord Howe Island, and red algae for mainland reef fishes. A wide range of habitat variables are required to assess critical habitats owing to varied affinities of species to different habitat features. We advocate assessments of critical habitats matched to the spatial scale used by the animals and a combination of multivariate methods. Our multivariate approach furnishes a general template for assessing the critical habitats of species, understanding how these vary among species, and determining differences in the degree of habitat specificity. © 2014 Society for Conservation Biology.

  18. Wavelet-based embedded zerotree extension to color coding

    Science.gov (United States)

    Franques, Victoria T.

    1998-03-01

    Recently, a new image compression algorithm was developed which employs wavelet transform and a simple binary linear quantization scheme with an embedded coding technique to perform data compaction. This new family of coder, Embedded Zerotree Wavelet (EZW), provides a better compression performance than the current JPEG coding standard for low bit rates. Since EZW coding algorithm emerged, all of the published coding results related to this coding technique are on monochrome images. In this paper the author has enhanced the original coding algorithm to yield a better compression ratio, and has extended the wavelet-based zerotree coding to color images. Color imagery is often represented by several components, such as RGB, in which each component is generally processed separately. With color coding, each component could be compressed individually in the same manner as a monochrome image, therefore requiring a threefold increase in processing time. Most image coding standards employ de-correlated components, such as YIQ or Y, CB, CR and subsampling of the 'chroma' components, such coding technique is employed here. Results of the coding, including reconstructed images and coding performance, will be presented.

  19. A wavelet based investigation of long memory in stock returns

    Science.gov (United States)

    Tan, Pei P.; Galagedera, Don U. A.; Maharaj, Elizabeth A.

    2012-04-01

    Using a wavelet-based maximum likelihood fractional integration estimator, we test long memory (return predictability) in the returns at the market, industry and firm level. In an analysis of emerging market daily returns over the full sample period, we find that long-memory is not present and in approximately twenty percent of 175 stocks there is evidence of long memory. The absence of long memory in the market returns may be a consequence of contemporaneous aggregation of stock returns. However, when the analysis is carried out with rolling windows evidence of long memory is observed in certain time frames. These results are largely consistent with that of detrended fluctuation analysis. A test of firm-level information in explaining stock return predictability using a logistic regression model reveal that returns of large firms are more likely to possess long memory feature than in the returns of small firms. There is no evidence to suggest that turnover, earnings per share, book-to-market ratio, systematic risk and abnormal return with respect to the market model is associated with return predictability. However, degree of long-range dependence appears to be associated positively with earnings per share, systematic risk and abnormal return and negatively with book-to-market ratio.

  20. The Boundary Processing of Wavelet Based Image Compression

    Institute of Scientific and Technical Information of China (English)

    Yu Sheng-sheng; He Xiao-cheng; Zhou Jing-li; Chen Jia-zhong

    2004-01-01

    When an image, which is decomposed by bi-orthogonal wavelet bases, is reconstructed, some information will be lost at the four edges of the image. At the same time,artificial discontinuities will be introduced. We use a method called symmetric extension to solve the problem. We only consider the case of the two-band filter banks, and the results can be applied to M-band filter banks. There are only two types of symmetric extension in analysis phrase, namely the whole-sample symmetry (WS), the half-sample symmetry (HS), while there are four types of symmetric extension in synthesis phrase, namely the WS, HS, the whole-sample an-ti-symmetry (WA), and the half-sample anti-symmetry (HA) respcctively. We can select the exact type according to the image length and the filter length, and we will show how to do these. The image can be perfectly reconstructed without any edge effects in this way. Finally, simulation results are reported.

  1. A wavelet-based method for multispectral face recognition

    Science.gov (United States)

    Zheng, Yufeng; Zhang, Chaoyang; Zhou, Zhaoxian

    2012-06-01

    A wavelet-based method is proposed for multispectral face recognition in this paper. Gabor wavelet transform is a common tool for orientation analysis of a 2D image; whereas Hamming distance is an efficient distance measurement for face identification. Specifically, at each frequency band, an index number representing the strongest orientational response is selected, and then encoded in binary format to favor the Hamming distance calculation. Multiband orientation bit codes are then organized into a face pattern byte (FPB) by using order statistics. With the FPB, Hamming distances are calculated and compared to achieve face identification. The FPB algorithm was initially created using thermal images, while the EBGM method was originated with visible images. When two or more spectral images from the same subject are available, the identification accuracy and reliability can be enhanced using score fusion. We compare the identification performance of applying five recognition algorithms to the three-band (visible, near infrared, thermal) face images, and explore the fusion performance of combing the multiple scores from three recognition algorithms and from three-band face images, respectively. The experimental results show that the FPB is the best recognition algorithm, the HMM yields the best fusion result, and the thermal dataset results in the best fusion performance compared to other two datasets.

  2. Multivariate skew- t approach to the design of accumulation risk scenarios for the flooding hazard

    Science.gov (United States)

    Ghizzoni, Tatiana; Roth, Giorgio; Rudari, Roberto

    2010-10-01

    The multivariate version of the skew- t distribution provides a powerful analytical description of the joint behavior of multivariate processes. It enjoys valuable properties: from the aptitude to model skewed as well as leptokurtic datasets to the availability of moments and likelihood analytical expressions. Moreover, it offers a wide range of extremal dependence strength, allowing for upper and lower tail dependence. The idea underneath this work is to employ the multivariate skew- t distribution to provide an estimation of the joint probability of flood events in a multi-site multi-basin approach. This constitutes the basis for the design and evaluation of flood hazard scenarios for large areas in terms of their intensity, extension and frequency, i.e. those information required by civil protection agencies to put in action mitigation strategies and by insurance companies to price the flooding risk and to evaluate portfolios. Performances of the skew- t distribution and the corresponding t copula function, introduced to represent the state of the art for multivariate simulations, are discussed with reference to the Tanaro Basin, North-western Italy. To enhance the characteristics of the correlation structure, three nested and non-nested gauging stations are selected with contributing areas from 1500 to 8000 km 2. A dataset of 76 trivariate flood events is extracted from a mean daily discharges database available for the time period from January 1995 to December 2003. Applications include the generation of multivariate skew- t and t copula samples and models' comparison through the principle of minimum cross-entropy, here revised for the application to multivariate samples. Copula and skew- t based scenario return period estimations are provided for the November 1994 flood event, i.e. the worst on record in the 1801-2001 period. Results are encouraging: the skew- t distribution seems able to describe the joint behavior, being close to the observations. Marginal

  3. A framework for evaluating wavelet based watermarking for scalable coded digital item adaptation attacks

    Science.gov (United States)

    Bhowmik, Deepayan; Abhayaratne, Charith

    2009-02-01

    A framework for evaluating wavelet based watermarking schemes against scalable coded visual media content adaptation attacks is presented. The framework, Watermark Evaluation Bench for Content Adaptation Modes (WEBCAM), aims to facilitate controlled evaluation of wavelet based watermarking schemes under MPEG-21 part-7 digital item adaptations (DIA). WEBCAM accommodates all major wavelet based watermarking in single generalised framework by considering a global parameter space, from which the optimum parameters for a specific algorithm may be chosen. WEBCAM considers the traversing of media content along various links and required content adaptations at various nodes of media supply chains. In this paper, the content adaptation is emulated by the JPEG2000 coded bit stream extraction for various spatial resolution and quality levels of the content. The proposed framework is beneficial not only as an evaluation tool but also as design tool for new wavelet based watermark algorithms by picking and mixing of available tools and finding the optimum design parameters.

  4. Wavelet-Based Adaptive Solvers on Multi-core Architectures for the Simulation of Complex Systems

    Science.gov (United States)

    Rossinelli, Diego; Bergdorf, Michael; Hejazialhosseini, Babak; Koumoutsakos, Petros

    We build wavelet-based adaptive numerical methods for the simulation of advection dominated flows that develop multiple spatial scales, with an emphasis on fluid mechanics problems. Wavelet based adaptivity is inherently sequential and in this work we demonstrate that these numerical methods can be implemented in software that is capable of harnessing the capabilities of multi-core architectures while maintaining their computational efficiency. Recent designs in frameworks for multi-core software development allow us to rethink parallelism as task-based, where parallel tasks are specified and automatically mapped into physical threads. This way of exposing parallelism enables the parallelization of algorithms that were considered inherently sequential, such as wavelet-based adaptive simulations. In this paper we present a framework that combines wavelet-based adaptivity with the task-based parallelism. We demonstrate good scaling performance obtained by simulating diverse physical systems on different multi-core and SMP architectures using up to 16 cores.

  5. High Order Wavelet-Based Multiresolution Technology for Airframe Noise Prediction Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a novel, high-accuracy, high-fidelity, multiresolution (MRES), wavelet-based framework for efficient prediction of airframe noise sources and...

  6. High Order Wavelet-Based Multiresolution Technology for Airframe Noise Prediction Project

    Data.gov (United States)

    National Aeronautics and Space Administration — An integrated framework is proposed for efficient prediction of rotorcraft and airframe noise. A novel wavelet-based multiresolution technique and high-order...

  7. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series

    Directory of Open Access Journals (Sweden)

    Charmaine eDemanuele

    2015-10-01

    Full Text Available Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from fMRI blood oxygenation level dependent (BOLD time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC, but not in the primary visual cortex (V1. Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel

  8. A frailty model approach for regression analysis of multivariate current status data.

    Science.gov (United States)

    Chen, Man-Hua; Tong, Xingwei; Sun, Jianguo

    2009-11-30

    This paper discusses regression analysis of multivariate current status failure time data (The Statistical Analysis of Interval-censoring Failure Time Data. Springer: New York, 2006), which occur quite often in, for example, tumorigenicity experiments and epidemiologic investigations of the natural history of a disease. For the problem, several marginal approaches have been proposed that model each failure time of interest individually (Biometrics 2000; 56:940-943; Statist. Med. 2002; 21:3715-3726). In this paper, we present a full likelihood approach based on the proportional hazards frailty model. For estimation, an Expectation Maximization (EM) algorithm is developed and simulation studies suggest that the presented approach performs well for practical situations. The approach is applied to a set of bivariate current status data arising from a tumorigenicity experiment.

  9. Linking multimetric and multivariate approaches to assess the ecological condition of streams.

    Science.gov (United States)

    Collier, Kevin J

    2009-10-01

    Few attempts have been made to combine multimetric and multivariate analyses for bioassessment despite recognition that an integrated method could yield powerful tools for bioassessment. An approach is described that integrates eight macroinvertebrate community metrics into a Principal Components Analysis to develop a Multivariate Condition Score (MCS) from a calibration dataset of 511 samples. The MCS is compared to an Index of Biotic Integrity (IBI) derived using the same metrics based on the ratio to the reference site mean. Both approaches were highly correlated although the MCS appeared to offer greater potential for discriminating a wider range of impaired conditions. Both the MCS and IBI displayed low temporal variability within reference sites, and were able to distinguish between reference conditions and low levels of catchment modification and local habitat degradation, although neither discriminated among three levels of low impact. Pseudosamples developed to test the response of the metric aggregation approaches to organic enrichment, urban, mining, pastoral and logging stressor scenarios ranked pressures in the same order, but the MCS provided a lower score for the urban scenario and a higher score for the pastoral scenario. The MCS was calculated for an independent test dataset of urban and reference sites, and yielded similar results to the IBI. Although both methods performed comparably, the MCS approach may have some advantages because it removes the subjectivity of assigning thresholds for scoring biological condition, and it appears to discriminate a wider range of degraded conditions.

  10. Experimental validation of wavelet based solution for dynamic response of railway track subjected to a moving train

    Science.gov (United States)

    Koziol, Piotr

    2016-10-01

    New approaches allowing effective analysis of railway structures dynamic behaviour are needed for appropriate modelling and understanding of phenomena associated with train transportation. The literature highlights the fact that nonlinear assumptions are of importance in dynamic analysis of railway tracks. This paper presents wavelet based semi-analytical solution for the infinite Euler-Bernoulli beam resting on a nonlinear foundation and subjected to a set of moving forces, being representation of railway track with moving train, along with its preliminary experimental validation. It is shown that this model, although very simplified, with an assumption of viscous damping of foundation, can be considered as a good enough approximation of realistic structures behaviour. The steady-state response of the beam is obtained by applying the Galilean co-ordinate system and the Adomian's decomposition method combined with coiflet based approximation, leading to analytical estimation of transverse displacements. The applied approach, using parameters taken from real measurements carried out on the Polish Railways network for fast train Pendolino EMU-250, shows ability of the proposed method to analyse parametrically dynamic systems associated with transportation. The obtained results are in accordance with measurement data in wide range of physical parameters, which can be treated as a validation of the developed wavelet based approach. The conducted investigation is supplemented by several numerical examples.

  11. Operational modal analysis approach based on multivariable transmissibility with different transferring outputs

    Science.gov (United States)

    Gómez Araújo, Iván; Laier, Jose Elias

    2015-09-01

    In recent years, transmissibility functions have been used as alternatives to identify the modal parameters of structures under operating conditions. The scalar power spectrum density transmissibility (PSDT), which relates only two responses, was proposed to extract modal parameters by combining different PSDTs with different transferring outputs. In this sense, this paper proposes extending the scalar PSDT concept to multivariable PSDT by relating multiple responses instead of only two. This extension implies the definition of a transmissibility matrix, relating the cross-spectral density matrix among the responses at coordinates Z and U with the cross-spectral density matrix among the responses at coordinates Z and K. The coordinates in Z are known as the transferring outputs. By defining the same coordinates K and U, but with different transferring outputs Z, we prove that the multivariable PSDT converges to the same matrix when it approaches the system poles. This property is used to define only one matrix with different multivariable PSDTs with same coordinates K and U, but with different transferring outputs. The resulting matrix is singular at the system poles, meaning that by applying the inverse of the matrix, the modal parameters can be identified. Here, a numeric example of a beam model subjected to excitations and data from an operational vibration bridge test shows that the proposed method is capable of identifying modal parameters. Furthermore, the results demonstrate the possibility of estimating the same modal parameters by changing only the coordinates K and U, providing greater reliability during modal parameter identification.

  12. Multivariate time delay analysis based local KPCA fault prognosis approach for nonlinear processes☆

    Institute of Scientific and Technical Information of China (English)

    Yuan Xu; Ying Liu; Qunxiong Zhu

    2016-01-01

    Currently, some fault prognosis technology occasionally has relatively unsatisfied performance especially for in-cipient faults in nonlinear processes duo to their large time delay and complex internal connection. To overcome this deficiency, multivariate time delay analysis is incorporated into the high sensitive local kernel principal com-ponent analysis. In this approach, mutual information estimation and Bayesian information criterion (BIC) are separately used to acquire the correlation degree and time delay of the process variables. Moreover, in order to achieve prediction, time series prediction by back propagation (BP) network is applied whose input is multivar-iate correlated time series other than the original time series. Then the multivariate time delayed series and future values obtained by time series prediction are combined to construct the input of local kernel principal component analysis (LKPCA) model for incipient fault prognosis. The new method has been exemplified in a sim-ple nonlinear process and the complicated Tennessee Eastman (TE) benchmark process. The results indicate that the new method has superiority in the fault prognosis sensitivity over other traditional fault prognosis methods. © 2016 The Chemical Industry and Engineering Society of China, and Chemical Industry Press. Al rights reserved.

  13. Wavelet-based multicomponent matching pursuit trace interpolation

    Science.gov (United States)

    Choi, Jihun; Byun, Joongmoo; Seol, Soon Jee; Kim, Young

    2016-09-01

    Typically, seismic data are sparsely and irregularly sampled due to limitations in the survey environment and these cause problems for key seismic processing steps such as surface-related multiple elimination or wave-equation-based migration. Various interpolation techniques have been developed to alleviate the problems caused by sparse and irregular sampling. Among many interpolation techniques, matching pursuit interpolation is a robust tool to interpolate the regularly sampled data with large receiver separation such as crossline data in marine seismic acquisition when both pressure and particle velocity data are used. Multicomponent matching pursuit methods generally used the sinusoidal basis function, which have shown to be effective for interpolating multicomponent marine seismic data in the crossline direction. In this paper, we report the use of wavelet basis functions which further enhances the performance of matching pursuit methods for de-aliasing than sinusoidal basis functions. We also found that the range of the peak wavenumber of the wavelet is critical to the stability of the interpolation results and the de-aliasing performance and that the range should be determined based on Nyquist criteria. In addition, we reduced the computational cost by adopting the inner product of the wavelet and the input data to find the parameters of the wavelet basis function instead of using L-2 norm minimization. Using synthetic data, we illustrate that for aliased data, wavelet-based matching pursuit interpolation yields more stable results than sinusoidal function-based one when we use not only pressure data only but also both pressure and particle velocity together.

  14. Sustainability Multivariate Analysis of the Energy Consumption of Ecuador Using MuSIASEM and BIPLOT Approach

    Directory of Open Access Journals (Sweden)

    Nathalia Tejedor-Flores

    2017-06-01

    Full Text Available Rapid economic growth, expanding populations and increasing prosperity are driving up demand for energy, water and food, especially in developing countries. To understand the energy consumption of a country, we used the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM approach. The MuSIASEM is an innovative approach to accounting that integrates quantitative information generated by distinct types of conventional models based on different dimensions and scales of analysis. The main objective of this work is to enrich the MuSIASEM approach with information from multivariate methods in order to improve the efficiency of existing models of sustainability. The Biplot method permits the joint plotting, in a reduced dimension of the rows (individuals and columns (variables of a multivariate data matrix. We found, in the case study of Ecuador, that the highest values of the Exosomatic Metabolic Rate (EMR per economic sector and Economic Labor Productivity (ELP are located in the Productive Sector (PS. We conclude that the combination of the MuSIASEM variables with the HJ-Biplot allows us to easily know the detailed behavior of the labor productivity and energy consumption of a country.

  15. An empirical approach to update multivariate regression models intended for routine industrial use

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Mencia, M.V.; Andrade, J.M.; Lopez-Mahia, P.; Prada, D. [University of La Coruna, La Coruna (Spain). Dept. of Analytical Chemistry

    2000-11-01

    Many problems currently tackled by analysts are highly complex and, accordingly, multivariate regression models need to be developed. Two intertwined topics are important when such models are to be applied within the industrial routines: (1) Did the model account for the 'natural' variance of the production samples? (2) Is the model stable on time? This paper focuses on the second topic and it presents an empirical approach where predictive models developed by using Mid-FTIR and PLS and PCR hold its utility during about nine months when used to predict the octane number of platforming naphthas in a petrochemical refinery. 41 refs., 10 figs., 1 tab.

  16. Uniform approach to linear and nonlinear interrelation patterns in multivariate time series

    Science.gov (United States)

    Rummel, Christian; Abela, Eugenio; Müller, Markus; Hauf, Martinus; Scheidegger, Olivier; Wiest, Roland; Schindler, Kaspar

    2011-06-01

    Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.

  17. Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Angel D. Sappa

    2016-06-01

    Full Text Available This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR and Long Wave InfraRed (LWIR.

  18. Quantum transport: A unified approach via a multivariate hypergeometric generating function

    Science.gov (United States)

    Macedo-Junior, A. F.; Macêdo, A. M. S.

    2014-07-01

    We introduce a characteristic function method to describe charge-counting statistics (CCS) in phase coherent systems that directly connects the three most successful approaches to quantum transport: random-matrix theory (RMT), the nonlinear σ-model and the trajectory-based semiclassical method. The central idea is the construction of a generating function based on a multivariate hypergeometric function, which can be naturally represented in terms of quantities that are well-defined in each approach. We illustrate the power of our scheme by obtaining exact analytical results for the first four cumulants of CCS in a chaotic quantum dot coupled ideally to electron reservoirs via perfectly conducting leads with arbitrary number of open scattering channels.

  19. Targeting sources of drought tolerance within an Avena spp. collection through multivariate approaches.

    Science.gov (United States)

    Sánchez-Martín, Javier; Mur, Luis A J; Rubiales, Diego; Prats, Elena

    2012-11-01

    In this study, we find and characterize the sources of tolerance to drought amongst an oat (Avena sativa L.) germplasm collection of 174 landraces and cultivars. We used multivariate analysis, non-supervised principal component analyses (PCA) and supervised discriminant function analyses (DFA) to suggest the key mechanism/s responsible for coping with drought stress. Following initial assessment of drought symptoms and area under the drought progress curve, a subset of 14 accessions were selected for further analysis. The collection was assessed for relative water content (RWC), cell membrane stability, stomatal conductance (g (1)), leaf temperature, water use efficiency (WUE), lipid peroxidation, lipoxygenase activity, chlorophyll levels and antioxidant capacity during a drought time course experiment. Without the use of multivariate approaches, it proved difficult to unequivocally link drought tolerance to specific physiological processes in the different resistant oat accessions. These approaches allowed the ranking of many supposed drought tolerance traits in the order of degree of importance within this crop, thereby highlighting those with a causal relationship to drought stress tolerance. Analyses of the loading vectors used to derive the PCA and DFA models indicated that two traits involved in water relations, temperature and RWC together with the area of drought curves, were important indicators of drought tolerance. However, other parameters involved in water use such as g (1) and WUE were less able to discriminate between the accessions. These observations validate our approach which should be seen as representing a cost-effective initial screen that could be subsequently employed to target drought tolerance in segregating populations.

  20. Multivariate chemometric approach to thermal solid-state FT-IR monitoring of pharmaceutical drug compound.

    Science.gov (United States)

    Tan, Wei Jian; Widjaja, Effendi

    2008-08-01

    The study of thermal-related solid-state reaction monitored by spectroscopic method needs the use of advanced multivariate chemometric approach. It is because visual inspection of spectral data on particular functional groups or spectral bands is difficult to reveal the complete physical and chemical information. The spectral contributions from various species involved in the solid-state changes are generally highly overlapping and the spectral differences between reactant and product are usually quite minute. In this article, we demonstrate the use of multivariate chemometric approach to resolve the in situ thermal-dependent Fourier-transform infrared (FT-IR) mixture spectra of lisinopril dihydrate when it was heated from 24 to 170 degrees C. The collected FT-IR mixture spectra were first subjected to singular value decomposition (SVD) to obtain the right singular vectors. The right singular vectors were rotated into a set of pure component spectral estimates based on entropy minimization and spectral dissimilarity objective functions. The resulting pure component spectral estimates were then further refined using alternating least squares (ALS). In current study, four pure component spectra, that is, lisinopril dihydrate, monohydrate, anhydrate, and diketopiperazine (DKP) were all resolved and the relative thermal-dependent contributions of each component were also obtained. These relative contributions revealed the critical temperature for each transformation and degradation. This novel approach provides better interpretation of the pathway of dehydration and intramolecular cyclization of lisinopril dihydrate in the solid state. In addition, it can be used to complement the information obtained from differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA).

  1. Determinants of Food Crop Diversity and Profitability in Southeastern Nigeria: A Multivariate Tobit Approach

    Directory of Open Access Journals (Sweden)

    Sanzidur Rahman

    2016-04-01

    Full Text Available The present study jointly determines the factors influencing decisions to diversify into multiple food crops (i.e., rice, yam and cassava vis-à-vis profitability of 400 farmers from Ebonyi and Anambra states of Southeastern Nigeria using a multivariate Tobit model. Model diagnostic reveals that the decisions to diversify into multiple crops and profits generated therefrom are significantly correlated, thereby justifying use of a multivariate approach. Results reveal that 68% of the farmers grew at least two food crops and profitability is highest for only rice producers followed by joint rice and yam producers, which are mainly for sale. Farm size is the most dominant determinant of crop diversity vis-à-vis profitability. A rise in the relative price of plowing significantly reduces profitability of yam and rice. High yield is the main motive for growing yam and cassava whereas ready market is for rice. Other determinants with varying level of influences are proximity to market and/or extension office, extension contact, training, agricultural credit, subsistence pressure and location. Policy recommendations include investments in market infrastructure and credit services, land and/or tenurial reform and input price stabilization to promote food crop diversity vis-à-vis profitability in Southeastern Nigeria.

  2. Portfolio Value at Risk Estimate for Crude Oil Markets: A Multivariate Wavelet Denoising Approach

    Directory of Open Access Journals (Sweden)

    Kin Keung Lai

    2012-04-01

    Full Text Available In the increasingly globalized economy these days, the major crude oil markets worldwide are seeing higher level of integration, which results in higher level of dependency and transmission of risks among different markets. Thus the risk of the typical multi-asset crude oil portfolio is influenced by dynamic correlation among different assets, which has both normal and transient behaviors. This paper proposes a novel multivariate wavelet denoising based approach for estimating Portfolio Value at Risk (PVaR. The multivariate wavelet analysis is introduced to analyze the multi-scale behaviors of the correlation among different markets and the portfolio volatility behavior in the higher dimensional time scale domain. The heterogeneous data and noise behavior are addressed in the proposed multi-scale denoising based PVaR estimation algorithm, which also incorporatesthe mainstream time series to address other well known data features such as autocorrelation and volatility clustering. Empirical studies suggest that the proposed algorithm outperforms the benchmark ExponentialWeighted Moving Average (EWMA and DCC-GARCH model, in terms of conventional performance evaluation criteria for the model reliability.

  3. Value-at-risk estimation with wavelet-based extreme value theory: Evidence from emerging markets

    Science.gov (United States)

    Cifter, Atilla

    2011-06-01

    This paper introduces wavelet-based extreme value theory (EVT) for univariate value-at-risk estimation. Wavelets and EVT are combined for volatility forecasting to estimate a hybrid model. In the first stage, wavelets are used as a threshold in generalized Pareto distribution, and in the second stage, EVT is applied with a wavelet-based threshold. This new model is applied to two major emerging stock markets: the Istanbul Stock Exchange (ISE) and the Budapest Stock Exchange (BUX). The relative performance of wavelet-based EVT is benchmarked against the Riskmetrics-EWMA, ARMA-GARCH, generalized Pareto distribution, and conditional generalized Pareto distribution models. The empirical results show that the wavelet-based extreme value theory increases predictive performance of financial forecasting according to number of violations and tail-loss tests. The superior forecasting performance of the wavelet-based EVT model is also consistent with Basel II requirements, and this new model can be used by financial institutions as well.

  4. Wavelet based deseasonalization for modelling and forecasting of daily discharge series considering long range dependence

    Directory of Open Access Journals (Sweden)

    Szolgayová Elena

    2014-03-01

    Full Text Available Short term streamflow forecasting is important for operational control and risk management in hydrology. Despite a wide range of models available, the impact of long range dependence is often neglected when considering short term forecasting. In this paper, the forecasting performance of a new model combining a long range dependent autoregressive fractionally integrated moving average (ARFIMA model with a wavelet transform used as a method of deseasonalization is examined. It is analysed, whether applying wavelets in order to model the seasonal component in a hydrological time series, is an alternative to moving average deseasonalization in combination with an ARFIMA model. The one-to-ten-steps-ahead forecasting performance of this model is compared with two other models, an ARFIMA model with moving average deseasonalization, and a multiresolution wavelet based model. All models are applied to a time series of mean daily discharge exhibiting long range dependence. For one and two day forecasting horizons, the combined wavelet - ARFIMA approach shows a similar performance as the other models tested. However, for longer forecasting horizons, the wavelet deseasonalization - ARFIMA combination outperforms the other two models. The results show that the wavelets provide an attractive alternative to the moving average deseasonalization.

  5. Wavelet-based density estimation for noise reduction in plasma simulations using particles

    Science.gov (United States)

    van yen, Romain Nguyen; del-Castillo-Negrete, Diego; Schneider, Kai; Farge, Marie; Chen, Guangye

    2010-04-01

    For given computational resources, the accuracy of plasma simulations using particles is mainly limited by the noise due to limited statistical sampling in the reconstruction of the particle distribution function. A method based on wavelet analysis is proposed and tested to reduce this noise. The method, known as wavelet-based density estimation (WBDE), was previously introduced in the statistical literature to estimate probability densities given a finite number of independent measurements. Its novel application to plasma simulations can be viewed as a natural extension of the finite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. The proposed method preserves the moments of the particle distribution function to a good level of accuracy, has no constraints on the dimensionality of the system, does not require an a priori selection of a global smoothing scale, and its able to adapt locally to the smoothness of the density based on the given discrete particle data. Moreover, the computational cost of the denoising stage is of the same order as one time step of a FSP simulation. The method is compared with a recently proposed proper orthogonal decomposition based method, and it is tested with three particle data sets involving different levels of collisionality and interaction with external and self-consistent fields.

  6. Joint Source-Channel Coding for Wavelet-Based Scalable Video Transmission Using an Adaptive Turbo Code

    Directory of Open Access Journals (Sweden)

    Ramzan Naeem

    2007-01-01

    Full Text Available An efficient approach for joint source and channel coding is presented. The proposed approach exploits the joint optimization of a wavelet-based scalable video coding framework and a forward error correction method based on turbo codes. The scheme minimizes the reconstructed video distortion at the decoder subject to a constraint on the overall transmission bitrate budget. The minimization is achieved by exploiting the source rate distortion characteristics and the statistics of the available codes. Here, the critical problem of estimating the bit error rate probability in error-prone applications is discussed. Aiming at improving the overall performance of the underlying joint source-channel coding, the combination of the packet size, interleaver, and channel coding rate is optimized using Lagrangian optimization. Experimental results show that the proposed approach outperforms conventional forward error correction techniques at all bit error rates. It also significantly improves the performance of end-to-end scalable video transmission at all channel bit rates.

  7. Joint Source-Channel Coding for Wavelet-Based Scalable Video Transmission Using an Adaptive Turbo Code

    Directory of Open Access Journals (Sweden)

    Naeem Ramzan

    2007-03-01

    Full Text Available An efficient approach for joint source and channel coding is presented. The proposed approach exploits the joint optimization of a wavelet-based scalable video coding framework and a forward error correction method based on turbo codes. The scheme minimizes the reconstructed video distortion at the decoder subject to a constraint on the overall transmission bitrate budget. The minimization is achieved by exploiting the source rate distortion characteristics and the statistics of the available codes. Here, the critical problem of estimating the bit error rate probability in error-prone applications is discussed. Aiming at improving the overall performance of the underlying joint source-channel coding, the combination of the packet size, interleaver, and channel coding rate is optimized using Lagrangian optimization. Experimental results show that the proposed approach outperforms conventional forward error correction techniques at all bit error rates. It also significantly improves the performance of end-to-end scalable video transmission at all channel bit rates.

  8. Time-varying correlations in global real estate markets: A multivariate GARCH with spatial effects approach

    Science.gov (United States)

    Gu, Huaying; Liu, Zhixue; Weng, Yingliang

    2017-04-01

    The present study applies the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) with spatial effects approach for the analysis of the time-varying conditional correlations and contagion effects among global real estate markets. A distinguishing feature of the proposed model is that it can simultaneously capture the spatial interactions and the dynamic conditional correlations compared with the traditional MGARCH models. Results reveal that the estimated dynamic conditional correlations have exhibited significant increases during the global financial crisis from 2007 to 2009, thereby suggesting contagion effects among global real estate markets. The analysis further indicates that the returns of the regional real estate markets that are in close geographic and economic proximities exhibit strong co-movement. In addition, evidence of significantly positive leverage effects in global real estate markets is also determined. The findings have significant implications on global portfolio diversification opportunities and risk management practices.

  9. Time and spectral domain relative entropy: A new approach to multivariate spectral estimation

    CERN Document Server

    Ferrante, Augusto; Pavon, Michele

    2011-01-01

    The concept of spectral relative entropy rate is introduced for jointly stationary Gaussian processes. Using a classical result of Marc Pinsker, we establish a remarkable connection between time and spectral domain relative entropy rate. This naturally leads to a new multivariate spectral estimation technique. The latter may be viewed as an extension of the approach, called THREE, introduced by Byrnes, Georgiou and Lindquist in 2000 which, in turn, followed in the footsteps of the Burg-Jaynes Maximum Entropy Method. Spectral estimation is here recast in the form of a constrained spectrum approximation problem where the distance is equal to the processes relative entropy rate. The corresponding solution entails a complexity upper bound which improves on the one so far available in the multichannel framework. Indeed, it is equal to the one featured by THREE in the scalar case. The solution is computed via a globally convergent matricial Newton-type algorithm. Simulations suggest the effectiveness of the new tec...

  10. A multivariate partial least squares approach to joint association analysis for multiple correlated traits

    Directory of Open Access Journals (Sweden)

    Yang Xu

    2016-02-01

    Full Text Available Many complex traits are highly correlated rather than independent. By taking the correlation structure of multiple traits into account, joint association analyses can achieve both higher statistical power and more accurate estimation. To develop a statistical approach to joint association analysis that includes allele detection and genetic effect estimation, we combined multivariate partial least squares regression with variable selection strategies and selected the optimal model using the Bayesian Information Criterion (BIC. We then performed extensive simulations under varying heritabilities and sample sizes to compare the performance achieved using our method with those obtained by single-trait multilocus methods. Joint association analysis has measurable advantages over single-trait methods, as it exhibits superior gene detection power, especially for pleiotropic genes. Sample size, heritability, polymorphic information content (PIC, and magnitude of gene effects influence the statistical power, accuracy and precision of effect estimation by the joint association analysis.

  11. A multivariate partial least squares approach to joint association analysis for multiple correlated traits

    Institute of Scientific and Technical Information of China (English)

    Yang Xu; Wenming Hu; Zefeng Yang; Chenwu Xu

    2016-01-01

    Many complex traits are highly correlated rather than independent. By taking the correlation structure of multiple traits into account, joint association analyses can achieve both higher statistical power and more accurate estimation. To develop a statistical approach to joint association analysis that includes allele detection and genetic effect estimation, we combined multivariate partial least squares regression with variable selection strategies and selected the optimal model using the Bayesian Information Criterion(BIC). We then performed extensive simulations under varying heritabilities and sample sizes to compare the performance achieved using our method with those obtained by single-trait multilocus methods. Joint association analysis has measurable advantages over single-trait methods, as it exhibits superior gene detection power, especially for pleiotropic genes. Sample size, heritability,polymorphic information content(PIC), and magnitude of gene effects influence the statistical power, accuracy and precision of effect estimation by the joint association analysis.

  12. A multivariate partial least squares approach to joint association analysis for multiple correlated traits

    Institute of Scientific and Technical Information of China (English)

    Yang Xu; Wenming Hu; Zefeng Yang; Chenwu Xu

    2016-01-01

    Many complex traits are highly correlated rather than independent. By taking the correlation structure of multiple traits into account, joint association analyses can achieve both higher statistical power and more accurate estimation. To develop a statistical approach to joint association analysis that includes allele detection and genetic effect estimation, we combined multivariate partial least squares regression with variable selection strategies and selected the optimal model using the Bayesian Information Criterion (BIC). We then performed extensive simulations under varying heritabilities and sample sizes to compare the performance achieved using our method with those obtained by single-trait multilocus methods. Joint association analysis has measurable advantages over single-trait methods, as it exhibits superior gene detection power, especially for pleiotropic genes. Sample size, heritability, polymorphic information content (PIC), and magnitude of gene effects influence the statistical power, accuracy and precision of effect estimation by the joint association analysis.

  13. On the Time Varying Relationship between Oil Price and G7 Equity index: a Multivariate Approach

    Directory of Open Access Journals (Sweden)

    Khaled Guesmi

    2016-06-01

    Full Text Available The aim of this paper is to investigate the interaction between G7 stock markets and oil prices during the period 1998-2013. We employ a multivariate approach based on c-DCC-FIAPARCH framework that incorporates the features of asymmetries, persistence, that are typically observed in stock markets and oil prices. We show that the origin of oil price shock is the main driver of the relationship between stock and oil markets. More specifically, our results show, in one hand, that G7 equity market has a high correlation with oil market in the presence of aggregate demand oil price shocks (Asian crisis, housing market boom, Chinese growth, subprime crisis. In other hand, our results highlight that G7 equity market did not react to precautionary demand shocks (09/11 US terrorist attacks, and second Iraq war in 2003.

  14. Image Denoising of Wavelet based Compressed Images Corrupted by Additive White Gaussian Noise

    Directory of Open Access Journals (Sweden)

    Shyam Lal

    2012-08-01

    Full Text Available In this study an efficient algorithm is proposed for removal of additive white Gaussian noise from compressed natural images in wavelet based domain. First, the natural image is compressed by discrete wavelet transform and then proposed hybrid filter is applied for image denoising of compressed images corrupted by Additive White Gaussian Noise (AWGN. The proposed hybrid filter (HMCD is combination of non-linear fourth order partial differential equation and bivariate shrinkage function. The proposed hybrid filter provides better results in term of noise suppression with keeping minimum edge blurring as compared to other existing image denoising techniques for wavelet based compressed images. Simulation and experimental results on benchmark test images demonstrate that the proposed hybrid filter attains competitive image denoising performances as compared with other state-of-the-art image denoising algorithms. It is more effective particularly for the highly corrupted images in wavelet based compressed domain.

  15. Wavelet Based Analytical Expressions to Steady State Biofilm Model Arising in Biochemical Engineering.

    Science.gov (United States)

    Padma, S; Hariharan, G

    2016-06-01

    In this paper, we have developed an efficient wavelet based approximation method to biofilm model under steady state arising in enzyme kinetics. Chebyshev wavelet based approximation method is successfully introduced in solving nonlinear steady state biofilm reaction model. To the best of our knowledge, until now there is no rigorous wavelet based solution has been addressed for the proposed model. Analytical solutions for substrate concentration have been derived for all values of the parameters δ and SL. The power of the manageable method is confirmed. Some numerical examples are presented to demonstrate the validity and applicability of the wavelet method. Moreover the use of Chebyshev wavelets is found to be simple, efficient, flexible, convenient, small computation costs and computationally attractive.

  16. A new wavelet-based thin plate element using B-spline wavelet on the interval

    Science.gov (United States)

    Jiawei, Xiang; Xuefeng, Chen; Zhengjia, He; Yinghong, Zhang

    2008-01-01

    By interacting and synchronizing wavelet theory in mathematics and variational principle in finite element method, a class of wavelet-based plate element is constructed. In the construction of wavelet-based plate element, the element displacement field represented by the coefficients of wavelet expansions in wavelet space is transformed into the physical degree of freedoms in finite element space via the corresponding two-dimensional C1 type transformation matrix. Then, based on the associated generalized function of potential energy of thin plate bending and vibration problems, the scaling functions of B-spline wavelet on the interval (BSWI) at different scale are employed directly to form the multi-scale finite element approximation basis so as to construct BSWI plate element via variational principle. BSWI plate element combines the accuracy of B-spline functions approximation and various wavelet-based elements for structural analysis. Some static and dynamic numerical examples are studied to demonstrate the performances of the present element.

  17. Approaches to Sample Size Determination for Multivariate Data: Applications to PCA and PLS-DA of Omics Data.

    Science.gov (United States)

    Saccenti, Edoardo; Timmerman, Marieke E

    2016-08-01

    Sample size determination is a fundamental step in the design of experiments. Methods for sample size determination are abundant for univariate analysis methods, but scarce in the multivariate case. Omics data are multivariate in nature and are commonly investigated using multivariate statistical methods, such as principal component analysis (PCA) and partial least-squares discriminant analysis (PLS-DA). No simple approaches to sample size determination exist for PCA and PLS-DA. In this paper we will introduce important concepts and offer strategies for (minimally) required sample size estimation when planning experiments to be analyzed using PCA and/or PLS-DA.

  18. Analysis of the real EADGENE data set: Multivariate approaches and post analysis (Open Access publication

    Directory of Open Access Journals (Sweden)

    Schuberth Hans-Joachim

    2007-11-01

    Full Text Available Abstract The aim of this paper was to describe, and when possible compare, the multivariate methods used by the participants in the EADGENE WP1.4 workshop. The first approach was for class discovery and class prediction using evidence from the data at hand. Several teams used hierarchical clustering (HC or principal component analysis (PCA to identify groups of differentially expressed genes with a similar expression pattern over time points and infective agent (E. coli or S. aureus. The main result from these analyses was that HC and PCA were able to separate tissue samples taken at 24 h following E. coli infection from the other samples. The second approach identified groups of differentially co-expressed genes, by identifying clusters of genes highly correlated when animals were infected with E. coli but not correlated more than expected by chance when the infective pathogen was S. aureus. The third approach looked at differential expression of predefined gene sets. Gene sets were defined based on information retrieved from biological databases such as Gene Ontology. Based on these annotation sources the teams used either the GlobalTest or the Fisher exact test to identify differentially expressed gene sets. The main result from these analyses was that gene sets involved in immune defence responses were differentially expressed.

  19. Monitoring and assessment of a eutrophicated coastal lake using multivariate approaches

    Directory of Open Access Journals (Sweden)

    U.G. Abhjna

    2016-05-01

    Full Text Available Multivariate statistical techniques such as cluster analysis, multidimensional scaling and principal component analysis were applied to evaluate the temporal and spatial variations in water quality data set generated for two years (2008-2010 from six monitoring stations of Veli-Akkulam Lake and compared with a regional reference lake Vellayani of south India. Seasonal variations of 14 different physicochemical parameters analyzed were as follows: pH (6.42-7.48, water temperature (26.0-31.28°C, salinity (0.50-26.81 ppt, electrical conductivity (47-20656.31 µs/cm, dissolved oxygen (0.078-7.65 mg/L, free carbon-dioxide (3.8-51.8 mg/L, total hardness (27.20-2166.6 mg/L, total dissolved solids (84.66-4195 mg/L, biochemical oxygen demand (1.57-25.78 mg/L, chemical oxygen demand (5.35-71.14 mg/L, nitrate (0.012-0.321 µg/ml, nitrite (0.24-0.79 µg/ml, phosphate (0.04-5.88 mg/L, and sulfate (0.27-27.8 mg/L. Cluster analysis showed four clusters based on the similarity of water quality characteristics among sampling stations during three different seasons (pre-monsoon, monsoon and post-monsoon. Multidimensional scaling in conjunction with cluster analysis identified four distinct groups of sites with varied water quality conditions such as upstream, transitional and downstream conditions  in Veli-Akkulam Lake and a reference condition at Vellayani Lake. Principal Component Analysis showed that Veli-Akkulam Lake was seriously deteriorated in water quality while acceptable water quality conditions were observed at reference lake Vellayani. Thus the present study could estimate the effectiveness of multivariate statistical approaches for assessing water quality conditions in lakes.

  20. A Hydraulic Tomographic Approach: Coupling of Travel Time and Amplitude Inversion Using Multivariate Statistics

    Science.gov (United States)

    Brauchler, R.; Cheng, J.; Dietrich, P.; Everett, M.; Johnson, B.; Sauter, M.

    2005-12-01

    Knowledge about the spatial variations in hydraulic properties plays an important role controlling solute movement in saturated flow systems. Traditional hydrogeological approaches appear to have difficulties providing high resolution parameter estimates. Thus, we have decided to develop an approach coupling the two existing hydraulic tomographic approaches: a) Inversion of the drawdown as a function of time (amplitude inversion) and b) the inversion of travel times of the pressure disturbance. The advantages of hydraulic travel time tomography are its high structural resolution and computational efficiency. However, travel times are primarily controlled by the aquifer diffusivity making it difficult to determine hydraulically conductivity and storage. Amplitude inversion on the other hand is able to determine hydraulic conductivity and storage separately, but the heavy computational burden of the amplitude inversion is often a shortcoming, especially for larger data sets. Our coupled inversion approach was developed and tested using synthetic data sets. The data base of the inversion comprises simulated slug tests, in which the position of the sources (injection ports) isolated with packers, are varied between the tests. The first step was the inversion of several characteristic travel times (e.g. early, intermediate and late travel times) in order to determine the diffusivity distribution. Secondly, the resulting diffusivity distributions were classified into homogeneous groups in order to differentiate between hydrogeological units characterized by a significant diffusivity contrast. The classification was performed by using multivariate statistics. With a numerical flow model and an automatic parameter estimator the amplitude inversion was performed in a final step. The classified diffusivity distribution is an excellent starting model for the amplitude inversion and allows to reduce strongly the calculation time. The final amplitude inversion overcomes

  1. Multivariate dimensionality reduction approaches to identify gene-gene and gene-environment interactions underlying multiple complex traits.

    Directory of Open Access Journals (Sweden)

    Hai-Ming Xu

    Full Text Available The elusive but ubiquitous multifactor interactions represent a stumbling block that urgently needs to be removed in searching for determinants involved in human complex diseases. The dimensionality reduction approaches are a promising tool for this task. Many complex diseases exhibit composite syndromes required to be measured in a cluster of clinical traits with varying correlations and/or are inherently longitudinal in nature (changing over time and measured dynamically at multiple time points. A multivariate approach for detecting interactions is thus greatly needed on the purposes of handling a multifaceted phenotype and longitudinal data, as well as improving statistical power for multiple significance testing via a two-stage testing procedure that involves a multivariate analysis for grouped phenotypes followed by univariate analysis for the phenotypes in the significant group(s. In this article, we propose a multivariate extension of generalized multifactor dimensionality reduction (GMDR based on multivariate generalized linear, multivariate quasi-likelihood and generalized estimating equations models. Simulations and real data analysis for the cohort from the Study of Addiction: Genetics and Environment are performed to investigate the properties and performance of the proposed method, as compared with the univariate method. The results suggest that the proposed multivariate GMDR substantially boosts statistical power.

  2. A wavelet-based evaluation of time-varying long memory of equity markets: A paradigm in crisis

    Science.gov (United States)

    Tan, Pei P.; Chin, Cheong W.; Galagedera, Don U. A.

    2014-09-01

    This study, using wavelet-based method investigates the dynamics of long memory in the returns and volatility of equity markets. In the sample of five developed and five emerging markets we find that the daily return series from January 1988 to June 2013 may be considered as a mix of weak long memory and mean-reverting processes. In the case of volatility in the returns, there is evidence of long memory, which is stronger in emerging markets than in developed markets. We find that although the long memory parameter may vary during crisis periods (1997 Asian financial crisis, 2001 US recession and 2008 subprime crisis) the direction of change may not be consistent across all equity markets. The degree of return predictability is likely to diminish during crisis periods. Robustness of the results is checked with de-trended fluctuation analysis approach.

  3. Controlling the Beam Halo-Chaos via Wavelet-Based Feedback Periodically

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In our recent work, worth mentioning in particular is the wavelet-based feedback controller, which works much better than the others for controlling the proton beam haio-chaos, where the master wavelet function isor, in a simplified form,and generalized form:(1)(2)(3)where a and b are scaling and translation constants, respectively. C is a selected constant. The

  4. Examining carbon emissions economic growth nexus for India: A multivariate cointegration approach

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, Sajal, E-mail: sghosh@mdi.ac.i [Management Development Institute, Gurgaon, Haryana 122001 (India)

    2010-06-15

    The study probes cointegration and causality between carbon emissions and economic growth for India using ARDL bounds testing approach complemented by Johansen-Juselius maximum likelihood procedure in a multivariate framework by incorporating energy supply, investment and employment for time span 1971-2006. The study fails to establish long-run equilibrium relationship and long term causality between carbon emissions and economic growth; however, there exists a bi-directional short-run causality between the two. Hence, in the short-run, any effort to reduce carbon emissions could lead to a fall in the national income. This study also establishes unidirectional short-run causality running from economic growth to energy supply and energy supply to carbon emissions. The absence of causality running from energy supply to economic growth implies that in India, energy conservation and energy efficiency measures can be implemented to minimize the wastage of energy across value chain. Such measures would narrow energy demand-supply gap. Absence of long-run causality between carbon emissions and economic growth implies that in the long-run, focus should be given on harnessing energy from clean sources to curb carbon emissions, which would not affect the country's economic growth.

  5. Examining carbon emissions economic growth nexus for India. A multivariate cointegration approach

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, Sajal [Assistant, Professor, Management Development Institute, Gurgaon, Haryana 122001 (India)

    2010-06-15

    The study probes cointegration and causality between carbon emissions and economic growth for India using ARDL bounds testing approach complemented by Johansen-Juselius maximum likelihood procedure in a multivariate framework by incorporating energy supply, investment and employment for time span 1971-2006. The study fails to establish long-run equilibrium relationship and long term causality between carbon emissions and economic growth; however, there exists a bi-directional short-run causality between the two. Hence, in the short-run, any effort to reduce carbon emissions could lead to a fall in the national income. This study also establishes unidirectional short-run causality running from economic growth to energy supply and energy supply to carbon emissions. The absence of causality running from energy supply to economic growth implies that in India, energy conservation and energy efficiency measures can be implemented to minimize the wastage of energy across value chain. Such measures would narrow energy demand-supply gap. Absence of long-run causality between carbon emissions and economic growth implies that in the long-run, focus should be given on harnessing energy from clean sources to curb carbon emissions, which would not affect the country's economic growth. (author)

  6. Multivariate approach to determination of intermediate target of monetary policy strategy in CEE countries

    Directory of Open Access Journals (Sweden)

    Mario Pečarić

    2014-12-01

    Full Text Available The main aim of this paper is to investigate the characteristics of Central and East European (CEE countries considering their choice of an intermediate target of the monetary policy strategy. A theoretical choice of the intermediate target of the monetary policy strategy refers to inflation targeting, exchange rate targeting and monetary targeting, with the latter not being a practical choice in these countries. This research tends to find out whether this choice means different economic characteristics considering macroeconomic and financial variables, and whether the choice of an individual intermediate target implies better overall economic performance. Eleven characteristics are classified for 14 chosen CEE countries, using the multivariate and multicriteria approaches. The research was conducted in four years (2005, 2007, 2009 and 2011 in order to see whether performance and ranking of countries change when taking into account the financial crisis. The results show that, when considering all 11 indicators, countries cannot be classified into two equal clusters considering the choice of the intermediate target. However, when clustering is done using foreign currency denominated loans and three forms of central bank independence, results show that countries are clustered according to our expectations, i.e., foreign currency denominated loans and two forms of central bank independence contribute to a difference between countries. Furthermore, countries were ranked considering overall economic performance, but no significant difference considering the choice of the intermediate target was found.

  7. Multi-variate flood damage assessment: a tree-based data-mining approach

    Science.gov (United States)

    Merz, B.; Kreibich, H.; Lall, U.

    2013-01-01

    The usual approach for flood damage assessment consists of stage-damage functions which relate the relative or absolute damage for a certain class of objects to the inundation depth. Other characteristics of the flooding situation and of the flooded object are rarely taken into account, although flood damage is influenced by a variety of factors. We apply a group of data-mining techniques, known as tree-structured models, to flood damage assessment. A very comprehensive data set of more than 1000 records of direct building damage of private households in Germany is used. Each record contains details about a large variety of potential damage-influencing characteristics, such as hydrological and hydraulic aspects of the flooding situation, early warning and emergency measures undertaken, state of precaution of the household, building characteristics and socio-economic status of the household. Regression trees and bagging decision trees are used to select the more important damage-influencing variables and to derive multi-variate flood damage models. It is shown that these models outperform existing models, and that tree-structured models are a promising alternative to traditional damage models.

  8. A New Multivariate Approach for Prognostics Based on Extreme Learning Machine and Fuzzy Clustering.

    Science.gov (United States)

    Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine

    2015-12-01

    Prognostics is a core process of prognostics and health management (PHM) discipline, that estimates the remaining useful life (RUL) of a degrading machinery to optimize its service delivery potential. However, machinery operates in a dynamic environment and the acquired condition monitoring data are usually noisy and subject to a high level of uncertainty/unpredictability, which complicates prognostics. The complexity further increases, when there is absence of prior knowledge about ground truth (or failure definition). For such issues, data-driven prognostics can be a valuable solution without deep understanding of system physics. This paper contributes a new data-driven prognostics approach namely, an "enhanced multivariate degradation modeling," which enables modeling degrading states of machinery without assuming a homogeneous pattern. In brief, a predictability scheme is introduced to reduce the dimensionality of the data. Following that, the proposed prognostics model is achieved by integrating two new algorithms namely, the summation wavelet-extreme learning machine and subtractive-maximum entropy fuzzy clustering to show evolution of machine degradation by simultaneous predictions and discrete state estimation. The prognostics model is equipped with a dynamic failure threshold assignment procedure to estimate RUL in a realistic manner. To validate the proposition, a case study is performed on turbofan engines data from PHM challenge 2008 (NASA), and results are compared with recent publications.

  9. Detecting change in biological rhythms: a multivariate permutation test approach to Fourier-transformed data.

    Science.gov (United States)

    Blackford, Jennifer Urbano; Salomon, Ronald M; Waller, Niels G

    2009-02-01

    Treatment-related changes in neurobiological rhythms are of increasing interest to psychologists, psychiatrists, and biological rhythms researchers. New methods for analyzing change in rhythms are needed, as most common methods disregard the rich complexity of biological processes. Large time series data sets reflect the intricacies of underlying neurobiological processes, but can be difficult to analyze. We propose the use of Fourier methods with multivariate permutation test (MPT) methods for analyzing change in rhythms from time series data. To validate the use of MPT for Fourier-transformed data, we performed Monte Carlo simulations and compared statistical power and family-wise error for MPT to Bonferroni-corrected and uncorrected methods. Results show that MPT provides greater statistical power than Bonferroni-corrected tests, while appropriately controlling family-wise error. We applied this method to human, pre- and post-treatment, serially-sampled neurotransmitter data to confirm the utility of this method using real data. Together, Fourier with MPT methods provides a statistically powerful approach for detecting change in biological rhythms from time series data.

  10. Mapping Natural Terroir Units using a multivariate approach and legacy data

    Science.gov (United States)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    was then subdivided into 9 NTUs, statistically differentiated for the used variables. The vineyard areas of Siena province was subdivided into 9 NTU, statistically differentiated for the used variables. The study demonstrated the strength of a multivariate approach for NTU mapping at province scale (1:125,000), using viticultural legacy data. Identification and mapping of terroir diversity within the DOC and DOCG at the province scale suggest the adoption of viticultural subzones. The subzones, based on the NTU, could bring to the fruition of different wine-production systems that enhanced the peculiarities of the terroir.

  11. Multivariate regression approaches for surrogate-based diffeomorphic estimation of respiratory motion in radiation therapy

    Science.gov (United States)

    Wilms, M.; Werner, R.; Ehrhardt, J.; Schmidt-Richberg, A.; Schlemmer, H.-P.; Handels, H.

    2014-03-01

    Breathing-induced location uncertainties of internal structures are still a relevant issue in the radiation therapy of thoracic and abdominal tumours. Motion compensation approaches like gating or tumour tracking are usually driven by low-dimensional breathing signals, which are acquired in real-time during the treatment. These signals are only surrogates of the internal motion of target structures and organs at risk, and, consequently, appropriate models are needed to establish correspondence between the acquired signals and the sought internal motion patterns. In this work, we present a diffeomorphic framework for correspondence modelling based on the Log-Euclidean framework and multivariate regression. Within the framework, we systematically compare standard and subspace regression approaches (principal component regression, partial least squares, canonical correlation analysis) for different types of common breathing signals (1D: spirometry, abdominal belt, diaphragm tracking; multi-dimensional: skin surface tracking). Experiments are based on 4D CT and 4D MRI data sets and cover intra- and inter-cycle as well as intra- and inter-session motion variations. Only small differences in internal motion estimation accuracy are observed between the 1D surrogates. Increasing the surrogate dimensionality, however, improved the accuracy significantly; this is shown for both 2D signals, which consist of a common 1D signal and its time derivative, and high-dimensional signals containing the motion of many skin surface points. Eventually, comparing the standard and subspace regression variants when applied to the high-dimensional breathing signals, only small differences in terms of motion estimation accuracy are found.

  12. Classification of Sunflower Oil Blends Stabilized by Oleoresin Rosemary (Rosmarinus officinalis L.) Using Multivariate Kinetic Approach.

    Science.gov (United States)

    Upadhyay, Rohit; Mishra, Hari Niwas

    2015-08-01

    The sunflower oil-oleoresin rosemary (Rosmarinus officinalis L.) blends (SORB) at 9 different concentrations (200 to 2000 mg/kg), sunflower oil-tertiary butyl hydroquinone (SOTBHQ ) at 200 mg/kg and control (without preservatives) (SO control ) were oxidized using Rancimat (temperature: 100 to 130 °C; airflow rate: 20 L/h). The oxidative stability of blends was expressed using induction period (IP), oil stability index and photochemiluminescence assay. The linear regression models were generated by plotting ln IP with temperature to estimate the shelf life at 20 °C (SL20 ; R(2) > 0.90). Principal component analysis (PCA) and hierarchical cluster analysis (HCA) was used to classify the oil blends depending upon the oxidative stability and kinetic parameters. The Arrhenius equation adequately described the temperature-dependent kinetics (R(2) > 0.90, P < 0.05) and kinetic parameters viz. activation energies, activation enthalpies, and entropies were calculated in the range of 92.07 to 100.50 kJ/mol, 88.85 to 97.28 kJ/mol, -33.33 to -1.13 J/mol K, respectively. Using PCA, a satisfactory discrimination was noted among SORB, SOTBHQ , and SOcontrol samples. HCA classified the oil blends into 3 different clusters (I, II, and III) where SORB1200 and SORB1500 were grouped together in close proximity with SOTBHQ indicating the comparable oxidative stability. The SL20 was estimated to be 3790, 6974, and 4179 h for SO control, SOTBHQ, and SORB1500, respectively. The multivariate kinetic approach effectively screened SORB1500 as the best blend conferring the highest oxidative stability to sunflower oil. This approach can be adopted for quick and reliable estimation of the oxidative stability of oil samples.

  13. A numerical multivariate approach to optimization of photovoltaic solar low energy building designs

    Energy Technology Data Exchange (ETDEWEB)

    Peippo, K.

    1997-12-31

    The large number of options available to the energy conscious building designer calls for careful assessment of the competitiveness of the various technologies, in order to find the best technology mix for each design. Here, a simplified rigorous numerical multivariate optimization scheme is introduced to address the energy efficient solar low energy building design problem at the early design stages. The approach is based on elementary building modeling and non-linear optimization techniques where the basic physical, technical and economical interactions between the building design options and energy flows are accounted for. The design features considered include building geometry, thermal insulation, windows, daylighting, solar thermal systems and photovoltaics. The applicability of the approach is assessed through a set of case studies for a single-family residential house and a large office building in three different climates in Europe: Helsinki, Finland (60 deg C N), Paris, France (49 deg C N) and Trapani, Italy (38 deg C N). The analysis is based on annual hourly simulations with Test Reference Years. First, the design minimizing the sum of annual capital and energy cost of the building is determined. Then, the optimal path is computed that gives the most economical design to reach a given level of annual primary energy requirements, should this desired level be lower than the one in the least cost option. The optimal values of the key design variables for the case studies are presented. As a typical low energy building design feature, photovoltaics is to be introduced at a relatively early stage in the designs, in order to further decrease the auxiliary energy requirements. This due to the rather high share of electricity in the energy balance of low energy buildings. However, it is crucial for the integrity of low energy building design, that also the other design features are optimized before one resorts to PV. In addition, it is notable that the somewhat

  14. Individual Differences in Male Rats in a Behavioral Test Battery: A Multivariate Statistical Approach

    Science.gov (United States)

    Feyissa, Daniel D.; Aher, Yogesh D.; Engidawork, Ephrem; Höger, Harald; Lubec, Gert; Korz, Volker

    2017-01-01

    Animal models for anxiety, depressive-like and cognitive diseases or aging often involve testing of subjects in behavioral test batteries. The large number of test variables with different mean variations and within and between test correlations often constitute a significant problem in determining essential variables to assess behavioral patterns and their variation in individual animals as well as appropriate statistical treatment. Therefore, we applied a multivariate approach (principal component analysis) to analyse the behavioral data of 162 male adult Sprague-Dawley rats that underwent a behavioral test battery including commonly used tests for spatial learning and memory (holeboard) and different behavioral patterns (open field, elevated plus maze, forced swim test) as well as for motor abilities (Rota rod). The high dimensional behavioral results were reduced to fewer components associated with spatial cognition, general activity, anxiety-, and depression-like behavior and motor ability. The loading scores of individual rats on these different components allow an assessment and the distribution of individual features in a population of animals. The reduced number of components can be used also for statistical calculations like appropriate sample sizes for valid discriminations between experimental groups, which otherwise have to be done on each variable. Because the animals were intact, untreated and experimentally naïve the results reflect trait patterns of behavior and thus individuality. The distribution of animals with high or low levels of anxiety, depressive-like behavior, general activity and cognitive features in a local population provides information of the probability of their appeareance in experimental samples and thus may help to avoid biases. However, such an analysis initially requires a large cohort of animals in order to gain a valid assessment.

  15. Optimization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    Science.gov (United States)

    Fu, Zhibiao; Leighton, Julie; Cheng, Aili; Appelbaum, Edward; Aon, Juan C

    2012-07-01

    Various approaches have been applied to optimize biological product fermentation processes and define design space. In this article, we present a stepwise approach to optimize a Saccharomyces cerevisiae fermentation process through risk assessment analysis, statistical design of experiments (DoE), and multivariate Bayesian predictive approach. The critical process parameters (CPPs) were first identified through a risk assessment. The response surface for each attribute was modeled using the results from the DoE study with consideration given to interactions between CPPs. A multivariate Bayesian predictive approach was then used to identify the region of process operating conditions where all attributes met their specifications simultaneously. The model prediction was verified by twelve consistency runs where all batches achieved broth titer more than 1.53 g/L of broth and quality attributes within the expected ranges. The calculated probability was used to define the reliable operating region. To our knowledge, this is the first case study to implement the multivariate Bayesian predictive approach to the process optimization for the industrial application and its corresponding verification at two different production scales. This approach can be extended to other fermentation process optimizations and reliable operating region quantitation.

  16. Power analysis for multivariate and repeated measures designs: a flexible approach using the SPSS MANOVA procedure.

    Science.gov (United States)

    D'Amico, E J; Neilands, T B; Zambarano, R

    2001-11-01

    Although power analysis is an important component in the planning and implementation of research designs, it is often ignored. Computer programs for performing power analysis are available, but most have limitations, particularly for complex multivariate designs. An SPSS procedure is presented that can be used for calculating power for univariate, multivariate, and repeated measures models with and without time-varying and time-constant covariates. Three examples provide a framework for calculating power via this method: an ANCOVA, a MANOVA, and a repeated measures ANOVA with two or more groups. The benefits and limitations of this procedure are discussed.

  17. Wavelet-based image coding using saliency map

    Science.gov (United States)

    Vargic, Radoslav; Kučerová, Júlia; Polec, Jaroslav

    2016-11-01

    Visual information is very important in human perceiving of the surrounding world. During the observation of the considered scene, some image parts are more salient than others. This fact is conventionally addressed using the regions of interest approach. We are presenting an approach that captures the saliency information per pixel basis using one continuous saliency map for a whole image and which is directly used in the lossy image compression algorithm. Although for the encoding/decoding part of the algorithm, the notion region is not necessary anymore; the resulting method can, due to its nature, efficiently emulate large amounts of regions of interest with various significance. We provide reference implementation of this approach based on the set partitioning in hierarchical trees (SPIHT) algorithm and show that the proposed method is effective and has potential to achieve significantly better results in comparison to the original SPIHT algorithm. The approach is not limited to SPIHT algorithm and can be coupled with, e.g., JPEG 2000 as well.

  18. A Quality by Design approach to investigate tablet dissolution shift upon accelerated stability by multivariate methods.

    Science.gov (United States)

    Huang, Jun; Goolcharran, Chimanlall; Ghosh, Krishnendu

    2011-05-01

    This paper presents the use of experimental design, optimization and multivariate techniques to investigate root-cause of tablet dissolution shift (slow-down) upon stability and develop control strategies for a drug product during formulation and process development. The effectiveness and usefulness of these methodologies were demonstrated through two application examples. In both applications, dissolution slow-down was observed during a 4-week accelerated stability test under 51°C/75%RH storage condition. In Application I, an experimental design was carried out to evaluate the interactions and effects of the design factors on critical quality attribute (CQA) of dissolution upon stability. The design space was studied by design of experiment (DOE) and multivariate analysis to ensure desired dissolution profile and minimal dissolution shift upon stability. Multivariate techniques, such as multi-way principal component analysis (MPCA) of the entire dissolution profiles upon stability, were performed to reveal batch relationships and to evaluate the impact of design factors on dissolution. In Application II, an experiment was conducted to study the impact of varying tablet breaking force on dissolution upon stability utilizing MPCA. It was demonstrated that the use of multivariate methods, defined as Quality by Design (QbD) principles and tools in ICH-Q8 guidance, provides an effective means to achieve a greater understanding of tablet dissolution upon stability.

  19. The Relationship between Burnout and Job Satisfaction among Physical Education Teachers: A Multivariate Approach

    Science.gov (United States)

    Koustelios, Athanasios; Tsigilis, Nikolaos

    2005-01-01

    The present study examined the multivariate relationship between job satisfaction and burnout, experienced by Greek physical education school-based teachers. The sample consisted of 175 physical education teachers, from primary and secondary education. The Maslach Burnout Inventory (Maslach and Jackson, 1986) and the Employee Satisfaction…

  20. Fusion of Thresholding Rules During Wavelet-Based Noisy Image Compression

    Directory of Open Access Journals (Sweden)

    Bekhtin Yury

    2016-01-01

    Full Text Available The new method for combining semisoft thresholding rules during wavelet-based data compression of images with multiplicative noise is suggested. The method chooses the best thresholding rule and the threshold value using the proposed criteria which provide the best nonlinear approximations and take into consideration errors of quantization. The results of computer modeling have shown that the suggested method provides relatively good image quality after restoration in the sense of some criteria such as PSNR, SSIM, etc.

  1. The analysis of VF and VT with wavelet-based Tsallis information measure

    Energy Technology Data Exchange (ETDEWEB)

    Huang Hai [Department of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai (China)]. E-mail: hai_h@sjtu.edu.cn; Xie Hongbo [Department of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai (China); Wang Zhizhong [Department of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai (China)

    2005-03-07

    We undertake the study of ventricular fibrillation and ventricular tachycardia by recourse to wavelet-based multiresolution analysis. Comparing with conventional Shannon entropy analysis of signal, we proposed a new application of Tsallis entropy analysis. It is shown that, as a criteria for detecting between ventricular fibrillation and ventricular tachycardia, Tsallis' multiresolution entropy (MRET) provides one with better discrimination power than the Shannon's multiresolution entropy (MRE)

  2. WAVELET-BASED ESTIMATORS OF MEAN REGRESSION FUNCTION WITH LONG MEMORY DATA

    Institute of Scientific and Technical Information of China (English)

    LI Lin-yuan; XIAO Yi-min

    2006-01-01

    This paper provides an asymptotic expansion for the mean integrated squared error (MISE) of nonlinear wavelet-based mean regression function estimators with long memory data. This MISE expansion, when the underlying mean regression function is only piecewise smooth, is the same as analogous expansion for the kernel estimators.However, for the kernel estimators, this MISE expansion generally fails if the additional smoothness assumption is absent.

  3. Theory of Wavelet-Based Coarse-Graining Hierarchies for Molecular Dynamics

    Science.gov (United States)

    2017-04-01

    b. ABSTRACT    c. THIS PAGE    19b. TELEPHONE NUMBER (Include area code)   Standard Form 298 (Rev. 8/98)    Prescribed by ANSI  Std . Z39.18 Apr...Diffusion wavelet-based decompo- sitions for coarse-graining of polymer chains. Paper presented at: University of Delaware , Applied Mathematics. 2015

  4. Model-free stochastic processes studied with q-wavelet-based informational tools

    Energy Technology Data Exchange (ETDEWEB)

    Perez, D.G. [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso (PUCV), 23-40025 Valparaiso (Chile)]. E-mail: dario.perez@ucv.cl; Zunino, L. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Ciencias Basicas, Facultad de Ingenieria, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: lucianoz@ciop.unlp.edu.ar; Martin, M.T. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: mtmartin@venus.unlp.edu.ar; Garavaglia, M. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: garavagliam@ciop.unlp.edu.ar; Plastino, A. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: plastino@venus.unlp.edu.ar; Rosso, O.A. [Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina)]. E-mail: oarosso@fibertel.com.ar

    2007-04-30

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework.

  5. WAVELET BASED CLASSIFICATION OF VOLTAGE SAG, SWELL & TRANSIENTS

    Directory of Open Access Journals (Sweden)

    Vijay Gajanan Neve

    2013-05-01

    Full Text Available When the time localization of the spectral components is needed, the WAVELE TRANSFORM (WT can be used to obtain the optimal time frequency representation of the signal. This paper deals with the use of a wavelet transform to detect and analyze voltage sags, voltage swell and transients. It introduces voltage disturbance detection approach based on wavelet transform, identifies voltage disturbances, and discriminates the type of event which has resulted in the voltage disturbance, e.g. either a fault or a capacitor-switching incident.Feasibility of the proposed disturbance detection approach is demonstrated based on digital time-domain simulation of a distribution power system using the PSCAD software package, and is implemented using MATLAB. The developed algorithm has been applied to the 14-buses IEEE system to illustrate its application. Results are analyzed.

  6. On exploiting wavelet bases in statistical region-based segmentation

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Forchhammer, Søren

    2002-01-01

    Statistical region-based segmentation methods such as the Active Appearance Models establish dense correspondences by modelling variation of shape and pixel intensities in low-resolution 2D images. Unfortunately, for high-resolution 2D and 3D images, this approach is rendered infeasible due to ex...... 9-7 wavelet on cardiac MRIs and human faces show that the segmentation accuracy is minimally degraded at compression ratios of 1:10 and 1:20, respectively....

  7. Construction of Hilbert Transform Pairs of Wavelet Bases and Gabor-like Transforms

    CERN Document Server

    Chaudhury, Kunal Narayan

    2009-01-01

    We propose a novel method for constructing Hilbert transform (HT) pairs of wavelet bases based on a fundamental approximation-theoretic characterization of scaling functions--the B-spline factorization theorem. In particular, starting from well-localized scaling functions, we construct HT pairs of biorthogonal wavelet bases of L^2(R) by relating the corresponding wavelet filters via a discrete form of the continuous HT filter. As a concrete application of this methodology, we identify HT pairs of spline wavelets of a specific flavor, which are then combined to realize a family of complex wavelets that resemble the optimally-localized Gabor function for sufficiently large orders. Analytic wavelets, derived from the complexification of HT wavelet pairs, exhibit a one-sided spectrum. Based on the tensor-product of such analytic wavelets, and, in effect, by appropriately combining four separable biorthogonal wavelet bases of L^2(R^2), we then discuss a methodology for constructing 2D directional-selective complex...

  8. Assessing heart rate variability through wavelet-based statistical measures.

    Science.gov (United States)

    Wachowiak, Mark P; Hay, Dean C; Johnson, Michel J

    2016-10-01

    Because of its utility in the investigation and diagnosis of clinical abnormalities, heart rate variability (HRV) has been quantified with both time and frequency analysis tools. Recently, time-frequency methods, especially wavelet transforms, have been applied to HRV. In the current study, a complementary computational approach is proposed wherein continuous wavelet transforms are applied directly to ECG signals to quantify time-varying frequency changes in the lower bands. Such variations are compared for resting and lower body negative pressure (LBNP) conditions using statistical and information-theoretic measures, and compared with standard HRV metrics. The latter confirm the expected lower variability in the LBNP condition due to sympathetic nerve activity (e.g. RMSSD: p=0.023; SDSD: p=0.023; LF/HF: p=0.018). Conversely, using the standard Morlet wavelet and a new transform based on windowed complex sinusoids, wavelet analysis of the ECG within the observed range of heart rate (0.5-1.25Hz) exhibits significantly higher variability, as measured by frequency band roughness (Morlet CWT: p=0.041), entropy (Morlet CWT: p=0.001), and approximate entropy (Morlet CWT: p=0.004). Consequently, this paper proposes that, when used with well-established HRV approaches, time-frequency analysis of ECG can provide additional insights into the complex phenomenon of heart rate variability.

  9. Effects of PAHs and dioxins on the earthworm Eisenia andrei: a multivariate approach for biomarker interpretation.

    Science.gov (United States)

    Sforzini, Susanna; Moore, Michael N; Boeri, Marta; Bencivenga, Mauro; Viarengo, Aldo

    2015-01-01

    In this study, a battery of biomarkers was utilised to evaluate the stress syndrome induced in the earthworm Eisenia andrei by exposure to environmentally realistic concentrations of benzo[a]pyrene (B[a]P) and 2,3,7,8-tetrachlorodibenzo-para-dioxin (TCDD) in OECD soil. The set of tests was then employed to assess the toxicity of field soils contaminated with organic xenobiotic compounds (such as PAHs, dioxins and PCBs). The results highlighted an impairment of immune and metabolic functions and genotoxic damage in worms exposed also to lower bioavailable concentrations of toxic chemicals. Multivariate analysis of biomarker data showed that all different contaminated soils had a detrimental effect on the earthworms. A separation between temporal and concentration factors was also evident for B[a]P and TCDD treatments; and field contaminated soils were further differentiated reflecting a diverse contamination. Multivariate analysis also demonstrated that lysosomal membrane stability can be considered a prognostic indicator for worm health status.

  10. Modeling international stock market contagion using multivariate fractionally integrated APARCH approach

    OpenAIRE

    Zouheir Mighri; Faysal Mansouri

    2014-01-01

    The aim of this article is to examine how the dynamics of correlations between two emerging countries (Brazil and Mexico) and the US evolved from January 2003 to December 2013. The main contribution of this study is to explore whether the plunging stock market in the US, in the aftermath of global financial crisis (2007 - 2009), exerts contagion effects on emerging stock markets. To this end, we rely on a multivariate fractionally integrated asymmetric power autoregressive conditional heteros...

  11. Multivariate Analysis Approach to the Serum Peptide Profile of Morbidly Obese Patients

    Directory of Open Access Journals (Sweden)

    M. Agostini

    2013-01-01

    Full Text Available Background: Obesity is currently epidemic in many countries worldwide and is strongly related to diabetes and cardiovascular disease. Mass spectrometry, in particular matrix-assisted laser desorption/ionization time of flight (MALDI-TOF is currently used for detecting different pattern of expressed protein. This study investigated the differences in low molecular weight (LMW peptide profiles between obese and normal-weight subjects in combination with multivariate statistical analysis.

  12. A wavelet-based quadtree driven stereo image coding

    Science.gov (United States)

    Bensalma, Rafik; Larabi, Mohamed-Chaker

    2009-02-01

    In this work, a new stereo image coding technique is proposed. The new approach integrates the coding of the residual image with the disparity map. The latter computed in the wavelet transform domain. The motivation behind using this transform is that it imitates some properties of the human visual system (HVS), particularly, the decomposition in the perspective canals. Therefore, using the wavelet transform allows for better perceptual image quality preservation. In order to estimate the disparity map, we used a quadtree segmentation in each wavelet frequency band. This segmentation has the advantage of minimizing the entropy. Dyadic squares in the subbands of target image that they are not matched with other in the reference image constitutes the residuals are coded by using an arithmetic codec. The obtained results are evaluated by using the SSIM and PSNR criteria.

  13. Image superresolution of cytology images using wavelet based patch search

    Science.gov (United States)

    Vargas, Carlos; García-Arteaga, Juan D.; Romero, Eduardo

    2015-01-01

    Telecytology is a new research area that holds the potential of significantly reducing the number of deaths due to cervical cancer in developing countries. This work presents a novel super-resolution technique that couples high and low frequency information in order to reduce the bandwidth consumption of cervical image transmission. The proposed approach starts by decomposing into wavelets the high resolution images and transmitting only the lower frequency coefficients. The transmitted coefficients are used to reconstruct an image of the original size. Additional details are added by iteratively replacing patches of the wavelet reconstructed image with equivalent high resolution patches from a previously acquired image database. Finally, the original transmitted low frequency coefficients are used to correct the final image. Results show a higher signal to noise ratio in the proposed method over simply discarding high frequency wavelet coefficients or replacing directly down-sampled patches from the image-database.

  14. A Hybrid ICA-SVM Approach for Determining the Quality Variables at Fault in a Multivariate Process

    Directory of Open Access Journals (Sweden)

    Yuehjen E. Shao

    2012-01-01

    Full Text Available The monitoring of a multivariate process with the use of multivariate statistical process control (MSPC charts has received considerable attention. However, in practice, the use of MSPC chart typically encounters a difficulty. This difficult involves which quality variable or which set of the quality variables is responsible for the generation of the signal. This study proposes a hybrid scheme which is composed of independent component analysis (ICA and support vector machine (SVM to determine the fault quality variables when a step-change disturbance existed in a multivariate process. The proposed hybrid ICA-SVM scheme initially applies ICA to the Hotelling T2 MSPC chart to generate independent components (ICs. The hidden information of the fault quality variables can be identified in these ICs. The ICs are then served as the input variables of the classifier SVM for performing the classification process. The performance of various process designs is investigated and compared with the typical classification method. Using the proposed approach, the fault quality variables for a multivariate process can be accurately and reliably determined.

  15. Target Identification Using Harmonic Wavelet Based ISAR Imaging

    Science.gov (United States)

    Shreyamsha Kumar, B. K.; Prabhakar, B.; Suryanarayana, K.; Thilagavathi, V.; Rajagopal, R.

    2006-12-01

    A new approach has been proposed to reduce the computations involved in the ISAR imaging, which uses harmonic wavelet-(HW) based time-frequency representation (TFR). Since the HW-based TFR falls into a category of nonparametric time-frequency (T-F) analysis tool, it is computationally efficient compared to parametric T-F analysis tools such as adaptive joint time-frequency transform (AJTFT), adaptive wavelet transform (AWT), and evolutionary AWT (EAWT). Further, the performance of the proposed method of ISAR imaging is compared with the ISAR imaging by other nonparametric T-F analysis tools such as short-time Fourier transform (STFT) and Choi-Williams distribution (CWD). In the ISAR imaging, the use of HW-based TFR provides similar/better results with significant (92%) computational advantage compared to that obtained by CWD. The ISAR images thus obtained are identified using a neural network-based classification scheme with feature set invariant to translation, rotation, and scaling.

  16. Wavelet-based pavement image compression and noise reduction

    Science.gov (United States)

    Zhou, Jian; Huang, Peisen S.; Chiang, Fu-Pen

    2005-08-01

    For any automated distress inspection system, typically a huge number of pavement images are collected. Use of an appropriate image compression algorithm can save disk space, reduce the saving time, increase the inspection distance, and increase the processing speed. In this research, a modified EZW (Embedded Zero-tree Wavelet) coding method, which is an improved version of the widely used EZW coding method, is proposed. This method, unlike the two-pass approach used in the original EZW method, uses only one pass to encode both the coordinates and magnitudes of wavelet coefficients. An adaptive arithmetic encoding method is also implemented to encode four symbols assigned by the modified EZW into binary bits. By applying a thresholding technique to terminate the coding process, the modified EZW coding method can compress the image and reduce noise simultaneously. The new method is much simpler and faster. Experimental results also show that the compression ratio was increased one and one-half times compared to the EZW coding method. The compressed and de-noised data can be used to reconstruct wavelet coefficients for off-line pavement image processing such as distress classification and quantification.

  17. Improved successive refinement for wavelet-based embedded image compression

    Science.gov (United States)

    Creusere, Charles D.

    1999-10-01

    In this paper we consider a new form of successive coefficient refinement which can be used in conjunction with embedded compression algorithms like Shapiro's EZW (Embedded Zerotree Wavelet) and Said & Pearlman's SPIHT (Set Partitioning in Hierarchical Trees). Using the conventional refinement process, the approximation of a coefficient that was earlier determined to be significantly is refined by transmitting one of two symbols--an `up' symbol if the actual coefficient value is in the top half of the current uncertainty interval or a `down' symbol if it is the bottom half. In the modified scheme developed here, we transmit one of 3 symbols instead--`up', `down', or `exact'. The new `exact' symbol tells the decoder that its current approximation of a wavelet coefficient is `exact' to the level of precision desired. By applying this scheme in earlier work to lossless embedded compression (also called lossy/lossless compression), we achieved significant reductions in encoder and decoder execution times with no adverse impact on compression efficiency. These excellent results for lossless systems have inspired us to adapt this refinement approach to lossy embedded compression. Unfortunately, the results we have achieved thus far for lossy compression are not as good.

  18. Wavelet-based Poisson rate estimation using the Skellam distribution

    Science.gov (United States)

    Hirakawa, Keigo; Baqai, Farhan; Wolfe, Patrick J.

    2009-02-01

    Owing to the stochastic nature of discrete processes such as photon counts in imaging, real-world data measurements often exhibit heteroscedastic behavior. In particular, time series components and other measurements may frequently be assumed to be non-iid Poisson random variables, whose rate parameter is proportional to the underlying signal of interest-witness literature in digital communications, signal processing, astronomy, and magnetic resonance imaging applications. In this work, we show that certain wavelet and filterbank transform coefficients corresponding to vector-valued measurements of this type are distributed as sums and differences of independent Poisson counts, taking the so-called Skellam distribution. While exact estimates rarely admit analytical forms, we present Skellam mean estimators under both frequentist and Bayes models, as well as computationally efficient approximations and shrinkage rules, that may be interpreted as Poisson rate estimation method performed in certain wavelet/filterbank transform domains. This indicates a promising potential approach for denoising of Poisson counts in the above-mentioned applications.

  19. Wavelet-based texture image classification using vector quantization

    Science.gov (United States)

    Lam, Eric P.

    2007-02-01

    Classification of image segments on textures can be helpful for target recognition. Sometimes target cueing is performed before target recognition. Textures are sometimes used to cue an image processor of a potential region of interest. In certain imaging sensors, such as those used in synthetic aperture radar, textures may be abundant. The textures may be caused by the object material or speckle noise. Even speckle noise can create the illusion of texture, which must be compensated in image pre-processing. In this paper, we will discuss how to perform texture classification but constrain the number of wavelet packet node decomposition. The new approach performs a twochannel wavelet decomposition. Comparing the strength of each new subband with others at the same level of the wavelet packet determines when to stop further decomposition. This type of decomposition is performed recursively. Once the decompositions stop, the structure of the packet is stored in a data structure. Using the information from the data structure, dominating channels are extracted. These are defined as paths from the root of the packet to the leaf with the highest strengths. The list of dominating channels are used to train a learning vector quantization neural network.

  20. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    Science.gov (United States)

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016.

  1. A Weighting Sequence Approach to the Analysis and Design of Multivariable Control Systems.

    Science.gov (United States)

    1987-01-01

    Bradford; 1975. [WELl] Wellstead , P.E., Prager, D., and Zanker, P.; "Pole assignment self- tuning regulator" Proc IEE, Vol 126(8), pp 781-787; 1979 ...for a class of multivariable systems" Automatica, Vol 15(2), pp 209-215; 1979 . [CAD11 Cadzov, J.A. and Martens, H.R.; Discrete-time and Computer...and Gawthrop, P.J.; "Self-tuning control" Proc IEE, Vol 126(6), pp 633-640; 1979 . [CLA3] Clarke, D.W., Mohtadi, C., and Tuffs, P.S.; "Generalized

  2. The mass transfer approach to multivariate discrete first order stochastic dominance

    DEFF Research Database (Denmark)

    Østerdal, Lars Peter Raahave

    2010-01-01

    A fundamental result in the theory of stochastic dominance tells that first order dominance between two finite multivariate distributions is equivalent to the property that the one can be obtained from the other by shifting probability mass from one outcome to another that is worse a finite number...... of times. This paper provides a new and elementary proof of that result by showing that starting with an arbitrary system of mass transfers, whenever the resulting distribution is first order dominated one can gradually rearrange transfers, according to a certain decentralized procedure, and obtain...... a system of transfers all shifting mass to outcomes that are worse....

  3. Search for Heavy Stable Charged Particles at $\\sqrt{s}$ = 13 TeV Utilizing a Multivariate Approach

    CERN Document Server

    Ackert, Andrew Kenjiro

    Heavy stable charged particles (HSCPs) have been searched for at the Large Hadron Collider since its initial data taking in 2010. The search for heavy stable charged particles provide a means of directly probing the new physics realm, as they produce a detector signature unlike any particle discovered to date. The goal of this research is to investigate an idea that was introduced in the later stages of 2010-2012 data taking period. Rather than utilizing the current tight selection on the calculated particle mass the hypothesis is that by incorporating a multivariate approach, specif- ically an artificial neural network, the remaining selection criteria could be loosened allowing for a greater signal acceptance while maintaining acceptable background rejection via the multivariate discriminator from the artificial neural network. The increase in signal acceptance and retention or increase in background rejection increases the discovery potential for HSCPs and as a secondary objective calculates improved limit...

  4. A Bayesian approach to joint analysis of multivariate longitudinal data and parametric accelerated failure time.

    Science.gov (United States)

    Luo, Sheng

    2014-02-20

    Impairment caused by Parkinson's disease (PD) is multidimensional (e.g., sensoria, functions, and cognition) and progressive. Its multidimensional nature precludes a single outcome to measure disease progression. Clinical trials of PD use multiple categorical and continuous longitudinal outcomes to assess the treatment effects on overall improvement. A terminal event such as death or dropout can stop the follow-up process. Moreover, the time to the terminal event may be dependent on the multivariate longitudinal measurements. In this article, we consider a joint random-effects model for the correlated outcomes. A multilevel item response theory model is used for the multivariate longitudinal outcomes and a parametric accelerated failure time model is used for the failure time because of the violation of proportional hazard assumption. These two models are linked via random effects. The Bayesian inference via MCMC is implemented in 'BUGS' language. Our proposed method is evaluated by a simulation study and is applied to DATATOP study, a motivating clinical trial to determine if deprenyl slows the progression of PD. © 2013 The authors. Statistics in Medicine published by John Wiley & Sons, Ltd.

  5. Multivariate analysis of prognostic factors for salvage nasopharyngectomy via the maxillary swing approach.

    Science.gov (United States)

    Chan, Jimmy Yu Wai; To, Victor Shing Howe; Chow, Velda Ling Yu; Wong, Stanley Thian Sze; Wei, William Ignace

    2014-07-01

    The purpose of this study was to investigate the prognostic factors for salvage nasopharyngectomy. A retrospective review was conducted on maxillary swing nasopharyngectomy performed between 1998 and 2010. Univariate and multivariate analyses identified prognostic factors affecting actuarial local tumor control and overall survival. The median follow-up duration was 52 months. Among the 268 patients, 79.1% had clear resection margins. The 5-year actuarial local tumor control and overall survival was 74% and 62.1%, respectively. On multivariate analysis, tumor size, resection margin status, and gross tumor in the sphenoid sinus were independent prognostic factors for local tumor control. For overall survival, resection margin status, synchronous cervical nodal recurrence, and cavernous sinus invasion had a negative influence on overall survival after surgery. Extent of nasopharyngectomy should be tailored to the individual tumor to achieve clear resection margins. Cavernous sinus invasion is associated with poor survival outcome, and detailed counseling and meticulous surgical planning is crucial in such circumstances. Copyright © 2014 Wiley Periodicals, Inc.

  6. A new approach for the quantification of synchrony of multivariate non-stationary psychophysiological variables during emotion eliciting stimuli

    Directory of Open Access Journals (Sweden)

    Augustin eKelava

    2015-01-01

    Full Text Available Emotion eliciting situations are accompanied by reactions on multiple response variables on subjective, physiological, and behavioral levels. The quantification of the overall simultaneous synchrony of psychophysiological reactions, plays a major role in emotion theories and has received increasing attention in recent research. From a psychometric perspective, the reactions represent multivariate non-stationary intra-individual time series. In this paper, we present a new time-frequency based latent variable approach for the quantification of the synchrony of the responses. The approach is applied to empirical data collected during an emotion eliciting situation. The results are compared with a complementary inter-individual approach of Hsieh et al. (2011. Finally, the proposed approach is discussed in the context of emotion theories, and possible future applications and limitations are provided.

  7. Evaluation of multivariate surveillance

    OpenAIRE

    Frisén,Marianne; Andersson, Eva; Schiöler, Linus

    2009-01-01

    Multivariate surveillance is of interest in many areas such as industrial production, bioterrorism detection, spatial surveillance, and financial transaction strategies. Some of the suggested approaches to multivariate surveillance have been multivariate counterparts to the univariate Shewhart, EWMA, and CUSUM methods. Our emphasis is on the special challenges of evaluating multivariate surveillance methods. Some new measures are suggested and the properties of several measures are demonstrat...

  8. The role of middle-class status in payday loan borrowing: a multivariate approach.

    Science.gov (United States)

    Lim, Younghee; Bickham, Trey; Broussard, Julia; Dinecola, Cassie M; Gregory, Alethia; Weber, Brittany E

    2014-10-01

    Payday loans refer to small-dollar, high-interest, short-term loans usually extended to lower-income consumers. Despite much research to the contrary, the payday loan industry asserts that it primarily serves middle-class Americans. This article discusses the authors' investigation of the industry's claim, by analyzing data from a U.S. bankruptcy court serving a Southern district. Results of the multivariate binary logistic regression analysis showed that, controlling for various sociodemographic and economic variables, two middle-class indicators--home-ownership and annual income at or greater than the median income--are associated with a decreased likelihood of using payday loans. The article concludes with a discussion of the implications of the results for social work practice and advocacy in regard to financial capability, particularly asset development, income maintenance, and payday loan regulation.

  9. Predictors of neurobehavioral symptoms in a university population: a multivariate approach using a postconcussive symptom questionnaire.

    Science.gov (United States)

    Ettenhofer, Mark L; Reinhardt, Lindsay E; Barry, David M

    2013-10-01

    Several factors have been linked to severity of postconcussive-type (neurobehavioral) symptoms. In this study, predictors of neurobehavioral symptoms were examined using multivariate methods to determine the relative importance of each. Data regarding demographics, symptoms, current alcohol use, history of traumatic brain injury (TBI), orthopedic injuries, and psychiatric/developmental diagnoses were collected via questionnaire from 3027 university students. The most prominent predictors of symptoms were gender, history of depression or anxiety, history of attention-deficit/hyperactivity disorder or learning disability diagnosis, and frequency of alcohol use. Prior mild TBI was significantly related to overall symptoms, but this effect was small in comparison to other predictors. These results provide further evidence that neurobehavioral symptoms are multi-determined phenomena, and highlight the importance of psychiatric comorbidity, demographic factors, and health behaviors to neurobehavioral symptom presentation after mild TBI.

  10. CAUSAL RELATIONSHIP BETWEEN FOSSIL FUEL CONSUMPTION AND ECONOMIC GROWTH IN JAPAN: A MULTIVARIATE APPROACH

    Directory of Open Access Journals (Sweden)

    Hazuki Ishida

    2013-01-01

    Full Text Available This paper explores whether Japanese economy can continue to grow without extensive dependence on fossil fuels. The paper conducts time series analysis using a multivariate model of fossil fuels, non-fossil energy, labor, stock and GDP to investigate the relationship between fossil fuel consumption and economic growth in Japan. The results of cointegration tests indicate long-run relationships among the variables. Using a vector error-correction model, the study reveals bidirectional causality between fossil fuels and GDP. The results also show that there is no causal relationship between non-fossil energy and GDP. The results of cointegration analysis, Granger causality tests, and variance decomposition analysis imply that non-fossil energy may not necessarily be able to play the role of fossil fuels. Japan cannot seem to realize both continuous economic growth and the departure from dependence on fossil fuels. Hence, growth-oriented macroeconomic policies should be re-examined.

  11. Behavioral event occurrence differs between behavioral states in Sotalia guianensis (Cetarctiodactyla: Delphinidae dolphins: a multivariate approach

    Directory of Open Access Journals (Sweden)

    Rodrigo H. Tardin

    2014-02-01

    Full Text Available Difficulties in quantifying behavioral events can cause loss of information about cetacean behavior, especially behaviors whose functions are still debated. The lack of knowledge is greater for South American species such as Sotalia guianensis (Van Benédén, 1864. Our objective was to contextualize the behavioral events inside behavioral states using a Permutational Multivariate Analysis of Variance (MANOVA. Three events occurred in the Feeding, Socio-Sexual and Travelling states (Porpoising, Side flop, Tail out dive, and five events occurred in the Feeding and Travelling states (Back flop, Horizontal jump, Lobtail, Spy-hop, Partial flop ahead. Three events (Belly exposure, Club, and Heading occurred exclusively in the Socio-sexual state. Partial Back flop and Head flop occurred exclusively in the Feeding state. For the events that occurred in multiple states, we observed that some events occurred more frequently in one of the states (p < 0.001, such as Lobtail, Tail out dive horizontal Jump, Partial flop ahead and Side flop. Our multivariate analysis, which separated Socio-sexual behavior from Feeding and Travelling, showed that the abundance of behavioral events differs between states. This differentiation indicates that some events are associated with specific behavioral states. Almost 40% of the events observed were exclusively performed in one state, which indicates a high specialization for some events. Proper discrimination and contextualization of behavioral events may be efficient tools to better understand dolphin behaviors. Similar studies in other habitats and with other species, will help build a broader scenario to aid our understanding of the functions of dolphin behavioral events.

  12. Strategies to optimize monitoring schemes of recreational waters from Salta, Argentina: a multivariate approach

    Science.gov (United States)

    Gutiérrez-Cacciabue, Dolores; Teich, Ingrid; Poma, Hugo Ramiro; Cruz, Mercedes Cecilia; Balzarini, Mónica; Rajal, Verónica Beatriz

    2014-01-01

    Several recreational surface waters in Salta, Argentina, were selected to assess their quality. Seventy percent of the measurements exceeded at least one of the limits established by international legislation becoming unsuitable for their use. To interpret results of complex data, multivariate techniques were applied. Arenales River, due to the variability observed in the data, was divided in two: upstream and downstream representing low and high pollution sites, respectively; and Cluster Analysis supported that differentiation. Arenales River downstream and Campo Alegre Reservoir were the most different environments and Vaqueros and La Caldera Rivers were the most similar. Canonical Correlation Analysis allowed exploration of correlations between physicochemical and microbiological variables except in both parts of Arenales River, and Principal Component Analysis allowed finding relationships among the 9 measured variables in all aquatic environments. Variable’s loadings showed that Arenales River downstream was impacted by industrial and domestic activities, Arenales River upstream was affected by agricultural activities, Campo Alegre Reservoir was disturbed by anthropogenic and ecological effects, and La Caldera and Vaqueros Rivers were influenced by recreational activities. Discriminant Analysis allowed identification of subgroup of variables responsible for seasonal and spatial variations. Enterococcus, dissolved oxygen, conductivity, E. coli, pH, and fecal coliforms are sufficient to spatially describe the quality of the aquatic environments. Regarding seasonal variations, dissolved oxygen, conductivity, fecal coliforms, and pH can be used to describe water quality during dry season, while dissolved oxygen, conductivity, total coliforms, E. coli, and Enterococcus during wet season. Thus, the use of multivariate techniques allowed optimizing monitoring tasks and minimizing costs involved. PMID:25190636

  13. Multivariate drought index: An information theory based approach for integrated drought assessment

    Science.gov (United States)

    Rajsekhar, Deepthi; Singh, Vijay. P.; Mishra, Ashok. K.

    2015-07-01

    Most of the existing drought indices are based on a single variable (e.g. precipitation) or a combination of two variables (e.g., precipitation and streamflow). This may not be sufficient for reliable quantification of the existing drought condition. It is possible that a region might be experiencing only a single type of drought at times, but multiple drought types affecting a region is quite common too. To have a comprehensive representation, it is better to consider all the variables that lead to different physical forms of drought, such as meteorological, hydrological, and agricultural droughts. Therefore, we propose to develop a multivariate drought index (MDI) that will utilize information from hydroclimatic variables, including precipitation, runoff, evapotranspiration and soil moisture as indicator variables, thus accounting for all the physical forms of drought. The entropy theory was utilized to develop this proposed index, that led to the smallest set of features maximally preserving the information of the input data set. MDI was then compared with the Palmer drought severity index (PDSI) for all climate regions within Texas for the time period 1950-2012, with particular attention to the two major drought occurrences in Texas, viz. the droughts which occurred in 1950-1957, and 2010-2011. The proposed MDI was found to represent drought conditions well, due to its multivariate, multi scalar, and nonlinear properties. To help the user choose the right time scale for further analysis, entropy maps of MDI at different time scales were used as a guideline. The MDI time scale that has the highest entropy value may be chosen, since a higher entropy indicates a higher information content.

  14. BioIMAX: A Web 2.0 approach for easy exploratory and collaborative access to multivariate bioimage data

    Directory of Open Access Journals (Sweden)

    Khan Michael

    2011-07-01

    Full Text Available Abstract Background Innovations in biological and biomedical imaging produce complex high-content and multivariate image data. For decision-making and generation of hypotheses, scientists need novel information technology tools that enable them to visually explore and analyze the data and to discuss and communicate results or findings with collaborating experts from various places. Results In this paper, we present a novel Web2.0 approach, BioIMAX, for the collaborative exploration and analysis of multivariate image data by combining the webs collaboration and distribution architecture with the interface interactivity and computation power of desktop applications, recently called rich internet application. Conclusions BioIMAX allows scientists to discuss and share data or results with collaborating experts and to visualize, annotate, and explore multivariate image data within one web-based platform from any location via a standard web browser requiring only a username and a password. BioIMAX can be accessed at http://ani.cebitec.uni-bielefeld.de/BioIMAX with the username "test" and the password "test1" for testing purposes.

  15. A multivariate approach to correlate bacterial surface properties to biofilm formation by lipopolysaccharide mutants of Pseudomonas aeruginosa.

    Science.gov (United States)

    Ruhal, Rohit; Antti, Henrik; Rzhepishevska, Olena; Boulanger, Nicolas; Barbero, David R; Wai, Sun Nyunt; Uhlin, Bernt Eric; Ramstedt, Madeleine

    2015-03-01

    Bacterial biofilms are involved in various medical infections and for this reason it is of great importance to better understand the process of biofilm formation in order to eradicate or mitigate it. It is a very complex process and a large range of variables have been suggested to influence biofilm formation. However, their internal importance is still not well understood. In the present study, a range of surface properties of Pseudomonas aeruginosa lipopolysaccharide mutants were studied in relation to biofilm formation measured in different kinds of multi-well plates and growth conditions in order to better understand the complexity of biofilm formation. Multivariate analysis was used to simultaneously evaluate the role of a range of physiochemical parameters under different conditions. Our results suggest the presence of serum inhibited biofilm formation due to changes in twitching motility. From the multivariate analysis it was observed that the most important parameters, positively correlated to biofilm formation on two types of plates, were high hydrophobicity, near neutral zeta potential and motility. Negative correlation was observed with cell aggregation, as well as formation of outer membrane vesicles and exopolysaccharides. This work shows that the complexity of biofilm formation can be better understood using a multivariate approach that can interpret and rank the importance of different factors being present simultaneously under several different environmental conditions, enabling a better understanding of this complex process.

  16. Evaluating the Performance of Wavelet-based Data-driven Models for Multistep-ahead Flood Forecasting in an Urbanized Watershed

    Science.gov (United States)

    Kasaee Roodsari, B.; Chandler, D. G.

    2015-12-01

    A real-time flood forecast system is presented to provide emergency management authorities sufficient lead time to execute plans for evacuation and asset protection in urban watersheds. This study investigates the performance of two hybrid models for real-time flood forecasting at different subcatchments of Ley Creek watershed, a heavily urbanized watershed in the vicinity of Syracuse, New York. Hybrid models include Wavelet-Based Artificial Neural Network (WANN) and Wavelet-Based Adaptive Neuro-Fuzzy Inference System (WANFIS). Both models are developed on the basis of real time stream network sensing. The wavelet approach is applied to decompose the collected water depth timeseries to Approximation and Detail components. The Approximation component is then used as an input to ANN and ANFIS models to forecast water level at lead times of 1 to 10 hours. The performance of WANN and WANFIS models are compared to ANN and ANFIS models for different lead times. Initial results demonstrated greater predictive power of hybrid models.

  17. Probing Tissue Multifractality Using Wavelet based Multifractal Detrended Fluctuation Analysis: Applications in Precancer Detection

    CERN Document Server

    Soni, Jalpa; Ghosh, Sayantan; Pradhan, Asima; Sengupta, Tapas K; Panigrahi, Prasanta K; Ghosh, Nirmalya

    2011-01-01

    The refractive index fluctuations in the connective tissue layer (stroma) of human cervical tissues having different grades of precancers (dysplasia) was quantified using a wavelet-based multifractal detrended fluctuation analysis model. The results show clear signature of multi-scale self-similarity in the index fluctuations of the tissues. Importantly, the refractive index fluctuations were found to be more anti-correlated at higher grades of precancers. Moreover, the strength of multifractality was also observed to be considerably weaker in higher grades of precancers. These results were further complemented by Fourier domain analysis of the spectral fluctuations.

  18. Serial identification of EEG patterns using adaptive wavelet-based analysis

    Science.gov (United States)

    Nazimov, A. I.; Pavlov, A. N.; Nazimova, A. A.; Grubov, V. V.; Koronovskii, A. A.; Sitnikova, E.; Hramov, A. E.

    2013-10-01

    A problem of recognition specific oscillatory patterns in the electroencephalograms with the continuous wavelet-transform is discussed. Aiming to improve abilities of the wavelet-based tools we propose a serial adaptive method for sequential identification of EEG patterns such as sleep spindles and spike-wave discharges. This method provides an optimal selection of parameters based on objective functions and enables to extract the most informative features of the recognized structures. Different ways of increasing the quality of patterns recognition within the proposed serial adaptive technique are considered.

  19. Research of Wavelet Based Multicarrier Modulation System with Near Shannon Limited Codes

    Institute of Scientific and Technical Information of China (English)

    ZHANGHaixia; YUANDongfeng; ZHAOFeng

    2005-01-01

    In this paper, by using turbo codes and Low density parity codes (LDPC) as channel correcting code scheme, Wavelet based multicarrier modulation (WMCM) systems are proposed and investigated on different transmission scenarios. The Bit error rate (BER) performance of these two near Shannon limited codes is simulated and compared with various code parameters. Simulated results show that Turbo coded WMCM (TCWMCM) performs better than LDPC coded WMCM (LDPC-CWMCM) on both AWGN and Rayleigh fading channels when these two kinds of codes are of the same code parameters.

  20. A novel 3D wavelet based filter for visualizing features in noisy biological data

    Energy Technology Data Exchange (ETDEWEB)

    Moss, W C; Haase, S; Lyle, J M; Agard, D A; Sedat, J W

    2005-01-05

    We have developed a 3D wavelet-based filter for visualizing structural features in volumetric data. The only variable parameter is a characteristic linear size of the feature of interest. The filtered output contains only those regions that are correlated with the characteristic size, thus denoising the image. We demonstrate the use of the filter by applying it to 3D data from a variety of electron microscopy samples including low contrast vitreous ice cryogenic preparations, as well as 3D optical microscopy specimens.

  1. Better Crunching: Recommendations for Multivariate Data Analysis Approaches for Program Impact Evaluations

    Science.gov (United States)

    Braverman, Marc T.

    2016-01-01

    Extension program evaluations often present opportunities to analyze data in multiple ways. This article suggests that program evaluations can involve more sophisticated data analysis approaches than are often used. On the basis of a hypothetical program scenario and corresponding data set, two approaches to testing for evidence of program impact…

  2. Improving the Performance of Machine Learning Based Multi Attribute Face Recognition Algorithm Using Wavelet Based Image Decomposition Technique

    Directory of Open Access Journals (Sweden)

    S. Sakthivel

    2011-01-01

    Full Text Available Problem statement: Recognizing a face based attributes is an easy task for a human to perform; it is closely automated and requires little mental effort. A computer, on the other hand, has no innate ability to recognize a face or a facial feature and must be programmed with an algorithm to do so. Generally, to recognize a face, different kinds of the facial features were used separately or in a combined manner. In the previous work, we have developed a machine learning based multi attribute face recognition algorithm and evaluated it different set of weights to each input attribute and performance wise it is low compared to proposed wavelet decomposition technique. Approach: In this study, wavelet decomposition technique has been applied as a preprocessing technique to enhance the input face images in order to reduce the loss of classification performance due to changes in facial appearance. The Experiment was specifically designed to investigate the gain in robustness against illumination and facial expression changes. Results: In this study, a wavelet based image decomposition technique has been proposed to enhance the performance by 8.54 percent of the previously designed system. Conclusion: The proposed model has been tested on face images with difference in expression and illumination condition with a dataset obtained from face image databases from Olivetti Research Laboratory.

  3. Wavelet-based denoising of the Fourier metric in real-time wavefront correction for single molecule localization microscopy

    Science.gov (United States)

    Tehrani, Kayvan Forouhesh; Mortensen, Luke J.; Kner, Peter

    2016-03-01

    Wavefront sensorless schemes for correction of aberrations induced by biological specimens require a time invariant property of an image as a measure of fitness. Image intensity cannot be used as a metric for Single Molecule Localization (SML) microscopy because the intensity of blinking fluorophores follows exponential statistics. Therefore a robust intensity-independent metric is required. We previously reported a Fourier Metric (FM) that is relatively intensity independent. The Fourier metric has been successfully tested on two machine learning algorithms, a Genetic Algorithm and Particle Swarm Optimization, for wavefront correction about 50 μm deep inside the Central Nervous System (CNS) of Drosophila. However, since the spatial frequencies that need to be optimized fall into regions of the Optical Transfer Function (OTF) that are more susceptible to noise, adding a level of denoising can improve performance. Here we present wavelet-based approaches to lower the noise level and produce a more consistent metric. We compare performance of different wavelets such as Daubechies, Bi-Orthogonal, and reverse Bi-orthogonal of different degrees and orders for pre-processing of images.

  4. Wavelet-based study of valence-arousal model of emotions on EEG signals with LabVIEW.

    Science.gov (United States)

    Guzel Aydin, Seda; Kaya, Turgay; Guler, Hasan

    2016-06-01

    This paper illustrates the wavelet-based feature extraction for emotion assessment using electroencephalogram (EEG) signal through graphical coding design. Two-dimensional (valence-arousal) emotion model was studied. Different emotions (happy, joy, melancholy, and disgust) were studied for assessment. These emotions were stimulated by video clips. EEG signals obtained from four subjects were decomposed into five frequency bands (gamma, beta, alpha, theta, and delta) using "db5" wavelet function. Relative features were calculated to obtain further information. Impact of the emotions according to valence value was observed to be optimal on power spectral density of gamma band. The main objective of this work is not only to investigate the influence of the emotions on different frequency bands but also to overcome the difficulties in the text-based program. This work offers an alternative approach for emotion evaluation through EEG processing. There are a number of methods for emotion recognition such as wavelet transform-based, Fourier transform-based, and Hilbert-Huang transform-based methods. However, the majority of these methods have been applied with the text-based programming languages. In this study, we proposed and implemented an experimental feature extraction with graphics-based language, which provides great convenience in bioelectrical signal processing.

  5. Modeling international stock market contagion using multivariate fractionally integrated APARCH approach

    Directory of Open Access Journals (Sweden)

    Zouheir Mighri

    2014-12-01

    Full Text Available The aim of this article is to examine how the dynamics of correlations between two emerging countries (Brazil and Mexico and the US evolved from January 2003 to December 2013. The main contribution of this study is to explore whether the plunging stock market in the US, in the aftermath of global financial crisis (2007–2009, exerts contagion effects on emerging stock markets. To this end, we rely on a multivariate fractionally integrated asymmetric power autoregressive conditional heteroskedasticity dynamic conditional correlation framework, which accounts for long memory, power effects, leverage terms, and time-varying correlations. The empirical analysis shows a contagion effect for Brazil and Mexico during the early stages of the global financial crisis, indicating signs of “recoupling.” Nevertheless, linkages show a general pattern of “decoupling” after the Lehman Brothers collapse. Furthermore, correlations between Brazil and the US are decreased from early 2009 onwards, implying that their dependence is larger in bearish than in bullish markets.

  6. Spatial characterization of water quality in a karstic coastal lagoon without anthropogenic disturbance: a multivariate approach

    Science.gov (United States)

    Medina-Gómez, Israel; Herrera-Silveira, Jorge A.

    2003-11-01

    Dzilam Lagoon, located in the central coast of Yucatan, Gulf of Mexico, is a shallow water body with average depth of 0.6 m and area of 9.4 km 2. Numerous groundwater inputs are distributed along the system representing a continuous source of nitrates and silicates. Due to scarce anthropogenic activity, it is well preserved. Such pristine conditions suggest that changes on nutrient dynamics are mostly related to natural behavior. Monthly samples were taken from September 1998 to August 1999. Physicochemical parameters, inorganic nutrients and chlorophyll- a were measured in nine stations. A multivariate analysis showed salinity gradient and nutrient concentration as the most significant variables in describing lagoon hydrologic heterogeneity. On the basis of those critical parameters, classification analysis of Dzilam Lagoon identified three hydrological affinity zones (HAZ); East and West Zone characterized by higher water residence time and lower salinities during the rainy season; Central Zone with lower residence time and lower inorganic nutrients concentration. Dzilam Lagoon was a NO 3- sink and a net source for NO 2- and NH 4+. Soluble reactive phosphorus was slightly defined and soluble reactive silica was close to conservative condition.

  7. Snow white and the seven dwarfs: a multivariate approach to classification of cold tolerance.

    Science.gov (United States)

    Nedved, O

    2000-01-01

    Two main cold hardiness strategies of insects - freeze tolerance in some species, and overwintering in a supercooled state without tolerance of freezing in many others - were recently reclassified. However, I present several problems with the current systems. My suggested classification is based on clearer definitions of the causes of cold injury. I recognize three main mortality factors: freezing of body liquids, cold shock, and cumulative chill injury. Presence or absence of each of these factors produce eight combinations. I have named the eight classes after Snow White and the Seven Dwarfs to avoid nomenclatural confusion. Some of these classes are probably not used as tactics against cold injury by any insect species. Other classes contain so many species that they might be reclassified in more detail, using values of supercooling point and other quantitative parameters. However, widely comparable parameters, like the upper limit of cold injury zone and the sum of injurious temperatures are still rarely published, thus we still lack comprehensive data for multivariate analyses. Every cold hardiness strategy should be characterized by a meaningful class or subclass together with the physiological, biochemical, and behavioural mechanisms employed by the insects. I also point out the existence of strategies that combine two tactics - either a switching strategy (during preparation for winter, population "chooses" which tactic will be used), or a dual strategy (individuals are ready to use one of the tactics depending on the prevailing environmental conditions).

  8. Multivariate Autoregressive Model Based Heart Motion Prediction Approach for Beating Heart Surgery

    Directory of Open Access Journals (Sweden)

    Fan Liang

    2013-02-01

    Full Text Available A robotic tool can enable a surgeon to conduct off-pump coronary artery graft bypass surgery on a beating heart. The robotic tool actively alleviates the relative motion between the point of interest (POI on the heart surface and the surgical tool and allows the surgeon to operate as if the heart were stationary. Since the beating heart's motion is relatively high-band, with nonlinear and nonstationary characteristics, it is difficult to follow. Thus, precise beating heart motion prediction is necessary for the tracking control procedure during the surgery. In the research presented here, we first observe that Electrocardiography (ECG signal contains the causal phase information on heart motion and non-stationary heart rate dynamic variations. Then, we investigate the relationship between ECG signal and beating heart motion using Granger Causality Analysis, which describes the feasibility of the improved prediction of heart motion. Next, we propose a nonlinear time-varying multivariate vector autoregressive (MVAR model based adaptive prediction method. In this model, the significant correlation between ECG and heart motion enables the improvement of the prediction of sharp changes in heart motion and the approximation of the motion with sufficient detail. Dual Kalman Filters (DKF estimate the states and parameters of the model, respectively. Last, we evaluate the proposed algorithm through comparative experiments using the two sets of collected vivo data.

  9. Phenotypic consequences of polyploidy and genome size at the microevolutionary scale: a multivariate morphological approach.

    Science.gov (United States)

    Balao, Francisco; Herrera, Javier; Talavera, Salvador

    2011-10-01

    • Chromosomal duplications and increases in DNA amount have the potential to alter quantitative plant traits like flower number, plant stature or stomata size. This has been documented often across species, but information on whether such effects also occur within species (i.e. at the microevolutionary or population scale) is scarce. • We studied trait covariation associated with polyploidy and genome size (both monoploid and total) in 22 populations of Dianthus broteri s.l., a perennial herb with several cytotypes (2x, 4x, 6x and 12x) that do not coexist spatially. Principal component scores of organ size/number variations were assessed as correlates of polyploidy, and phylogenetic relatedness among populations was controlled using phylogenetic generalized least squares. • Polyploidy covaried with organ dimensions, causing multivariate characters to increase, remain unchanged, or decrease with DNA amount. Variations in monoploid DNA amount had detectable consequences on some phenotypic traits. According to the analyses, some traits would experience phenotypic selection, while others would not. • We show that polyploidy contributes to decouple variation among traits in D. broteri, and hypothesize that polyploids may experience an evolutionary advantage in this plant lineage, for example, if it helps to overcome the constraints imposed by trait integration.

  10. Multivariate statistical approach for the assessment of groundwater quality in Ujjain City, India.

    Science.gov (United States)

    Vishwakarma, Vikas; Thakur, Lokendra Singh

    2012-10-01

    Groundwater quality assessment is an essential study which plays important role in the rational development and utilization of groundwater. Groundwater quality greatly influences the health of local people. The variations of water quality are essentially the combination of both anthropogenic and natural contributions. In order to understand the underlying physical and chemical processes this study analyzes 8 chemical and physical-chemical water quality parameters, viz. pH, turbidity, electrical conductivity, total dissolved solids, total alkalinity, total hardness, chloride and fluoride recorded at the 54 sampling stations during summer season of 2011 by using multivariate statistical techniques. Hierarchical clustering analysis (CA) is first applied to distinguish groundwater quality patterns among the stations, followed by the use of principle component analysis (PCA) and factor analysis (FA) to extract and recognize the major underlying factors contributing to the variations among the water quality measures. The first three components were chosen for interpretation of the data, which accounts for 72.502% of the total variance in the data set. The maximum number of variables, i.e. turbidity, EC, TDS and chloride were characterized by first component, while second and third were characterized by total alkalinity, total hardness, fluoride and pH respectively. This shows that hydro chemical constituents of the groundwater are mainly controlled by EC, TDS, and fluoride. The findings of the cluster analysis are presented in the form of dendrogram of the sampling stations (cases) as well as hydro chemical variables, which produced four major groupings, suggest that groundwater monitoring can be consolidated.

  11. One approach in using multivariate statistical process control in analyzing cheese quality

    Directory of Open Access Journals (Sweden)

    Ilija Djekic

    2015-05-01

    Full Text Available The objective of this paper was to investigate possibility of using multivariate statistical process control in analysing cheese quality parameters. Two cheese types (white brined cheeses and soft cheese from ultra-filtered milk were selected and analysed for several quality parameters such as dry matter, milk fat, protein contents, pH, NaCl, fat in dry matter and moisture in non-fat solids. The obtained results showed significant variations for most of the quality characteristics which were examined among the two types of cheese. The only stable parameter in both types of cheese was moisture in non-fat solids. All of the other cheese quality characteristics were characterized above or below control limits for most of the samples. Such results indicated a high instability and variations within cheese production. Although the use of statistical process control is not mandatory in the dairy industry, it might provide benefits to organizations in improving quality control of dairy products.

  12. Assessing heavy metal sources in sugarcane Brazilian soils: an approach using multivariate analysis.

    Science.gov (United States)

    da Silva, Fernando Bruno Vieira; do Nascimento, Clístenes Williams Araújo; Araújo, Paula Renata Muniz; da Silva, Luiz Henrique Vieira; da Silva, Roberto Felipe

    2016-08-01

    Brazil is the world's largest sugarcane producer and soils in the northeastern part of the country have been cultivated with the crop for over 450 years. However, so far, there has been no study on the status of heavy metal accumulation in these long-history cultivated soils. To fill the gap, we collect soil samples from 60 sugarcane fields in order to determine the contents of Cd, Cr, Cu, Ni, Pb, and Zn. We used multivariate analysis to distinguish between natural and anthropogenic sources of these metals in soils. Analytical determinations were performed in ICP-OES after microwave acid solution digestion. Mean concentrations of Cd, Cr, Cu, Ni, Pb, and Zn were 1.9, 18.8, 6.4, 4.9, 11.2, and 16.2 mg kg(-1), respectively. The principal component one was associated with lithogenic origin and comprised the metals Cr, Cu, Ni, and Zn. Cluster analysis confirmed that 68 % of the evaluated sites have soil heavy metal concentrations close to the natural background. The Cd concentration (principal component two) was clearly associated with anthropogenic sources with P fertilization being the most likely source of Cd to soils. On the other hand, the third component (Pb concentration) indicates a mixed origin for this metal (natural and anthropogenic); hence, Pb concentrations are probably related not only to the soil parent material but also to industrial emissions and urbanization in the vicinity of the agricultural areas.

  13. Taking a comparative approach: analysing personality as a multivariate behavioural response across species.

    Directory of Open Access Journals (Sweden)

    Alecia J Carter

    Full Text Available Animal personality, repeatable behaviour through time and across contexts, is ecologically and evolutionarily important as it can account for the exhibition of sub-optimal behaviours. Interspecific comparisons have been suggested as important for understanding the evolution of animal personality; however, these are seldom accomplished due, in part, to the lack of statistical tools for quantifying differences and similarities in behaviour between groups of individuals. We used nine species of closely-related coral reef fishes to investigate the usefulness of ecological community analyses for the analysis of between-species behavioural differences and behavioural heterogeneity. We first documented behavioural carryover across species by observing the fishes' behaviour and measuring their response to a threatening stimulus to quantify boldness. Bold fish spent more time away from the reef and fed more than shy fish. We then used ecological community analysis tools (canonical variate analysis, multi-response permutation procedure, and permutational analysis of multivariate dispersion and identified four 'clusters' of behaviourally similar fishes, and found that the species differ in the behavioural variation expressed; some species are more behaviourally heterogeneous than others. We found that ecological community analysis tools are easily and fruitfully applied to comparative studies of personality and encourage their use by future studies.

  14. Multivariate approach of inter-relationships among growth, consumption and carcass traits in Nellore cattle

    Directory of Open Access Journals (Sweden)

    Cláudio Ulhôa Magnabosco

    Full Text Available The objective of the present study was to analyze the phenotypic inter-relationships between growth, feed intake and carcass traits in polled Nellore cattle, as well as to determine which bulls produced the most efficient progeny. The experiment was conducted in the feedlot of the Guaporé Pecuária (Livestock Company, OB Brand. The following traits were analyzed: initial live weight (ILW; final live weight (FLW; average daily gain (ADG; dry matter intake (DMI; gain:feed (G:F; residual feed intake (RFI; rib-eye area (REA; rump fat thickness (RF; backfat thickness at the 12th-13th rib (BF; weighted fat score (WF; and intramuscular fat percentage (IMF. Both univariate and multivariate analyses were performed to analyze the inter-relationships between the studied traits. No significant phenotypic associations were observed between growth, carcass traits and residual feed intake, while the correlation between RFI and G:F was negative. Therefore, RFI may be used to select more nutritionally efficient animals without compromising growth or adult size. The selection of bulls with progeny showing low residual feed intake is recommended, as selection for low RFI tends to improve feed efficiency without compromising growth and development.

  15. A multivariate approach for the study of the environmental drivers of wine production structure

    Science.gov (United States)

    Lorenzetti, Romina; Costantini, Edoardo A. C.; Malorgio, Giulio

    2015-04-01

    Vitivinicultural "terroir" is a concept referring to an area in which the collective knowledge of the interactions between environment and vitivinicultural practices develops, providing distinctive characteristics to the products. The effect of the environment components over the terroir has been already widely demonstrated. What it has not been studied yet is their possible effect on the structure of wine production. Therefore, the aim of this work was to find if environmental drivers influence the wine production structure. This kind of investigation necessarily involves a change of scale towards wide territories. We used the Italian Denomination of Origin territories, which were grouped in Macro-areas (reference scale 1:500,000) with respect of geographic proximity, environmental features, viticultural affinity and tradition. The characterization of the structure of the wine transformation industry was based on the official data reported in the wine production declarations related to the year 2008. Statistics were taken into account about general quantitative variables of wine farms, presence of associative forms, degree of vertical integration of wineries, quality orientation of wine producers, and acreage of vineyard. The environmental variables climate, soil, and vegetation vigour were selected for their direct influence on the vine growing. A second set of variables was chosen to express the effect of land morphology on viticultural management. The third one was intended to discover the possible relationships between viticultural structures and land quality, such as the indexes of sensitivity to desertification, the soil resistance to water erosion, and land vulnerability. A PCA was carried out separately for the environmental and economic data to reduce the database dimensions. The new economic and environmental synthetic descriptors were involved in three multivariate analyses: i) the correlation between economic and environmental descriptors through the

  16. Risk management and statistical multivariate analysis approach for design and optimization of satranidazole nanoparticles.

    Science.gov (United States)

    Dhat, Shalaka; Pund, Swati; Kokare, Chandrakant; Sharma, Pankaj; Shrivastava, Birendra

    2017-01-01

    Rapidly evolving technical and regulatory landscapes of the pharmaceutical product development necessitates risk management with application of multivariate analysis using Process Analytical Technology (PAT) and Quality by Design (QbD). Poorly soluble, high dose drug, Satranidazole was optimally nanoprecipitated (SAT-NP) employing principles of Formulation by Design (FbD). The potential risk factors influencing the critical quality attributes (CQA) of SAT-NP were identified using Ishikawa diagram. Plackett-Burman screening design was adopted to screen the eight critical formulation and process parameters influencing the mean particle size, zeta potential and dissolution efficiency at 30min in pH7.4 dissolution medium. Pareto charts (individual and cumulative) revealed three most critical factors influencing CQA of SAT-NP viz. aqueous stabilizer (Polyvinyl alcohol), release modifier (Eudragit® S 100) and volume of aqueous phase. The levels of these three critical formulation attributes were optimized by FbD within established design space to minimize mean particle size, poly dispersity index, and maximize encapsulation efficiency of SAT-NP. Lenth's and Bayesian analysis along with mathematical modeling of results allowed identification and quantification of critical formulation attributes significantly active on the selected CQAs. The optimized SAT-NP exhibited mean particle size; 216nm, polydispersity index; 0.250, zeta potential; -3.75mV and encapsulation efficiency; 78.3%. The product was lyophilized using mannitol to form readily redispersible powder. X-ray diffraction analysis confirmed the conversion of crystalline SAT to amorphous form. In vitro release of SAT-NP in gradually pH changing media showed 95%) in pH7.4 in next 3h, indicative of burst release after a lag time. This investigation demonstrated effective application of risk management and QbD tools in developing site-specific release SAT-NP by nanoprecipitation.

  17. Impact of Coal-Coking Effluent on Sediment Microbial Communities: a Multivariate Approach

    Science.gov (United States)

    Sayler, Gary S.; Sherrill, Timothy W.; Perkins, Richard E.; Mallory, Lawrence M.; Shiaris, Michael P.; Pedersen, Deana

    1982-01-01

    The functional response to and recovery from coal-coking waste effluent was evaluated for sediment microbial communities. Twenty estimates of microbial population density, biomass, and activity were measured five times during a 15-month period. Significant effects on microbial communities were observed in response to both wastewater contamination and diversion of the wastewater. Multivariate analysis of variance and discriminant analysis indicated that accurate differentiation between uncontaminated and contaminated sediments required a minimum of nine estimates of community response. Total viable population density, ATP, alkaline phosphatase, naphthalene, and phenanthrene mineralization rates were found to be highly weighted variables in site discrimination. Lipid and glucose mineralization, nitrogen fixation, and sediment protein also contributed significantly to explaining variation among sites. Estimates of anaerobic population densities and rates of methane production contributed little to discrimination among sites in the environment examined. In general, total viable population density, ATP, and alkaline phosphatase activity were significantly depressed in contaminated sediments. However, after removal of this contamination, the previously affected sites demonstrated greater temporal variability but a closer approximation of the mean response at the control site. Naphthalene and phenanthrene mineralization did not follow the general trend and were elevated at the contaminated sites throughout the investigation. Results of the investigation supported the hypothesis that multiple functional measures of microbial community response are required to evaluate the effect of and recovery from environmental contamination. In addition, when long-term effects are evaluated, select physiological traits, i.e., polyaromatic hydrocarbon mineralization, may not reflect population and biomass estimates of community response. PMID:16346132

  18. Reliability and predictors of resistive load detection in children with persistent asthma: a multivariate approach.

    Science.gov (United States)

    Harver, Andrew; Dyer, Allison; Ersek, Jennifer L; Kotses, Harry; Humphries, C Thomas

    2015-03-01

    Resistive load detection tasks enable analysis of individual differences in psychophysical outcomes. The purpose of this study was to determine both the reliability and predictors of resistive load detection in children with persistent asthma who completed multiple testing sessions. Both University of North Carolina (UNC) Charlotte and Ohio University institutional review boards approved the research protocol. The detection of inspiratory resistive loads was evaluated in 75 children with asthma between 8 and 15 years of age. Each child participated in four experimental sessions that occurred approximately once every 2 weeks. Multivariate analyses were used to delineate predictors of task performance. Reliability of resistive load detection was determined for each child, and predictors of load detection outcomes were investigated in two groups of children: those who performed reliably in all four sessions (n = 31) and those who performed reliably in three or fewer sessions (n = 44). Three factors (development, symptoms, and compliance) accounted for 66.3% of the variance among variables that predicted 38.7% of the variance in load detection outcomes (Multiple R = 0.62, p = 0.004) and correctly classified performance as reliable or less reliable in 80.6% of the children, χ(2)(12) = 28.88, p = 0.004. Cognitive and physical development, appraisal of symptom experiences, and adherence-related behaviors (1) account for a significant proportion of the interrelationships among variables that affect perception of airflow obstruction in children with asthma and (2) differentiate between children who perform more or less reliably in a resistive load detection task.

  19. Environmental controls on microbial abundance and activity on the greenland ice sheet: a multivariate analysis approach.

    Science.gov (United States)

    Stibal, Marek; Telling, Jon; Cook, Joe; Mak, Ka Man; Hodson, Andy; Anesio, Alexandre M

    2012-01-01

    Microbes in supraglacial ecosystems have been proposed to be significant contributors to regional and possibly global carbon cycling, and quantifying the biogeochemical cycling of carbon in glacial ecosystems is of great significance for global carbon flow estimations. Here we present data on microbial abundance and productivity, collected along a transect across the ablation zone of the Greenland ice sheet (GrIS) in summer 2010. We analyse the relationships between the physical, chemical and biological variables using multivariate statistical analysis. Concentrations of debris-bound nutrients increased with distance from the ice sheet margin, as did both cell numbers and activity rates before reaching a peak (photosynthesis) or a plateau (respiration, abundance) between 10 and 20 km from the margin. The results of productivity measurements suggest an overall net autotrophy on the GrIS and support the proposed role of ice sheet ecosystems in carbon cycling as regional sinks of CO(2) and places of production of organic matter that can be a potential source of nutrients for downstream ecosystems. Principal component analysis based on chemical and biological data revealed three clusters of sites, corresponding to three 'glacier ecological zones', confirmed by a redundancy analysis (RDA) using physical data as predictors. RDA using data from the largest 'bare ice zone' showed that glacier surface slope, a proxy for melt water flow, accounted for most of the variation in the data. Variation in the chemical data was fully explainable by the determined physical variables. Abundance of phototrophic microbes and their proportion in the community were identified as significant controls of the carbon cycling-related microbial processes.

  20. Assessing the impact of thrombolysis on progress through inpatient rehabilitation after stroke: a multivariable approach.

    Science.gov (United States)

    Meyer, M; Murie-Fernandez, M; Hall, R; Liu, Y; Fang, J; Salter, K; Foley, N; Teasell, R

    2012-08-01

    Acute administration of tissue plasminogen activator has been shown to improve immediate and long-term patient recovery after ischaemic stroke. Yet, despite widespread clinical application, many patients who receive acute tissue plasminogen activator still require inpatient rehabilitation. This study aimed to examine the effect of tissue plasminogen activator administration on recovery among patients requiring inpatient rehabilitation after stroke in Ontario, Canada. It was hypothesized that after covariate adjustment, administration of tissue plasminogen activator would be associated with accelerated progress through inpatient rehabilitation. Acute and rehabilitation data were retrieved from the Registry of the Canadian Stroke Network and the National Rehabilitation Reporting System for all ischaemic stroke patients admitted to an acute facility and a rehabilitation unit between July 1, 2003 and March 31, 2008. Patients were divided into two groups: those who received tissue plasminogen activator and those who were medically eligible but did not receive tissue plasminogen activator. Three rehabilitation progress indicators were compared between groups: Functional Independence Measure gain, active length of stay, and discharge destination. Indicators were modelled using multivariable generalized linear models or logistic regression as appropriate. Patients who received tissue plasminogen activator experienced shorter active lengths of stay (log estimate ± standard error: -0·04 ± 0·01 days), and were slightly more likely to be discharged home compared to controls (adjusted odds ratio 1·35, 95% confidence interval 1·004-1·82). No differences were noted on Functional Independence Measure gain during rehabilitation. Results suggest that tissue plasminogen activator may contribute to accelerated progress through inpatient rehabilitation; however, there is no evidence to suggest that it contributes to greater functional improvement as measured by the

  1. Wavelet-based neural network analysis of internal carotid arterial Doppler signals.

    Science.gov (United States)

    Ubeyli, Elif Derya; Güler, Inan

    2006-06-01

    In this study, internal carotid arterial Doppler signals recorded from 130 subjects, where 45 of them suffered from internal carotid artery stenosis, 44 of them suffered from internal carotid artery occlusion and the rest of them were healthy subjects, were classified using wavelet-based neural network. Wavelet-based neural network model, employing the multilayer perceptron, was used for analysis of the internal carotid arterial Doppler signals. Multi-layer perceptron neural network (MLPNN) trained with the Levenberg-Marquardt algorithm was used to detect stenosis and occlusion in internal carotid arteries. In order to determine the MLPNN inputs, spectral analysis of the internal carotid arterial Doppler signals was performed using wavelet transform (WT). The MLPNN was trained, cross validated, and tested with training, cross validation, and testing sets, respectively. All these data sets were obtained from internal carotid arteries of healthy subjects, subjects suffering from internal carotid artery stenosis and occlusion. The correct classification rate was 96% for healthy subjects, 96.15% for subjects having internal carotid artery stenosis and 96.30% for subjects having internal carotid artery occlusion. The classification results showed that the MLPNN trained with the Levenberg-Marquardt algorithm was effective to detect internal carotid artery stenosis and occlusion.

  2. Wavelet-based neural network analysis of ophthalmic artery Doppler signals.

    Science.gov (United States)

    Güler, Nihal Fatma; Ubeyli, Elif Derya

    2004-10-01

    In this study, ophthalmic artery Doppler signals were recorded from 115 subjects, 52 of whom had ophthalmic artery stenosis while the rest were healthy controls. Results were classified using a wavelet-based neural network. The wavelet-based neural network model, employing the multilayer perceptron, was used for analysis of ophthalmic artery Doppler signals. A multilayer perceptron neural network (MLPNN) trained with the Levenberg-Marquardt algorithm was used to detect stenosis in ophthalmic arteries. In order to determine the MLPNN inputs, spectral analysis of ophthalmic artery Doppler signals was performed using wavelet transform. The MLPNN was trained, cross validated, and tested with training, cross validation, and testing sets, respectively. All data sets were obtained from ophthalmic arteries of healthy subjects and subjects suffering from ophthalmic artery stenosis. The correct classification rate was 97.22% for healthy subjects, and 96.77% for subjects having ophthalmic artery stenosis. The classification results showed that the MLPNN trained with the Levenberg-Marquardt algorithm was effective to detect ophthalmic artery stenosis.

  3. A data-distributed parallel algorithm for wavelet-based fusion of remote sensing images

    Institute of Scientific and Technical Information of China (English)

    YANG Xuejun; WANG Panfeng; DU Yunfei; ZHOU Haifang

    2007-01-01

    With the increasing importance of multiplatform remote sensing missions,the fast integration or fusion of digital images from disparate sources has become critical to the success of these endeavors.In this paper,to speed up the fusion process,a Data-distributed Parallel Algorithm for wavelet-based Fusion (DPAF for short) of remote sensing images which are not geo-registered remote sensing images is presented for the first time.To overcome the limitations on memory space as well as the computing capability of a single processor,data distribution,data-parallel processing and load balancing techniques are integrated into DPAF.To avoid the inherent communication overhead of a wavelet-based fusion method,a special design called redundant partitioning is used,which is inspired by the characteristics of wavelet transform.Finally,DPAF is evaluated in theory and tested on a 32-CPU cluster of workstations.The experimental results show that our algorithm has good parallel performance and scalability.

  4. Evaluation of Effectiveness of Wavelet Based Denoising Schemes Using ANN and SVM for Bearing Condition Classification

    Directory of Open Access Journals (Sweden)

    Vijay G. S.

    2012-01-01

    Full Text Available The wavelet based denoising has proven its ability to denoise the bearing vibration signals by improving the signal-to-noise ratio (SNR and reducing the root-mean-square error (RMSE. In this paper seven wavelet based denoising schemes have been evaluated based on the performance of the Artificial Neural Network (ANN and the Support Vector Machine (SVM, for the bearing condition classification. The work consists of two parts, the first part in which a synthetic signal simulating the defective bearing vibration signal with Gaussian noise was subjected to these denoising schemes. The best scheme based on the SNR and the RMSE was identified. In the second part, the vibration signals collected from a customized Rolling Element Bearing (REB test rig for four bearing conditions were subjected to these denoising schemes. Several time and frequency domain features were extracted from the denoised signals, out of which a few sensitive features were selected using the Fisher’s Criterion (FC. Extracted features were used to train and test the ANN and the SVM. The best denoising scheme identified, based on the classification performances of the ANN and the SVM, was found to be the same as the one obtained using the synthetic signal.

  5. Wavelet-Based Digital Image Fusion on Reconfigurable FPGA Using Handel-C Language

    Directory of Open Access Journals (Sweden)

    Dr. G. Mohan

    2013-07-01

    Full Text Available Field Programmable Gate Array (FPGA technology has become a viable target for the implementation of real time algorithms in different fusion methods have been proposed mainly in the fields of remote sensing and computer vision. Image fusion is basically a process where multiple images (more than one are combined to form a single resultant fused image. This fused image is more productive as compared to its original input images. In most paper image fusion algorithms were implemented in simulation level only. In this paper Wavelet based Image fusion algorithm is employed and implemented on a Field-Programmable-Gate-Array-based hardware system using a Xilinx Platform Studio EDK 11.1 FPGA Spartan 3E is implemented. The FPGA technologies offer basic digital blocks with flexible interconnections to achieve high speed digital hardware realization. The FPGA consists of a system of logic blocks, such as Look up Tables, gates, or flip-flops and some amount of memory. The algorithm will be transferred from computer to FPGA board using JTAG cable. In this proposed work algorithm is developed by using Handle –C language to performing wavelet based image fusion. The result will be transferred back to system to analyze hardware resource taken by FPGA

  6. A Wavelet-Based Method to Predict Muscle Forces From Surface Electromyography Signals in Weightlifting

    Institute of Scientific and Technical Information of China (English)

    Gaofeng Wei; Feng Tian; Gang Tang; Chengtao Wang

    2012-01-01

    The purpose of this study was to develop a wavelet-based method to predict muscle forces from surface electromyography (EMG) signals in vivo.The weightlifting motor task was implemented as the case study.EMG signals of biceps brachii,triceps brachii and deltoid muscles were recorded when the subject carried out a standard weightlifting motor task.The wavelet-based algorithm was used to process raw EMG signals and extract features which could be input to the Hill-type muscle force models to predict muscle forces.At the same time,the musculoskeletal model of subject's weightlifting motor task was built and simulated using the Computed Muscle Control (CMC) method via a motion capture experiment.The results of CMC were compared with the muscle force predictions by the proposed method.The correlation coefficient between two results was 0.99(p<0.01).However,the proposed method was easier and more efficiency than the CMC method.It has potential to be used clinically to predict muscle forces in vivo.

  7. Regional trends in short-duration precipitation extremes: a flexible multivariate monotone quantile regression approach

    Science.gov (United States)

    Cannon, Alex

    2017-04-01

    univariate technique, and cannot incorporate information from additional covariates, for example ENSO state or physiographic controls on extreme rainfall within a region. Here, the univariate MQR model is extended to allow the use of multiple covariates. Multivariate monotone quantile regression (MMQR) is based on a single hidden-layer feedforward network with the quantile regression error function and partial monotonicity constraints. The MMQR model is demonstrated via Monte Carlo simulations and the estimation and visualization of regional trends in moderate rainfall extremes based on homogenized sub-daily precipitation data at stations in Canada.

  8. A latent dynamic factor approach to forecasting multivariate stock market volatility

    OpenAIRE

    Gribisch, Bastian

    2013-01-01

    This paper proposes a latent dynamic factor model for low- as well as high-dimensional realized covariance matrices of stock returns. The approach is based on the matrix logarithm and allows for flexible dynamic dependence patterns by combining common latent factors driven by HAR dynamics and idiosyncratic AR(1) factors. The model accounts for symmetry and positive definiteness of covariance matrices without imposing parametric restrictions. Simulated Bayesian parameter estimates as well as p...

  9. Elliptic Curves, Algebraic Geometry Approach in Gravity Theory and Uniformization of Multivariable Cubic Algebraic Equations

    OpenAIRE

    2008-01-01

    Based on the distinction between the covariant and contravariant metric tensor components in the framework of the affine geometry approach and the s.c. "gravitational theories with covariant and contravariant connection and metrics", it is shown that a wide variety of third, fourth, fifth, seventh, tenth- degree algebraic equations exists in gravity theory. This is important in view of finding new solutions of the Einstein's equations, if they are treated as algebraic ones. Since the obtained...

  10. Multivariate approach to quantitative analysis of Aphis gossypii Glover (Hemiptera: Aphididae) and their natural enemy populations at different cotton spacings

    Science.gov (United States)

    Malaquias, José B.; Ramalho, Francisco S.; Dos S. Dias, Carlos T.; Brugger, Bruno P.; S. Lira, Aline Cristina; Wilcken, Carlos F.; Pachú, Jéssica K. S.; Zanuncio, José C.

    2017-02-01

    The relationship between pests and natural enemies using multivariate analysis on cotton in different spacing has not been documented yet. Using multivariate approaches is possible to optimize strategies to control Aphis gossypii at different crop spacings because the possibility of a better use of the aphid sampling strategies as well as the conservation and release of its natural enemies. The aims of the study were (i) to characterize the temporal abundance data of aphids and its natural enemies using principal components, (ii) to analyze the degree of correlation between the insects and between groups of variables (pests and natural enemies), (iii) to identify the main natural enemies responsible for regulating A. gossypii populations, and (iv) to investigate the similarities in arthropod occurrence patterns at different spacings of cotton crops over two seasons. High correlations in the occurrence of Scymnus rubicundus with aphids are shown through principal component analysis and through the important role the species plays in canonical correlation analysis. Clustering the presence of apterous aphids matches the pattern verified for Chrysoperla externa at the three different spacings between rows. Our results indicate that S. rubicundus is the main candidate to regulate the aphid populations in all spacings studied.

  11. Faecal sterols as sewage markers in the Langat River, Malaysia: Integration of biomarker and multivariate statistical approaches

    Institute of Scientific and Technical Information of China (English)

    Nur Hazirah Adnan; Mohamad Pauzi Zakaria; Hafizan Juahir; Masni Mohd Ali

    2012-01-01

    The Langat River in Malaysia has been experiencing anthropogenic input from urban,rural and industrial activities for many years.Sewage contamination,possibly originating from the greater than three million inhabitants of the Langat River Basin,were examined.Sediment samples from 22 stations (SL01-SL22) along the Langat River were collected,extracted and analysed by GC-MS.Six different sterols were identified and quantified.The highest sterol concentration was found at station SL02 (618.29 ng/g dry weight),which situated in the Balak River whereas the other sediment samples ranged between 11.60 and 446.52 ng/g dry weight.Sterol ratios were used to identify sources,occurrence and partitioning of faecal matter in sediments and majority of the ratios clearly demonstrated that sewage contamination was occurring at most stations in the Langat River.A multivariate statistical analysis was used in conjunction with a combination of biomarkers to better understand the data that clearly separated the compounds.Most sediments of the Langat River were found to contain low to mid-range sewage contamination with some containing ‘significant' levels of contamination.This is the first report on sewage pollution in the Langat River based on a combination of biomarker and multivariate statistical approaches that will establish a new standard for sewage detection using faecal sterols.

  12. Assessment of Near-Bottom Water Quality of Southwestern Coast of Sarawak, Borneo, Malaysia: A Multivariate Statistical Approach

    Directory of Open Access Journals (Sweden)

    Chen-Lin Soo

    2017-01-01

    Full Text Available The study on Sarawak coastal water quality is scarce, not to mention the application of the multivariate statistical approach to investigate the spatial variation of water quality and to identify the pollution source in Sarawak coastal water. Hence, the present study aimed to evaluate the spatial variation of water quality along the coastline of the southwestern region of Sarawak using multivariate statistical techniques. Seventeen physicochemical parameters were measured at 11 stations along the coastline with approximately 225 km length. The coastal water quality showed spatial heterogeneity where the cluster analysis grouped the 11 stations into four different clusters. Deterioration in coastal water quality has been observed in different regions of Sarawak corresponding to land use patterns in the region. Nevertheless, nitrate-nitrogen exceeded the guideline value at all sampling stations along the coastline. The principal component analysis (PCA has determined a reduced number of five principal components that explained 89.0% of the data set variance. The first PC indicated that the nutrients were the dominant polluting factors, which is attributed to the domestic, agricultural, and aquaculture activities, followed by the suspended solids in the second PC which are related to the logging activities.

  13. Distribution of chlorinated organic pollutants in harbor sediments of Livorno (Italy): a multivariate approach to evaluate dredging sediments.

    Science.gov (United States)

    Cicero, A M; Mecozzi, M; Morlino, R; Pellegrini, D; Veschetti, E

    2001-10-01

    Dredging is a very important procedure for harbor management. In Italy the guidelines for the offshore dumping of dredged materials are issued by the Ministry of Environment. They described a few steps of dredging activities, such as the sampling strategy, but do not deal with limits or guide-values for the chemical, physical and biological composition of the resulting sediments. The quality of dredged materials is mainly dependent on the presence of inorganic and organic pollutants. In particular, polychlorinated biphenyls (PCBs) and organo-chlorinated pesticides are seen as a high priority in marine environment by international organizations because of their persistence, toxicity and bioaccumulation capacity. In this article the presence of some PCBs and organo-chlorinated pesticides in sediment samples collected from the harbor of Livorno (Northern Tyrrhenian Sea) was investigated. The concentration of HCHs, Aldrin, Chlordanes, DDEs, DDTs, and PCBs in 12 representative sites ranged between <1 microg kg(-1) and 95, 19, 32, 35, 107, and 111 microg kg(-1), respectively. The application of univariate and multivariate statistical techniques, such as linear regression analysis and principal component analysis, to the experimental data showed a different distribution of PCBs in the two sediment layers. On the contrary, the vertical distribution of the other investigated pollutants was more homogeneous and affected by random variability. The multivariate approach was an important tool to establish more rational criteria for the management of dredged materials.

  14. Optimisation of Oil Spill Dispersants on Weathered Oils. A New Approach Using Experimental Design and Multivariate Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Brandvik, Per Johan

    1997-12-31

    This thesis describes how laboratory experiments combined with numerical modelling were used to predict weathering of an oil slick at different environmental conditions (temperature, wind etc.). It also applies laboratory test methods to screen dispersant effectiveness under different temperatures and salinities. A new approach is developed for dispersant optimization based on statistical design and multivariate analysis; this resulted in a new dispersant with low toxicity and high effectiveness on a broad selection of oil types. The thesis illustrates the potential of dispersant used as an operational response method on oil spills by discussing three different oil spill scenarios and compares the effect of using dispersants to using mechanical recovery and to doing nothing. Some recommendations that may increase the effectiveness of the Norwegian oil spill contingency are also given. 172 refs., 65 figs., 9 tabs.

  15. Biogeochemical regions of the Mediterranean Sea: An objective multidimensional and multivariate environmental approach

    Science.gov (United States)

    Reygondeau, Gabriel; Guieu, Cécile; Benedetti, Fabio; Irisson, Jean-Olivier; Ayata, Sakina-Dorothée; Gasparini, Stéphane; Koubbi, Philippe

    2017-02-01

    When dividing the ocean, the aim is generally to summarise a complex system into a representative number of units, each representing a specific environment, a biological community or a socio-economical specificity. Recently, several geographical partitions of the global ocean have been proposed using statistical approaches applied to remote sensing or observations gathered during oceanographic cruises. Such geographical frameworks defined at a macroscale appear hardly applicable to characterise the biogeochemical features of semi-enclosed seas that are driven by smaller-scale chemical and physical processes. Following the Longhurst's biogeochemical partitioning of the pelagic realm, this study investigates the environmental divisions of the Mediterranean Sea using a large set of environmental parameters. These parameters were informed in the horizontal and the vertical dimensions to provide a 3D spatial framework for environmental management (12 regions found for the epipelagic, 12 for the mesopelagic, 13 for the bathypelagic and 26 for the seafloor). We show that: (1) the contribution of the longitudinal environmental gradient to the biogeochemical partitions decreases with depth; (2) the partition of the surface layer cannot be extrapolated to other vertical layers as the partition is driven by a different set of environmental variables. This new partitioning of the Mediterranean Sea has strong implications for conservation as it highlights that management must account for the differences in zoning with depth at a regional scale.

  16. A multivariate and stochastic approach to identify key variables to rank dairy farms on profitability.

    Science.gov (United States)

    Atzori, A S; Tedeschi, L O; Cannas, A

    2013-05-01

    The economic efficiency of dairy farms is the main goal of farmers. The objective of this work was to use routinely available information at the dairy farm level to develop an index of profitability to rank dairy farms and to assist the decision-making process of farmers to increase the economic efficiency of the entire system. A stochastic modeling approach was used to study the relationships between inputs and profitability (i.e., income over feed cost; IOFC) of dairy cattle farms. The IOFC was calculated as: milk revenue + value of male calves + culling revenue - herd feed costs. Two databases were created. The first one was a development database, which was created from technical and economic variables collected in 135 dairy farms. The second one was a synthetic database (sDB) created from 5,000 synthetic dairy farms using the Monte Carlo technique and based on the characteristics of the development database data. The sDB was used to develop a ranking index as follows: (1) principal component analysis (PCA), excluding IOFC, was used to identify principal components (sPC); and (2) coefficient estimates of a multiple regression of the IOFC on the sPC were obtained. Then, the eigenvectors of the sPC were used to compute the principal component values for the original 135 dairy farms that were used with the multiple regression coefficient estimates to predict IOFC (dRI; ranking index from development database). The dRI was used to rank the original 135 dairy farms. The PCA explained 77.6% of the sDB variability and 4 sPC were selected. The sPC were associated with herd profile, milk quality and payment, poor management, and reproduction based on the significant variables of the sPC. The mean IOFC in the sDB was 0.1377 ± 0.0162 euros per liter of milk (€/L). The dRI explained 81% of the variability of the IOFC calculated for the 135 original farms. When the number of farms below and above 1 standard deviation (SD) of the dRI were calculated, we found that 21

  17. A multi-variable box model approach to the soft tissue carbon pump

    Directory of Open Access Journals (Sweden)

    A. M. de Boer

    2010-12-01

    Full Text Available The canonical question of which physical, chemical or biological mechanisms were responsible for oceanic uptake of atmospheric CO2 during the last glacial is yet unanswered. Insight from paleo-proxies has led to a multitude of hypotheses but none so far have been convincingly supported in three dimensional numerical modelling experiments. The processes that influence the CO2 uptake and export production are inter-related and too complex to solve conceptually while complex numerical models are time consuming and expensive to run which severely limits the combinations of mechanisms that can be explored. Instead, an intermediate inverse box model approach of the soft tissue pump is used here in which the whole parameter space is explored. The glacial circulation and biological production states are derived from these using proxies of glacial export production and the need to draw down CO2 into the ocean. We find that circulation patterns which explain glacial observations include reduced Antarctic Bottom Water formation and high latitude upwelling and mixing of deep water and to a lesser extent reduced equatorial upwelling. The proposed mechanism of CO2 uptake by an increase of eddies in the Southern Ocean, leading to a reduced residual circulation, is not supported. Regarding biological mechanisms, an increase in the nutrient utilization in either the equatorial regions or the northern polar latitudes can reduce atmospheric CO2 and satisfy proxies of glacial export production. Consistent with previous studies, CO2 is drawn down more easily through increased productivity in the Antarctic region than the sub-Antarctic, but that violates observations of lower export production there. The glacial states are more sensitive to changes in the circulation and less sensitive to changes in nutrient utilization rates than the interglacial states.

  18. Explaining nitrate pollution pressure on the groundwater resource in Kinshasa using a multivariate statistical modelling approach

    Science.gov (United States)

    Mfumu Kihumba, Antoine; Vanclooster, Marnik

    2013-04-01

    Drinking water in Kinshasa, the capital of the Democratic Republic of Congo, is provided by extracting groundwater from the local aquifer, particularly in peripheral areas. The exploited groundwater body is mainly unconfined and located within a continuous detrital aquifer, primarily composed of sedimentary formations. However, the aquifer is subjected to an increasing threat of anthropogenic pollution pressure. Understanding the detailed origin of this pollution pressure is important for sustainable drinking water management in Kinshasa. The present study aims to explain the observed nitrate pollution problem, nitrate being considered as a good tracer for other pollution threats. The analysis is made in terms of physical attributes that are readily available using a statistical modelling approach. For the nitrate data, use was made of a historical groundwater quality assessment study, for which the data were re-analysed. The physical attributes are related to the topography, land use, geology and hydrogeology of the region. Prior to the statistical modelling, intrinsic and specific vulnerability for nitrate pollution was assessed. This vulnerability assessment showed that the alluvium area in the northern part of the region is the most vulnerable area. This area consists of urban land use with poor sanitation. Re-analysis of the nitrate pollution data demonstrated that the spatial variability of nitrate concentrations in the groundwater body is high, and coherent with the fragmented land use of the region and the intrinsic and specific vulnerability maps. For the statistical modeling use was made of multiple regression and regression tree analysis. The results demonstrated the significant impact of land use variables on the Kinshasa groundwater nitrate pollution and the need for a detailed delineation of groundwater capture zones around the monitoring stations. Key words: Groundwater , Isotopic, Kinshasa, Modelling, Pollution, Physico-chemical.

  19. Wavelet-based EEG processing for computer-aided seizure detection and epilepsy diagnosis

    Directory of Open Access Journals (Sweden)

    Krishnaveni

    2017-01-01

    Full Text Available Many Neurological disorders are very difficult to detect. One such Neurological disorder which we are going to discuss in this paper is Epilepsy. Epilepsy means sudden change in the behavior of a human being for a short period of time. This is caused due to seizures in the brain. Many researches are going onto detect epilepsy detection through analyzing EEG. One such method of epilepsy detection is proposed in this paper. This technique employs Discrete Wave Transform (DWT method for pre-processing, Approximate Entropy (ApEn to extract features and Artificial Neural Network (ANN for classification. This paper presented a detailed survey of various methods that are being used for epilepsy detection and also proposes a wavelet based epilepsy detection method.

  20. Efficient wavelet-based voice/data discriminator for telephone networks

    Science.gov (United States)

    Quirk, Patrick J.; Tseng, Yi-Chyun; Adhami, Reza R.

    1996-06-01

    A broad array of applications in the Public Switched Telephone Network (PSTN) require detailed information about type of call being carried. This information can be used to enhance service, diagnose transmission impairments, and increase available call capacity. The increase in data rates of modems and the increased usage of speech compression in the PSTN has rendered existing detection algorithms obsolete. Wavelets, specifically the Discrete Wavelet Transform (DWT), are a relatively new analysis tool in Digital Signal Processing. The DWT has been applied to signal processing problems ranging from speech compression to astrophysics. In this paper, we present a wavelet-based method of categorizing telephony traffic by call type. Calls are categorized as Voice or Data. Data calls, primarily modem and fax transmissions, are further divided by the International Telecommunications Union-Telephony (ITU-T), formerly CCITT, V-series designations (V.22bis, V.32, V.32bis, and V.34).

  1. Wavelet-based correlations of impedance cardiography signals and heart rate variability

    Science.gov (United States)

    Podtaev, Sergey; Dumler, Andrew; Stepanov, Rodion; Frick, Peter; Tziberkin, Kirill

    2010-04-01

    The wavelet-based correlation analysis is employed to study impedance cardiography signals (variation in the impedance of the thorax z(t) and time derivative of the thoracic impedance (- dz/dt)) and heart rate variability (HRV). A method of computer thoracic tetrapolar polyrheocardiography is used for hemodynamic registrations. The modulus of wavelet-correlation function shows the level of correlation, and the phase indicates the mean phase shift of oscillations at the given scale (frequency). Significant correlations essentially exceeding the values obtained for noise signals are defined within two spectral ranges, which correspond to respiratory activity (0.14-0.5 Hz), endothelial related metabolic activity and neuroendocrine rhythms (0.0095-0.02 Hz). Probably, the phase shift of oscillations in all frequency ranges is related to the peculiarities of parasympathetic and neuro-humoral regulation of a cardiovascular system.

  2. Wavelet-based method for computing elastic band gaps of one-dimensional phononic crystals

    Institute of Scientific and Technical Information of China (English)

    YAN; ZhiZhong; WANG; YueSheng

    2007-01-01

    A wavelet-based method was developed to compute elastic band gaps of one-dimensional phononic crystals. The wave field was expanded in the wavelet basis and an equivalent eigenvalue problem was derived in a matrix form involving the adaptive computation of integrals of the wavelets. The method was then applied to a binary system. For comparison, the elastic band gaps of the same one-di- mensional phononic crystals computed with the wavelet method and the well- known plane wave expansion (PWE) method are both presented in this paper. The numerical results of the two methods are in good agreement while the computation costs of the wavelet method are much lower than that of PWE method. In addition, the adaptability of wavelets makes the method possible for efficient band gap computation of more complex phononic structures.

  3. JPEG2000-Compatible Scalable Scheme for Wavelet-Based Video Coding

    Directory of Open Access Journals (Sweden)

    André Thomas

    2007-01-01

    Full Text Available We present a simple yet efficient scalable scheme for wavelet-based video coders, able to provide on-demand spatial, temporal, and SNR scalability, and fully compatible with the still-image coding standard JPEG2000. Whereas hybrid video coders must undergo significant changes in order to support scalability, our coder only requires a specific wavelet filter for temporal analysis, as well as an adapted bit allocation procedure based on models of rate-distortion curves. Our study shows that scalably encoded sequences have the same or almost the same quality than nonscalably encoded ones, without a significant increase in complexity. A full compatibility with Motion JPEG2000, which tends to be a serious candidate for the compression of high-definition video sequences, is ensured.

  4. JPEG2000-Compatible Scalable Scheme for Wavelet-Based Video Coding

    Directory of Open Access Journals (Sweden)

    Thomas André

    2007-03-01

    Full Text Available We present a simple yet efficient scalable scheme for wavelet-based video coders, able to provide on-demand spatial, temporal, and SNR scalability, and fully compatible with the still-image coding standard JPEG2000. Whereas hybrid video coders must undergo significant changes in order to support scalability, our coder only requires a specific wavelet filter for temporal analysis, as well as an adapted bit allocation procedure based on models of rate-distortion curves. Our study shows that scalably encoded sequences have the same or almost the same quality than nonscalably encoded ones, without a significant increase in complexity. A full compatibility with Motion JPEG2000, which tends to be a serious candidate for the compression of high-definition video sequences, is ensured.

  5. Self-similar prior and wavelet bases for hidden incompressible turbulent motion

    CERN Document Server

    Héas, Patrick; Kadri-Harouna, Souleymane

    2013-01-01

    This work is concerned with the ill-posed inverse problem of estimating turbulent flows from the observation of an image sequence. From a Bayesian perspective, a divergence-free isotropic fractional Brownian motion (fBm) is chosen as a prior model for instantaneous turbulent velocity fields. This self-similar prior characterizes accurately second-order statistics of velocity fields in incompressible isotropic turbulence. Nevertheless, the associated maximum a posteriori involves a fractional Laplacian operator which is delicate to implement in practice. To deal with this issue, we propose to decompose the divergent-free fBm on well-chosen wavelet bases. As a first alternative, we propose to design wavelets as whitening filters. We show that these filters are fractional Laplacian wavelets composed with the Leray projector. As a second alternative, we use a divergence-free wavelet basis, which takes implicitly into account the incompressibility constraint arising from physics. Although the latter decomposition ...

  6. Application of Wavelet-based Active Power Filter in Accelerator Magnet Power Supply

    CERN Document Server

    Xiaoling, Guo

    2013-01-01

    As modern accelerators demand excellent stability to magnet power supply (PS), it is necessary to decrease harmonic currents passing magnets. Aim at depressing rappel current from PS in Beijing electron-positron collider II, a wavelet-based active power filter (APF) is proposed in this paper. APF is an effective device to improve the quality of currents. As a countermeasure to these harmonic currents, the APF circuit generates a harmonic current, countervailing harmonic current from PS. An active power filter based on wavelet transform is proposed in this paper. Discrete wavelet transform is used to analyze the harmonic components in supply current, and active power filter circuit works according to the analysis results. At end of this paper, the simulation and experiment results are given to prove the effect of the mentioned Active power filter.

  7. An Investigation of Wavelet Bases for Grid-Based Multi-Scale Simulations Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Baty, R.S.; Burns, S.P.; Christon, M.A.; Roach, D.W.; Trucano, T.G.; Voth, T.E.; Weatherby, J.R.; Womble, D.E.

    1998-11-01

    The research summarized in this report is the result of a two-year effort that has focused on evaluating the viability of wavelet bases for the solution of partial differential equations. The primary objective for this work has been to establish a foundation for hierarchical/wavelet simulation methods based upon numerical performance, computational efficiency, and the ability to exploit the hierarchical adaptive nature of wavelets. This work has demonstrated that hierarchical bases can be effective for problems with a dominant elliptic character. However, the strict enforcement of orthogonality was found to be less desirable than weaker semi-orthogonality or bi-orthogonality for solving partial differential equations. This conclusion has led to the development of a multi-scale linear finite element based on a hierarchical change of basis. The reproducing kernel particle method has been found to yield extremely accurate phase characteristics for hyperbolic problems while providing a convenient framework for multi-scale analyses.

  8. Passive microrheology of soft materials with atomic force microscopy: A wavelet-based spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Torres, C.; Streppa, L. [CNRS, UMR5672, Laboratoire de Physique, Ecole Normale Supérieure de Lyon, 46 Allée d' Italie, Université de Lyon, 69007 Lyon (France); Arneodo, A.; Argoul, F. [CNRS, UMR5672, Laboratoire de Physique, Ecole Normale Supérieure de Lyon, 46 Allée d' Italie, Université de Lyon, 69007 Lyon (France); CNRS, UMR5798, Laboratoire Ondes et Matière d' Aquitaine, Université de Bordeaux, 351 Cours de la Libération, 33405 Talence (France); Argoul, P. [Université Paris-Est, Ecole des Ponts ParisTech, SDOA, MAST, IFSTTAR, 14-20 Bd Newton, Cité Descartes, 77420 Champs sur Marne (France)

    2016-01-18

    Compared to active microrheology where a known force or modulation is periodically imposed to a soft material, passive microrheology relies on the spectral analysis of the spontaneous motion of tracers inherent or external to the material. Passive microrheology studies of soft or living materials with atomic force microscopy (AFM) cantilever tips are rather rare because, in the spectral densities, the rheological response of the materials is hardly distinguishable from other sources of random or periodic perturbations. To circumvent this difficulty, we propose here a wavelet-based decomposition of AFM cantilever tip fluctuations and we show that when applying this multi-scale method to soft polymer layers and to living myoblasts, the structural damping exponents of these soft materials can be retrieved.

  9. Neuro-Fuzzy Wavelet Based Adaptive MPPT Algorithm for Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Syed Zulqadar Hassan

    2017-03-01

    Full Text Available An intelligent control of photovoltaics is necessary to ensure fast response and high efficiency under different weather conditions. This is often arduous to accomplish using traditional linear controllers, as photovoltaic systems are nonlinear and contain several uncertainties. Based on the analysis of the existing literature of Maximum Power Point Tracking (MPPT techniques, a high performance neuro-fuzzy indirect wavelet-based adaptive MPPT control is developed in this work. The proposed controller combines the reasoning capability of fuzzy logic, the learning capability of neural networks and the localization properties of wavelets. In the proposed system, the Hermite Wavelet-embedded Neural Fuzzy (HWNF-based gradient estimator is adopted to estimate the gradient term and makes the controller indirect. The performance of the proposed controller is compared with different conventional and intelligent MPPT control techniques. MATLAB results show the superiority over other existing techniques in terms of fast response, power quality and efficiency.

  10. A parameter-tuned genetic algorithm for statistically constrained economic design of multivariate CUSUM control charts: a Taguchi loss approach

    Science.gov (United States)

    Niaki, Seyed Taghi Akhavan; Javad Ershadi, Mohammad

    2012-12-01

    In this research, the main parameters of the multivariate cumulative sum (CUSUM) control chart (the reference value k, the control limit H, the sample size n and the sampling interval h) are determined by minimising the Lorenzen-Vance cost function [Lorenzen, T.J., and Vance, L.C. (1986), 'The Economic Design of Control Charts: A Unified Approach', Technometrics, 28, 3-10], in which the external costs of employing the chart are added. In addition, the model is statistically constrained to achieve desired in-control and out-of-control average run lengths. The Taguchi loss approach is used to model the problem and a genetic algorithm, for which its main parameters are tuned using the response surface methodology (RSM), is proposed to solve it. At the end, sensitivity analyses on the main parameters of the cost function are presented and their practical conclusions are drawn. The results show that RSM significantly improves the performance of the proposed algorithm and the external costs of applying the chart, which are due to real-world constraints, do not increase the average total loss very much.

  11. Diversity trends in bread wheat in Italy during the 20th century assessed by traditional and multivariate approaches.

    Science.gov (United States)

    Ormoli, Leonardo; Costa, Corrado; Negri, Stefano; Perenzin, Maurizio; Vaccino, Patrizia

    2015-02-25

    A collection of 157 Triticum aestivum accessions, representative of wheat breeding in Italy during the 20(th) century, was assembled to describe the evolutionary trends of cultivated varieties throughout this period. The lines were cultivated in Italy, in two locations, over two growing seasons, and evaluated for several agronomical, morphological and qualitative traits. Analyses were conducted using the most common univariate approach on individual plant traits coupled with a correspondance multivariate approach. ANOVA showed a clear trend from old to new varieties, leading towards earliness, plant height reduction and denser spikes with smaller seeds. The average protein content gradually decreased over time; however this trend did not affect bread-making quality, because it was counterbalanced by a gradual increase of SDS sedimentation volume, achieved by the incorporation of favourable alleles into recent cultivars. Correspondence analysis allowed an overall view of the breeding activity. A clear-cut separation was observed between ancient lines and all the others, matched with a two-step gradient, the first, corresponding roughly to the period 1920-1940, which can be ascribed mostly to genetics, the second, from the 40s onward, which can be ascribed also to the farming practice innovations, such as improvement of mechanical devices and optimised use of fertilizers.

  12. Assessment on the pollution of nitrogen and phosphorus of Beijing surface water based on GIS system and multivariate statistical approaches

    Institute of Scientific and Technical Information of China (English)

    LI Lian-fang; LI Guo-xue; LIAO Xiao-yong

    2004-01-01

    This paper presented the characteristics of nitrogen and phosphorus pollution in Beijing surface water during the survey. A significant difference was found out in concentration distribution of various parameters of nitrogen and phosphorus. Most water bodies in five water systems were polluted by total nitrogen with the content even up to 120 mg/L which was higher than exceeded the fifth class standard of national surface water quality standard GB3838-2002 except for several segments of Chaobaihe and Yongdinghe. Ammonia and phosphorus showed a similar tendency of distribution with higher content in Daqinghe, Beiyunhe and Jiyunhe water systems, but with relatively low concentrations in Chaobaihe and Yongdinghe water systems. Meanwhile, nitrate was found at comparatively low content(mostly less than 10 mg/L) and could fit for corresponding water quality requirements. Totally, the water quality of Daqinghe, Jiyunhe and Beiyunhe river systems as well as the lower reaches of Yongdinghe and Chaobaihe was contaminated seriously with high content of total nitrogen and phosphorus. Through multivariate statistical approaches, it can be concluded that total nitrogen, ammonia and total phosphorus was highly correlated to chemical oxygen demand, biochemical oxygen demand, dissolved oxygen and electrical conductivity, which explained the same pollution source from anthropogenic activities.

  13. A Multivariate Approach to Evaluate Biomass Production, Biochemical Composition and Stress Compounds of Spirulina platensis Cultivated in Wastewater.

    Science.gov (United States)

    Çelekli, Abuzer; Topyürek, Ali; Markou, Giorgos; Bozkurt, Hüseyin

    2016-10-01

    The study was performed to investigate the effects of using cow effluent for the cultivation of Spirulina platensis on its biomass production and cell physiology. S. platensis was cultivated in three different cow effluents (CE) used as cultivation medium during 15 days. CE was prepared using dry cow manures, and it was further modified with supplement of NaNO3 (CEN) and NaNO3 + NaCl (CENS). High nitrate value stimulated chlorophyll-a and total protein content of the cyanobacterium and also biomass production in standards medium (SM) and CEN media. Total carbohydrate content of S. platensis grown in CE media was found to be higher (p biomass and biochemical compounds by the cyanobacterium grown on the CE and SM media were evaluated by using multivariate approach. Conductivity, oxidation reduction potential (ORP), salinity, pH, and TDS played important role (p biomass production, filament length, and proline. Canonical correspondence analysis proposed that biochemical compounds of S. platensis were not only affected by salinity and nutrition of media but also by pH and ORP. The present study indicated that CEN as a low cost model medium had high potential for the production of biomass by S. platensis with high protein content.

  14. A wavelet based approach to measure and manage contagion at different time scales

    Science.gov (United States)

    Berger, Theo

    2015-10-01

    We decompose financial return series of US stocks into different time scales with respect to different market regimes. First, we examine dependence structure of decomposed financial return series and analyze the impact of the current financial crisis on contagion and changing interdependencies as well as upper and lower tail dependence for different time scales. Second, we demonstrate to which extent the information of different time scales can be used in the context of portfolio management. As a result, minimizing the variance of short-run noise outperforms a portfolio that minimizes the variance of the return series.

  15. A time-scale analysis of systematic risk: wavelet-based approach

    OpenAIRE

    Khalfaoui Rabeh, K; Boutahar Mohamed, B

    2011-01-01

    The paper studies the impact of different time-scales on the market risk of individual stock market returns and of a given portfolio in Paris Stock Market by applying the wavelet analysis. To investigate the scaling properties of stock market returns and the lead/lag relationship between them at different scales, wavelet variance and crosscorrelations analyses are used. According to wavelet variance, stock returns exhibit long memory dynamics. The wavelet cross-correlation analysis shows that...

  16. HIRDLS observations of global gravity wave absolute momentum fluxes: A wavelet based approach

    Science.gov (United States)

    John, Sherine Rachel; Kishore Kumar, Karanam

    2016-02-01

    Using wavelet technique for detection of height varying vertical and horizontal wavelengths of gravity waves, the absolute values of gravity wave momentum fluxes are estimated from High Resolution Dynamics Limb Sounder (HIRDLS) temperature measurements. Two years of temperature measurements (2005 December-2007 November) from HIRDLS onboard EOS-Aura satellite over the globe are used for this purpose. The least square fitting method is employed to extract the 0-6 zonal wavenumber planetary wave amplitudes, which are removed from the instantaneous temperature profiles to extract gravity wave fields. The vertical and horizontal wavelengths of the prominent waves are computed using wavelet and cross correlation techniques respectively. The absolute momentum fluxes are then estimated using prominent gravity wave perturbations and their vertical and horizontal wavelengths. The momentum fluxes obtained from HIRDLS are compared with the fluxes obtained from ground based Rayleigh LIDAR observations over a low latitude station, Gadanki (13.5°N, 79.2°E) and are found to be in good agreement. After validation, the absolute gravity wave momentum fluxes over the entire globe are estimated. It is found that the winter hemisphere has the maximum momentum flux magnitudes over the high latitudes with a secondary maximum over the summer hemispheric low-latitudes. The significance of the present study lies in introducing the wavelet technique for estimating the height varying vertical and horizontal wavelengths of gravity waves and validating space based momentum flux estimations using ground based lidar observations.

  17. A comparison between wavelet based static and dynamic neural network approaches for runoff prediction

    Science.gov (United States)

    Shoaib, Muhammad; Shamseldin, Asaad Y.; Melville, Bruce W.; Khan, Mudasser Muneer

    2016-04-01

    In order to predict runoff accurately from a rainfall event, the multilayer perceptron type of neural network models are commonly used in hydrology. Furthermore, the wavelet coupled multilayer perceptron neural network (MLPNN) models has also been found superior relative to the simple neural network models which are not coupled with wavelet. However, the MLPNN models are considered as static and memory less networks and lack the ability to examine the temporal dimension of data. Recurrent neural network models, on the other hand, have the ability to learn from the preceding conditions of the system and hence considered as dynamic models. This study for the first time explores the potential of wavelet coupled time lagged recurrent neural network (TLRNN) models for runoff prediction using rainfall data. The Discrete Wavelet Transformation (DWT) is employed in this study to decompose the input rainfall data using six of the most commonly used wavelet functions. The performance of the simple and the wavelet coupled static MLPNN models is compared with their counterpart dynamic TLRNN models. The study found that the dynamic wavelet coupled TLRNN models can be considered as alternative to the static wavelet MLPNN models. The study also investigated the effect of memory depth on the performance of static and dynamic neural network models. The memory depth refers to how much past information (lagged data) is required as it is not known a priori. The db8 wavelet function is found to yield the best results with the static MLPNN models and with the TLRNN models having small memory depths. The performance of the wavelet coupled TLRNN models with large memory depths is found insensitive to the selection of the wavelet function as all wavelet functions have similar performance.

  18. Construction of compactly supported biorthogonal wavelet based on Human Visual System

    Science.gov (United States)

    Hu, Haiping; Hou, Weidong; Liu, Hong; Mo, Yu L.

    2000-11-01

    As an important analysis tool, wavelet transform has made a great development in image compression coding, since Daubechies constructed a kind of compact support orthogonal wavelet and Mallat presented a fast pyramid algorithm for wavelet decomposition and reconstruction. In order to raise the compression ratio and improve the visual quality of reconstruction, it becomes very important to find a wavelet basis that fits the human visual system (HVS). Marr wavelet, as it is known, is a kind of wavelet, so it is not suitable for implementation of image compression coding. In this paper, a new method is provided to construct a kind of compactly supported biorthogonal wavelet based on human visual system, we employ the genetic algorithm to construct compactly supported biorthogonal wavelet that can approximate the modulation transform function for HVS. The novel constructed wavelet is applied to image compression coding in our experiments. The experimental results indicate that the visual quality of reconstruction with the new kind of wavelet is equivalent to other compactly biorthogonal wavelets in the condition of the same bit rate. It has good performance of reconstruction, especially used in texture image compression coding.

  19. Wavelet-based multifractal analysis of dynamic infrared thermograms to assist in early breast cancer diagnosis

    Directory of Open Access Journals (Sweden)

    Evgeniya eGerasimova

    2014-05-01

    Full Text Available Breast cancer is the most common type of cancer among women and despite recent advances in the medical field, there are still some inherent limitations in the currently used screening techniques. The radiological interpretation of screening X-ray mammograms often leads to over-diagnosis and, as a consequence, to unnecessary traumatic and painful biopsies. Here we propose a computer-aided multifractal analysis of dynamic infrared (IR imaging as an efficient method for identifying women with risk of breast cancer. Using a wavelet-based multi-scale method to analyze the temporal fluctuations of breast skin temperature collected from a panel of patients with diagnosed breast cancer and some female volunteers with healthy breasts, we show that the multifractal complexity of temperature fluctuations observed in healthy breasts is lost in mammary glands with malignant tumor. Besides potential clinical impact, these results open new perspectives in the investigation of physiological changes that may precede anatomical alterations in breast cancer development.

  20. A Wavelet-Based ECG Delineation Method: Adaptation to an Experimental Electrograms with Manifested Global Ischemia.

    Science.gov (United States)

    Hejč, Jakub; Vítek, Martin; Ronzhina, Marina; Nováková, Marie; Kolářová, Jana

    2015-09-01

    We present a novel wavelet-based ECG delineation method with robust classification of P wave and T wave. The work is aimed on an adaptation of the method to long-term experimental electrograms (EGs) measured on isolated rabbit heart and to evaluate the effect of global ischemia in experimental EGs on delineation performance. The algorithm was tested on a set of 263 rabbit EGs with established reference points and on human signals using standard Common Standards for Quantitative Electrocardiography Standard Database (CSEDB). On CSEDB, standard deviation (SD) of measured errors satisfies given criterions in each point and the results are comparable to other published works. In rabbit signals, our QRS detector reached sensitivity of 99.87% and positive predictivity of 99.89% despite an overlay of spectral components of QRS complex, P wave and power line noise. The algorithm shows great performance in suppressing J-point elevation and reached low overall error in both, QRS onset (SD = 2.8 ms) and QRS offset (SD = 4.3 ms) delineation. T wave offset is detected with acceptable error (SD = 12.9 ms) and sensitivity nearly 99%. Variance of the errors during global ischemia remains relatively stable, however more failures in detection of T wave and P wave occur. Due to differences in spectral and timing characteristics parameters of rabbit based algorithm have to be highly adaptable and set more precisely than in human ECG signals to reach acceptable performance.

  1. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  2. Error Estimations in an Approximation on a Compact Interval with a Wavelet Bases

    Directory of Open Access Journals (Sweden)

    Dr. Marco Schuchmann

    2013-11-01

    Full Text Available By an approximation with a wavelet base we have in practise not only an error if the function y is not in Vj . There we have a second error because we do not use all bases functions. If the wavelet has a compact support we have no error by using only a part of all basis function. If we need an approximation on a compact interval I (which we can do even if y is not quadratic integrable on R, because in that case it must only be quadratic integrable on I leads to worse approximations if we calculate an orthogonal projection from 1I y in Vj. We can get much better approximations, if we apply a least square approximation with points in I. Here we will see, that this approximation can be much better th an a orthogonal projection form y or 1I y in Vj . With the Shannon wavelet, which has no compact support, we saw in many simulations, that a least square approximation can lead to much better results than with well known wavelets with compact support. So in that article we do an error estimation for the Shannon wavelet, if we use not all bases coefficients.

  3. Wavelet-Based ECG Steganography for Protecting Patient Confidential Information in Point-of-Care Systems.

    Science.gov (United States)

    Ibaida, Ayman; Khalil, Ibrahim

    2013-12-01

    With the growing number of aging population and a significant portion of that suffering from cardiac diseases, it is conceivable that remote ECG patient monitoring systems are expected to be widely used as point-of-care (PoC) applications in hospitals around the world. Therefore, huge amount of ECG signal collected by body sensor networks from remote patients at homes will be transmitted along with other physiological readings such as blood pressure, temperature, glucose level, etc., and diagnosed by those remote patient monitoring systems. It is utterly important that patient confidentiality is protected while data are being transmitted over the public network as well as when they are stored in hospital servers used by remote monitoring systems. In this paper, a wavelet-based steganography technique has been introduced which combines encryption and scrambling technique to protect patient confidential data. The proposed method allows ECG signal to hide its corresponding patient confidential data and other physiological information thus guaranteeing the integration between ECG and the rest. To evaluate the effectiveness of the proposed technique on the ECG signal, two distortion measurement metrics have been used: the percentage residual difference and the wavelet weighted PRD. It is found that the proposed technique provides high-security protection for patients data with low (less than 1%) distortion and ECG data remain diagnosable after watermarking (i.e., hiding patient confidential data) and as well as after watermarks (i.e., hidden data) are removed from the watermarked data.

  4. A NOVEL BIOMETRICS TRIGGERED WATERMARKING OF IMAGES BASED ON WAVELET BASED CONTOURLET TRANSFORM

    Directory of Open Access Journals (Sweden)

    Elakkiya Soundar

    2013-01-01

    Full Text Available The rapid development of network and digital technology has led to several issues to the digital content. The technical solution to provide law enforcement and copyright protection is achieved by digital watermarking Digital watermarking is the process of embedding information into a digital image in a way that is difficult to remove. The proposed method contains following phases (i Pre-processing of biometric image (ii key generation from the biometrics of the owner/user and randomization of the host image using Speeded-Up Robust Features (SURF (iii Wavelet-Based Contourlet Transform (WBCT is applied on the host image. The WBCT can give the anisotropy optimal representation of the edges and contours in the image by virtue of the characteristics of multi-scale framework and multi-directionality (iv Singular Value Decomposition (SVD is enforced over the watermark image (v Embedding of the host image with the watermark image. The comparative analysis confirms the efficiency and robustness of the proposed system Index Terms— Digital Watermarking, copyright, Pre-processing, wavelet, Speeded-Up Robust Features.

  5. Adaptive Audio Watermarking via the Optimization Point of View on the Wavelet-Based Entropy

    CERN Document Server

    Chen, Shuo-Tsung; Chen, Chur-Jen

    2011-01-01

    This study aims to present an adaptive audio watermarking method using ideas of wavelet-based entropy (WBE). The method converts low-frequency coefficients of discrete wavelet transform (DWT) into the WBE domain, followed by the calculations of mean values of each audio as well as derivation of some essential properties of WBE. A characteristic curve relating the WBE and DWT coefficients is also presented. The foundation of the embedding process lies on the approximately invariant property demonstrated from the mean of each audio and the characteristic curve. Besides, the quality of the watermarked audio is optimized. In the detecting process, the watermark can be extracted using only values of the WBE. Finally, the performance of the proposed watermarking method is analyzed in terms of signal to noise ratio, mean opinion score and robustness. Experimental results confirm that the embedded data are robust to resist the common attacks like re-sampling, MP3 compression, low-pass filtering, and amplitude-scaling

  6. Selective error detection for error-resilient wavelet-based image coding.

    Science.gov (United States)

    Karam, Lina J; Lam, Tuyet-Trang

    2007-12-01

    This paper introduces the concept of a similarity check function for error-resilient multimedia data transmission. The proposed similarity check function provides information about the effects of corrupted data on the quality of the reconstructed image. The degree of data corruption is measured by the similarity check function at the receiver, without explicit knowledge of the original source data. The design of a perceptual similarity check function is presented for wavelet-based coders such as the JPEG2000 standard, and used with a proposed "progressive similarity-based ARQ" (ProS-ARQ) scheme to significantly decrease the retransmission rate of corrupted data while maintaining very good visual quality of images transmitted over noisy channels. Simulation results with JPEG2000-coded images transmitted over the Binary Symmetric Channel, show that the proposed ProS-ARQ scheme significantly reduces the number of retransmissions as compared to conventional ARQ-based schemes. The presented results also show that, for the same number of retransmitted data packets, the proposed ProS-ARQ scheme can achieve significantly higher PSNR and better visual quality as compared to the selective-repeat ARQ scheme.

  7. Performance evaluation of wavelet-based ECG compression algorithms for telecardiology application over CDMA network.

    Science.gov (United States)

    Kim, Byung S; Yoo, Sun K

    2007-09-01

    The use of wireless networks bears great practical importance in instantaneous transmission of ECG signals during movement. In this paper, three typical wavelet-based ECG compression algorithms, Rajoub (RA), Embedded Zerotree Wavelet (EZ), and Wavelet Transform Higher-Order Statistics Coding (WH), were evaluated to find an appropriate ECG compression algorithm for scalable and reliable wireless tele-cardiology applications, particularly over a CDMA network. The short-term and long-term performance characteristics of the three algorithms were analyzed using normal, abnormal, and measurement noise-contaminated ECG signals from the MIT-BIH database. In addition to the processing delay measurement, compression efficiency and reconstruction sensitivity to error were also evaluated via simulation models including the noise-free channel model, random noise channel model, and CDMA channel model, as well as over an actual CDMA network currently operating in Korea. This study found that the EZ algorithm achieves the best compression efficiency within a low-noise environment, and that the WH algorithm is competitive for use in high-error environments with degraded short-term performance with abnormal or contaminated ECG signals.

  8. Wavelet-based AR-SVM for health monitoring of smart structures

    Science.gov (United States)

    Kim, Yeesock; Chong, Jo Woon; Chon, Ki H.; Kim, JungMi

    2013-01-01

    This paper proposes a novel structural health monitoring framework for damage detection of smart structures. The framework is developed through the integration of the discrete wavelet transform, an autoregressive (AR) model, damage-sensitive features, and a support vector machine (SVM). The steps of the method are the following: (1) the wavelet-based AR (WAR) model estimates vibration signals obtained from both the undamaged and damaged smart structures under a variety of random signals; (2) a new damage-sensitive feature is formulated in terms of the AR parameters estimated from the structural velocity responses; and then (3) the SVM is applied to each group of damaged and undamaged data sets in order to optimally separate them into either damaged or healthy groups. To demonstrate the effectiveness of the proposed structural health monitoring framework, a three-story smart building equipped with a magnetorheological (MR) damper under artificial earthquake signals is studied. It is shown from the simulation that the proposed health monitoring scheme is effective in detecting damage of the smart structures in an efficient way.

  9. Mean square error approximation for wavelet-based semiregular mesh compression.

    Science.gov (United States)

    Payan, Frédéric; Antonini, Marc

    2006-01-01

    The objective of this paper is to propose an efficient model-based bit allocation process optimizing the performances of a wavelet coder for semiregular meshes. More precisely, this process should compute the best quantizers for the wavelet coefficient subbands that minimize the reconstructed mean square error for one specific target bitrate. In order to design a fast and low complex allocation process, we propose an approximation of the reconstructed mean square error relative to the coding of semiregular mesh geometry. This error is expressed directly from the quantization errors of each coefficient subband. For that purpose, we have to take into account the influence of the wavelet filters on the quantized coefficients. Furthermore, we propose a specific approximation for wavelet transforms based on lifting schemes. Experimentally, we show that, in comparison with a "naive" approximation (depending on the subband levels), using the proposed approximation as distortion criterion during the model-based allocation process improves the performances of a wavelet-based coder for any model, any bitrate, and any lifting scheme.

  10. Wavelet Based Hilbert Transform with Digital Design and Application to QCM-SS Watermarking

    Directory of Open Access Journals (Sweden)

    S. P. Maity

    2008-04-01

    Full Text Available In recent time, wavelet transforms are used extensively for efficient storage, transmission and representation of multimedia signals. Hilbert transform pairs of wavelets is the basic unit of many wavelet theories such as complex filter banks, complex wavelet and phaselet etc. Moreover, Hilbert transform finds various applications in communications and signal processing such as generation of single sideband (SSB modulation, quadrature carrier multiplexing (QCM and bandpass representation of a signal. Thus wavelet based discrete Hilbert transform design draws much attention of researchers for couple of years. This paper proposes an (i algorithm for generation of low computation cost Hilbert transform pairs of symmetric filter coefficients using biorthogonal wavelets, (ii approximation to its rational coefficients form for its efficient hardware realization and without much loss in signal representation, and finally (iii development of QCM-SS (spread spectrum image watermarking scheme for doubling the payload capacity. Simulation results show novelty of the proposed Hilbert transform design and its application to watermarking compared to existing algorithms.

  11. A Wavelet-based Fast Discrimination of Transformer Magnetizing Inrush Current

    Science.gov (United States)

    Kitayama, Masashi

    Recently customers who need electricity of higher quality have been installing co-generation facilities. They can avoid voltage sags and other distribution system related disturbances by supplying electricity to important load from their generators. For another example, FRIENDS, highly reliable distribution system using semiconductor switches or storage devices based on power electronics technology, is proposed. These examples illustrates that the request for high reliability in distribution system is increasing. In order to realize these systems, fast relaying algorithms are indispensable. The author proposes a new method of detecting magnetizing inrush current using discrete wavelet transform (DWT). DWT provides the function of detecting discontinuity of current waveform. Inrush current occurs when transformer core becomes saturated. The proposed method detects spikes of DWT components derived from the discontinuity of the current waveform at both the beginning and the end of inrush current. Wavelet thresholding, one of the wavelet-based statistical modeling, was applied to detect the DWT component spikes. The proposed method is verified using experimental data using single-phase transformer and the proposed method is proved to be effective.

  12. Wavelet-Based Linear-Response Time-Dependent Density-Functional Theory

    CERN Document Server

    Natarajan, Bhaarathi; Casida, Mark E; Deutsch, Thierry; Burchak, Olga N; Philouze, Christian; Balakirev, Maxim Y

    2011-01-01

    Linear-response time-dependent (TD) density-functional theory (DFT) has been implemented in the pseudopotential wavelet-based electronic structure program BigDFT and results are compared against those obtained with the all-electron Gaussian-type orbital program deMon2k for the calculation of electronic absorption spectra of N2 using the TD local density approximation (LDA). The two programs give comparable excitation energies and absorption spectra once suitably extensive basis sets are used. Convergence of LDA density orbitals and orbital energies to the basis-set limit is significantly faster for BigDFT than for deMon2k. However the number of virtual orbitals used in TD-DFT calculations is a parameter in BigDFT, while all virtual orbitals are included in TD-DFT calculations in deMon2k. As a reality check, we report the x-ray crystal structure and the measured and calculated absorption spectrum (excitation energies and oscillator strengths) of the small organic molecule N-cyclohexyl-2-(4-methoxyphenyl)imidaz...

  13. A wavelet-based PWTD algorithm-accelerated time domain surface integral equation solver

    KAUST Repository

    Liu, Yang

    2015-10-26

    © 2015 IEEE. The multilevel plane-wave time-domain (PWTD) algorithm allows for fast and accurate analysis of transient scattering from, and radiation by, electrically large and complex structures. When used in tandem with marching-on-in-time (MOT)-based surface integral equation (SIE) solvers, it reduces the computational and memory costs of transient analysis from equation and equation to equation and equation, respectively, where Nt and Ns denote the number of temporal and spatial unknowns (Ergin et al., IEEE Trans. Antennas Mag., 41, 39-52, 1999). In the past, PWTD-accelerated MOT-SIE solvers have been applied to transient problems involving half million spatial unknowns (Shanker et al., IEEE Trans. Antennas Propag., 51, 628-641, 2003). Recently, a scalable parallel PWTD-accelerated MOT-SIE solver that leverages a hiearchical parallelization strategy has been developed and successfully applied to the transient problems involving ten million spatial unknowns (Liu et. al., in URSI Digest, 2013). We further enhanced the capabilities of this solver by implementing a compression scheme based on local cosine wavelet bases (LCBs) that exploits the sparsity in the temporal dimension (Liu et. al., in URSI Digest, 2014). Specifically, the LCB compression scheme was used to reduce the memory requirement of the PWTD ray data and computational cost of operations in the PWTD translation stage.

  14. A real-time wavelet-based video decoder using SIMD technology

    Science.gov (United States)

    Klepko, Robert; Wang, Demin

    2008-02-01

    This paper presents a fast implementation of a wavelet-based video codec. The codec consists of motion-compensated temporal filtering (MCTF), 2-D spatial wavelet transform, and SPIHT for wavelet coefficient coding. It offers compression efficiency that is competitive to H.264. The codec is implemented in software running on a general purpose PC, using C programming language and streaming SIMD extensions intrinsics, without assembly language. This high-level software implementation allows the codec to be portable to other general-purpose computing platforms. Testing with a Pentium 4 HT at 3.6GHz (running under Linux and using the GCC compiler, version 4), shows that the software decoder is able to decode 4CIF video in real-time, over 2 times faster than software written only in C language. This paper describes the structure of the codec, the fast algorithms chosen for the most computationally intensive elements in the codec, and the use of SIMD to implement these algorithms.

  15. CARDIAD approach to system dominance with application to turbofan engine models. [Complex Acceptability Region for DIAgonal Dominance in multivariable systems analysis and design

    Science.gov (United States)

    Schafer, R. M.; Sain, M. K.

    1980-01-01

    The paper presents the CARDIAD (complex acceptability region for diagonal dominance) method for achieving the diagonal dominance condition in the inverse Nyquist array approach to the analysis and design of multivariable systems in the frequency domain. A design example is given for a sixth order, 4-input, 4-output model of a turbofan engine.

  16. A Novel Method for Brain MRI Super-resolution by Wavelet-based POCS and Adaptive Edge Zoom

    Directory of Open Access Journals (Sweden)

    N. Hema Rajini,

    2010-10-01

    Full Text Available This paper aims to make the super-resolution of a high-resolution image from a sequence of low-resolution frames containing non-stationary objects. The challenges of making super-resolution image, like unavoidable smoothing effects, introduction of artifacts, computational efficiency in time and computational efficiency in memory requirements, are considered and a novel method is proposed to solve these problems. The proposed method handles the super-resolution process by using wavelet based projection-onto-convex-set with adaptive edge zoom algorithm. Adaptive edge zoom algorithm address the problem of producing enlarged picture from the given digital image. Wavelet based projection-onto-convex-set method is usedto enhance spatial resolution of MRI brain images from a temporal sequence. This method produces more clarity with high peak signal-to-noise ratio.

  17. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  18. WaVPeak: Picking NMR peaks through wavelet-based smoothing and volume-based filtering

    KAUST Repository

    Liu, Zhi

    2012-02-10

    Motivation: Nuclear magnetic resonance (NMR) has been widely used as a powerful tool to determine the 3D structures of proteins in vivo. However, the post-spectra processing stage of NMR structure determination usually involves a tremendous amount of time and expert knowledge, which includes peak picking, chemical shift assignment and structure calculation steps. Detecting accurate peaks from the NMR spectra is a prerequisite for all following steps, and thus remains a key problem in automatic NMR structure determination. Results: We introduce WaVPeak, a fully automatic peak detection method. WaVPeak first smoothes the given NMR spectrum by wavelets. The peaks are then identified as the local maxima. The false positive peaks are filtered out efficiently by considering the volume of the peaks. WaVPeak has two major advantages over the state-of-the-art peak-picking methods. First, through wavelet-based smoothing, WaVPeak does not eliminate any data point in the spectra. Therefore, WaVPeak is able to detect weak peaks that are embedded in the noise level. NMR spectroscopists need the most help isolating these weak peaks. Second, WaVPeak estimates the volume of the peaks to filter the false positives. This is more reliable than intensity-based filters that are widely used in existing methods. We evaluate the performance of WaVPeak on the benchmark set proposed by PICKY (Alipanahi et al., 2009), one of the most accurate methods in the literature. The dataset comprises 32 2D and 3D spectra from eight different proteins. Experimental results demonstrate that WaVPeak achieves an average of 96%, 91%, 88%, 76% and 85% recall on 15N-HSQC, HNCO, HNCA, HNCACB and CBCA(CO)NH, respectively. When the same number of peaks are considered, WaVPeak significantly outperforms PICKY. The Author(s) 2012. Published by Oxford University Press.

  19. A wavelet-based intermittency detection technique from PIV investigations in transitional boundary layers

    Science.gov (United States)

    Simoni, Daniele; Lengani, Davide; Guida, Roberto

    2016-09-01

    The transition process of the boundary layer growing over a flat plate with pressure gradient simulating the suction side of a low-pressure turbine blade and elevated free-stream turbulence intensity level has been analyzed by means of PIV and hot-wire measurements. A detailed view of the instantaneous flow field in the wall-normal plane highlights the physics characterizing the complex process leading to the formation of large-scale coherent structures during breaking down of the ordered motion of the flow, thus generating randomized oscillations (i.e., turbulent spots). This analysis gives the basis for the development of a new procedure aimed at determining the intermittency function describing (statistically) the transition process. To this end, a wavelet-based method has been employed for the identification of the large-scale structures created during the transition process. Successively, a probability density function of these events has been defined so that an intermittency function is deduced. This latter strictly corresponds to the intermittency function of the transitional flow computed trough a classic procedure based on hot-wire data. The agreement between the two procedures in the intermittency shape and spot production rate proves the capability of the method in providing the statistical representation of the transition process. The main advantages of the procedure here proposed concern with its applicability to PIV data; it does not require a threshold level to discriminate first- and/or second-order time-derivative of hot-wire time traces (that makes the method not influenced by the operator); and it provides a clear evidence of the connection between the flow physics and the statistical representation of transition based on theory of turbulent spot propagation.

  20. A wavelet based neural model to optimize and read out a temporal population code

    Directory of Open Access Journals (Sweden)

    Andre eLuvizotto

    2012-05-01

    wavelet based decoders.

  1. Analysis of mosses and soils for quantifying heavy metal concentrations in Sicily: a multivariate and spatial analytical approach.

    Science.gov (United States)

    Gramatica, Paola; Battaini, Francesca; Giani, Elisa; Papa, Ester; Jones, Robert J A; Preatoni, Damiano; Cenci, Roberto M

    2006-01-01

    The use of vegetal organisms as indicators of contamination of the environment is partially replacing traditional monitoring techniques. Amongst the vegetal organisms available, mosses appear to be good bioindicators and are used for monitoring anthropogenic and natural fall-out on soils. This study has two objectives: the evaluation of the concentrations of heavy metals in soils and mosses of the Sicily Region, in Italy and the identification of the origin of fall-out of heavy metals. Mosses and the surface soil were sampled at 28 sites, only the youngest segments of Hylocomium splendens and Hypnum cupressiforme, corresponding to the plant tissues produced during the last 3 years, were taken. The elements Cd, Cu, Ni, Pb and Zn were analysed by ICP-MS and Hg by AAS. Statistical analysis was by PCA and spatial representation by GIS. In the mosses sampled in Sicily, the highest concentrations of Cd were found around the cities of Palermo and Messina. The highest concentrations of Hg were recorded in the northern part of the island between Trapani and Messina, similar to the distribution of Cu. Different areas with the highest concentrations of Ni were found near the south coast, in the vicinity of Palermo and around the Volcano Etna. The highest concentrations of Pb were found in the south-west coast near Agrigento, where important chemical plants and petroleum refineries are located. Except for a few locations, Zn fall-out was found to be evenly distributed throughout Sicily. The sites where the concentrations of heavy metals cause greatest concern have been revealed by the PCA analysis and portrayed using GIS. Also of some concern is the diffuse and anthropogenic origin of Hg and Cd. The combined approach of using soil and mosses, together with pedological interpretation and application of multivariate statistical techniques has provided valuable insight into the environmental aspects of heavy metal deposition in a region of southern Europe. Further insight into

  2. Detecting relationships between the interannual variability in climate records and ecological time series using a multivariate statistical approach - four case studies for the North Sea region

    Energy Technology Data Exchange (ETDEWEB)

    Heyen, H. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Gewaesserphysik

    1998-12-31

    A multivariate statistical approach is presented that allows a systematic search for relationships between the interannual variability in climate records and ecological time series. Statistical models are built between climatological predictor fields and the variables of interest. Relationships are sought on different temporal scales and for different seasons and time lags. The possibilities and limitations of this approach are discussed in four case studies dealing with salinity in the German Bight, abundance of zooplankton at Helgoland Roads, macrofauna communities off Norderney and the arrival of migratory birds on Helgoland. (orig.) [Deutsch] Ein statistisches, multivariates Modell wird vorgestellt, das eine systematische Suche nach potentiellen Zusammenhaengen zwischen Variabilitaet in Klima- und oekologischen Zeitserien erlaubt. Anhand von vier Anwendungsbeispielen wird der Klimaeinfluss auf den Salzgehalt in der Deutschen Bucht, Zooplankton vor Helgoland, Makrofauna vor Norderney, und die Ankunft von Zugvoegeln auf Helgoland untersucht. (orig.)

  3. Avoiding hard chromatographic segmentation: A moving window approach for the automated resolution of gas chromatography-mass spectrometry-based metabolomics signals by multivariate methods.

    Science.gov (United States)

    Domingo-Almenara, Xavier; Perera, Alexandre; Brezmes, Jesus

    2016-11-25

    Gas chromatography-mass spectrometry (GC-MS) produces large and complex datasets characterized by co-eluted compounds and at trace levels, and with a distinct compound ion-redundancy as a result of the high fragmentation by the electron impact ionization. Compounds in GC-MS can be resolved by taking advantage of the multivariate nature of GC-MS data by applying multivariate resolution methods. However, multivariate methods have to be applied in small regions of the chromatogram, and therefore chromatograms are segmented prior to the application of the algorithms. The automation of this segmentation process is a challenging task as it implies separating between informative data and noise from the chromatogram. This study demonstrates the capabilities of independent component analysis-orthogonal signal deconvolution (ICA-OSD) and multivariate curve resolution-alternating least squares (MCR-ALS) with an overlapping moving window implementation to avoid the typical hard chromatographic segmentation. Also, after being resolved, compounds are aligned across samples by an automated alignment algorithm. We evaluated the proposed methods through a quantitative analysis of GC-qTOF MS data from 25 serum samples. The quantitative performance of both moving window ICA-OSD and MCR-ALS-based implementations was compared with the quantification of 33 compounds by the XCMS package. Results shown that most of the R(2) coefficients of determination exhibited a high correlation (R(2)>0.90) in both ICA-OSD and MCR-ALS moving window-based approaches.

  4. Multivariate Birkhoff interpolation

    CERN Document Server

    Lorentz, Rudolph A

    1992-01-01

    The subject of this book is Lagrange, Hermite and Birkhoff (lacunary Hermite) interpolation by multivariate algebraic polynomials. It unifies and extends a new algorithmic approach to this subject which was introduced and developed by G.G. Lorentz and the author. One particularly interesting feature of this algorithmic approach is that it obviates the necessity of finding a formula for the Vandermonde determinant of a multivariate interpolation in order to determine its regularity (which formulas are practically unknown anyways) by determining the regularity through simple geometric manipulations in the Euclidean space. Although interpolation is a classical problem, it is surprising how little is known about its basic properties in the multivariate case. The book therefore starts by exploring its fundamental properties and its limitations. The main part of the book is devoted to a complete and detailed elaboration of the new technique. A chapter with an extensive selection of finite elements follows as well a...

  5. Wavelet-based detection of transcriptional activity on a novel Staphylococcus aureus tiling microarray

    Directory of Open Access Journals (Sweden)

    Segura Víctor

    2012-09-01

    Full Text Available Abstract Background High-density oligonucleotide microarray is an appropriate technology for genomic analysis, and is particulary useful in the generation of transcriptional maps, ChIP-on-chip studies and re-sequencing of the genome.Transcriptome analysis of tiling microarray data facilitates the discovery of novel transcripts and the assessment of differential expression in diverse experimental conditions. Although new technologies such as next-generation sequencing have appeared, microarrays might still be useful for the study of small genomes or for the analysis of genomic regions with custom microarrays due to their lower price and good accuracy in expression quantification. Results Here, we propose a novel wavelet-based method, named ZCL (zero-crossing lines, for the combined denoising and segmentation of tiling signals. The denoising is performed with the classical SUREshrink method and the detection of transcriptionally active regions is based on the computation of the Continuous Wavelet Transform (CWT. In particular, the detection of the transitions is implemented as the thresholding of the zero-crossing lines. The algorithm described has been applied to the public Saccharomyces cerevisiae dataset and it has been compared with two well-known algorithms: pseudo-median sliding window (PMSW and the structural change model (SCM. As a proof-of-principle, we applied the ZCL algorithm to the analysis of the custom tiling microarray hybridization results of a S. aureus mutant deficient in the sigma B transcription factor. The challenge was to identify those transcripts whose expression decreases in the absence of sigma B. Conclusions The proposed method archives the best performance in terms of positive predictive value (PPV while its sensitivity is similar to the other algorithms used for the comparison. The computation time needed to process the transcriptional signals is low as compared with model-based methods and in the same range to those

  6. New Wavelet Bases and Isometric Between Symbolic Operators Spaces OpS1,δm and Kernel Distributions Spaces

    Institute of Scientific and Technical Information of China (English)

    YANG Qi Xiang

    2002-01-01

    In the fifties, Calderón established a formal relation between symbol and kernel distribu-the C-Z operators, and Hormander, Kohn and Nirenberg, et al. studied the symbolic operators. Herewe apply a refinement of the Littlewood-Paley (L-P) decomposition, analyse under new wavelet bases,to characterize both symbolic operators spaces OpS1,δm and kernel distributions spaces with other spacescomposed of some almost diagonal matrices, then get an isometric between OpS1,δm and kernel distri-bution spaces

  7. Heart Rate Variability and Wavelet-based Studies on ECG Signals from Smokers and Non-smokers

    Science.gov (United States)

    Pal, K.; Goel, R.; Champaty, B.; Samantray, S.; Tibarewala, D. N.

    2013-12-01

    The current study deals with the heart rate variability (HRV) and wavelet-based ECG signal analysis of smokers and non-smokers. The results of HRV indicated dominance towards the sympathetic nervous system activity in smokers. The heart rate was found to be higher in case of smokers as compared to non-smokers ( p 90 % was achieved. The wavelet decomposition of the ECG signal was done using the Daubechies (db 6) wavelet family. No difference was observed between the smokers and non-smokers which apparently suggested that smoking does not affect the conduction pathway of heart.

  8. Multivariate alteration detection (MAD) in multispectral, bi-temporal image data: A new approach to change detction studies

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut

    for the definition of the MAD transformation is proven. As opposed to traditional univariate change detection schemes our scheme transforms two sets of multivariate observations (e.g. two multispectral satellite images covering the same geographical area acquired at different points in time) into a difference......-processing introduces a new spatial element into our change detection scheme which is highly relevant for image data. Two case studies using multispectral SPOT HRV data from 5 February 1987 and 12 February 1989 covering coffee and pineapple plantations in central Kenya, and Landsat TM data from 6 June 1986 and 27 June...

  9. A hybrid wavelet-based adaptive immersed boundary finite-difference lattice Boltzmann method for two-dimensional fluid-structure interaction

    Science.gov (United States)

    Cui, Xiongwei; Yao, Xiongliang; Wang, Zhikai; Liu, Minghao

    2017-03-01

    A second generation wavelet-based adaptive finite-difference Lattice Boltzmann method (FD-LBM) is developed in this paper. In this approach, the adaptive wavelet collocation method (AWCM) is firstly, to the best of our knowledge, incorporated into the FD-LBM. According to the grid refinement criterion based on the wavelet amplitudes of density distribution functions, an adaptive sparse grid is generated by the omission and addition of collocation points. On the sparse grid, the finite differences are used to approximate the derivatives. To eliminate the special treatments in using the FD-based derivative approximation near boundaries, the immersed boundary method (IBM) is also introduced into FD-LBM. By using the adaptive technique, the adaptive code requires much less grid points as compared to the uniform-mesh code. As a consequence, the computational efficiency can be improved. To justify the proposed method, a series of test cases, including fixed boundary cases and moving boundary cases, are invested. A good agreement between the present results and the data in previous literatures is obtained, which demonstrates the accuracy and effectiveness of the present AWCM-IB-LBM.

  10. Multivariate bubbles and antibubbles

    Science.gov (United States)

    Fry, John

    2014-08-01

    In this paper we develop models for multivariate financial bubbles and antibubbles based on statistical physics. In particular, we extend a rich set of univariate models to higher dimensions. Changes in market regime can be explicitly shown to represent a phase transition from random to deterministic behaviour in prices. Moreover, our multivariate models are able to capture some of the contagious effects that occur during such episodes. We are able to show that declining lending quality helped fuel a bubble in the US stock market prior to 2008. Further, our approach offers interesting insights into the spatial development of UK house prices.

  11. DEVELOPMENT OF AN APPROXIMATE MULTIVARIATE TWO-STEP APPROACH FOR THE JOINT GENETIC EVALUATION OF AUSTRIAN AND GERMAN DAIRY CATTLE

    Directory of Open Access Journals (Sweden)

    Christina Pfeiffer

    2015-09-01

    Full Text Available Multivariate genetic evaluation in modern dairy cattle breeding programs became important in the last decades. The simultaneous estimation of all production and functional traits is still demanding. Different meta-models are used to overcome several constraints. The aim of this study was to conduct an approximate multivariate two-step procedure applied to de-regressed breeding values and yield deviations of five fertility traits of Austrian Pinzgau cattle and to compare results with routinely estimated breeding values. The approximate two-step procedure applied to de-regressed breeding values performed better than the procedure applied to yield deviations. Spearman rank correlations for all animals, sires and cows were between 0.996 and 0.999 for the procedure applied to de-regressed breeding values and between 0.866 and 0.995 for the procedure applied to yield deviations. The results are encouraging to move from the currently used selection index in routine genetic evaluation towards an approximate two-step procedure applied to de-regressed breeding values.

  12. THE CONSTRUCTION OF WAVELET-BASED TRUNCATED CONICAL SHELL ELEMENT USING B-SPLINE WAVELET ON THE INTERVAL

    Institute of Scientific and Technical Information of China (English)

    Xiang Jiawei; He Zhengjia; Chen Xuefeng

    2006-01-01

    Based on B-spline wavelet on the interval (BSWI), two classes of truncated conical shell elements were constructed to solve axisymmetric problems, i.e. BSWI thin truncated conical shell element and BSWI moderately thick truncated conical shell element with independent slopedeformation interpolation. In the construction of wavelet-based element, instead of traditional polynomial interpolation, the scaling functions of BSWI were employed to form the shape functions through the constructed elemental transformation matrix, and then construct BSWI element via the variational principle. Unlike the process of direct wavelets adding in the wavelet Galerkin method, the elemental displacement field represented by the coefficients of wavelets expansion was transformed into edges and internal modes via the constructed transformation matrix. BSWI element combines the accuracy of B-spline function approximation and various wavelet-based lements for structural analysis. Some static and dynamic numerical examples of conical shells were studied to demonstrate the present element with higher efficiency and precision than the traditional element.

  13. Wavelet-based estimation of EEG synchronization associated with native and second language processing in Stroop task

    Institute of Scientific and Technical Information of China (English)

    LIU Xiao-feng

    2007-01-01

    Objective:To examine and compare the synchronization of different brain regions during the Chinese and English Stroop tasks. Methods: Ten native Chinese speakers with a moderate command of English participated in this study, and event-related potentials were recorded while participants performed the Stroop task. Then wavelet-based estimation of instantaneous EEG coherence was applied to investigate the synchronization of different brain regions during Stroop task. Results: A greater negativity for the incongruent situation than congruent situation appeared from 350 ms to 600 ms post-stimulus onset over frontal, central, and parietal regions in Chinese Stroop task, while the negativity was absent in English Stroop task. However, not only in Chinese Stroop task but also in English Stroop task was it found significantly higher EEG coherences for the incongruent situation than congruent situation over frontal, parietal, and frontoparietal regions before 400 ms post stimulus onset at β (13-30 Hz) frequency band. Conclusion: This finding indicated that wavelet-based coherence is more exquisite tool to analyze brain electrophysiological signals associated with complex cognitive task than ERP component, and that functional synchronization indexed by EEG coherence is enhanced at the earlier stage while processing the conflicting information for the incongruent stimulus.

  14. TME12/400: Application Oriented Wavelet-based Coding of Volumetric Medical Data

    Science.gov (United States)

    Menegaz, G; Grewe, L; Lozano, A; Thiran, J-Ph

    1999-01-01

    Introduction While medical data are increasingly acquired in a multidimensional space, in clinical practice they are mainly still analyzed as images. We propose a wavelet-based coding technique exploiting the full dimensionality of the data distribution while allowing to recover a single image without any need to decode the whole volume. The proposed compression scheme is based on the Layered Zero Coding (LZC) method. Two modes are considered. In the progressive (PROG) mode, the volume is processed as a whole, while in the layer-per-layer (LPL) one each layer of each sub-band is encoded independently. The three-dimensional extension of the Embedded Zerotree Wavelet (EZW) coder is used as reference for coding efficiency. All working modalities provide a fully embedded bit-stream allowing a progressive by quality recovering of the encoded information. Methods The 3D DWT is performed mapping integers to integers thus allowing lossless compression. Two different coding systems have been considered: EZW and LZC. LZC models the expected statistical dependencies among coefficients by defining some conditional terms (contexts) which summarize the significance state of the samples belonging to a generalized neighborhood of the coefficient being encoded. Such terms are then used by a context adaptive arithmetic coder. The LPL mode has been designed in order to be able to independently decode any image of the dataset, and it is derived from the PROG mode by over-constraining the system. The sub-bands are quantized and encoded according to a sequence of uniform quantizers with decreasing step-size. This ensures progressiveness capabilities when decoding both the whole volume and a single image. Results Performances have been evaluated on two datasets: DSR and ANGIO, an opthalmologic angiographic sequence. For each mode the best context has been retained. Results show that the proposed system is competitive with EZW, and PROG mode is the more performant. The main factors

  15. A cost-based empirical model of the aggregate price determination for the Turkish economy: A multivariate cointegration approach

    Directory of Open Access Journals (Sweden)

    Zeren Fatma

    2010-01-01

    Full Text Available This paper tries to examine the long run relationships between the aggregate consumer prices and some cost-based components for the Turkish economy. Based on a simple economic model of the macro-scaled price formation, multivariate cointegration techniques have been applied to test whether the real data support the a priori model construction. The results reveal that all of the factors, related to the price determination, have a positive impact on the consumer prices as expected. We find that the most significant component contributing to the price setting is the nominal exchange rate depreciation. We also cannot reject the linear homogeneity of the sum of all the price data as to the domestic inflation. The paper concludes that the Turkish consumer prices have in fact a strong cost-push component that contributes to the aggregate pricing.

  16. Accurate predictions of iron redox state in silicate glasses: A multivariate approach using X-ray absorption spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Dyar, M. Darby; McCanta, Molly; Breves, Elly; Carey, C. J.; Lanzirotti, Antonio

    2016-03-01

    Pre-edge features in the K absorption edge of X-ray absorption spectra are commonly used to predict Fe3+ valence state in silicate glasses. However, this study shows that using the entire spectral region from the pre-edge into the extended X-ray absorption fine-structure region provides more accurate results when combined with multivariate analysis techniques. The least absolute shrinkage and selection operator (lasso) regression technique yields %Fe3+ values that are accurate to ±3.6% absolute when the full spectral region is employed. This method can be used across a broad range of glass compositions, is easily automated, and is demonstrated to yield accurate results from different synchrotrons. It will enable future studies involving X-ray mapping of redox gradients on standard thin sections at 1 × 1 μm pixel sizes.

  17. Accurate predictions of iron redox state in silicate glasses: A multivariate approach using X-ray absorption spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Dyar, M. Darby; McCanta, Molly; Breves, Elly; Carey, C. J.; Lanzirotti, Antonio

    2016-03-01

    Pre-edge features in the K absorption edge of X-ray absorption spectra are commonly used to predict Fe3+ valence state in silicate glasses. However, this study shows that using the entire spectral region from the pre-edge into the extended X-ray absorption fine-structure region provides more accurate results when combined with multivariate analysis techniques. The least absolute shrinkage and selection operator (lasso) regression technique yields %Fe3+ values that are accurate to ±3.6% absolute when the full spectral region is employed. This method can be used across a broad range of glass compositions, is easily automated, and is demonstrated to yield accurate results from different synchrotrons. It will enable future studies involving X-ray mapping of redox gradients on standard thin sections at 1 × 1 μm pixel sizes.

  18. Longitudinal Relationships Between Productive Activities and Functional Health in Later Years: A Multivariate Latent Growth Curve Modeling Approach.

    Science.gov (United States)

    Choi, Eunhee; Tang, Fengyan; Kim, Sung-Geun; Turk, Phillip

    2016-10-01

    This study examined the longitudinal relationships between functional health in later years and three types of productive activities: volunteering, full-time, and part-time work. Using the data from five waves (2000-2008) of the Health and Retirement Study, we applied multivariate latent growth curve modeling to examine the longitudinal relationships among individuals 50 or over. Functional health was measured by limitations in activities of daily living. Individuals who volunteered, worked either full time or part time exhibited a slower decline in functional health than nonparticipants. Significant associations were also found between initial functional health and longitudinal changes in productive activity participation. This study provides additional support for the benefits of productive activities later in life; engagement in volunteering and employment are indeed associated with better functional health in middle and old age.

  19. A Cyber War Modeling Approach Based on Multivariate Network%基于多变元网络的Cyber作战建模方法

    Institute of Scientific and Technical Information of China (English)

    邓志宏; 老松杨; 白亮

    2013-01-01

    Cyberspace是近年出现的一个新概念,分析了Cyberspace的概念及特性,提出Cyberspace层次概念模型,在此基础上,提出基于多变元网络的Cyber作战建模方法,认为Cyber作战建模可以分为对战争系统静态结构建模和交战行为动态交互建模,并对应分别建立了Cyberspace战争系统的多变元网络模型和Cyber作战交战行为的多变元网络演化模型,最后运用所建立的模型对一个典型Cyber战进行了分析.%Cyberspace is a new emerging concept,we analyze the conception and features of Cyberspace,and propose a hierarchy conceptual framework for Cyberspace.After that,we propose a Cyber war modeling approach based on multivariate network,considering that Cyber war modeling contains two parts,that is (i)modeling the static structure of warfare system (ii)modeling the dynamic process of engaging.Accordingly,a multivariate network model for warfare system and a multivariate network evolving model for engaging process of Cyber war are established.

  20. Dynamic ultrasound imaging—A multivariate approach for the analysis and comparison of time-dependent musculoskeletal movements

    Directory of Open Access Journals (Sweden)

    Löfstedt Tommy

    2012-09-01

    Full Text Available Abstract Background Muscle functions are generally assumed to affect a wide variety of conditions and activities, including pain, ischemic and neurological disorders, exercise and injury. It is therefore very desirable to obtain more information on musculoskeletal contributions to and activity during clinical processes such as the treatment of muscle injuries, post-surgery evaluations, and the monitoring of progressive degeneration in neuromuscular disorders. The spatial image resolution achievable with ultrasound systems has improved tremendously in the last few years and it is nowadays possible to study skeletal muscles in real-time during activity. However, ultrasound imaging has an inherent problem that makes it difficult to compare different measurement series or image sequences from two or more subjects. Due to physiological differences between different subjects, the ultrasound sequences will be visually different – partly because of variation in probe placement and partly because of the difficulty of perfectly reproducing any given movement. Methods Ultrasound images of the biceps and calf of a single subject were transformed to achieve congruence and then efficiently compressed and stacked to facilitate analysis using a multivariate method known as O2PLS. O2PLS identifies related and unrelated variation in and between two sets of data such that different phases of the studied movements can be analysed. The methodology was used to study the dynamics of the Achilles tendon and the calf and also the Biceps brachii and upper arm. The movements of these parts of the body are both of interest in clinical orthopaedic research. Results This study extends the novel method of multivariate analysis of congruent images (MACI to facilitate comparisons between two series of ultrasound images. This increases its potential range of medical applications and its utility for detecting, visualising and quantifying the dynamics and functions of skeletal

  1. Comparative study of different approaches for multivariate image analysis in HPTLC fingerprinting of natural products such as plant resin.

    Science.gov (United States)

    Ristivojević, Petar; Trifković, Jelena; Vovk, Irena; Milojković-Opsenica, Dušanka

    2017-01-01

    Considering the introduction of phytochemical fingerprint analysis, as a method of screening the complex natural products for the presence of most bioactive compounds, use of chemometric classification methods, application of powerful scanning and image capturing and processing devices and algorithms, advancement in development of novel stationary phases as well as various separation modalities, high-performance thin-layer chromatography (HPTLC) fingerprinting is becoming attractive and fruitful field of separation science. Multivariate image analysis is crucial in the light of proper data acquisition. In a current study, different image processing procedures were studied and compared in detail on the example of HPTLC chromatograms of plant resins. In that sense, obtained variables such as gray intensities of pixels along the solvent front, peak area and mean values of peak were used as input data and compared to obtained best classification models. Important steps in image analysis, baseline removal, denoising, target peak alignment and normalization were pointed out. Numerical data set based on mean value of selected bands and intensities of pixels along the solvent front proved to be the most convenient for planar-chromatographic profiling, although required at least the basic knowledge on image processing methodology, and could be proposed for further investigation in HPLTC fingerprinting.

  2. A New Predictive Model of Centerline Segregation in Continuous Cast Steel Slabs by Using Multivariate Adaptive Regression Splines Approach

    Directory of Open Access Journals (Sweden)

    Paulino José García Nieto

    2015-06-01

    Full Text Available The aim of this study was to obtain a predictive model able to perform an early detection of central segregation severity in continuous cast steel slabs. Segregation in steel cast products is an internal defect that can be very harmful when slabs are rolled in heavy plate mills. In this research work, the central segregation was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS technique. For this purpose, the most important physical-chemical parameters are considered. The results of the present study are two-fold. In the first place, the significance of each physical-chemical variable on the segregation is presented through the model. Second, a model for forecasting segregation is obtained. Regression with optimal hyperparameters was performed and coefficients of determination equal to 0.93 for continuity factor estimation and 0.95 for average width were obtained when the MARS technique was applied to the experimental dataset, respectively. The agreement between experimental data and the model confirmed the good performance of the latter.

  3. Unifying Amplitude and Phase Analysis: A Compositional Data Approach to Functional Multivariate Mixed-Effects Modeling of Mandarin Chinese.

    Science.gov (United States)

    Hadjipantelis, P Z; Aston, J A D; Müller, H G; Evans, J P

    2015-04-03

    Mandarin Chinese is characterized by being a tonal language; the pitch (or F0) of its utterances carries considerable linguistic information. However, speech samples from different individuals are subject to changes in amplitude and phase, which must be accounted for in any analysis that attempts to provide a linguistically meaningful description of the language. A joint model for amplitude, phase, and duration is presented, which combines elements from functional data analysis, compositional data analysis, and linear mixed effects models. By decomposing functions via a functional principal component analysis, and connecting registration functions to compositional data analysis, a joint multivariate mixed effect model can be formulated, which gives insights into the relationship between the different modes of variation as well as their dependence on linguistic and nonlinguistic covariates. The model is applied to the COSPRO-1 dataset, a comprehensive database of spoken Taiwanese Mandarin, containing approximately 50,000 phonetically diverse sample F0 contours (syllables), and reveals that phonetic information is jointly carried by both amplitude and phase variation. Supplementary materials for this article are available online.

  4. A New Predictive Model of Centerline Segregation in Continuous Cast Steel Slabs by Using Multivariate Adaptive Regression Splines Approach

    Science.gov (United States)

    García Nieto, Paulino José; González Suárez, Victor Manuel; Álvarez Antón, Juan Carlos; Mayo Bayón, Ricardo; Sirgo Blanco, José Ángel; Díaz Fernández, Ana María

    2015-01-01

    The aim of this study was to obtain a predictive model able to perform an early detection of central segregation severity in continuous cast steel slabs. Segregation in steel cast products is an internal defect that can be very harmful when slabs are rolled in heavy plate mills. In this research work, the central segregation was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS) technique. For this purpose, the most important physical-chemical parameters are considered. The results of the present study are two-fold. In the first place, the significance of each physical-chemical variable on the segregation is presented through the model. Second, a model for forecasting segregation is obtained. Regression with optimal hyperparameters was performed and coefficients of determination equal to 0.93 for continuity factor estimation and 0.95 for average width were obtained when the MARS technique was applied to the experimental dataset, respectively. The agreement between experimental data and the model confirmed the good performance of the latter.

  5. Multivariate approach to the chemical mapping of uranium in sandstone-hosted uranium ores analyzed using double pulse Laser-Induced Breakdown Spectroscopy

    Science.gov (United States)

    Klus, Jakub; Mikysek, Petr; Prochazka, David; Pořízka, Pavel; Prochazková, Petra; Novotný, Jan; Trojek, Tomáš; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef

    2016-09-01

    The goal of this work is to provide high resolution mapping of uranium in sandstone-hosted uranium ores using Laser-Induced Breakdown Spectroscopy (LIBS) technique. In order to obtain chemical image with highest possible spatial resolution, LIBS system in orthogonal double pulse (DP LIBS) arrangement was employed. Owing to this experimental arrangement the spot size of 50 μm in diameter resulting in lateral resolution of 100 μm was reached. Despite the increase in signal intensity in DP LIBS modification, the detection of uranium is challenging. The main cause is the high density of uranium spectral lines, which together with broadening of LIBS spectral lines overreaches the resolution of commonly used spectrometers. It results in increased overall background radiation with only few distinguishable uranium lines. Three different approaches in the LIBS data treatment for the uranium detection were utilized: i) spectral line intensity, ii) region of apparent background and iii) multivariate data analysis. By utilizing multivariate statistical methods, a specific specimen features (in our case uranium content) were revealed by processing complete spectral information obtained from broadband echelle spectrograph. Our results are in a good agreement with conventional approaches such as line fitting and show new possibilities of processing spectral data in mapping. As a reference technique to LIBS was employed X-ray Fluorescence (XRF). The XRF chemical images used in this paper have lower resolution (approximately 1-2 mm per image point), nevertheless the elemental distribution is apparent and corresponds to presented LIBS experiments.

  6. Unintentional Interpersonal Synchronization Represented as a Reciprocal Visuo-Postural Feedback System: A Multivariate Autoregressive Modeling Approach.

    Directory of Open Access Journals (Sweden)

    Shuntaro Okazaki

    Full Text Available People's behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction--two individuals influencing one another--or in one direction--one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another's head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR, the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one's postural sway is explained by that of the other's and how visual information (sighted vs. blindfolded interacts with paired participants' postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the

  7. Unintentional Interpersonal Synchronization Represented as a Reciprocal Visuo-Postural Feedback System: A Multivariate Autoregressive Modeling Approach.

    Science.gov (United States)

    Okazaki, Shuntaro; Hirotani, Masako; Koike, Takahiko; Bosch-Bayard, Jorge; Takahashi, Haruka K; Hashiguchi, Maho; Sadato, Norihiro

    2015-01-01

    People's behaviors synchronize. It is difficult, however, to determine whether synchronized behaviors occur in a mutual direction--two individuals influencing one another--or in one direction--one individual leading the other, and what the underlying mechanism for synchronization is. To answer these questions, we hypothesized a non-leader-follower postural sway synchronization, caused by a reciprocal visuo-postural feedback system operating on pairs of individuals, and tested that hypothesis both experimentally and via simulation. In the behavioral experiment, 22 participant pairs stood face to face either 20 or 70 cm away from each other wearing glasses with or without vision blocking lenses. The existence and direction of visual information exchanged between pairs of participants were systematically manipulated. The time series data for the postural sway of these pairs were recorded and analyzed with cross correlation and causality. Results of cross correlation showed that postural sway of paired participants was synchronized, with a shorter time lag when participant pairs could see one another's head motion than when one of the participants was blindfolded. In addition, there was less of a time lag in the observed synchronization when the distance between participant pairs was smaller. As for the causality analysis, noise contribution ratio (NCR), the measure of influence using a multivariate autoregressive model, was also computed to identify the degree to which one's postural sway is explained by that of the other's and how visual information (sighted vs. blindfolded) interacts with paired participants' postural sway. It was found that for synchronization to take place, it is crucial that paired participants be sighted and exert equal influence on one another by simultaneously exchanging visual information. Furthermore, a simulation for the proposed system with a wider range of visual input showed a pattern of results similar to the behavioral results.

  8. Water quality assessment in the Bétaré-Oya gold mining area (East-Cameroon): Multivariate Statistical Analysis approach.

    Science.gov (United States)

    Rakotondrabe, Felaniaina; Ndam Ngoupayou, Jules Remy; Mfonka, Zakari; Rasolomanana, Eddy Harilala; Nyangono Abolo, Alexis Jacob; Ako Ako, Andrew

    2018-01-01

    The influence of gold mining activities on the water quality in the Mari catchment in Bétaré-Oya (East Cameroon) was assessed in this study. Sampling was performed within the period of one hydrological year (2015 to 2016), with 22 sampling sites consisting of groundwater (06) and surface water (16). In addition to measuring the physicochemical parameters, such as pH, electrical conductivity, alkalinity, turbidity, suspended solids and CN(-), eleven major elements (Na(+), K(+), Ca(2+), Mg(2+), NH4(+), Cl(-), NO3(-), HCO3(-), SO4(2-), PO4(3-) and F(-)) and eight heavy metals (Pb, Zn, Cd, Fe, Cu, As, Mn and Cr) were also analyzed using conventional hydrochemical methods, Multivariate Statistical Analysis and the Heavy metal Pollution Index (HPI). The results showed that the water from Mari catchment and Lom River was acidic to basic (5.40water quality, except for nitrates in some wells, which was found at a concentration >50mg NO3(-)/L. This water was found as two main types: calcium magnesium bicarbonate (CaMg-HCO3), which was the most represented, and sodium bicarbonate potassium (NaK-HCO3). As for trace elements in surface water, the contents of Pb, Cd, Mn, Cr and Fe were higher than recommended by the WHO guidelines, and therefore, the surface water was unsuitable for human consumption. Three phenomena were responsible for controlling the quality of the water in the study area: hydrolysis of silicate minerals of plutono-metamorphic rocks, which constitute the geological basement of this area; vegetation and soil leaching; and mining activities. The high concentrations of TSS and trace elements found in this basin were mainly due to gold mining activities (exploration and exploitation) as well as digging of rivers beds, excavation and gold amalgamation. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Implementation of Wavelet-Based Neural Network for the detection of Very Low Frequency (VLF) Whistlers Transients

    Science.gov (United States)

    Sondhiya, Deepak Kumar; Gwal, Ashok Kumar; Verma, Shivali; Kasde, Satish Kumar

    Abstract: In this paper, a wavelet-based neural network system for the detection and identification of four types of VLF whistler’s transients (i.e. dispersive, diffuse, spiky and multipath) is implemented and tested. The discrete wavelet transform (DWT) technique is integrated with the feed forward neural network (FFNN) model to construct the identifier. First, the multi-resolution analysis (MRA) technique of DWT and the Parseval’s theorem are employed to extract the characteristics features of the transients at different resolution levels. Second, the FFNN identifies these extracted features to identify the transients according to the features extracted. The proposed methodology can reduce a great quantity of the features of transients without losing its original property; less memory space and computing time are required. Various transient events are tested; the results show that the identifier can detect whistler transients efficiently. Keywords: Discrete wavelets transform, Multi-resolution analysis, Parseval’s theorem and Feed forward neural network

  10. Improving quality of medical image compression using biorthogonal CDF wavelet based on lifting scheme and SPIHT coding

    Directory of Open Access Journals (Sweden)

    Beladgham Mohammed

    2011-01-01

    Full Text Available As the coming era is that of digitized medical information, an important challenge to deal with is the storage and transmission requirements of enormous data, including medical images. Compression is one of the indispensable techniques to solve this problem. In this work, we propose an algorithm for medical image compression based on a biorthogonal wavelet transform CDF 9/7 coupled with SPIHT coding algorithm, of which we applied the lifting structure to improve the drawbacks of wavelet transform. In order to enhance the compression by our algorithm, we have compared the results obtained with wavelet based filters bank. Experimental results show that the proposed algorithm is superior to traditional methods in both lossy and lossless compression for all tested images. Our algorithm provides very important PSNR and MSSIM values for MRI images.

  11. A Sequential, Implicit, Wavelet-Based Solver for Multi-Scale Time-Dependent Partial Differential Equations

    Directory of Open Access Journals (Sweden)

    Donald A. McLaren

    2013-04-01

    Full Text Available This paper describes and tests a wavelet-based implicit numerical method for solving partial differential equations. Intended for problems with localized small-scale interactions, the method exploits the form of the wavelet decomposition to divide the implicit system created by the time-discretization into multiple smaller systems that can be solved sequentially. Included is a test on a basic non-linear problem, with both the results of the test, and the time required to calculate them, compared with control results based on a single system with fine resolution. The method is then tested on a non-trivial problem, its computational time and accuracy checked against control results. In both tests, it was found that the method requires less computational expense than the control. Furthermore, the method showed convergence towards the fine resolution control results.

  12. Haar Wavelet Based Implementation Method of the Non–integer Order Differentiation and its Application to Signal Enhancement

    Directory of Open Access Journals (Sweden)

    Li Yuanlu

    2015-06-01

    Full Text Available Non–integer order differentiation is changing application of traditional differentiation because it can achieve a continuous interpolation of the integer order differentiation. However, implementation of the non–integer order differentiation is much more complex than that of integer order differentiation. For this purpose, a Haar wavelet-based implementation method of non–integer order differentiation is proposed. The basic idea of the proposed method is to use the operational matrix to compute the non–integer order differentiation of a signal through expanding the signal by the Haar wavelets and constructing Haar wavelet operational matrix of the non–integer order differentiation. The effectiveness of the proposed method was verified by comparison of theoretical results and those obtained by another non–integer order differential filtering method. Finally, non–integer order differentiation was applied to enhance signal.

  13. Landscape genomics of Sphaeralcea ambigua in the Mojave Desert: a multivariate, spatially-explicit approach to guide ecological restoration

    Science.gov (United States)

    Shryock, Daniel F.; Havrilla, Caroline A.; DeFalco, Lesley; Esque, Todd C.; Custer, Nathan; Wood, Troy E.

    2015-01-01

    Local adaptation influences plant species’ responses to climate change and their performance in ecological restoration. Fine-scale physiological or phenological adaptations that direct demographic processes may drive intraspecific variability when baseline environmental conditions change. Landscape genomics characterize adaptive differentiation by identifying environmental drivers of adaptive genetic variability and mapping the associated landscape patterns. We applied such an approach to Sphaeralcea ambigua, an important restoration plant in the arid southwestern United States, by analyzing variation at 153 amplified fragment length polymorphism loci in the context of environmental gradients separating 47 Mojave Desert populations. We identified 37 potentially adaptive loci through a combination of genome scan approaches. We then used a generalized dissimilarity model (GDM) to relate variability in potentially adaptive loci with spatial gradients in temperature, precipitation, and topography. We identified non-linear thresholds in loci frequencies driven by summer maximum temperature and water stress, along with continuous variation corresponding to temperature seasonality. Two GDM-based approaches for mapping predicted patterns of local adaptation are compared. Additionally, we assess uncertainty in spatial interpolations through a novel spatial bootstrapping approach. Our study presents robust, accessible methods for deriving spatially-explicit models of adaptive genetic variability in non-model species that will inform climate change modelling and ecological restoration.

  14. Prediction Method of El Nino Southern Oscillation: ENSO by Means of Wavelet Based Data Compression with Appropriate Support Length of Base Function

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-08-01

    Full Text Available Method for El Nino/Southern Oscillation: ENSO by means of wavelet based data compression with appropriate support length of base function is proposed. Through the experiments with observed southern oscillation index, the proposed method is validated. Also a method for determination of appropriate support length is proposed and is validated.

  15. Multivariate compressive sensing for image reconstruction in the wavelet domain: using scale mixture models.

    Science.gov (United States)

    Wu, Jiao; Liu, Fang; Jiao, L C; Wang, Xiaodong; Hou, Biao

    2011-12-01

    Most wavelet-based reconstruction methods of compressive sensing (CS) are developed under the independence assumption of the wavelet coefficients. However, the wavelet coefficients of images have significant statistical dependencies. Lots of multivariate prior models for the wavelet coefficients of images have been proposed and successfully applied to the image estimation problems. In this paper, the statistical structures of the wavelet coefficients are considered for CS reconstruction of images that are sparse or compressive in wavelet domain. A multivariate pursuit algorithm (MPA) based on the multivariate models is developed. Several multivariate scale mixture models are used as the prior distributions of MPA. Our method reconstructs the images by means of modeling the statistical dependencies of the wavelet coefficients in a neighborhood. The proposed algorithm based on these scale mixture models provides superior performance compared with many state-of-the-art compressive sensing reconstruction algorithms.

  16. Application of a multivariate approach for analyte focusing by micelle collapse-micellar electrokinetic chromatography for analyzing sunscreen agents in cosmetics.

    Science.gov (United States)

    Lin, Yi-Hui; Lu, Chi-Yu; Jiang, Shiuh-Jen; Hsiao, Wen-Yao; Cheng, Hui-Ling; Chen, Yen-Ling

    2015-10-01

    The operating parameters that affect the performance of the online preconcentration technique "analyte focusing by micelle collapse-MEKC (AFMC-MEKC)" were examined using a multivariate approach involving experimental design to determine the sunscreen agents in cosmetics. Compared to the single-variable approach, the advantage of the multivariate approach was that many factors could be investigated simultaneously to obtain the best separation condition. A fractional factorial design was used to identify the fewest significant factors in the central composite design (cCD). The cCD was adopted for evaluating the location of the minimum or maximum response in this study. The influences of the experimental variables on the response were investigated by applying a chromatographic exponential function. The optimized condition and the relationship between the experimental variables were acquired using the JMP software. The ANOVA analysis indicated that the Tris pH value, SDS concentration, and ethanol percentage influenced the separation quality and significantly contributed to the model. The optimized condition of the running buffer was 10 mM Tris buffer (pH 9.5) containing 60 mM SDS, 7 mM γ-CD, and 20% v/v ethanol. The sample was prepared in 100 mM Tris buffer (pH 9.0) containing 7.5 mM SDS and 20% v/v ethanol. The SDS concentration in the sample matrix was slightly greater than the CMC value that makes the micelle be easily collapsed and the analytes be accumulated in the capillary. In addition, sunscreen agents in cosmetics after 1000-fold dilution were successfully determined by AFMC-MEKC. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A wavelet-based estimator of the degrees of freedom in denoised fMRI time series for probabilistic testing of functional connectivity and brain graphs.

    Science.gov (United States)

    Patel, Ameera X; Bullmore, Edward T

    2016-11-15

    Connectome mapping using techniques such as functional magnetic resonance imaging (fMRI) has become a focus of systems neuroscience. There remain many statistical challenges in analysis of functional connectivity and network architecture from BOLD fMRI multivariate time series. One key statistic for any time series is its (effective) degrees of freedom, df, which will generally be less than the number of time points (or nominal degrees of freedom, N). If we know the df, then probabilistic inference on other fMRI statistics, such as the correlation between two voxel or regional time series, is feasible. However, we currently lack good estimators of df in fMRI time series, especially after the degrees of freedom of the "raw" data have been modified substantially by denoising algorithms for head movement. Here, we used a wavelet-based method both to denoise fMRI data and to estimate the (effective) df of the denoised process. We show that seed voxel correlations corrected for locally variable df could be tested for false positive connectivity with better control over Type I error and greater specificity of anatomical mapping than probabilistic connectivity maps using the nominal degrees of freedom. We also show that wavelet despiked statistics can be used to estimate all pairwise correlations between a set of regional nodes, assign a P value to each edge, and then iteratively add edges to the graph in order of increasing P. These probabilistically thresholded graphs are likely more robust to regional variation in head movement effects than comparable graphs constructed by thresholding correlations. Finally, we show that time-windowed estimates of df can be used for probabilistic connectivity testing or dynamic network analysis so that apparent changes in the functional connectome are appropriately corrected for the effects of transient noise bursts. Wavelet despiking is both an algorithm for fMRI time series denoising and an estimator of the (effective) df of denoised

  18. Seizure-Onset Mapping Based on Time-Variant Multivariate Functional Connectivity Analysis of High-Dimensional Intracranial EEG: A Kalman Filter Approach.

    Science.gov (United States)

    Lie, Octavian V; van Mierlo, Pieter

    2017-01-01

    The visual interpretation of intracranial EEG (iEEG) is the standard method used in complex epilepsy surgery cases to map the regions of seizure onset targeted for resection. Still, visual iEEG analysis is labor-intensive and biased due to interpreter dependency. Multivariate parametric functional connectivity measures using adaptive autoregressive (AR) modeling of the iEEG signals based on the Kalman filter algorithm have been used successfully to localize the electrographic seizure onsets. Due to their high computational cost, these methods have been applied to a limited number of iEEG time-series (theory index of outward directed connectivity. Simulation results indicated high levels of mapping accuracy for the two models in the presence of low-to-moderate noise cross-correlation. Accordingly, both AR models correctly mapped the real seizure onset to the resection volume. This study supports the possibility of conducting fully data-driven multivariate connectivity estimations on high-dimensional iEEG datasets using the Kalman filter approach.

  19. Foreign Exchange Value-at-Risk with Multiple Currency Exposure: A Multivariate and Copula Generalized Autoregressive Conditional Heteroskedasticity Approach

    Science.gov (United States)

    2014-11-01

    strength of natural diversification benefits within value-at-risk (VaR) analyses. We examine two methods to attack this problem. Our first approach uses... portfolio VaR – correcting the indepen- dence assumed VaR estimate by 25% in cases in which multiple currency exposures are of similar size. ii DRDC-RDDC-2014...bénéfices de la diversification naturelle dans les analyses de la valeur à risque (VaR). Pour aborder ce problème, nous avons examiné deux méthodes

  20. An omic approach for the identification of oil sands process-affected water compounds using multivariate statistical analysis of ultrahigh resolution mass spectrometry datasets.

    Science.gov (United States)

    Chen, Yuan; McPhedran, Kerry N; Perez-Estrada, Leonidas; Gamal El-Din, Mohamed

    2015-04-01

    Oil sands process-affected water (OSPW) is a major environmental issue due to its acute and chronic toxicity to aquatic life. Advanced oxidation processes are promising treatments to successfully degrade toxic OSPW compounds. This study applied high resolution mass spectrometry to detect over 1000 compounds in OSPW samples after treatments including general ozonation, and ozone with carbonate, tert-butyl-alcohol, carbonate/tert-butyl-alcohol, tetranitromethane, or iron. Hierarchal clustering analysis showed that samples clustered based on sampling time and principal component analysis corroborated these results while also providing information on significant markers responsible for the clustering. Some markers were uniquely present in certain treatment conditions, while others showed variable behaviors in two or more treatments due to the presence of scavengers/catalysts. This advanced approach to monitoring significant changes of markers by using multivariate analysis can be invaluable for future work on OSPW treatment by-products and their potential toxicity to receiving environment organisms.

  1. Climatic spatialization and analyses of longitudinal data of beef cattle Nellore raising Maranhão, Pará and Tocantins using univariate and multivariate approach

    Directory of Open Access Journals (Sweden)

    Jorge Luís Ferreira

    2014-09-01

    Full Text Available This study was carried out to spatialize climatic factors that best discriminate the states of Maranhão, Pará and Tocantins, to analyze the structure of phenotypic correlation between phenotypic variables weights standardized at 120, 210, 365, 450 and 550 days old and propose phenotypic indices for animals selection in these States. The climate variables analyzed were maximum temperature, minimum temperature, average temperature, precipitation, normalized difference vegetative index, humidity, altitude and temperature and humidity index. Univariate and multivariate approach were used by procedures program Statistical Analysis System, SAS, to explain the relationship intra-variables, phenotypic and environmental variation. The expected differences in the progenies (EDPs were predicted using the software MTDFREML. All climatic and phenotypic variables were effective in discriminating the Maranhão, Pará and Tocantins States. Thus, we suggest the use of phenotypic indices for classification and animals’ selection within each State.

  2. Multivariate covariance generalized linear models

    DEFF Research Database (Denmark)

    Bonat, W. H.; Jørgensen, Bent

    2016-01-01

    We propose a general framework for non-normal multivariate data analysis called multivariate covariance generalized linear models, designed to handle multivariate response variables, along with a wide range of temporal and spatial correlation structures defined in terms of a covariance link...... function combined with a matrix linear predictor involving known matrices. The method is motivated by three data examples that are not easily handled by existing methods. The first example concerns multivariate count data, the second involves response variables of mixed types, combined with repeated...... are fitted by using an efficient Newton scoring algorithm based on quasi-likelihood and Pearson estimating functions, using only second-moment assumptions. This provides a unified approach to a wide variety of types of response variables and covariance structures, including multivariate extensions...

  3. Multivariate statistical approach to the temporal and spatial patterns of selected bioindicators observed in the North Sea during the years 1995-1997

    Science.gov (United States)

    Schmolke, S. R.; Broeg, K.; Zander, S.; Bissinger, V.; Hansen, P. D.; Kress, N.; Herut, B.; Jantzen, E.; Krüner, G.; Sturm, A.; Körting, W.; von Westernhagen, H.

    A comprehensive database, containing biological and chemical information, collected in the framework of the bilateral interdisciplinary MARS project (''biological indicators of natural and man-made changes in marine and coastal waters'') during the years 1995-1997 in the coastal environment of the North Sea, was subjected to a multivariate statistical evaluation. The MARS project was designated to combine a variety of approaches and to develop a set of methods for the employment of biological indicators in pollution monitoring and environmental quality assessment. In total, nine ship cruises to four coastal sampling sites were conducted; 765 fish and 384 mussel samples were analysed for biological and chemical parameters. Additional information on the chemical background at the sampling sites was derived from sediment samples, collected at each of the four sampling sites. Based on the available chemical data in sediments and black mussel (Mytilus edulis) a pollution gradient between the selected sites, was established. The chemical body burden of flounder (Platichthys flesus) from these sites, though, did not reflect this gradient equally clear. In contrast, the biological information derived from measurements in fish samples displayed significant a regional as well as a temporal pattern. A multivariate bioindicator data matrix was evaluated employing a factor analysis model to identify relations between selected biological indicators, and to improve the understanding of a regional and temporal component in the parameter response. In a second approach, applying the k-means algorithm on the data matrix, two significantly different clusters of samples, characterised by the current health status of the fish, were extracted. Using this classification a temporal, and in the second order, a less pronounced spatial effect was evident. In particular, during July 1996, a clear sign of deteriorating environmental conditions was extracted from the biological data matrix.

  4. Regional-scale controls on the spatial activity of rockfalls (Turtmann Valley, Swiss Alps) - A multivariate modeling approach

    Science.gov (United States)

    Messenzehl, Karoline; Meyer, Hanna; Otto, Jan-Christoph; Hoffmann, Thomas; Dikau, Richard

    2017-06-01

    In mountain geosystems, rockfalls are among the most effective sediment transfer processes, reflected in the regional-scale distribution of talus slopes. However, the understanding of the key controlling factors seems to decrease with increasing spatial scale, due to emergent and complex system behavior and not least to recent methodological shortcomings in rockfall modeling research. In this study, we aim (i) to develop a new approach to identify major regional-scale rockfall controls and (ii) to quantify the relative importance of these controls. Using a talus slope inventory in the Turtmann Valley (Swiss Alps), we applied for the first time the decision-tree based random forest algorithm (RF) in combination with a principal component logistic regression (PCLR) to evaluate the spatial distribution of rockfall activity. This study presents new insights into the discussion on whether periglacial rockfall events are controlled more by topo-climatic, cryospheric, paraglacial or/and rock mechanical properties. Both models explain the spatial rockfall pattern very well, given the high areas under the Receiver Operating Characteristic (ROC) curves of > 0.83. Highest accuracy was obtained by the RF, correctly predicting 88% of the rockfall source areas. The RF appears to have a great potential in geomorphic research involving multicollinear data. The regional permafrost distribution, coupled to the bedrock curvature and valley topography, was detected to be the primary rockfall control. Rockfall source areas cluster within a low-radiation elevation belt (2900-3300 m a.s.l,) consistent with a permafrost probability of > 90%. The second most important factor is the time since deglaciation, reflected by the high abundance of rockfalls along recently deglaciated (< 100 years), north-facing slopes. However, our findings also indicate a strong rock mechanical control on the paraglacial rockfall activity, declining either exponentially or linearly since deglaciation. The study

  5. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    Science.gov (United States)

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  6. A novel multivariate approach using science-based calibration for direct coating thickness determination in real-time NIR process monitoring.

    Science.gov (United States)

    Möltgen, C-V; Herdling, T; Reich, G

    2013-11-01

    This study demonstrates an approach, using science-based calibration (SBC), for direct coating thickness determination on heart-shaped tablets in real-time. Near-Infrared (NIR) spectra were collected during four full industrial pan coating operations. The tablets were coated with a thin hydroxypropyl methylcellulose (HPMC) film up to a film thickness of 28 μm. The application of SBC permits the calibration of the NIR spectral data without using costly determined reference values. This is due to the fact that SBC combines classical methods to estimate the coating signal and statistical methods for the noise estimation. The approach enabled the use of NIR for the measurement of the film thickness increase from around 8 to 28 μm of four independent batches in real-time. The developed model provided a spectroscopic limit of detection for the coating thickness of 0.64 ± 0.03 μm root-mean square (RMS). In the commonly used statistical methods for calibration, such as Partial Least Squares (PLS), sufficiently varying reference values are needed for calibration. For thin non-functional coatings this is a challenge because the quality of the model depends on the accuracy of the selected calibration standards. The obvious and simple approach of SBC eliminates many of the problems associated with the conventional statistical methods and offers an alternative for multivariate calibration.

  7. A primer of multivariate statistics

    CERN Document Server

    Harris, Richard J

    2014-01-01

    Drawing upon more than 30 years of experience in working with statistics, Dr. Richard J. Harris has updated A Primer of Multivariate Statistics to provide a model of balance between how-to and why. This classic text covers multivariate techniques with a taste of latent variable approaches. Throughout the book there is a focus on the importance of describing and testing one's interpretations of the emergent variables that are produced by multivariate analysis. This edition retains its conversational writing style while focusing on classical techniques. The book gives the reader a feel for why

  8. Multivariate extensions of expectiles risk measures

    Directory of Open Access Journals (Sweden)

    Maume-Deschamps Véronique

    2017-01-01

    Full Text Available This paper is devoted to the introduction and study of a new family of multivariate elicitable risk measures. We call the obtained vector-valued measures multivariate expectiles. We present the different approaches used to construct our measures. We discuss the coherence properties of these multivariate expectiles. Furthermore, we propose a stochastic approximation tool of these risk measures.

  9. How many taxa can be recognized within the complex Tillandsia capillaris (Bromeliaceae, Tillandsioideae? Analysis of the available classifications using a multivariate approach

    Directory of Open Access Journals (Sweden)

    Lucía Castello

    2013-05-01

    Full Text Available Tillandsia capillaris Ruiz & Pav., which belongs to the subgenus Diaphoranthema is distributed in Ecuador, Peru, Bolivia, northern and central Argentina, and Chile, and includes forms that are difficult to circumscribe, thus considered to form a complex. The entities of this complex are predominantly small-sized epiphytes, adapted to xeric environments. The most widely used classification defines 5 forms for this complex based on few morphological reproductive traits: T. capillaris Ruiz & Pav. f. capillaris, T. capillaris f. incana (Mez L.B. Sm., T. capillaris f. cordobensis (Hieron. L.B. Sm., T. capillaris f. hieronymi (Mez L.B. Sm. and T. capillaris f. virescens (Ruiz & Pav. L.B. Sm. In this study, 35 floral and vegetative characters were analyzed with a multivariate approach in order to assess and discuss different proposals for classification of the T. capillaris complex, which presents morphotypes that co-occur in central and northern Argentina. To accomplish this, data of quantitative and categorical morphological characters of flowers and leaves were collected from herbarium specimens and field collections and were analyzed with statistical multivariate techniques. The results suggest that the last classification for the complex seems more comprehensive and three taxa were delimited: T. capillaris (=T. capillaris f. incana-hieronymi, T. virescens s. str. (=T. capillaris f. cordobensis and T. virescens s. l. (=T. capillaris f. virescens. While T. capillaris and T. virescens s. str. co-occur, T. virescens s. l. is restricted to altitudes above 2000 m in Argentina. Characters previously used for taxa delimitation showed continuous variation and therefore were not useful. New diagnostic characters are proposed and a key is provided for delimiting these three taxa within the complex.

  10. Multivariate approaches in plant science

    DEFF Research Database (Denmark)

    Gottlieb, D.M.; Schultz, j.; Bruun, Susanne Wrang

    2004-01-01

    The objective of proteomics is to get an overview of the proteins expressed at a given point in time in a given tissue and to identify the connection to the biochemical status of that tissue. Therefore sample throughput and analysis time are important issues in proteomics. The concept of proteomi...

  11. Wavelet-based analysis and power law classification of C/NOFS high-resolution electron density data

    Science.gov (United States)

    Rino, C. L.; Carrano, C. S.; Roddy, Patrick

    2014-08-01

    This paper applies new wavelet-based analysis procedures to low Earth-orbiting satellite measurements of equatorial ionospheric structure. The analysis was applied to high-resolution data from 285 Communications/Navigation Outage Forecasting System (C/NOFS) satellite orbits sampling the postsunset period at geomagnetic equatorial latitudes. The data were acquired during a period of progressively intensifying equatorial structure. The sampled altitude range varied from 400 to 800 km. The varying scan velocity remained within 20° of the cross-field direction. Time-to-space interpolation generated uniform samples at approximately 8 m. A maximum segmentation length that supports stochastic structure characterization was identified. A two-component inverse power law model was fit to scale spectra derived from each segment together with a goodness-of-fit measure. Inverse power law parameters derived from the scale spectra were used to classify the scale spectra by type. The largest category was characterized by a single inverse power law with a mean spectral index somewhat larger than 2. No systematic departure from the inverse power law was observed to scales greater than 100 km. A small subset of the most highly disturbed passes at the lowest sampled altitudes could be categorized by two-component power law spectra with a range of break scales from less than 100 m to several kilometers. The results are discussed within the context of other analyses of in situ data and spectral characteristics used for scintillation analyses.

  12. Application of Wavelet-Based Tools to Study the Dynamics of Biological Processes

    DEFF Research Database (Denmark)

    Pavlov, A. N.; Makarov, V. A.; Mosekilde, Erik

    2006-01-01

    The article makes use of three different examples (sensory information processing in the rat trigeminal complex, intracellular interaction in snail neurons and multimodal dynamics in nephron autoregulation) to demonstrate how modern approaches to time-series analysis based on the wavelet...

  13. Proposing Wavelet-Based Low-Pass Filter and Input Filter to Improve Transient Response of Grid-Connected Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Bijan Rahmani

    2016-08-01

    Full Text Available Available photovoltaic (PV systems show a prolonged transient response, when integrated into the power grid via active filters. On one hand, the conventional low-pass filter, employed within the integrated PV system, works with a large delay, particularly in the presence of system’s low-order harmonics. On the other hand, the switching of the DC (direct current–DC converters within PV units also prolongs the transient response of an integrated system, injecting harmonics and distortion through the PV-end current. This paper initially develops a wavelet-based low-pass filter to improve the transient response of the interconnected PV systems to grid lines. Further, a damped input filter is proposed within the PV system to address the raised converter’s switching issue. Finally, Matlab/Simulink simulations validate the effectiveness of the proposed wavelet-based low-pass filter and damped input filter within an integrated PV system.

  14. A Wavelet-Based Robust Relevance Vector Machine Based on Sensor Data Scheduling Control for Modeling Mine Gas Gushing Forecasting on Virtual Environment

    OpenAIRE

    Wang Ting; Cai Lin-qin; Fu Yao; Zhu Tingcheng

    2013-01-01

    It is wellknown that mine gas gushing forecasting is very significant to ensure the safety of mining. A wavelet-based robust relevance vector machine based on sensor data scheduling control for modeling mine gas gushing forecasting is presented in the paper. Morlet wavelet function can be used as the kernel function of robust relevance vector machine. Mean percentage error has been used to measure the performance of the proposed method in this study. As the mean prediction error of mine gas g...

  15. Perceptual Copyright Protection Using Multiresolution Wavelet-Based Watermarking And Fuzzy Logic

    CERN Document Server

    Hsieh, Ming-Shing

    2010-01-01

    In this paper, an efficiently DWT-based watermarking technique is proposed to embed signatures in images to attest the owner identification and discourage the unauthorized copying. This paper deals with a fuzzy inference filter to choose the larger entropy of coefficients to embed watermarks. Unlike most previous watermarking frameworks which embedded watermarks in the larger coefficients of inner coarser subbands, the proposed technique is based on utilizing a context model and fuzzy inference filter by embedding watermarks in the larger-entropy coefficients of coarser DWT subbands. The proposed approaches allow us to embed adaptive casting degree of watermarks for transparency and robustness to the general image-processing attacks such as smoothing, sharpening, and JPEG compression. The approach has no need the original host image to extract watermarks. Our schemes have been shown to provide very good results in both image transparency and robustness.

  16. Perceptual Copyright Protection Using Multiresolution Wavelet-Based Watermarking And Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Ming-Shing Hsieh

    2010-07-01

    Full Text Available In this paper, an efficiently DWT-based watermarking technique is proposed to embed signatures in images to attest the owner identification and discourage the unauthorized copying. This paper deals with a fuzzy inference filter to choose the larger entropy of coefficients to embed watermarks. Unlike most previous watermarking frameworks which embedded watermarks in the larger coefficients of inner coarser subbands, the proposed technique is based on utilizing a context model and fuzzy inference filter by embedding watermarks in the larger-entropy coefficients of coarser DWT subbands. The proposed approaches allow us to embed adaptive casting degree of watermarks for transparency and robustness to the general image-processing attacks such as smoothing, sharpening, and JPEG compression. The approach has no need the original host image to extract watermarks. Our schemes have been shown to provide very good results in both image transparency and robustness.

  17. Perceptual Copyright Protection Using Multiresolution Wavelet-Based Watermarking And Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Ming-Shing Hsieh

    2010-07-01

    Full Text Available In this paper, an efficiently DWT-based watermarking technique is proposed to embed signatures in images toattest the owner identification and discourage the unauthorized copying. This paper deals with a fuzzy inferencefilter to choose the larger entropy of coefficients to embed watermarks. Unlike most previous watermarkingframeworks which embedded watermarks in the larger coefficients of inner coarser subbands, the proposedtechnique is based on utilizing a context model and fuzzy inference filter by embedding watermarks in thelarger-entropy coefficients of coarser DWT subbands. The proposed approaches allow us to embed adaptivecasting degree of watermarks for transparency and robustness to the general image-processing attacks such assmoothing, sharpening, and JPEG compression. The approach has no need the original host image to extractwatermarks. Our schemes have been shown to provide very good results in both image transparency androbustness.

  18. Wavelet-based calculation of cerebral angiographic data from time-resolved CT perfusion acquisitions

    Energy Technology Data Exchange (ETDEWEB)

    Havla, Lukas; Dietrich, Olaf [Ludwig-Maximilians-University Hospital Munich, Josef-Lissner-Laboratory for Biomedical Imaging, Institute for Clinical Radiology, Munich (Germany); Thierfelder, Kolja M.; Beyer, Sebastian E.; Sommer, Wieland H. [Ludwig-Maximilians-University Hospital Munich, Institute for Clinical Radiology, Munich (Germany)

    2015-08-15

    To evaluate a new approach for reconstructing angiographic images by application of wavelet transforms on CT perfusion data. Fifteen consecutive patients with suspected stroke were examined with a multi-detector CT acquiring 32 dynamic phases (∇t = 1.5s) of 99 slices (total slab thickness 99mm) at 80kV/200mAs. Thirty-five mL of iomeprol-350 was injected (flow rate = 4.5mL/s). Angiographic datasets were calculated after initial rigid-body motion correction using (a) temporally filtered maximum intensity projections (tMIP) and (b) the wavelet transform (Paul wavelet, order 1) of each voxel time course. The maximum of the wavelet-power-spectrum was defined as the angiographic signal intensity. The contrast-to-noise ratio (CNR) of 18 different vessel segments was quantified and two blinded readers rated the images qualitatively using 5pt Likert scales. The CNR for the wavelet angiography (501.8 ± 433.0) was significantly higher than for the tMIP approach (55.7 ± 29.7, Wilcoxon test p < 0.00001). Image quality was rated to be significantly higher (p < 0.001) for the wavelet angiography with median scores of 4/4 (reader 1/reader 2) than the tMIP (scores of 3/3). The proposed calculation approach for angiography data using temporal wavelet transforms of intracranial CT perfusion datasets provides higher vascular contrast and intrinsic removal of non-enhancing structures such as bone. (orig.)

  19. Optimal mother wavelet-based Lamb wave analyses and damage detection for composite structures

    Institute of Scientific and Technical Information of China (English)

    Li Fucai; Meng Guang; Ye Lin

    2007-01-01

    With the purpose of on-line structural health monitoring, a transducer network was embedded into composite structure to minimize the influence of surroundings. The intrinsic dispersion characteristic of Lamb wave makes the wavelet transform an effective signal processing method for guided waves. To get high precision in feature extraction, an information entropy-based optimal mother wavelet selection approach was proposed, which was used to choose the most appropriate basis function for particular Lamb wave analysis. By using the embedded sensor network and extracting time-of-flight, delamination in the composite laminate was identified and located. The results demonstrate the effectiveness of the proposed methods.

  20. A Low-complexity Wavelet Based Algorithm for Inter-frame Image Prediction

    Directory of Open Access Journals (Sweden)

    S. Usama

    2002-01-01

    Full Text Available In this paper, a novel multi-resolution variable block size algorithm (MRVBS is introduced. It is based on: (1 Using the wavelet components of the seven sub-bands from two layers of wavelet pyramid in the lowest resolution; (2 Performing a block matching estimation within a nine-block only in each sub-band of the lower layer; (3 Scaling the estimated motion vectors and using them as a new search center for the finest resolution. The motivation for using the multi-resolution approach is the inherent structure of the wavelet representation. A multi-resolution scheme significantly reduces the searching time, and provides a smooth motion vector field. The approach presented in this paper providing an accurate motion estimate even in the presence of single and mixed noise. As a part of this framework, a comparison of the Full search (FS algorithm, the three-step search (TSS algorithm and the new algorithm (MRVBS is presented. For a small addition in computational complexity over a simple TSS algorithm, the new algorithm achieves good results in the presence of noise.

  1. A Discrete Wavelet Based Feature Extraction and Hybrid Classification Technique for Microarray Data Analysis

    Directory of Open Access Journals (Sweden)

    Jaison Bennet

    2014-01-01

    Full Text Available Cancer classification by doctors and radiologists was based on morphological and clinical features and had limited diagnostic ability in olden days. The recent arrival of DNA microarray technology has led to the concurrent monitoring of thousands of gene expressions in a single chip which stimulates the progress in cancer classification. In this paper, we have proposed a hybrid approach for microarray data classification based on nearest neighbor (KNN, naive Bayes, and support vector machine (SVM. Feature selection prior to classification plays a vital role and a feature selection technique which combines discrete wavelet transform (DWT and moving window technique (MWT is used. The performance of the proposed method is compared with the conventional classifiers like support vector machine, nearest neighbor, and naive Bayes. Experiments have been conducted on both real and benchmark datasets and the results indicate that the ensemble approach produces higher classification accuracy than conventional classifiers. This paper serves as an automated system for the classification of cancer and can be applied by doctors in real cases which serve as a boon to the medical community. This work further reduces the misclassification of cancers which is highly not allowed in cancer detection.

  2. Wavelet-based Characterization of Small-scale Solar Emission Features at Low Radio Frequencies

    Science.gov (United States)

    Suresh, A.; Sharma, R.; Oberoi, D.; Das, S. B.; Pankratius, V.; Timar, B.; Lonsdale, C. J.; Bowman, J. D.; Briggs, F.; Cappallo, R. J.; Corey, B. E.; Deshpande, A. A.; Emrich, D.; Goeke, R.; Greenhill, L. J.; Hazelton, B. J.; Johnston-Hollitt, M.; Kaplan, D. L.; Kasper, J. C.; Kratzenberg, E.; Lynch, M. J.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Ord, S. M.; Prabu, T.; Rogers, A. E. E.; Roshi, A.; Udaya Shankar, N.; Srivani, K. S.; Subrahmanyan, R.; Tingay, S. J.; Waterson, M.; Wayth, R. B.; Webster, R. L.; Whitney, A. R.; Williams, A.; Williams, C. L.

    2017-07-01

    Low radio frequency solar observations using the Murchison Widefield Array have recently revealed the presence of numerous weak short-lived narrowband emission features, even during moderately quiet solar conditions. These nonthermal features occur at rates of many thousands per hour in the 30.72 MHz observing bandwidth, and hence necessarily require an automated approach for their detection and characterization. Here, we employ continuous wavelet transform using a mother Ricker wavelet for feature detection from the dynamic spectrum. We establish the efficacy of this approach and present the first statistically robust characterization of the properties of these features. In particular, we examine distributions of their peak flux densities, spectral spans, temporal spans, and peak frequencies. We can reliably detect features weaker than 1 SFU, making them, to the best of our knowledge, the weakest bursts reported in literature. The distribution of their peak flux densities follows a power law with an index of -2.23 in the 12-155 SFU range, implying that they can provide an energetically significant contribution to coronal and chromospheric heating. These features typically last for 1-2 s and possess bandwidths of about 4-5 MHz. Their occurrence rate remains fairly flat in the 140-210 MHz frequency range. At the time resolution of the data, they appear as stationary bursts, exhibiting no perceptible frequency drift. These features also appear to ride on a broadband background continuum, hinting at the likelihood of them being weak type-I bursts.

  3. Geochemistry of natural and anthropogenic fall-out (aerosol and precipitation) collected from the NW Mediterranean: two different multivariate statistical approaches

    Energy Technology Data Exchange (ETDEWEB)

    Molinaroli, E.; Pistolato, M.; Rampazzo, G. [Dipartimento di Scienze Ambientali, Universita di Venezia, Dorsoduro 2137, 30123 Venezia (Italy); Guerzoni, S. [CNR, Istituto di Geologia Marina, via Gobetti 101, 40129 Bologna (Italy)

    1999-06-01

    The chemical characteristics of the mineral fractions of aerosol and precipitation collected in Sardinia (NW Mediterranean) are highlighted by means of two multivariate statistical approaches. Two different combinations of classification and statistical methods for geochemical data are presented. It is shown that the application of cluster analysis subsequent to Q-Factor analysis better distinguishes among Saharan dust, background pollution (Europe-Mediterranean) and local aerosol from various source regions (Sardinia). Conversely, the application of simple cluster analysis was able to distinguish only between aerosols and precipitation particles, without assigning the sources (local or distant) to the aerosol. This method also highlighted the fact that crust-enriched precipitation is similar to desert-derived aerosol. Major elements (Al, Na) and trace metal (Pb) turn out to be the most discriminating elements of the analysed data set. Independent use of mineralogical, granulometric and meteorological data confirmed the results derived from the statistical methods employed. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  4. Study of a Multivariate Approach for the Background Rejection in the Scattering of Two Like-Charge $W^{\\pm}$ Bosons with the ATLAS Detector at the LHC

    CERN Document Server

    AUTHOR|(CDS)2100403; Kobel, Michael; Straessner, Arno

    This thesis presents the study of a multivariate approach for the background rejection in the scattering of two like-charge $W^{\\pm}$ bosons with the ATLAS detector at the Large Hadron Collider. The scattering process can be accessed through the measurement of purely electroweak production of two like-charge $W^{\\pm}$ bosons and two jets in the fully leptonic decay channel of the $W^{\\pm}$ bosons. Although the characteristic signature of the final state of this production process already reduces most Standard Model backgrounds, other processes exist that leave the same experimental signature in the detector. QCD-initiated production of a $W^{\\pm}$ boson and a $Z$ boson in association with two jets with leptonic decay of the $W^{\\pm}$ and the $Z$ boson accounts for the largest background contribution. Thus, the focus of this thesis is set on the rejection of this background. As a very promising technique for this classification problem, boosted decision trees are studied in this thesis. The variable ranking of...

  5. A nonlinear Granger causality test between stock returns and investor sentiment for Chinese stock market: a wavelet-based approach

    NARCIS (Netherlands)

    Chu, X.; Wu, C.; Qiu, J.

    2016-01-01

    In this article, we re-examine the causality between the stock returns and investor sentiment in China. The number of net added accounts is used as a proxy for investor sentiment. To mimic the different investment horizons of market participants, we use the wavelet method to decompose stock returns

  6. The EM Method in a Probabilistic Wavelet-Based MRI Denoising

    Directory of Open Access Journals (Sweden)

    Marcos Martin-Fernandez

    2015-01-01

    Full Text Available Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak’s, Donoho-Johnstone’s, Awate-Whitaker’s, and nonlocal means filters, in different 2D and 3D images.

  7. Wavelet-based texture analysis of EEG signal for prediction of epileptic seizure

    Science.gov (United States)

    Petrosian, Arthur A.; Homan, Richard; Pemmaraju, Suryalakshmi; Mitra, Sunanda

    1995-09-01

    Electroencephalographic (EEG) signal texture content analysis has been proposed for early warning of an epileptic seizure. This approach was evaluated by investigating the interrelationship between texture features and basic signal informational characteristics, such as Kolmogorov complexity and fractal dimension. The comparison of several traditional techniques, including higher-order FIR digital filtering, chaos, autoregressive and FFT time- frequency analysis was also carried out on the same epileptic EEG recording. The purpose of this study is to investigate whether wavelet transform can be used to further enhance the developed methods for prediction of epileptic seizures. The combined consideration of texture and entropy characteristics extracted from subsignals decomposed by wavelet transform are explored for that purpose. Yet, the novel neuro-fuzzy clustering algorithm is performed on wavelet coefficients to segment given EEG recording into different stages prior to an actual seizure onset.

  8. Comparison on Integer Wavelet Transforms in Spherical Wavelet Based Image Based Relighting

    Institute of Scientific and Technical Information of China (English)

    WANGZe; LEEYin; LEUNGChising; WONGTientsin; ZHUYisheng

    2003-01-01

    To provide a good quality rendering in the Image based relighting (IBL) system, tremendous reference images under various illumination conditions are needed. Therefore data compression is essential to enable interactive action. And the rendering speed is another crucial consideration for real applications. Based on Spherical wavelet transform (SWT), this paper presents a quick representation method with Integer wavelet transform (IWT) for the IBL system. It focuses on comparison on different IWTs with the Embedded zerotree wavelet (EZW) used in the IBL system. The whole compression procedure contains two major compression steps. Firstly, SWT is applied to consider the correlation among different reference images. Secondly, the SW transformed images are compressed with IWT based image compression approach. Two IWTs are used and good results are showed in the simulations.

  9. Wavelet-based improved Chan-Vese model for image segmentation

    Science.gov (United States)

    Zhao, Xiaoli; Zhou, Pucheng; Xue, Mogen

    2016-10-01

    In this paper, a kind of image segmentation approach which based on improved Chan-Vese (CV) model and wavelet transform was proposed. Firstly, one-level wavelet decomposition was adopted to get the low frequency approximation image. And then, the improved CV model, which contains the global term, local term and the regularization term, was utilized to segment the low frequency approximation image, so as to obtain the coarse image segmentation result. Finally, the coarse segmentation result was interpolated into the fine scale as an initial contour, and the improved CV model was utilized again to get the fine scale segmentation result. Experimental results show that our method can segment low contrast images and/or inhomogeneous intensity images more effectively than traditional level set methods.

  10. Wavelet-Based Watermarking and Compression for ECG Signals with Verification Evaluation

    Directory of Open Access Journals (Sweden)

    Kuo-Kun Tseng

    2014-02-01

    Full Text Available In the current open society and with the growth of human rights, people are more and more concerned about the privacy of their information and other important data. This study makes use of electrocardiography (ECG data in order to protect individual information. An ECG signal can not only be used to analyze disease, but also to provide crucial biometric information for identification and authentication. In this study, we propose a new idea of integrating electrocardiogram watermarking and compression approach, which has never been researched before. ECG watermarking can ensure the confidentiality and reliability of a user’s data while reducing the amount of data. In the evaluation, we apply the embedding capacity, bit error rate (BER, signal-to-noise ratio (SNR, compression ratio (CR, and compressed-signal to noise ratio (CNR methods to assess the proposed algorithm. After comprehensive evaluation the final results show that our algorithm is robust and feasible.

  11. Wavelet-based watermarking and compression for ECG signals with verification evaluation.

    Science.gov (United States)

    Tseng, Kuo-Kun; He, Xialong; Kung, Woon-Man; Chen, Shuo-Tsung; Liao, Minghong; Huang, Huang-Nan

    2014-01-01

    In the current open society and with the growth of human rights, people are more and more concerned about the privacy of their information and other important data. This study makes use of electrocardiography (ECG) data in order to protect individual information. An ECG signal can not only be used to analyze disease, but also to provide crucial biometric information for identification and authentication. In this study, we propose a new idea of integrating electrocardiogram watermarking and compression approach, which has never been researched before. ECG watermarking can ensure the confidentiality and reliability of a user's data while reducing the amount of data. In the evaluation, we apply the embedding capacity, bit error rate (BER), signal-to-noise ratio (SNR), compression ratio (CR), and compressed-signal to noise ratio (CNR) methods to assess the proposed algorithm. After comprehensive evaluation the final results show that our algorithm is robust and feasible.

  12. A novel application of wavelet based SVM to transient phenomena identification of power transformers

    Energy Technology Data Exchange (ETDEWEB)

    Jazebi, S., E-mail: jazebi@aut.ac.i [Department of Electrical Engineering, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Vahidi, B., E-mail: vahidi@aut.ac.i [Department of Electrical Engineering, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Jannati, M., E-mail: M.jannati@uok.ac.i [Department of Electrical Engineering, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of)

    2011-02-15

    A novel differential protection approach is introduced in the present paper. The proposed scheme is a combination of Support Vector Machine (SVM) and wavelet transform theories. Two common transients such as magnetizing inrush current and internal fault are considered. A new wavelet feature is extracted which reduces the computational cost and enhances the discrimination accuracy of SVM. Particle swarm optimization technique (PSO) has been applied to tune SVM parameters. The suitable performance of this method is demonstrated by simulation of different faults and switching conditions on a power transformer in PSCAD/EMTDC software. The method has the advantages of high accuracy and low computational burden (less than a quarter of a cycle). The other advantage is that the method is not dependent on a specific threshold. Sympathetic and recovery inrush currents also have been simulated and investigated. Results show that the proposed method could remain stable even in noisy environments.

  13. Adaptive multiple subtraction with wavelet-based complex unary Wiener filters

    CERN Document Server

    Ventosa, Sergi; Huard, Irène; Pica, Antonio; Rabeson, Hérald; Ricarte, Patrice; Duval, Laurent

    2011-01-01

    Multiple attenuation is a crucial task in seismic data processing because multiples usually cover primaries from fundamental reflectors. Predictive multiple suppression methods remove these multiples by building an adapted model, aiming at being subtracted from the original signal. However, before the subtraction is applied, a matching filter is required to minimize amplitude differences and misalignments between actual multiples and their prediction, and thus to minimize multiples in the input dataset after the subtraction. In this work we focus on the subtraction element. We propose an adaptive multiple removal technique in a 1-D complex wavelet frame combined with a non-stationary adaptation performed via single-sample (unary) Wiener filters, consistently estimated on overlapping windows in the transformed domain. This approach greatly simplifies the matching filter estimation and, despite its simplicity, compares promisingly with standard adaptive 2-D methods, both in terms of results and retained speed a...

  14. Expression Detail Synthesis Based on Wavelet-Based Image Fusion%基于小波图像融合的表情细节合成

    Institute of Scientific and Technical Information of China (English)

    王晓慧; 贾珈; 蔡莲红

    2013-01-01

    Expression details are texture changes caused by facial expression, such as wrinkles in the corner of the mouth when smiling and wrinkles on the forehead when surprising. Expression details can help to enhance the realistic experience of synthesized face image. In this paper, we propose to synthesize expression details by using the method of wavelet-based image fusion. We try to mine the texture feature of expression details for natural expression generation. In order to meet the requirements of individual expression detail synthesis, we use different wavelet transforms, such as the traditional wavelet transform and dual-tree complex wavelet transform, and kinds of fusion operators to get rich results. To seamlessly integrate the synthesized image of expression details to the output expressive face image, we select the optimal replacement for both images by clustering and graph cut method. Our proposed approach is applied to not only grayscale images, but also color images by the color space conversion. The experimental results show that the proposed method is effective on the expression detail synthesis, which can enhance the realistic experience of synthesized face image.%表情细节是人脸表情变化时带来的皮肤纹理变化,表情细节合成有助于增强合成表情的真实感.提出了基于小波的图像融合方法,挖掘表情细节的纹理特征本质,并应用到表情细节的合成中,使合成的表情更加真实自然.为了满足表情细节个性化的需求,采用传统的小波变换和双树复小波变换,同时使用不同的融合算子,得到了丰富的表情合成.本方法不仅适用于灰度图像,通过颜色空间的转换还适用于彩色图像.最后还提出了基于聚类和图切割的最优替换区域选取方法,使得合成的表情细节区域与目标图像融为一体.

  15. Wavelet-Based Geometry Coding for Three Dimensional Mesh Using Space Frequency Quantization

    Directory of Open Access Journals (Sweden)

    Shymaa T. El-Leithy

    2009-01-01

    Full Text Available Problem statement: Recently, 3D objects have been used in several applications like internet games, virtual reality and scientific visualization. These applications require real time rendering and fast transmission of large objects through internet. However, due to limitation of bandwidth, the compression and streaming of 3D object is still an open research problem. Approach: Novel procedure for compression and coding of 3-Dimensional (3-D semi-regular meshes using wavelet transform had been introduced. This procedure was based on Space Frequency Quantization (SFQ which was used to minimize distortion error of reconstructed mesh for a different bit-rate constraint. Results: Experimental results had been carried out over five datasets with different mesh intense and irregularity. Results were evaluated by using the peak signal to noise ratio as an error measurement. Experiments showed that 3D SFQ code over performs Progressive Geometry Coder (PGC in terms of quality of compressed meshes. Conclusion: A pure 3D geometry coding algorithm based on wavelet had been introduced. Proposed procedure showed its superiority over the state of art coding techniques. Moreover, bit-stream can be truncated at any point and still decode reasonable visual quality meshes.

  16. 3D Inversion of Magnetic Data through Wavelet based Regularization Method

    Directory of Open Access Journals (Sweden)

    Maysam Abedi

    2015-06-01

    Full Text Available This study deals with the 3D recovering of magnetic susceptibility model by incorporating the sparsity-based constraints in the inversion algorithm. For this purpose, the area under prospect was divided into a large number of rectangular prisms in a mesh with unknown susceptibilities. Tikhonov cost functions with two sparsity functions were used to recover the smooth parts as well as the sharp boundaries of model parameters. A pre-selected basis namely wavelet can recover the region of smooth behaviour of susceptibility distribution while Haar or finite-difference (FD domains yield a solution with rough boundaries. Therefore, a regularizer function which can benefit from the advantages of both wavelets and Haar/FD operators in representation of the 3D magnetic susceptibility distributionwas chosen as a candidate for modeling magnetic anomalies. The optimum wavelet and parameter β which controls the weight of the two sparsifying operators were also considered. The algorithm assumed that there was no remanent magnetization and observed that magnetometry data represent only induced magnetization effect. The proposed approach is applied to a noise-corrupted synthetic data in order to demonstrate its suitability for 3D inversion of magnetic data. On obtaining satisfactory results, a case study pertaining to the ground based measurement of magnetic anomaly over a porphyry-Cu deposit located in Kerman providence of Iran. Now Chun deposit was presented to be 3D inverted. The low susceptibility in the constructed model coincides with the known location of copper ore mineralization.

  17. Wavelet-based multiresolution with n-th-root-of-2 Subdivision

    Energy Technology Data Exchange (ETDEWEB)

    Linsen, L; Pascucci, V; Duchaineau, M A; Hamann, B; Joy, K I

    2004-12-16

    Multiresolution methods are a common technique used for dealing with large-scale data and representing it at multiple levels of detail. The authors present a multiresolution hierarchy construction based on n{radical}2 subdivision, which has all the advantages of a regular data organization scheme while reducing the drawback of coarse granularity. The n{radical}2-subdivision scheme only doubles the number of vertices in each subdivision step regardless of dimension n. They describe the construction of 2D, 3D, and 4D hierarchies representing surfaces, volume data, and time-varying volume data, respectively. The 4D approach supports spatial and temporal scalability. For high-quality data approximation on each level of detail, they use downsampling filters based on n-variate B-spline wavelets. They present a B-spline wavelet lifting scheme for n{radical}2-subdivision steps to obtain small or narrow filters. Narrow filters support adaptive refinement and out-of-core data exploration techniques.

  18. A wavelet based algorithm for DTM extraction from airborne laser scanning data

    Science.gov (United States)

    Xu, Liang; Yang, Yan; Tian, Qingjiu

    2007-06-01

    The automatic extraction of Digital Terrain Model (DTM) from point clouds acquired by airborne laser scanning (ALS) equipment remains a problem in ALS data filtering nowadays. Many filter algorithms have been developed to remove object points and outliers, and to extract DTM automatically. However, it is difficult to filter in areas where few points have identical morphological or geological features that can present the bare earth. Especially in sloped terrain covered by dense vegetation, points representing bare earth are often identified as noisy data below ground. To extract terrain surface in these areas, a new algorithm is proposed. First, the point clouds are cut into profiles based on a scan line segmentation algorithm. In each profile, a 1D filtering procedure is determined from the wavelet theory, which is superior in detecting high frequency discontinuities. After combining profiles from different directions, an interpolated grid data representing DTM is generated. In order to evaluate the performance of this new approach, we applied it to the data set used in the ISPRS filter test in 2003. 2 samples containing mostly vegetation on slopes have been processed by the proposed algorithm. It can be seen that it filtered most of the objects like vegetation and buildings in sloped area, and smoothed the hilly mountain to be more close to its real terrain surface.

  19. A wavelet-based method to discriminate internal faults from inrush currents using correlation coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Vahidi, B.; Ghaffarzadeh, N.; Hosseinian, S.H. [Dept. of Electrical Engineering, Amirkabir University of Technology, Tehran (Iran)

    2010-09-15

    In this paper a new method based on discrete wavelet transform and correlation coefficient is presented for digital differential protection. The algorithm includes offline and online operations. In offline operation, discrete wavelet transform is used to decompose typical three-phase differential currents for inrush current. Then an index is defined and computed. The index is based on the sum of the energy of detail coefficients at level 5 of three-phase differential currents at each half cycle. The online operation consists of capturing the three-phase differential currents using 10 kHz sampling rate, decomposing it by db1. Finally, the inrush current and internal fault is detected based on correlation coefficients of the computed index of pre-stored typical inrush current and a recorded indistinct signal. The effectiveness of the approach is tested using numerous inrush and internal fault currents. Simulations are used to confirm the aptness and the capability of the proposed method to discriminate inrush current from internal fault. (author)

  20. AUTOMATED SEGMENTATION OF CORTICAL NECROSIS USING A WAVELET BASED ABNORMALITY DETECTION SYSTEM.

    Science.gov (United States)

    Gaonkar, Bilwaj; Erus, Guray; Pohl, Kilian M; Tanwar, Manoj; Margiewicz, Stefan; Bryan, R Nick; Davatzikos, Christos

    2011-03-01

    We propose an automated method to segment cortical necrosis from brain FLAIR-MR Images. Cortical necrosis are regions of dead brain tissue in the cortex caused by cerebrovascular disease (CVD). The accurate segmentation of these regions is difficult as their intensity patterns are similar to the adjoining cerebrospinal fluid (CSF). We generate a model of normal variation using MR scans of healthy controls. The model is based on the Jacobians of warps obtained by registering scans of normal subjects to a common coordinate system. For each patient scan a Jacobian is obtained by warping it to the same coordinate system. Large deviations between the model and subject-specific Jacobians are flagged as `abnormalities'. Abnormalities are segmented as cortical necrosis if they are in the cortex and have the intensity profile of CSF. We evaluate our method by using a set of 72 healthy subjects to model cortical variation.We use this model to successfully detect and segment cortical necrosis in a set of 37 patients with CVD. A comparison of the results with segmentations from two independent human experts shows that the overlap between our approach and either of the human experts is in the range of the overlap between the two human experts themselves.

  1. Multidimensional Wavelet-based Regularized Reconstruction for Parallel Acquisition in Neuroimaging

    CERN Document Server

    Chaari, Lotfi; Badillo, Solveig; Pesquet, Jean-Christophe; Ciuciu, Philippe

    2012-01-01

    Parallel MRI is a fast imaging technique that enables the acquisition of highly resolved images in space or/and in time. The performance of parallel imaging strongly depends on the reconstruction algorithm, which can proceed either in the original k-space (GRAPPA, SMASH) or in the image domain (SENSE-like methods). To improve the performance of the widely used SENSE algorithm, 2D- or slice-specific regularization in the wavelet domain has been deeply investigated. In this paper, we extend this approach using 3D-wavelet representations in order to handle all slices together and address reconstruction artifacts which propagate across adjacent slices. The gain induced by such extension (3D-Unconstrained Wavelet Regularized -SENSE: 3D-UWR-SENSE) is validated on anatomical image reconstruction where no temporal acquisition is considered. Another important extension accounts for temporal correlations that exist between successive scans in functional MRI (fMRI). In addition to the case of 2D+t acquisition schemes ad...

  2. A wavelets-based analysis of the phillips curve hypothesis for the Brazilian economy, 1980-2011

    Directory of Open Access Journals (Sweden)

    Edgard Almeida Pimentel

    2013-03-01

    Full Text Available This paper implements a wavelets-based analysis of the Phillips curve hypothesis — as formulated by Friedman and Phelps — for the Brazilian economy, concerning the last thirty years. We provide an introductory discussion on Phillips curve's main arguments and an exploratory data analysis for the variables under consideration: prices, unemployment and real wages. In the sequel, we estimate variances and correlation structures between these aggregates through wavelets. Our findings reject the Phillips curve hypothesis for the Brazilian economy in the short run while suggest that it does hold in the long run. Finally, the correlation structure obtained in the paper captures particular aspects of Brazilian economic policy within the period.Este artigo desenvolve uma análise da hipótese da curva de Phillips — de acordo com a formulação de Phelps-Friedman — para a economia brasileira dos últimos 30 anos através da metodologia de ondaletas. Uma introdução às ideias fundamentais do argumento de Phillips é seguida por uma breve exposição dos principais desenvolvimentos teóricos no tema e uma discussão acerca do recente panorama da pesquisa no Brasil. Em seguida, uma análise exploratória das variáveis em questão é empreendida. Por fim, são apresentadas estruturas de correlação e variâncias estimadas através da metodologia de ondatelas, desagregando assim efeitos de curto e longo prazo. Nossos resultados rejeitam a hipótese da curva de Phillips para a economia brasileira no curto prazo enquanto sugere a sua validade no longo prazo. Ainda discutem-se aspectos da pol+itica econômica nacional evidenciados pela metodologia de análise empregada.

  3. Wavelet-based filter methods for the detection of small transiting planets: Application to Kepler and K2 light curves

    Science.gov (United States)

    Grziwa, Sascha; Korth, Judith; Paetzold, Martin; KEST

    2016-10-01

    The Rheinisches Institut für Umweltforschung (RIU-PF) has developed the software package EXOTRANS for the detection of transits of exoplanets in stellar light curves. This software package was in use during the CoRoT space mission (2006-2013). EXOTRANS was improved by different wavelet-based filter methods during the following years to separate stellar variation, orbital disturbances and instrumental effects from stellar light curves taken by space telescopes (Kepler, K2, TESS and PLATO). The VARLET filter separates faint transit signals from stellar variations without using a-priori information about the target star. VARLET considers variations by frequency, amplitude and shape simultaneously. VARLET is also able to extract most instrumental jumps and glitches. The PHALET filter separates periodic features independent of their shape and is used with the intention to separate diluting stellar binaries. It is also applied for the multi transit search. Stellar light curves of the K2 mission are constructed from the processing of target pixel files which corrects disturbances caused by the reduced pointing precision of the Kepler telescope after the failure of two gyroscopes. The combination of target pixel file processing with both filter techniques and the proven detection pipeline EXOTRANS lowers the detection limit, reduces false alarms and simplifies the detection of faint transits in light curves of the K2 mission. Using EXOTRANS many new candidates were detected in K2 light curves by using EXOTRANS which were successfully confirmed by ground-based follow-up observation of the KEST collaboration. New candidates and confirmed planets are presented.

  4. Pigmented skin lesion detection using random forest and wavelet-based texture

    Science.gov (United States)

    Hu, Ping; Yang, Tie-jun

    2016-10-01

    The incidence of cutaneous malignant melanoma, a disease of worldwide distribution and is the deadliest form of skin cancer, has been rapidly increasing over the last few decades. Because advanced cutaneous melanoma is still incurable, early detection is an important step toward a reduction in mortality. Dermoscopy photographs are commonly used in melanoma diagnosis and can capture detailed features of a lesion. A great variability exists in the visual appearance of pigmented skin lesions. Therefore, in order to minimize the diagnostic errors that result from the difficulty and subjectivity of visual interpretation, an automatic detection approach is required. The objectives of this paper were to propose a hybrid method using random forest and Gabor wavelet transformation to accurately differentiate which part belong to lesion area and the other is not in a dermoscopy photographs and analyze segmentation accuracy. A random forest classifier consisting of a set of decision trees was used for classification. Gabor wavelets transformation are the mathematical model of visual cortical cells of mammalian brain and an image can be decomposed into multiple scales and multiple orientations by using it. The Gabor function has been recognized as a very useful tool in texture analysis, due to its optimal localization properties in both spatial and frequency domain. Texture features based on Gabor wavelets transformation are found by the Gabor filtered image. Experiment results indicate the following: (1) the proposed algorithm based on random forest outperformed the-state-of-the-art in pigmented skin lesions detection (2) and the inclusion of Gabor wavelet transformation based texture features improved segmentation accuracy significantly.

  5. A wavelet-based ECG delineation algorithm for 32-bit integer online processing

    Directory of Open Access Journals (Sweden)

    Chiari Lorenzo

    2011-04-01

    Full Text Available Abstract Background Since the first well-known electrocardiogram (ECG delineator based on Wavelet Transform (WT presented by Li et al. in 1995, a significant research effort has been devoted to the exploitation of this promising method. Its ability to reliably delineate the major waveform components (mono- or bi-phasic P wave, QRS, and mono- or bi-phasic T wave would make it a suitable candidate for efficient online processing of ambulatory ECG signals. Unfortunately, previous implementations of this method adopt non-linear operators such as root mean square (RMS or floating point algebra, which are computationally demanding. Methods This paper presents a 32-bit integer, linear algebra advanced approach to online QRS detection and P-QRS-T waves delineation of a single lead ECG signal, based on WT. Results The QRS detector performance was validated on the MIT-BIH Arrhythmia Database (sensitivity Se = 99.77%, positive predictive value P+ = 99.86%, on 109010 annotated beats and on the European ST-T Database (Se = 99.81%, P+ = 99.56%, on 788050 annotated beats. The ECG delineator was validated on the QT Database, showing a mean error between manual and automatic annotation below 1.5 samples for all fiducial points: P-onset, P-peak, P-offset, QRS-onset, QRS-offset, T-peak, T-offset, and a mean standard deviation comparable to other established methods. Conclusions The proposed algorithm exhibits reliable QRS detection as well as accurate ECG delineation, in spite of a simple structure built on integer linear algebra.

  6. Wavelet based automated postural event detection and activity classification with single imu - biomed 2013.

    Science.gov (United States)

    Lockhart, Thurmon E; Soangra, Rahul; Zhang, Jian; Wu, Xuefan

    2013-01-01

    and classification algorithm using denoised signals from single wireless IMU placed at sternum. The algorithm was further validated and verified with motion capture system in laboratory environment. Wavelet denoising highlighted postural events and transition durations that further provided clinical information on postural control and motor coordination. The presented method can be applied in real life ambulatory monitoring approaches for assessing condition of elderly.

  7. Wavelet-based reconstruction of fossil-fuel CO2 emissions from sparse measurements

    Science.gov (United States)

    McKenna, S. A.; Ray, J.; Yadav, V.; Van Bloemen Waanders, B.; Michalak, A. M.

    2012-12-01

    We present a method to estimate spatially resolved fossil-fuel CO2 (ffCO2) emissions from sparse measurements of time-varying CO2 concentrations. It is based on the wavelet-modeling of the strongly non-stationary spatial distribution of ffCO2 emissions. The dimensionality of the wavelet model is first reduced using images of nightlights, which identify regions of human habitation. Since wavelets are a multiresolution basis set, most of the reduction is accomplished by removing fine-scale wavelets, in the regions with low nightlight radiances. The (reduced) wavelet model of emissions is propagated through an atmospheric transport model (WRF) to predict CO2 concentrations at a handful of measurement sites. The estimation of the wavelet model of emissions i.e., inferring the wavelet weights, is performed by fitting to observations at the measurement sites. This is done using Staggered Orthogonal Matching Pursuit (StOMP), which first identifies (and sets to zero) the wavelet coefficients that cannot be estimated from the observations, before estimating the remaining coefficients. This model sparsification and fitting is performed simultaneously, allowing us to explore multiple wavelet-models of differing complexity. This technique is borrowed from the field of compressive sensing, and is generally used in image and video processing. We test this approach using synthetic observations generated from emissions from the Vulcan database. 35 sensor sites are chosen over the USA. FfCO2 emissions, averaged over 8-day periods, are estimated, at a 1 degree spatial resolutions. We find that only about 40% of the wavelets in emission model can be estimated from the data; however the mix of coefficients that are estimated changes with time. Total US emission can be reconstructed with about ~5% errors. The inferred emissions, if aggregated monthly, have a correlation of 0.9 with Vulcan fluxes. We find that the estimated emissions in the Northeast US are the most accurate. Sandia

  8. Multivariate Approaches for Simultaneous Determination of Avanafil and Dapoxetine by UV Chemometrics and HPLC-QbD in Binary Mixtures and Pharmaceutical Product.

    Science.gov (United States)

    2016-04-07

    Multivariate UV-spectrophotometric methods and Quality by Design (QbD) HPLC are described for concurrent estimation of avanafil (AV) and dapoxetine (DP) in the binary mixture and in the dosage form. Chemometric methods have been developed, including classical least-squares, principal component regression, partial least-squares, and multiway partial least-squares. Analytical figures of merit, such as sensitivity, selectivity, analytical sensitivity, LOD, and LOQ were determined. QbD consists of three steps, starting with the screening approach to determine the critical process parameter and response variables. This is followed by understanding of factors and levels, and lastly the application of a Box-Behnken design containing four critical factors that affect the method. From an Ishikawa diagram and a risk assessment tool, four main factors were selected for optimization. Design optimization, statistical calculation, and final-condition optimization of all the reactions were Carried out. Twenty-five experiments were done, and a quadratic model was used for all response variables. Desirability plot, surface plot, design space, and three-dimensional plots were calculated. In the optimized condition, HPLC separation was achieved on Phenomenex Gemini C18 column (250 × 4.6 mm, 5 μm) using acetonitrile-buffer (ammonium acetate buffer at pH 3.7 with acetic acid) as a mobile phase at flow rate of 0.7 mL/min. Quantification was done at 239 nm, and temperature was set at 20°C. The developed methods were validated and successfully applied for simultaneous determination of AV and DP in the dosage form.

  9. Hydrogeochemistry and quality of surface water and groundwater in the vicinity of Lake Monoun, West Cameroon: approach from multivariate statistical analysis and stable isotopic characterization.

    Science.gov (United States)

    Kamtchueng, Brice T; Fantong, Wilson Y; Wirmvem, Mengnjo J; Tiodjio, Rosine E; Takounjou, Alain F; Ndam Ngoupayou, Jules R; Kusakabe, Minoru; Zhang, Jing; Ohba, Takeshi; Tanyileke, Gregory; Hell, Joseph V; Ueda, Akira

    2016-09-01

    With the use of conventional hydrogeochemical techniques, multivariate statistical analysis, and stable isotope approaches, this paper investigates for the first time surface water and groundwater from the surrounding areas of Lake Monoun (LM), West Cameroon. The results reveal that waters are generally slightly acidic to neutral. The relative abundance of major dissolved species are Ca(2+) > Mg(2+) > Na(+) > K(+) for cations and HCO3 (-) ≫ NO3 (-) > Cl(-) > SO4 (2-) for anions. The main water type is Ca-Mg-HCO3. Observed salinity is related to water-rock interaction, ion exchange process, and anthropogenic activities. Nitrate and chloride have been identified as the most common pollutants. These pollutants are attributed to the chlorination of wells and leaching from pit latrines and refuse dumps. The stable isotopic compositions in the investigated water sources suggest evidence of evaporation before recharge. Four major groups of waters were identified by salinity and NO3 concentrations using the Q-mode hierarchical cluster analysis (HCA). Consistent with the isotopic results, group 1 represents fresh unpolluted water occurring near the recharge zone in the general flow regime; groups 2 and 3 are mixed water whose composition is controlled by both weathering of rock-forming minerals and anthropogenic activities; group 4 represents water under high vulnerability of anthropogenic pollution. Moreover, the isotopic results and the HCA showed that the CO2-rich bottom water of LM belongs to an isolated hydrological system within the Foumbot plain. Except for some springs, groundwater water in the area is inappropriate for drinking and domestic purposes but good to excellent for irrigation.

  10. Genome scan for loci predisposing to anxiety disorders using a novel multivariate approach: strong evidence for a chromosome 4 risk locus.

    Science.gov (United States)

    Kaabi, Belhassen; Gelernter, Joel; Woods, Scott W; Goddard, Andrew; Page, Grier P; Elston, Robert C

    2006-04-01

    We conducted a 10-centimorgan linkage autosomal genome scan in a set of 19 extended American pedigrees (219 subjects) ascertained through probands with panic disorder. Several anxiety disorders--including social phobia, agoraphobia, and simple phobia--in addition to panic disorder segregate in these families. In previous studies of this sample, linkage analyses were based separately on each of the individual categorical affection diagnoses. Given the substantial comorbidity between anxiety disorders and their probable shared genetic liability, it is clear that this method discards a considerable amount of information. In this article, we propose a new approach that considers panic disorder, simple phobia, social phobia, and agoraphobia as expressions of the same multivariate, putatively genetically influenced trait. We applied the most powerful multipoint Haseman-Elston method, using the grade of membership score generated from a fuzzy clustering of these phenotypes as the dependent variable in Haseman-Elston regression. One region on chromosome 4q31-q34, at marker D4S413 (with multipoint and single-point nominal P values < .00001), showed strong evidence of linkage (genomewide significance at P<.05). The same region is known to be the site of a neuropeptide Y receptor gene, NPY1R (4q31-q32), that was recently connected to anxiolytic-like effects in rats. Several other regions on four chromosomes (4q21.21-22.3, 5q14.2-14.3, 8p23.1, and 14q22.3-23.3) met criteria for suggestive linkage (multipoint nominal P values < .01). Family-by-family analysis did not show any strong evidence of heterogeneity. Our findings support the notion that the major anxiety disorders, including phobias and panic disorder, are complex traits that share at least one susceptibility locus. This method could be applied to other complex traits for which shared genetic-liability factors are thought to be important, such as substance dependencies.

  11. Applied multivariate statistics with R

    CERN Document Server

    Zelterman, Daniel

    2015-01-01

    This book brings the power of multivariate statistics to graduate-level practitioners, making these analytical methods accessible without lengthy mathematical derivations. Using the open source, shareware program R, Professor Zelterman demonstrates the process and outcomes for a wide array of multivariate statistical applications. Chapters cover graphical displays, linear algebra, univariate, bivariate and multivariate normal distributions, factor methods, linear regression, discrimination and classification, clustering, time series models, and additional methods. Zelterman uses practical examples from diverse disciplines to welcome readers from a variety of academic specialties. Those with backgrounds in statistics will learn new methods while they review more familiar topics. Chapters include exercises, real data sets, and R implementations. The data are interesting, real-world topics, particularly from health and biology-related contexts. As an example of the approach, the text examines a sample from the B...

  12. Multivariate analysis with LISREL

    CERN Document Server

    Jöreskog, Karl G; Y Wallentin, Fan

    2016-01-01

    This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.

  13. Assessing the response of area burned to changing climate in western boreal North America using a Multivariate Adaptive Regression Splines (MARS) approach

    Science.gov (United States)

    Balshi, M. S.; McGuire, A.D.; Duffy, P.; Flannigan, M.; Walsh, J.; Melillo, J.

    2009-01-01

    Fire is a common disturbance in the North American boreal forest that influences ecosystem structure and function. The temporal and spatial dynamics of fire are likely to be altered as climate continues to change. In this study, we ask the question: how will area burned in boreal North America by wildfire respond to future changes in climate? To evaluate this question, we developed temporally and spatially explicit relationships between air temperature and fuel moisture codes derived from the Canadian Fire Weather Index System to estimate annual area burned at 2.5?? (latitude ?? longitude) resolution using a Multivariate Adaptive Regression Spline (MARS) approach across Alaska and Canada. Burned area was substantially more predictable in the western portion of boreal North America than in eastern Canada. Burned area was also not very predictable in areas of substantial topographic relief and in areas along the transition between boreal forest and tundra. At the scale of Alaska and western Canada, the empirical fire models explain on the order of 82% of the variation in annual area burned for the period 1960-2002. July temperature was the most frequently occurring predictor across all models, but the fuel moisture codes for the months June through August (as a group) entered the models as the most important predictors of annual area burned. To predict changes in the temporal and spatial dynamics of fire under future climate, the empirical fire models used output from the Canadian Climate Center CGCM2 global climate model to predict annual area burned through the year 2100 across Alaska and western Canada. Relative to 1991-2000, the results suggest that average area burned per decade will double by 2041-2050 and will increase on the order of 3.5-5.5 times by the last decade of the 21st century. To improve the ability to better predict wildfire across Alaska and Canada, future research should focus on incorporating additional effects of long-term and successional

  14. Basics of Multivariate Analysis in Neuroimaging Data

    Science.gov (United States)

    Habeck, Christian Georg

    2010-01-01

    Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic

  15. Exchange Rate Forecasting Using Entropy Optimized Multivariate Wavelet Denoising Model

    Directory of Open Access Journals (Sweden)

    Kaijian He

    2014-01-01

    Full Text Available Exchange rate is one of the key variables in the international economics and international trade. Its movement constitutes one of the most important dynamic systems, characterized by nonlinear behaviors. It becomes more volatile and sensitive to increasingly diversified influencing factors with higher level of deregulation and global integration worldwide. Facing the increasingly diversified and more integrated market environment, the forecasting model in the exchange markets needs to address the individual and interdependent heterogeneity. In this paper, we propose the heterogeneous market hypothesis- (HMH- based exchange rate modeling methodology to model the micromarket structure. Then we further propose the entropy optimized wavelet-based forecasting algorithm under the proposed methodology to forecast the exchange rate movement. The multivariate wavelet denoising algorithm is used to separate and extract the underlying data components with distinct features, which are modeled with multivariate time series models of different specifications and parameters. The maximum entropy is introduced to select the best basis and model parameters to construct the most effective forecasting algorithm. Empirical studies in both Chinese and European markets have been conducted to confirm the significant performance improvement when the proposed model is tested against the benchmark models.

  16. Relations between the development of school investment, self-confidence, and language achievement in elementary education: A multivariate latent growth curve approach

    NARCIS (Netherlands)

    R.D. Stoel; T.T.D. Peetsma; J. Roeleveld

    2001-01-01

    Latent growth curve (LGC) analysis of longitudinal data for pupils' school investment, self confidence and language ability is presented. A multivariate model is tested that relates the three developmental processes to each other and to intelligence. All processes show significant differences betwee

  17. 基于小波的B样条曲线多分辨表示及编辑%Wavelets-Based Multiresolution Representation and Edit of B-Spline Curves

    Institute of Scientific and Technical Information of China (English)

    赵罡; 朱心雄

    2001-01-01

    Multiresolution representation provides a more flexible approach to edit curves and surfaces in different resolution levels. The paper describes, from the viewpoint of geometry, the principles and methods of realizing wavelets-based multiresolution representation of quasi-uniform cubic B-spline curves. An example is given to illustrate the editing of B-spline curves in multiresolution level.%多分辨表示方法为曲线提供了更为灵活的表达方式,使得我们可以在不同分辨率下对曲线进行编辑.小波技术是实现曲线多分辨表示的一种新颖方法,已有许多论文从理论上论述了这项技术.文中从几何概念出发,由浅入深地论述了基于小波的准均匀三次B样条曲线多分辨表示的原理及其实现,并通过实例描述了B样条曲线的多分辨编辑.

  18. Switching Between Multivariable Controllers

    DEFF Research Database (Denmark)

    Niemann, H.; Stoustrup, Jakob; Abrahamsen, R.B.

    2004-01-01

    A concept for implementation of multivariable controllers is presented in this paper. The concept is based on the Youla-Jabr-Bongiorno-Kucera (YJBK) parameterization of all stabilizing controllers. By using this architecture for implementation of multivariable controllers, it is shown how...

  19. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....

  20. Multivariate irregular sampling theorem

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper,we prove a Marcinkiewicz-Zygmund type inequality for multivariate entire functions of exponential type with non-equidistant spaced sampling points. And from this result,we establish a multivariate irregular Whittaker-Kotelnikov-Shannon type sampling theorem.

  1. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....

  2. Multivariate irregular sampling theorem

    Institute of Scientific and Technical Information of China (English)

    CHEN GuangGui; FANG GenSun

    2009-01-01

    In this paper, we prove a Marcinkiewicz-Zygmund type inequality for multivariate entire functions of exponential type with non-equidistant spaced sampling points. And from this result, we establish a multivariate irregular Whittaker-Kotelnikov-Shannon type sampling theorem.

  3. Switching Between Multivariable Controllers

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob; Abrahamsen, Rune

    2004-01-01

    it is possible to smoothly switch between multivariable controllers with guaranteed closed-loop stability. This includes also the case where one or more controllers are unstable. The concept for smooth online changes of multivariable controllers based on the YJBK architecture can also handle the start up...

  4. 均匀B样条曲线曲面的小波表示%Wavelets-Based Representation of Uniform B-Spline Curves and Surfaces

    Institute of Scientific and Technical Information of China (English)

    赵罡; 穆国旺; 闫光荣; 朱心雄

    2001-01-01

    Wavelets-based representation provides a more flexible method for expressing curves and surfaces in different resolution levels. For uniform B-spline curves and surfaces, a unified expression can be adopted after decomposition to describe the wavelets for the interior and boundaries of the domain defined, and hence the multiplication is the only operation be needed to wavelets reconstruction. This results in high efficiency for the computation. The paper describes, from the point of geometry view, the principles and methods of realizing wavelets-based multiresolution representation of uniform cubic B-spline curves and surfaces.%小波基为曲线曲面带来了更为灵活的表达方式。均匀B样条曲线曲面在经过小波分解以后所得到的小波在定义域边界与内部可以采用统一的表达式,在进行小波重构时仅需作乘法运算,计算效率高。本文试图从几何概念出发由浅入深地论述基于小波的均匀三次B样条曲线曲面多分辨表示的原理及其实现。

  5. Wavelet-Based Sit-To-Stand Detection and Assessment of Fall Risk in Older People Using a Wearable Pendant Device.

    Science.gov (United States)

    Ejupi, Andreas; Brodie, Matthew; Lord, Stephen R; Annegarn, Janneke; Redmond, Stephen J; Delbaere, Kim

    2017-07-01

    Wearable devices provide new ways to identify people who are at risk of falls and track long-term changes of mobility in daily life of older people. The aim of this study was to develop a wavelet-based algorithm to detect and assess quality of sit-to-stand movements with a wearable pendant device. The algorithm used wavelet transformations of the accelerometer and barometric air pressure sensor data. Detection accuracy was tested in 25 older people performing 30 min of typical daily activities. The ability to differentiate between people who are at risk of falls from people who are not at risk was investigated by assessing group differences of sensor-based sit-to-stand measurements in 34 fallers and 60 nonfallers (based on 12-month fall history) performing sit-to-stand movements as part of a laboratory study. Sit-to-stand movements were detected with 93.1% sensitivity and a false positive rate of 2.9% during activities of daily living. In the laboratory study, fallers had significantly lower maximum acceleration, velocity, and power during the sit-to-stand movement compared to nonfallers. The new wavelet-based algorithm accurately detected sit-to-stand movements in older people and differed significantly between older fallers and nonfallers. Accurate detection and quantification of sit-to-stand movements may provide objective assessment and monitoring of fall risk during daily life in older people.

  6. Multivariate Evolutionary Analyses in Astrophysics

    CERN Document Server

    Fraix-Burnet, Didier

    2011-01-01

    The large amount of data on galaxies, up to higher and higher redshifts, asks for sophisticated statistical approaches to build adequate classifications. Multivariate cluster analyses, that compare objects for their global similarities, are still confidential in astrophysics, probably because their results are somewhat difficult to interpret. We believe that the missing key is the unavoidable characteristics in our Universe: evolution. Our approach, known as Astrocladistics, is based on the evolutionary nature of both galaxies and their properties. It gathers objects according to their "histories" and establishes an evolutionary scenario among groups of objects. In this presentation, I show two recent results on globular clusters and earlytype galaxies to illustrate how the evolutionary concepts of Astrocladistics can also be useful for multivariate analyses such as K-means Cluster Analysis.

  7. Simultaneous determination of propranolol and amiloride in synthetic binary mixtures and pharmaceutical dosage forms by synchronous fluorescence spectroscopy: a multivariate approach

    Science.gov (United States)

    Divya, O.; Shinde, Mandakini

    2013-07-01

    A multivariate calibration model for the simultaneous estimation of propranolol (PRO) and amiloride (AMI) using synchronous fluorescence spectroscopic data has been presented in this paper. Two multivariate techniques, PCR (Principal Component Regression) and PLSR (Partial Least Square Regression), have been successfully applied for the simultaneous determination of AMI and PRO in synthetic binary mixtures and pharmaceutical dosage forms. The SF spectra of AMI and PRO (calibration mixtures) were recorded at several concentrations within their linear range between wavelengths of 310 and 500 nm at an interval of 1 nm. Calibration models were constructed using 32 samples and validated by varying the concentrations of AMI and PRO in the calibration range. The results indicated that the model developed was very robust and able to efficiently analyze the mixtures with low RMSEP values.

  8. 用于弱目标检测的形态滤波和小波变换图像增强算法%Morphological Filters and Wavelet-based Histogram Equalization Image Enhancement for Weak Target Detection

    Institute of Scientific and Technical Information of China (English)

    吉书鹏; 丁小青

    2003-01-01

    Image enhancement methods are typically aimed at improvement of the overall visibility of features. Though histogram equalization can enhance the contrast by redistributing the gray levels, it has the drawback that it reduces the information in the processed image. In this paper, we present a new image enhancement algorithm. After histogram equalization is carried out, morphological filters and wavelet-based enhancement algorithm is used to clean out the unwanted details and further enhance the image and compensate for the information loss during histogram equalization. Experimental results show that the morphological filters and wavelet-based histogram equalization algorithm can significantly enhance the contrast and increase the information entropy of the image.

  9. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  10. Wavelet-based regularization and edge preservation for submillimetre 3D list-mode reconstruction data from a high resolution small animal PET system

    Energy Technology Data Exchange (ETDEWEB)

    Jesus Ochoa Dominguez, Humberto de, E-mail: hochoa@uacj.mx [Departamento de Ingenieria Eectrica y Computacion, Universidad Autonoma de Ciudad Juarez, Avenida del Charro 450 Norte, C.P. 32310 Ciudad Juarez, Chihuahua (Mexico); Ortega Maynez, Leticia; Osiris Vergara Villegas, Osslan; Gordillo Castillo, Nelly; Guadalupe Cruz Sanchez, Vianey; Gutierrez Casas, Efren David [Departamento de Ingenieria Eectrica y Computacion, Universidad Autonoma de Ciudad Juarez, Avenida del Charro 450 Norte, C.P. 32310 Ciudad Juarez, Chihuahua (Mexico)

    2011-10-01

    The data obtained from a PET system tend to be noisy because of the limitations of the current instrumentation and the detector efficiency. This problem is particularly severe in images of small animals as the noise contaminates areas of interest within small organs. Therefore, denoising becomes a challenging task. In this paper, a novel wavelet-based regularization and edge preservation method is proposed to reduce such noise. To demonstrate this method, image reconstruction using a small mouse {sup 18}F NEMA phantom and a {sup 18}F mouse was performed. Investigation on the effects of the image quality was addressed for each reconstruction case. Results show that the proposed method drastically reduces the noise and preserves the image details.

  11. The Identification of Internal and External Faults for±800kV UHVDC Transmission Line Using Wavelet based Multi-Resolution Analysis

    Directory of Open Access Journals (Sweden)

    Shu Hongchun

    2011-05-01

    Full Text Available There is a smoothing reactor and DC filter between the inverter and the direct current line to form a boundary in the HVDC transmission system. Since this boundary presents the stop-band characteristic to the high frequency transient voltage signals, the high-frequency transient voltage signal caused by external faults through boundary will be attenuated and the signals caused by internal faults will be unchanged. The wavelet analysis can be used as a tool to extract the feature of the fault to classify the internal fault and the external fault in HVDC transmission system. This paper explores the new method of wavelet based Multi-Resolution Analysis for signal decomposition to classify the difference types fault.

  12. A Wavelet-Based Robust Relevance Vector Machine Based on Sensor Data Scheduling Control for Modeling Mine Gas Gushing Forecasting on Virtual Environment

    Directory of Open Access Journals (Sweden)

    Wang Ting

    2013-01-01

    Full Text Available It is wellknown that mine gas gushing forecasting is very significant to ensure the safety of mining. A wavelet-based robust relevance vector machine based on sensor data scheduling control for modeling mine gas gushing forecasting is presented in the paper. Morlet wavelet function can be used as the kernel function of robust relevance vector machine. Mean percentage error has been used to measure the performance of the proposed method in this study. As the mean prediction error of mine gas gushing of the WRRVM model is less than 1.5%, and the mean prediction error of mine gas gushing of the RVM model is more than 2.5%, it can be seen that the prediction accuracy for mine gas gushing of the WRRVM model is better than that of the RVM model.

  13. New and Simple Approach for Preventing Postoperative Peritoneal Adhesions: Do not Touch the Peritoneum without Viscous Liquid—A Multivariate Analysis

    Science.gov (United States)

    Aysan, Erhan; Bektas, Hasan; Ersoz, Feyzullah; Sari, Serkan; Kaygusuz, Arslan; Huq, Gulben Erdem

    2012-01-01

    Background. Postoperative peritoneal adhesions (PPAs) are an unsolved and serious problem in abdominal surgery. Method. Viscous liquids of soybean oil, octyl methoxycinnamate, flax oil, aloe vera gel, and glycerol were used in five experiments, using the same methodology for each. Liquids were applied in the peritoneal cavity before and after mechanical peritoneal trauma. Results were evaluated by multivariate analysis. Results. Compared with the control group, macroscopic and microscopic adhesion values before (P < .001) and after (P < .05) application of viscous liquids significantly reduced PPAs. Values were significantly lower when liquids were applied before rather than after peritoneal trauma (P < .0001). Discussion. Viscous liquids injected into the peritoneal cavity before or after mechanical peritoneal trauma decrease PPA. Injection before trauma was more effective than after trauma. In surgical practice, PPA formation may be prevented or decreased by covering the peritoneal cavity with an appropriate viscous liquid before abdominal surgery. PMID:22363347

  14. New and Simple Approach for Preventing Postoperative Peritoneal Adhesions: Do not Touch the Peritoneum without Viscous Liquid—A Multivariate Analysis

    Directory of Open Access Journals (Sweden)

    Erhan Aysan

    2012-01-01

    Full Text Available Background. Postoperative peritoneal adhesions (PPAs are an unsolved and serious problem in abdominal surgery. Method. Viscous liquids of soybean oil, octyl methoxycinnamate, flax oil, aloe vera gel, and glycerol were used in five experiments, using the same methodology for each. Liquids were applied in the peritoneal cavity before and after mechanical peritoneal trauma. Results were evaluated by multivariate analysis. Results. Compared with the control group, macroscopic and microscopic adhesion values before (P<.001 and after (P<.05 application of viscous liquids significantly reduced PPAs. Values were significantly lower when liquids were applied before rather than after peritoneal trauma (P<.0001. Discussion. Viscous liquids injected into the peritoneal cavity before or after mechanical peritoneal trauma decrease PPA. Injection before trauma was more effective than after trauma. In surgical practice, PPA formation may be prevented or decreased by covering the peritoneal cavity with an appropriate viscous liquid before abdominal surgery.

  15. [Multivariate geostatistics and GIS-based approach to study the spatial distribution and sources of heavy metals in agricultural soil in the Pearl River Delta, China].

    Science.gov (United States)

    Cai, Li-mei; Ma, Jin; Zhou, Yong-zhang; Huang, Lan-chun; Dou, Lei; Zhang, Cheng-bo; Fu, Shan-ming

    2008-12-01

    One hundred and eighteen surface soil samples were collected from the Dongguan City, and analyzed for concentration of Cu, Zn, Ni, Cr, Pb, Cd, As, Hg, pH and OM. The spatial distribution and sources of soil heavy metals were studied using multivariate geostatistical methods and GIS technique. The results indicated concentrations of Cu, Zn, Ni, Pb, Cd and Hg were beyond the soil background content in Guangdong province, and especially concentrations of Pb, Cd and Hg were greatly beyond the content. The results of factor analysis group Cu, Zn, Ni, Cr and As in Factor 1, Pb and Hg in Factor 2 and Cd in Factor 3. The spatial maps based on geostatistical analysis show definite association of Factor 1 with the soil parent material, Factor 2 was mainly affected by industries. The spatial distribution of Factor 3 was attributed to anthropogenic influence.

  16. Testing for Expected Return and Market Price of Risk in Chinese A-B Share Market: A Geometric Brownian Motion and Multivariate GARCH Model Approach

    DEFF Research Database (Denmark)

    Zhu, Jie

    There exist dual-listed stocks which are issued by the same company in some stock markets. Although these stocks bare the same firm-specific risk and enjoy identical dividends and voting policies, they are priced differently. Some previous studies show this seeming deviation from the law of one...... price can be solved due to different ex- pected return and market price of risk for investors holding heterogeneous beliefs. This paper provides empirical evidence for that argument by testing the expected return and market price of risk between Chinese A and B shares listed in Shanghai and Shenzhen...... stock markets. Models with dynamic of Geometric Brownian Motion are adopted, multivariate GARCH models are also introduced to capture the feature of time-varying volatility in stock returns. The results suggest that the different pric- ing can be explained by the difference in expected returns between...

  17. Testing for Expected Return and Market Price of Risk in Chinese A-B Share Market: A Geometric Brownian Motion and Multivariate GARCH Model Approach

    DEFF Research Database (Denmark)

    Zhu, Jie

    There exist dual-listed stocks which are issued by the same company in some stock markets. Although these stocks bare the same firm-specific risk and enjoy identical dividends and voting policies, they are priced differently. Some previous studies show this seeming deviation from the law of one...... price can be solved due to different ex- pected return and market price of risk for investors holding heterogeneous beliefs. This paper provides empirical evidence for that argument by testing the expected return and market price of risk between Chinese A and B shares listed in Shanghai and Shenzhen...... stock markets. Models with dynamic of Geometric Brownian Motion are adopted, multivariate GARCH models are also introduced to capture the feature of time-varying volatility in stock returns. The results suggest that the different pric- ing can be explained by the difference in expected returns between...

  18. Dissolution comparisons using a Multivariate Statistical Distance (MSD) test and a comparison of various approaches for calculating the measurements of dissolution profile comparison.

    Science.gov (United States)

    Cardot, J-M; Roudier, B; Schütz, H

    2017-07-01

    The f 2 test is generally used for comparing dissolution profiles. In cases of high variability, the f 2 test is not applicable, and the Multivariate Statistical Distance (MSD) test is frequently proposed as an alternative by the FDA and EMA. The guidelines provide only general recommendations. MSD tests can be performed either on raw data with or without time as a variable or on parameters of models. In addition, data can be limited-as in the case of the f 2 test-to dissolutions of up to 85% or to all available data. In the context of the present paper, the recommended calculation included all raw dissolution data up to the first point greater than 85% as a variable-without the various times as parameters. The proposed MSD overcomes several drawbacks found in other methods.

  19. PENDEKATAN MULTIVARIAT UNTUK PENGUKURAN KUALITAS TOMAT (Lycopersicon esculentum BERDASARKAN PARAMETER WARNA Multivariate Approach to The Measurement of Tomato (Lycopersicon esculentum Quality Based on Color Parameters

    Directory of Open Access Journals (Sweden)

    Rudiati Evi Masithoh

    2012-05-01

    Full Text Available In this study, multivariate linear regression (MLR was used to predict the content of Brix, total carotene, citric acid,and vitamin C of tomato based on RGB color parameters. Tomatoes were stored at 6 °C and 28 °C then their quality parameters were measured. R, G, and B values were measured non-destructively using computer vision system developed in the previous study. Brix, total carotene, citric acid, and vitamin C were determined by conventional procedures in the laboratory. Data analysis showed that the MLR calibration models could be used to predict Brix, total carotene, citric acid, and vitamin C with R2  of  0.77and 0.72, 0.902 and 0.85, 0.71 and 0.77, as well as 0.88 and 0.82 for temperature of 6 °C and 28 °C, respectively. ABSTRAK Pada penelitian ini, multivariate linier regression (MLR digunakan untuk memprediksi kandungan Brix, karoten total,asam sitrat, dan vitamin C dari tomat berdasarkan parameter warna RGB. Tomat disimpan pada suhu 6 °C dan 28 °C kemudian diukur parameter kualitasnya. Nilai R, G, dan B diukur secara non-destructive dari computer vision system yang dikembangkan pada penelitian sebelumnya. Parameter kualitas Brix, karoten total, asam sitrat, dan vitamin C ditentukan secara destructive dengan prosedur konvensional di laboratorium. Analisis data menunjukkan bahwa model kalibrasi MLR dapat digunakan untuk memprediksi Brix, karoten total, asam sitrat, dan vitamin C dengan R2 sebesar0,77dan 0,72, 0,902 dan 0,85, 0,71 dan 0,77, serta 0,88 dan 0,82 untuk suhu 6 °C dan 28 °C secara berturutan.

  20. Testing for Multivariate Normality in Mass Spectrometry Imaging Data: A Robust Statistical Approach for Clustering Evaluation and the Generation of Synthetic Mass Spectrometry Imaging Data Sets.

    Science.gov (United States)

    Dexter, Alex; Race, Alan M; Styles, Iain B; Bunch, Josephine

    2016-11-15

    Spatial clustering is a powerful tool in mass spectrometry imaging (MSI) and has been demonstrated to be capable of differentiating tumor types, visualizing intratumor heterogeneity, and segmenting anatomical structures. Several clustering methods have been applied to mass spectrometry imaging data, but a principled comparison and evaluation of different clustering techniques presents a significant challenge. We propose that testing whether the data has a multivariate normal distribution within clusters can be used to evaluate the performance when using algorithms that assume normality in the data, such as k-means clustering. In cases where clustering has been performed using the cosine distance, conversion of the data to polar coordinates prior to normality testing should be performed to ensure normality is tested in the correct coordinate system. In addition to these evaluations of internal consistency, we demonstrate that the multivariate normal distribution can then be used as a basis for statistical modeling of MSI data. This allows the generation of synthetic MSI data sets with known ground truth, providing a means of external clustering evaluation. To demonstrate this, reference data from seven anatomical regions of an MSI image of a coronal section of mouse brain were modeled. From this, a set of synthetic data based on this model was generated. Results of r(2) fitting of the chi-squared quantile-quantile plots on the seven anatomical regions confirmed that the data acquired from each spatial region was found to be closer to normally distributed in polar space than in Euclidean. Finally, principal component analysis was applied to a single data set that included synthetic and real data. No significant differences were found between the two data types, indicating the suitability of these methods for generating realistic synthetic data.

  1. A Wavelet-Based Unified Power Quality Conditioner to Eliminate Wind Turbine Non-Ideality Consequences on Grid-Connected Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Bijan Rahmani

    2016-05-01

    Full Text Available The integration of renewable power sources with power grids presents many challenges, such as synchronization with the grid, power quality problems and so on. The shunt active power filter (SAPF can be a solution to address the issue while suppressing the grid-end current harmonics and distortions. Nonetheless, available SAPFs work somewhat unpredictably in practice. This is attributed to the dependency of the SAPF controller on nonlinear complicated equations and two distorted variables, such as load current and voltage, to produce the current reference. This condition will worsen when the plant includes wind turbines which inherently produce 3rd, 5th, 7th and 11th voltage harmonics. Moreover, the inability of the typical phase locked loop (PLL used to synchronize the SAPF reference with the power grid also disrupts SAPF operation. This paper proposes an improved synchronous reference frame (SRF which is equipped with a wavelet-based PLL to control the SAPF, using one variable such as load current. Firstly the fundamental positive sequence of the source voltage, obtained using a wavelet, is used as the input signal of the PLL through an orthogonal signal generator process. Then, the generated orthogonal signals are applied through the SRF-based compensation algorithm to synchronize the SAPF’s reference with power grid. To further force the remained uncompensated grid current harmonics to pass through the SAPF, an improved series filter (SF equipped with a current harmonic suppression loop is proposed. Concurrent operation of the improved SAPF and SF is coordinated through a unified power quality conditioner (UPQC. The DC-link capacitor of the proposed UPQC, used to interconnect a photovoltaic (PV system to the power grid, is regulated by an adaptive controller. Matlab/Simulink results confirm that the proposed wavelet-based UPQC results in purely sinusoidal grid-end currents with total harmonic distortion (THD = 1.29%, which leads to high

  2. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  3. Wavelet threshold image denoising algorithm based on MATLAB different wavelet bases%基于MATLAB不同小波基的小波阈值图像去噪算法

    Institute of Scientific and Technical Information of China (English)

    曾敬枫

    2016-01-01

    Through the introduction of wavelet image denoising method and wavelet threshold denoising steps,this paper discusses the role of wavelet bases in wavelet threshold denoising, and describes the characteristics of several common wavelet bases and their correlation properties. Finally, respectively with a db2 and sym4 two kinds of wavelet bases by MATLAB, to denoise wavelet threshold realizes the image filtering and reconstruction of high frequency coefficients, so the conclusion is obtained that using different wavelet bases affects the results of image denoising.%通过介绍小波图像去噪的方法和小波阈值去噪的步骤,讨论小波基在小波阈值去噪中的作用,阐述了常见的几种小波基的特征及其相关性质的比较。最后通过在MATLAB下,分别选择了db2和sym4两种小波基,进行小波阈值去噪实现图像高频系数的滤波并重建,得到采用不同的小波基影响图像去噪效果的结论。

  4. Multivariate normative comparisons using an aggregated database

    Science.gov (United States)

    Murre, Jaap M. J.; Huizenga, Hilde M.

    2017-01-01

    In multivariate normative comparisons, a patient’s profile of test scores is compared to those in a normative sample. Recently, it has been shown that these multivariate normative comparisons enhance the sensitivity of neuropsychological assessment. However, multivariate normative comparisons require multivariate normative data, which are often unavailable. In this paper, we show how a multivariate normative database can be constructed by combining healthy control group data from published neuropsychological studies. We show that three issues should be addressed to construct a multivariate normative database. First, the database may have a multilevel structure, with participants nested within studies. Second, not all tests are administered in every study, so many data may be missing. Third, a patient should be compared to controls of similar age, gender and educational background rather than to the entire normative sample. To address these issues, we propose a multilevel approach for multivariate normative comparisons that accounts for missing data and includes covariates for age, gender and educational background. Simulations show that this approach controls the number of false positives and has high sensitivity to detect genuine deviations from the norm. An empirical example is provided. Implications for other domains than neuropsychology are also discussed. To facilitate broader adoption of these methods, we provide code implementing the entire analysis in the open source software package R. PMID:28267796

  5. Multivariate statistical and lead isotopic analyses approach to identify heavy metal sources in topsoil from the industrial zone of Beijing Capital Iron and Steel Factory.

    Science.gov (United States)

    Zhu, Guangxu; Guo, Qingjun; Xiao, Huayun; Chen, Tongbin; Yang, Jun

    2017-06-01

    Heavy metals are considered toxic to humans and ecosystems. In the present study, heavy metal concentration in soil was investigated using the single pollution index (PIi), the integrated Nemerow pollution index (PIN), and the geoaccumulation index (Igeo) to determine metal accumulation and its pollution status at the abandoned site of the Capital Iron and Steel Factory in Beijing and its surrounding area. Multivariate statistical (principal component analysis and correlation analysis), geostatistical analysis (ArcGIS tool), combined with stable Pb isotopic ratios, were applied to explore the characteristics of heavy metal pollution and the possible sources of pollutants. The results indicated that heavy metal elements show different degrees of accumulation in the study area, the observed trend of the enrichment factors, and the geoaccumulation index was Hg > Cd > Zn > Cr > Pb > Cu ≈ As > Ni. Hg, Cd, Zn, and Cr were the dominant elements that influenced soil quality in the study area. The Nemerow index method indicated that all of the heavy metals caused serious pollution except Ni. Multivariate statistical analysis indicated that Cd, Zn, Cu, and Pb show obvious correlation and have higher loads on the same principal component, suggesting that they had the same sources, which are related to industrial activities and vehicle emissions. The spatial distribution maps based on ordinary kriging showed that high concentrations of heavy metals were located in the local factory area and in the southeast-northwest part of the study region, corresponding with the predominant wind directions. Analyses of lead isotopes confirmed that Pb in the study soils is predominantly derived from three Pb sources: dust generated during steel production, coal combustion, and the natural background. Moreover, the ternary mixture model based on lead isotope analysis indicates that lead in the study soils originates mainly from anthropogenic sources, which contribute much more

  6. Multivariable Feedback Control of Nuclear Reactors

    Directory of Open Access Journals (Sweden)

    Rune Moen

    1982-07-01

    Full Text Available Multivariable feedback control has been adapted for optimal control of the spatial power distribution in nuclear reactor cores. Two design techniques, based on the theory of automatic control, were developed: the State Variable Feedback (SVF is an application of the linear optimal control theory, and the Multivariable Frequency Response (MFR is based on a generalization of the traditional frequency response approach to control system design.

  7. Multivariate data analysis

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg

    Interest in statistical methodology is increasing so rapidly in the astronomical community that accessible introductory material in this area is long overdue. This book fills the gap by providing a presentation of the most useful techniques in multivariate statistics. A wide-ranging annotated set...

  8. Multivariate realised kernels

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Hansen, Peter Reinhard; Lunde, Asger

    2011-01-01

    We propose a multivariate realised kernel to estimate the ex-post covariation of log-prices. We show this new consistent estimator is guaranteed to be positive semi-definite and is robust to measurement error of certain types and can also handle non-synchronous trading. It is the first estimator...

  9. Multivariate data analysis

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg

    Interest in statistical methodology is increasing so rapidly in the astronomical community that accessible introductory material in this area is long overdue. This book fills the gap by providing a presentation of the most useful techniques in multivariate statistics. A wide-ranging annotated set...

  10. A MULTIVARIATE WEIBULL DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Cheng Lee

    2010-07-01

    Full Text Available A multivariate survival function of Weibull Distribution is developed by expanding the theorem by Lu and Bhattacharyya. From the survival function, the probability density function, the cumulative probability function, the determinant of the Jacobian Matrix, and the general moment are derived.

  11. Simulation of multivariate diffusion bridges

    DEFF Research Database (Denmark)

    Bladt, Mogens; Finch, Samuel; Sørensen, Michael

    We propose simple methods for multivariate diffusion bridge simulation, which plays a fundamental role in simulation-based likelihood and Bayesian inference for stochastic differential equations. By a novel application of classical coupling methods, the new approach generalizes a previously...... proposed simulation method for one-dimensional bridges to the mulit-variate setting. First a method of simulating approzimate, but often very accurate, diffusion bridges is proposed. These approximate bridges are used as proposal for easily implementable MCMC algorithms that produce exact diffusion bridges...

  12. Likelihood estimators for multivariate extremes

    KAUST Repository

    Huser, Raphaël

    2015-11-17

    The main approach to inference for multivariate extremes consists in approximating the joint upper tail of the observations by a parametric family arising in the limit for extreme events. The latter may be expressed in terms of componentwise maxima, high threshold exceedances or point processes, yielding different but related asymptotic characterizations and estimators. The present paper clarifies the connections between the main likelihood estimators, and assesses their practical performance. We investigate their ability to estimate the extremal dependence structure and to predict future extremes, using exact calculations and simulation, in the case of the logistic model.

  13. A New Predictive Model Based on the ABC Optimized Multivariate Adaptive Regression Splines Approach for Predicting the Remaining Useful Life in Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Paulino José García Nieto

    2016-05-01

    Full Text Available Remaining useful life (RUL estimation is considered as one of the most central points in the prognostics and health management (PHM. The present paper describes a nonlinear hybrid ABC–MARS-based model for the prediction of the remaining useful life of aircraft engines. Indeed, it is well-known that an accurate RUL estimation allows failure prevention in a more controllable way so that the effective maintenance can be carried out in appropriate time to correct impending faults. The proposed hybrid model combines multivariate adaptive regression splines (MARS, which have been successfully adopted for regression problems, with the artificial bee colony (ABC technique. This optimization technique involves parameter setting in the MARS training procedure, which significantly influences the regression accuracy. However, its use in reliability applications has not yet been widely explored. Bearing this in mind, remaining useful life values have been predicted here by using the hybrid ABC–MARS-based model from the remaining measured parameters (input variables for aircraft engines with success. A correlation coefficient equal to 0.92 was obtained when this hybrid ABC–MARS-based model was applied to experimental data. The agreement of this model with experimental data confirmed its good performance. The main advantage of this predictive model is that it does not require information about the previous operation states of the aircraft engine.

  14. Discriminating among multiple components affecting bulk atmospheric deposition chemistry: a multivariate approach using data from a forest plot in Calabria (Southern Italy

    Directory of Open Access Journals (Sweden)

    Maurizio BADIANI

    2007-02-01

    Full Text Available This study examines the relationships between meteorology and atmospheric deposition chemistry on the basis of 4 years of monitoring in an area of Calabria (Piano Limina under the National Integrated Programme for the Control of Forest Ecosystems. The location of the area and its low anthropogenic impact meant that phenomena of locally originating alkaline dust deposition could be distinguished from those originating long distances away. The analysis performed on the whole dataset revealed the interaction between temperature, solar radiation and ionic concentrations; the effects of the atmospheric transport of compounds, with lower concentrations during calm conditions; and a marked increase of calcium, alkalinity and pH with winds from W-SW, owing to the transport of alkaline dust from North Africa, in agreement with thematic maps on the synoptic scale. The possible influence of two volcanic events deriving from Stromboli and Etna is discussed. After elimination of the Saharan dust and volcanic events, a multivariate analysis showed the effects of compounds deriving from anthropogenic activities. Sulphate, nitrate and ammonium were closely correlated with NW winds; air masses from this direction come from the continental land mass and the sea, crossing the Calabrian plain before being deposited as precipitation on the Apennine chain. The component from NW also includes a high marine contribution, with maximum values of chloride and sodium.

  15. A multivariate approach using attenuated total reflectance mid-infrared spectroscopy to measure the surface mannoproteins and β-glucans of yeast cell walls during wine fermentations.

    Science.gov (United States)

    Moore, John P; Zhang, Song-Lei; Nieuwoudt, Hélène; Divol, Benoit; Trygg, Johan; Bauer, Florian F

    2015-11-18

    Yeast cells possess a cell wall comprising primarily glycoproteins, mannans, and glucan polymers. Several yeast phenotypes relevant for fermentation, wine processing, and wine quality are correlated with cell wall properties. To investigate the effect of wine fermentation on cell wall composition, a study was performed using mid-infrared (MIR) spectroscopy coupled with multivariate methods (i.e., PCA and OPLS-DA). A total of 40 yeast strains were evaluated, including Saccharomyces strains (laboratory and industrial) and non-Saccharomyces species. Cells were fermented in both synthetic MS300 and Chardonnay grape must to stationery phase, processed, and scanned in the MIR spectrum. PCA of the fingerprint spectral region showed distinct separation of Saccharomyces strains from non-Saccharomyces species; furthermore, industrial wine yeast strains separated from laboratory strains. PCA loading plots and the use of OPLS-DA to the data sets suggested that industrial strains were enriched with cell wall proteins (e.g., mannoproteins), whereas laboratory strains were composed mainly of mannan and glucan polymers.

  16. A study of pH-dependent photodegradation of amiloride by a multivariate curve resolution approach to combined kinetic and acid-base titration UV data.

    Science.gov (United States)

    De Luca, Michele; Ioele, Giuseppina; Mas, Sílvia; Tauler, Romà; Ragno, Gaetano

    2012-11-21

    Amiloride photostability at different pH values was studied in depth by applying Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) to the UV spectrophotometric data from drug solutions exposed to stressing irradiation. Resolution of all degradation photoproducts was possible by simultaneous spectrophotometric analysis of kinetic photodegradation and acid-base titration experiments. Amiloride photodegradation showed to be strongly dependent on pH. Two hard modelling constraints were sequentially used in MCR-ALS for the unambiguous resolution of all the species involved in the photodegradation process. An amiloride acid-base system was defined by using the equilibrium constraint, and the photodegradation pathway was modelled taking into account the kinetic constraint. The simultaneous analysis of photodegradation and titration experiments revealed the presence of eight different species, which were differently distributed according to pH and time. Concentration profiles of all the species as well as their pure spectra were resolved and kinetic rate constants were estimated. The values of rate constants changed with pH and under alkaline conditions the degradation pathway and photoproducts also changed. These results were compared to those obtained by LC-MS analysis from drug photodegradation experiments. MS analysis allowed the identification of up to five species and showed the simultaneous presence of more than one acid-base equilibrium.

  17. Potential of LC-MS phenolic profiling combined with multivariate analysis as an approach for the determination of the geographical origin of north Moroccan virgin olive oils.

    Science.gov (United States)

    Bajoub, Aadil; Carrasco-Pancorbo, Alegría; Ajal, El Amine; Ouazzani, Noureddine; Fernández-Gutiérrez, Alberto

    2015-01-01

    The applicability of two different platforms (LC-ESI-TOF MS and LC-ESI-IT MS) as powerful tools for the characterisation and subsequent quantification of the phenolic compounds present in north Moroccan virgin olive oils was assessed in this study. 156 olives samples of "Picholine Marocaine" cultivar grown in 7 Moroccan regions were collected and olive oils extracted. The phenolic profiles of these olive oils were studied using a resolutive chromatographic method coupled to ESI-TOF MS (for initial characterisation purposes) and coupled to ESI-IT MS (for further identification and quantification). 25 phenolic compounds belonging to different chemical families were identified and quantified. Secoiridoids were the most abundant phenols in all the samples, followed by phenolic alcohols, lignans and flavonoids, respectively. For testing the ability of phenolic profiles for tracing the geographical origin of the investigated oils, multivariate analysis tools were used, getting a good rate of correct classification and prediction by using a cross validation procedure. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Application of multivariate statistical approach to identify trace elements sources in surface waters: a case study of Kowalskie and Stare Miasto reservoirs, Poland.

    Science.gov (United States)

    Siepak, Marcin; Sojka, Mariusz

    2017-08-01

    The paper reports the results of measurements of trace elements concentrations in surface water samples collected at the lowland retention reservoirs of Stare Miasto and Kowalskie (Poland). The samples were collected once a month from October 2011 to November 2012. Al, As, Cd, Co, Cr, Cu, Li, Mn, Ni, Pb, Sb, V, and Zn were determined in water samples using the inductively coupled plasma with mass detection (ICP-QQQ). To assess the chemical composition of surface water, multivariate statistical methods of data analysis were used, viz. cluster analysis (CA), principal components analysis (PCA), and discriminant analysis (DA). They made it possible to observe similarities and differences in the chemical composition of water in the points of water samples collection, to uncover hidden factors accounting for the structure of the data, and to assess the impact of natural and anthropogenic sources on the content of trace elements in the water of retention reservoirs. The conducted statistical analyses made it possible to distinguish groups of trace elements allowing for the analysis of time and spatial variation of water in the studied reservoirs.

  19. Investigation on the antidepressant effect of sea buckthorn seed oil through the GC-MS-based metabolomics approach coupled with multivariate analysis.

    Science.gov (United States)

    Tian, Jun-sheng; Liu, Cai-chun; Xiang, Huan; Zheng, Xiao-fen; Peng, Guo-jiang; Zhang, Xiang; Du, Guan-hua; Qin, Xue-mei

    2015-11-01

    Depression is one of the prevalent and serious mental disorders and the number of depressed patients has been on the rise globally during the recent decades. Sea buckthorn seed oil from traditional Chinese medicine (TCM) is edible and has been widely used for treatment of different diseases for a long time. However, there are few published reports on the antidepressant effect of sea buckthorn seed oil. With the objective of finding potential biomarkers of the therapeutic response of sea buckthorn seed oil in chronic unpredictable mild stress (CUMS) rats, urine metabolomics based on gas chromatography-mass spectrometry (GC-MS) coupled with multivariate analysis was applied. In this study, we discovered a higher level of pimelic acid as well as palmitic acid and a lower level of suberic acid, citrate, phthalic acid, cinnamic acid and Sumiki's acid in urine of rats exposed to CUMS procedures after sea buckthorn seed oil was administered. These changes of metabolites are involved in energy metabolism, fatty acid metabolism and other metabolic pathways as well as in the synthesis of neurotransmitters and it is helpful to facilitate the efficacy evaluation and mechanism elucidating the effect of sea buckthorn seed oil for depression management.

  20. Measurements of natural gamma radiation in beach sediments of north east coast of Tamilnadu, India by gamma ray spectrometry with multivariate statistical approach

    Directory of Open Access Journals (Sweden)

    M. SureshGandhi

    2014-01-01

    Full Text Available The distribution of natural gamma ray emitting 238U, 232Th and 40K radionuclides in beach sediments along north east coast of Tamilnadu, India has been carried out using a NaI(Tl gamma ray spectrometric technique. The total average concentrations of radionuclides 238U, 232Th, and 40K were 35.12, 713.16, and 349.60 Bq kg−1, respectively. Correlations made among these radionuclides prove the existence of secular equilibrium in the investigated sediments. The total average absorbed dose rate in the study areas is found to be 504.75 nGyh−1, whereas the annual effective dose rate has an average value of 0.62 mSvy−1. The mean activity concentrations of measured radionuclides were compared with other literature values. The ratios between the detected radioisotopes have been calculated for spatial distribution of natural radionuclides in studied area. Also the radiological hazard of the natural radionuclides content, radium equivalent activity, external hazard index of the sediment samples in the area under consideration were calculated. Multivariate Statistical analyses (Pearson Correlation, Cluster and Factor analysis were carried out between the parameters obtained from radioactivity to know the existing relations.

  1. Survival to parasitoids in an insect hosting defensive symbionts: a multivariate approach to polymorphic traits affecting host use by its natural enemy.

    Science.gov (United States)

    Bilodeau, Emilie; Guay, Jean-Frédéric; Turgeon, Julie; Cloutier, Conrad

    2013-01-01

    Insect parasitoids and their insect hosts represent a wide range of parasitic trophic relations that can be used to understand the evolution of biotic diversity on earth. Testing theories of coevolution between hosts and parasites is based on factors directly involved in host susceptibility and parasitoid virulence. We used controlled encounters with potential hosts of the Aphidius ervi wasp to elucidate behavioral and other phenotypic traits of host Acyrthosiphon pisum that most contribute to success or failure of parasitism. The host aphid is at an advanced stage of specialization on different crop plants, and exhibits intra-population polymorphism for traits of parasitoid avoidance and resistance based on clonal variation of color morph and anti-parasitoid bacterial symbionts. Randomly selected aphid clones from alfalfa and clover were matched in 5 minute encounters with wasps of two parasitoid lineages deriving from hosts of each plant biotype in a replicated transplant experimental design. In addition to crop plant affiliation (alfalfa, clover), aphid clones were characterized for color morph (green, pink), Hamiltonella defensa and Regiella insecticola symbionts, and frequently used behaviors in encounters with A. ervi wasps. A total of 12 explanatory variables were examined using redundancy analysis (RDA) to predict host survival or failure to A. ervi parasitism. Aphid color was the best univariate predictor, but was poorly predictive in the RDA model. In contrast, aphid host plant and symbionts were not significant univariate predictors, but significant predictors in the multivariate model. Aphid susceptibility to wasp acceptance as reflected in host attacks and oviposition clearly differed from its suitability to parasitism and progeny development. Parasitoid progeny were three times more likely to survive on clover than alfalfa host aphids, which was compensated by behaviorally adjusting eggs invested per host. Strong variation of the predictive power of

  2. Joint modeling of repeated multivariate cognitive measures and competing risks of dementia and death: a latent process and latent class approach.

    Science.gov (United States)

    Proust-Lima, Cécile; Dartigues, Jean-François; Jacqmin-Gadda, Hélène

    2016-02-10

    Joint models initially dedicated to a single longitudinal marker and a single time-to-event need to be extended to account for the rich longitudinal data of cohort studies. Multiple causes of clinical progression are indeed usually observed, and multiple longitudinal markers are collected when the true latent trait of interest is hard to capture (e.g., quality of life, functional dependency, and cognitive level). These multivariate and longitudinal data also usually have nonstandard distributions (discrete, asymmetric, bounded, etc.). We propose a joint model based on a latent process and latent classes to analyze simultaneously such multiple longitudinal markers of different natures, and multiple causes of progression. A latent process model describes the latent trait of interest and links it to the observed longitudinal outcomes using flexible measurement models adapted to different types of data, and a latent class structure links the longitudinal and cause-specific survival models. The joint model is estimated in the maximum likelihood framework. A score test is developed to evaluate the assumption of conditional independence of the longitudinal markers and each cause of progression given the latent classes. In addition, individual dynamic cumulative incidences of each cause of progression based on the repeated marker data are derived. The methodology is validated in a simulation study and applied on real data about cognitive aging obtained from a large population-based study. The aim is to predict the risk of dementia by accounting for the competing death according to the profiles of semantic memory measured by two asymmetric psychometric tests.

  3. Survival to parasitoids in an insect hosting defensive symbionts: a multivariate approach to polymorphic traits affecting host use by its natural enemy.

    Directory of Open Access Journals (Sweden)

    Emilie Bilodeau

    Full Text Available Insect parasitoids and their insect hosts represent a wide range of parasitic trophic relations that can be used to understand the evolution of biotic diversity on earth. Testing theories of coevolution between hosts and parasites is based on factors directly involved in host susceptibility and parasitoid virulence. We used controlled encounters with potential hosts of the Aphidius ervi wasp to elucidate behavioral and other phenotypic traits of host Acyrthosiphon pisum that most contribute to success or failure of parasitism. The host aphid is at an advanced stage of specialization on different crop plants, and exhibits intra-population polymorphism for traits of parasitoid avoidance and resistance based on clonal variation of color morph and anti-parasitoid bacterial symbionts. Randomly selected aphid clones from alfalfa and clover were matched in 5 minute encounters with wasps of two parasitoid lineages deriving from hosts of each plant biotype in a replicated transplant experimental design. In addition to crop plant affiliation (alfalfa, clover, aphid clones were characterized for color morph (green, pink, Hamiltonella defensa and Regiella insecticola symbionts, and frequently used behaviors in encounters with A. ervi wasps. A total of 12 explanatory variables were examined using redundancy analysis (RDA to predict host survival or failure to A. ervi parasitism. Aphid color was the best univariate predictor, but was poorly predictive in the RDA model. In contrast, aphid host plant and symbionts were not significant univariate predictors, but significant predictors in the multivariate model. Aphid susceptibility to wasp acceptance as reflected in host attacks and oviposition clearly differed from its suitability to parasitism and progeny development. Parasitoid progeny were three times more likely to survive on clover than alfalfa host aphids, which was compensated by behaviorally adjusting eggs invested per host. Strong variation of the

  4. An Integrated and Multivariate Model along with Designing Experiments Approach for Assessment of Micro- and Macro- Ergonomic Factors: The Case of a Gas Refinery.

    Science.gov (United States)

    Azadeh, A; Mohammadfam, I; Sadjadi, M; Hamidi, Y; Kianfar, A

    2008-12-28

    The objectives of this paper are three folds. First, an integrated framework for designing and development of the integrated health, safety and environment (HSE) model is presented. Second, it is implemented and tested for a large gas refinery in Iran. Third, it is shown whether the total ergonomics model is superior to the conventional ergonomics approach. This study is among the first to examine total ergonomics components in a manufacturing system. This study was conducted in Sarkhoon & Qeshm Gas refinery- Iran in 2006. To achieve the above objectives, an integrated approach based on total ergonomics factors was developed. Second, it is applied to the refinery and the advantages of total ergonomics approach are discussed. Third, the impacts of total ergonomics factors on local factors are examined through non-parametric statistical analysis. It was shown that total ergonomics model is much more beneficial than conventional approach. It should be noted that the traditional ergonomics methodology is not capable of locating the findings of total ergonomics model. The distinguished aspect of this study is the employment of a total system approach based on integration of the conventional ergonomics factors with HSE factors.

  5. Multivariate calculus and geometry

    CERN Document Server

    Dineen, Seán

    2014-01-01

    Multivariate calculus can be understood best by combining geometric insight, intuitive arguments, detailed explanations and mathematical reasoning. This textbook has successfully followed this programme. It additionally provides a solid description of the basic concepts, via familiar examples, which are then tested in technically demanding situations. In this new edition the introductory chapter and two of the chapters on the geometry of surfaces have been revised. Some exercises have been replaced and others provided with expanded solutions. Familiarity with partial derivatives and a course in linear algebra are essential prerequisites for readers of this book. Multivariate Calculus and Geometry is aimed primarily at higher level undergraduates in the mathematical sciences. The inclusion of many practical examples involving problems of several variables will appeal to mathematics, science and engineering students.

  6. Multivariate image analysis in biomedicine.

    Science.gov (United States)

    Nattkemper, Tim W

    2004-10-01

    In recent years, multivariate imaging techniques are developed and applied in biomedical research in an increasing degree. In research projects and in clinical studies as well m-dimensional multivariate images (MVI) are recorded and stored to databases for a subsequent analysis. The complexity of the m-dimensional data and the growing number of high throughput applications call for new strategies for the application of image processing and data mining to support the direct interactive analysis by human experts. This article provides an overview of proposed approaches for MVI analysis in biomedicine. After summarizing the biomedical MVI techniques the two level framework for MVI analysis is illustrated. Following this framework, the state-of-the-art solutions from the fields of image processing and data mining are reviewed and discussed. Motivations for MVI data mining in biology and medicine are characterized, followed by an overview of graphical and auditory approaches for interactive data exploration. The paper concludes with summarizing open problems in MVI analysis and remarks upon the future development of biomedical MVI analysis.

  7. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  8. Multivariate $\\alpha$-molecules

    OpenAIRE

    Flinth, Axel; Schäfer, Martin

    2015-01-01

    The suboptimal performance of wavelets with regard to the approximation of multivariate data gave rise to new representation systems, specifically designed for data with anisotropic features. Some prominent examples of these are given by ridgelets, curvelets, and shearlets, to name a few. The great variety of such so-called directional systems motivated the search for a common framework, which unites many under one roof and enables a simultaneous analysis, for example with respect to approxim...

  9. Transient multivariable sensor evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Vilim, Richard B.; Heifetz, Alexander

    2017-02-21

    A method and system for performing transient multivariable sensor evaluation. The method and system includes a computer system for identifying a model form, providing training measurement data, generating a basis vector, monitoring system data from sensor, loading the system data in a non-transient memory, performing an estimation to provide desired data and comparing the system data to the desired data and outputting an alarm for a defective sensor.

  10. Transient multivariable sensor evaluation

    Science.gov (United States)

    Vilim, Richard B.; Heifetz, Alexander

    2017-02-21

    A method and system for performing transient multivariable sensor evaluation. The method and system includes a computer system for identifying a model form, providing training measurement data, generating a basis vector, monitoring system data from sensor, loading the system data in a non-transient memory, performing an estimation to provide desired data and comparing the system data to the desired data and outputting an alarm for a defective sensor.

  11. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  12. Multivariate determinants of self-management in Health Care: assessing Health Empowerment Model by comparison between structural equation and graphical models approaches

    Directory of Open Access Journals (Sweden)

    Filippo Trentini

    2015-03-01

    Full Text Available Backgroung. In public health one debated issue is related to consequences of improper self-management in health care.  Some theoretical models have been proposed in Health Communication theory which highlight how components such general literacy and specific knowledge of the disease might be very important for effective actions in healthcare system.  Methods. This  paper aims at investigating the consistency of Health Empowerment Model by means of both graphical models approach, which is a “data driven” method and a Structural Equation Modeling (SEM approach, which is instead “theory driven”, showing the different information pattern that can be revealed in a health care research context.The analyzed dataset provides data on the relationship between the Health Empowerment Model constructs and the behavioral and health status in 263 chronic low back pain (cLBP patients. We used the graphical models approach to evaluate the dependence structure in a “blind” way, thus learning the structure from the data.Results. From the estimation results dependence structure confirms links design assumed in SEM approach directly from researchers, thus validating the hypotheses which generated the Health Empowerment Model constructs.Conclusions. This models comparison helps in avoiding confirmation bias. In Structural Equation Modeling, we used SPSS AMOS 21 software. Graphical modeling algorithms were implemented in a R software environment.

  13. Using the Onto-Semiotic Approach to Identify and Analyze Mathematical Meaning when Transiting between Different Coordinate Systems in a Multivariate Context

    Science.gov (United States)

    Montiel, Mariana; Wilhelmi, Miguel R.; Vidakovic, Draga; Elstak, Iwan

    2009-01-01

    The main objective of this paper is to apply the onto-semiotic approach to analyze the mathematical concept of different coordinate systems, as well as some situations and university students' actions related to these coordinate systems. The identification of objects that emerge from the mathematical activity and a first intent to describe an…

  14. The past, present, and future of the U.S. electric power sector: Examining regulatory changes using multivariate time series approaches

    Science.gov (United States)

    Binder, Kyle Edwin

    The U.S. energy sector has undergone continuous change in the regulatory, technological, and market environments. These developments show no signs of slowing. Accordingly, it is imperative that energy market regulators and participants develop a strong comprehension of market dynamics and the potential implications of their actions. This dissertation contributes to a better understanding of the past, present, and future of U.S. energy market dynamics and interactions with policy. Advancements in multivariate time series analysis are employed in three related studies of the electric power sector. Overall, results suggest that regulatory changes have had and will continue to have important implications for the electric power sector. The sector, however, has exhibited adaptability to past regulatory changes and is projected to remain resilient in the future. Tests for constancy of the long run parameters in a vector error correction model are applied to determine whether relationships among coal inventories in the electric power sector, input prices, output prices, and opportunity costs have remained constant over the past 38 years. Two periods of instability are found, the first following railroad deregulation in the U.S. and the second corresponding to a number of major regulatory changes in the electric power and natural gas sectors. Relationships among Renewable Energy Credit prices, electricity prices, and natural gas prices are estimated using a vector error correction model. Results suggest that Renewable Energy Credit prices do not completely behave as previously theorized in the literature. Potential reasons for the divergence between theory and empirical evidence are the relative immaturity of current markets and continuous institutional intervention. Potential impacts of future CO2 emissions reductions under the Clean Power Plan on economic and energy sector activity are estimated. Conditional forecasts based on an outlined path for CO2 emissions are

  15. Multivariate Voronoi Outlier Detection for Time Series.

    Science.gov (United States)

    Zwilling, Chris E; Wang, Michelle Yongmei

    2014-10-01

    Outlier detection is a primary step in many data mining and analysis applications, including healthcare and medical research. This paper presents a general method to identify outliers in multivariate time series based on a Voronoi diagram, which we call Multivariate Voronoi Outlier Detection (MVOD). The approach copes with outliers in a multivariate framework, via designing and extracting effective attributes or features from the data that can take parametric or nonparametric forms. Voronoi diagrams allow for automatic configuration of the neighborhood relationship of the data points, which facilitates the differentiation of outliers and non-outliers. Experimental evaluation demonstrates that our MVOD is an accurate, sensitive, and robust method for detecting outliers in multivariate time series data.

  16. A wavelet-based non-linear autoregressive with exogenous inputs (WNARX) dynamic neural network model for real-time flood forecasting using satellite-based rainfall products

    Science.gov (United States)

    Nanda, Trushnamayee; Sahoo, Bhabagrahi; Beria, Harsh; Chatterjee, Chandranath

    2016-08-01

    Although flood forecasting and warning system is a very important non-structural measure in flood-prone river basins, poor raingauge network as well as unavailability of rainfall data in real-time could hinder its accuracy at different lead times. Conversely, since the real-time satellite-based rainfall products are now becoming available for the data-scarce regions, their integration with the data-driven models could be effectively used for real-time flood forecasting. To address these issues in operational streamflow forecasting, a new data-driven model, namely, the wavelet-based non-linear autoregressive with exogenous inputs (WNARX) is proposed and evaluated in comparison with four other data-driven models, viz., the linear autoregressive moving average with exogenous inputs (ARMAX), static artificial neural network (ANN), wavelet-based ANN (WANN), and dynamic nonlinear autoregressive with exogenous inputs (NARX) models. First, the quality of input rainfall products of Tropical Rainfall Measuring Mission Multi-satellite Precipitation Analysis (TMPA), viz., TRMM and TRMM-real-time (RT) rainfall products is assessed through statistical evaluation. The results reveal that the satellite rainfall products moderately correlate with the observed rainfall, with the gauge-adjusted TRMM product outperforming the real-time TRMM-RT product. The TRMM rainfall product better captures the ground observations up to 95 percentile range (30.11 mm/day), although the hit rate decreases for high rainfall intensity. The effect of antecedent rainfall (AR) and climate forecast system reanalysis (CFSR) temperature product on the catchment response is tested in all the developed models. The results reveal that, during real-time flow simulation, the satellite-based rainfall products generally perform worse than the gauge-based rainfall. Moreover, as compared to the existing models, the flow forecasting by the WNARX model is way better than the other four models studied herein with the

  17. Chemical modeling of groundwater in the Banat Plain, southwestern Romania, with elevated As content and co-occurring species by combining diagrams and unsupervised multivariate statistical approaches.

    Science.gov (United States)

    Butaciu, Sinziana; Senila, Marin; Sarbu, Costel; Ponta, Michaela; Tanaselia, Claudiu; Cadar, Oana; Roman, Marius; Radu, Emil; Sima, Mihaela; Frentiu, Tiberiu

    2017-04-01

    The study proposes a combined model based on diagrams (Gibbs, Piper, Stuyfzand Hydrogeochemical Classification System) and unsupervised statistical approaches (Cluster Analysis, Principal Component Analysis, Fuzzy Principal Component Analysis, Fuzzy Hierarchical Cross-Clustering) to describe natural enrichment of inorganic arsenic and co-occurring species in groundwater in the Banat Plain, southwestern Romania. Speciation of inorganic As (arsenite, arsenate), ion concentrations (Na(+), K(+), Ca(2+), Mg(2+), HCO3(-), Cl(-), F(-), SO4(2-), PO4(3-), NO3(-)), pH, redox potential, conductivity and total dissolved substances were performed. Classical diagrams provided the hydrochemical characterization, while statistical approaches were helpful to establish (i) the mechanism of naturally occurring of As and F(-) species and the anthropogenic one for NO3(-), SO4(2-), PO4(3-) and K(+) and (ii) classification of groundwater based on content of arsenic species. The HCO3(-) type of local groundwater and alkaline pH (8.31-8.49) were found to be responsible for the enrichment of arsenic species and occurrence of F(-) but by different paths. The PO4(3-)-AsO4(3-) ion exchange, water-rock interaction (silicates hydrolysis and desorption from clay) were associated to arsenate enrichment in the oxidizing aquifer. Fuzzy Hierarchical Cross-Clustering was the strongest tool for the rapid simultaneous classification of groundwaters as a function of arsenic content and hydrogeochemical characteristics. The approach indicated the Na(+)-F(-)-pH cluster as marker for groundwater with naturally elevated As and highlighted which parameters need to be monitored. A chemical conceptual model illustrating the natural and anthropogenic paths and enrichment of As and co-occurring species in the local groundwater supported by mineralogical analysis of rocks was established. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Multivariate Statistical Process Control

    DEFF Research Database (Denmark)

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...

  19. A Robust and Effective Multivariate Post-processing approach: Application on North American Multi-Model Ensemble Climate Forecast over the CONUS

    Science.gov (United States)

    Khajehei, Sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2017-04-01

    The North American Multi-model Ensemble (NMME) forecasting system has been providing valuable information using a large number of contributing models each consisting of several ensemble members. Despite all the potential benefits that the NMME offers, the forecasts are prone to bias in many regions. In this study, monthly precipitation from 11 contributing models totaling 128 ensemble members in the NMME are assessed and bias corrected. All the models are regridded to 0.5 degree spatial resolution for a more detailed assessment. The goals of this study are as follows: 1. Evaluating the performance of the NMME models over the Contiguous United States using the probabilistic and deterministic measures. 2. Introducing the Copula based ensemble post-processing (COP-EPP) method rooted in Bayesian methods for conditioning the forecast on the observations to improve the performance of NMME predictions. 3. Comparing the forecast skill of the NMME at four different lead-times (lead-0 to lead-3) across the western US, and assessing the effectiveness of COP-EPP in post-processing of precipitation forecasts. Results revealed that NMME models are highly biased in central and western US, while they provide acceptable performance in the eastern regions. The new approach demonstrates substantial improvement over the raw NMME forecasts. However, regional assessment indicates that the COP-EPP is superior to the commonly used Quantile Matching (QM) approach. Also, this method is showing considerable improvements on the seasonal NMME forecasts at all lead times.

  20. Oil condition monitoring of gears onboard ships using a regression approach for multivariate T2 control charts

    DEFF Research Database (Denmark)

    Henneberg, Morten; Jørgensen, Bent; Eriksen, René Lynge

    2016-01-01

    In this paper, we present an oil condition and wear debris evaluation method for ship thruster gears using T2 statistics to form control charts from a multi-sensor platform. The proposed method takes into account the different ambient conditions by multiple linear regression on the mean value...... as substitution from the normal empirical mean value. This regression approach accounts for the bias imposed on the empirical mean value due to different geographical and seasonal differences on the multi-sensor inputs. Data from a gearbox are used to evaluate the length of the run-in period in order to ensure...... only quasi-stationary data are included in phase I of the T2 statistics. Data from two thruster gears onboard two different ships are presented and analyzed, and the selection of the phase I data size is discussed. A graphic overview for quick localization of T2 signaling is also demonstrated using...

  1. Nondestructive Damage Assessment of Composite Structures Based on Wavelet Analysis of Modal Curvatures: State-of-the-Art Review and Description of Wavelet-Based Damage Assessment Benchmark

    Directory of Open Access Journals (Sweden)

    Andrzej Katunin

    2015-01-01

    Full Text Available The application of composite structures as elements of machines and vehicles working under various operational conditions causes degradation and occurrence of damage. Considering that composites are often used for responsible elements, for example, parts of aircrafts and other vehicles, it is extremely important to maintain them properly and detect, localize, and identify the damage occurring during their operation in possible early stage of its development. From a great variety of nondestructive testing methods developed to date, the vibration-based methods seem to be ones of the least expensive and simultaneously effective with appropriate processing of measurement data. Over the last decades a great popularity of vibration-based structural testing has been gained by wavelet analysis due to its high sensitivity to a damage. This paper presents an overview of results of numerous researchers working in the area of vibration-based damage assessment supported by the wavelet analysis and the detailed description of the Wavelet-based Structural Damage Assessment (WavStructDamAs Benchmark, which summarizes the author’s 5-year research in this area. The benchmark covers example problems of damage identification in various composite structures with various damage types using numerous wavelet transforms and supporting tools. The benchmark is openly available and allows performing the analysis on the example problems as well as on its own problems using available analysis tools.

  2. Wavelet-based peak detection and a new charge inference procedure for MS/MS implemented in ProteoWizard's msConvert.

    Science.gov (United States)

    French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L

    2015-02-06

    We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.

  3. The evolution of spillover effects between oil and stock markets across multi-scales using a wavelet-based GARCH-BEKK model

    Science.gov (United States)

    Liu, Xueyong; An, Haizhong; Huang, Shupei; Wen, Shaobo

    2017-01-01

    Aiming to investigate the evolution of mean and volatility spillovers between oil and stock markets in the time and frequency dimensions, we employed WTI crude oil prices, the S&P 500 (USA) index and the MICEX index (Russia) for the period Jan. 2003-Dec. 2014 as sample data. We first applied a wavelet-based GARCH-BEKK method to examine the spillover features in frequency dimension. To consider the evolution of spillover effects in time dimension at multiple-scales, we then divided the full sample period into three sub-periods, pre-crisis period, crisis period, and post-crisis period. The results indicate that spillover effects vary across wavelet scales in terms of strength and direction. By analysis the time-varying linkage, we found the different evolution features of spillover effects between the Oil-US stock market and Oil-Russia stock market. The spillover relationship between oil and US stock market is shifting to short-term while the spillover relationship between oil and Russia stock market is changing to all time scales. That result implies that the linkage between oil and US stock market is weakening in the long-term, and the linkage between oil and Russia stock market is getting close in all time scales. This may explain the phenomenon that the US stock index and the Russia stock index showed the opposite trend with the falling of oil price in the post-crisis period.

  4. Approximation by Multivariate Singular Integrals

    CERN Document Server

    Anastassiou, George A

    2011-01-01

    Approximation by Multivariate Singular Integrals is the first monograph to illustrate the approximation of multivariate singular integrals to the identity-unit operator. The basic approximation properties of the general multivariate singular integral operators is presented quantitatively, particularly special cases such as the multivariate Picard, Gauss-Weierstrass, Poisson-Cauchy and trigonometric singular integral operators are examined thoroughly. This book studies the rate of convergence of these operators to the unit operator as well as the related simultaneous approximation. The last cha

  5. An efficient approach to identify different chemical markers between fibrous root and rhizome of Anemarrhena asphodeloides by ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry with multivariate statistical analysis.

    Science.gov (United States)

    Wang, Fang-Xu; Yuan, Jian-Chao; Kang, Li-Ping; Pang, Xu; Yan, Ren-Yi; Zhao, Yang; Zhang, Jie; Sun, Xin-Guang; Ma, Bai-Ping

    2016-09-10

    An ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry approach coupled with multivariate statistical analysis was established and applied to rapidly distinguish the chemical differences between fibrous root and rhizome of Anemarrhena asphodeloides. The datasets of tR-m/z pairs, ion intensity and sample code were processed by principal component analysis and orthogonal partial least squares discriminant analysis. Chemical markers could be identified based on their exact mass data, fragmentation characteristics, and retention times. And the new compounds among chemical markers could be isolated rapidly guided by the ultra high-performance liquid chromatography quadrupole time-of-flight tandem mass spectrometry and their definitive structures would be further elucidated by NMR spectra. Using this approach, twenty-four markers were identified on line including nine new saponins and five new steroidal saponins of them were obtained in pure form. The study validated this proposed approach as a suitable method for identification of the chemical differences between various medicinal parts in order to expand medicinal parts and increase the utilization rate of resources. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Low rank Multivariate regression

    CERN Document Server

    Giraud, Christophe

    2010-01-01

    We consider in this paper the multivariate regression problem, when the target regression matrix $A$ is close to a low rank matrix. Our primary interest in on the practical case where the variance of the noise is unknown. Our main contribution is to propose in this setting a criterion to select among a family of low rank estimators and prove a non-asymptotic oracle inequality for the resulting estimator. We also investigate the easier case where the variance of the noise is known and outline that the penalties appearing in our criterions are minimal (in some sense). These penalties involve the expected value of the Ky-Fan quasi-norm of some random matrices. These quantities can be evaluated easily in practice and upper-bounds can be derived from recent results in random matrix theory.

  7. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  8. Multivariate Model for Test Response Analysis

    NARCIS (Netherlands)

    Krishnan, Shaji; Krishnan, Shaji; Kerkhoff, Hans G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage

  9. Multivariate model for test response analysis

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai

  10. Multivariate model for test response analysis

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2010-01-01

    A systematic approach to construct an effective multivariate test response model for capturing manufacturing defects in electronic products is described. The effectiveness of the model is demonstrated by its capability in reducing the number of test-points, while achieving the maximal coverage attai

  11. Multivariate statistical methods a primer

    CERN Document Server

    Manly, Bryan FJ

    2004-01-01

    THE MATERIAL OF MULTIVARIATE ANALYSISExamples of Multivariate DataPreview of Multivariate MethodsThe Multivariate Normal DistributionComputer ProgramsGraphical MethodsChapter SummaryReferencesMATRIX ALGEBRAThe Need for Matrix AlgebraMatrices and VectorsOperations on MatricesMatrix InversionQuadratic FormsEigenvalues and EigenvectorsVectors of Means and Covariance MatricesFurther Reading Chapter SummaryReferencesDISPLAYING MULTIVARIATE DATAThe Problem of Displaying Many Variables in Two DimensionsPlotting index VariablesThe Draftsman's PlotThe Representation of Individual Data P:ointsProfiles o

  12. A new wavelet-based reconstruction algorithm for twin image removal in digital in-line holography

    Science.gov (United States)

    Hattay, Jamel; Belaid, Samir; Aguili, Taoufik; Lebrun, Denis

    2016-07-01

    Two original methods are proposed here for digital in-line hologram processing. Firstly, we propose an entropy-based method to retrieve the focus plane which is very useful for digital hologram reconstruction. Secondly, we introduce a new approach to remove the so-called twin images reconstructed by holograms. This is achieved owing to the Blind Source Separation (BSS) technique. The proposed method is made up of two steps: an Adaptive Quincunx Lifting Scheme (AQLS) and a statistical unmixing algorithm. The AQLS tool is based on wavelet packet transform, whose role is to maximize the sparseness of the input holograms. The unmixing algorithm uses the Independent Component Analysis (ICA) tool. Experimental results confirm the ability of convolutive blind source separation to discard the unwanted twin image from in-line digital holograms.

  13. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances....

  14. Wavelet-based automatic cry recognition system for detecting infants with hearing-loss from normal infants

    Directory of Open Access Journals (Sweden)

    Mahmoud Mansouri Jam

    2013-11-01

    Full Text Available Infant cry is a multimodal and dynamic behaviour that it contains a lot of information. Goal of this investigation is recognition of two groups of infants by new acoustic feature that has not used in infant cry classification. The cry of deaf infants and normal hearing infants is studied. ‘Mel filter-bank discrete wavelet coefficients (MFDWCs’ have been extracted as feature vector. Infant cry classification is a pattern recognition problem such as ‘automatic speech recognition’, which in signal processing stage the authors performed some pre-processing included silence elimination, filtering, pre-emphasising and, segmentation. After applying the discrete wavelet transform on the Mel scaled log filter bank energies of a cry signal frames, MFDWCs feature vector was extracted. The feature vector, MFDWCs, of each cry sample has large length, so they used principle components analysis to reduce in feature space dimension, after training of neural network as classifier, they achieved to 93.2% correction rate in cry recognition of test data set. This result shows better efficiency in comparison with previous familiarised approaches.

  15. Bayesian wavelet-based image deconvolution: a GEM algorithm exploiting a class of heavy-tailed priors.

    Science.gov (United States)

    Bioucas-Dias, José M

    2006-04-01

    Image deconvolution is formulated in the wavelet domain under the Bayesian framework. The well-known sparsity of the wavelet coefficients of real-world images is modeled by heavy-tailed priors belonging to the Gaussian scale mixture (GSM) class; i.e., priors given by a linear (finite of infinite) combination of Gaussian densities. This class includes, among others, the generalized Gaussian, the Jeffreys, and the Gaussian mixture priors. Necessary and sufficient conditions are stated under which the prior induced by a thresholding/shrinking denoising rule is a GSM. This result is then used to show that the prior induced by the "nonnegative garrote" thresholding/shrinking rule, herein termed the garrote prior, is a GSM. To compute the maximum a posteriori estimate, we propose a new generalized expectation maximization (GEM) algorithm, where the missing variables are the scale factors of the GSM densities. The maximization step of the underlying expectation maximization algorithm is replaced with a linear stationary second-order iterative method. The result is a GEM algorithm of O(N log N) computational complexity. In a series of benchmark tests, the proposed approach outperforms or performs similarly to state-of-the art methods, demanding comparable (in some cases, much less) computational complexity.

  16. Schmidt decomposition and multivariate statistical analysis

    Science.gov (United States)

    Bogdanov, Yu. I.; Bogdanova, N. A.; Fastovets, D. V.; Luckichev, V. F.

    2016-12-01

    The new method of multivariate data analysis based on the complements of classical probability distribution to quantum state and Schmidt decomposition is presented. We considered Schmidt formalism application to problems of statistical correlation analysis. Correlation of photons in the beam splitter output channels, when input photons statistics is given by compound Poisson distribution is examined. The developed formalism allows us to analyze multidimensional systems and we have obtained analytical formulas for Schmidt decomposition of multivariate Gaussian states. It is shown that mathematical tools of quantum mechanics can significantly improve the classical statistical analysis. The presented formalism is the natural approach for the analysis of both classical and quantum multivariate systems and can be applied in various tasks associated with research of dependences.

  17. 周期小波基下Burgers方程数值模拟%The Study of Periodic Wavelet Based Numerical Method Applied to Burgers' Equation

    Institute of Scientific and Technical Information of China (English)

    许伯强; 田立新

    2001-01-01

    This paper studies the numerical solution of the Galerkin projection onto a periodic wavelet basis of the Burgers partial differential equation with periodic boundary conditions.Based on the orthogonal transformation of both periodic spline wavelet within each scale and the symmetry of Burgers' equation,the nonlinerar Burgers' equation to ODEs is slmplified and the numerical solution is obtained.In phase space,an analysis is given to combinations of wavelets which represent ‘global’ functions.It is shown that the local models of the numerical solution based on periodic wavelets are more distinguishable than those of Fourier modes.This study provides a foundation for further work in which we use wavelet base to extract local models of nonlinear evolution equations.%研究周期边界条件下非线性Burgers 方程的周期小波基下Galerkin解.利用周期样条小波基的正交变换,结合Burgers方程所具有的对称性作线性变换,约化非线性Burgers方程为一组常微分方程组,得到该方程的Galerkin解,在相空间中进行分析,采用能表征全域特性的小波组合函数,数值分析表明周期小波基下Galerkin解与Fourier分析下的数值解比较更能反映方程的局部特征.本文的研究为非线性发展方程的局部复杂性研究提供了一个新的基础.

  18. Some New Approaches to Multivariate Probability Distributions.

    Science.gov (United States)

    1986-12-01

    Forte, B. (1985). Mutual dependence of random variables and maximum discretized entropy , Ann. Prob., 13, 630-637. .. 3. Billingsley, P. (1968...characterizations of distributions, such as the Marshall-Olkin bivariate distribution or Frechet’s multi- variate distribution with continuous marginals or a...problem mentioned in Remark 8. He has given in this context a uniqueness theorem in the bivariate case under certain assump- tions. The following

  19. Multivariate statistics exercises and solutions

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    The authors present tools and concepts of multivariate data analysis by means of exercises and their solutions. The first part is devoted to graphical techniques. The second part deals with multivariate random variables and presents the derivation of estimators and tests for various practical situations. The last part introduces a wide variety of exercises in applied multivariate data analysis. The book demonstrates the application of simple calculus and basic multivariate methods in real life situations. It contains altogether more than 250 solved exercises which can assist a university teacher in setting up a modern multivariate analysis course. All computer-based exercises are available in the R language. All R codes and data sets may be downloaded via the quantlet download center  www.quantlet.org or via the Springer webpage. For interactive display of low-dimensional projections of a multivariate data set, we recommend GGobi.

  20. Analysis of multivariate extreme intakes of food chemicals

    NARCIS (Netherlands)

    Paulo, M.J.; Voet, van der H.; Wood, J.C.; Marion, G.R.; Klaveren, van J.D.

    2006-01-01

    A recently published multivariate Extreme Value Theory (EVT) model [Heffernan, J.E., Tawn, J.A., 2004. A conditional approach for multivariate extreme values (with discussion). Journal of the Royal Statistical Society Series B 66 (3), 497¿546] is applied to the estimation of population risks associa

  1. Multivariate Bonferroni-type inequalities theory and applications

    CERN Document Server

    Chen, John

    2014-01-01

    Multivariate Bonferroni-Type Inequalities: Theory and Applications presents a systematic account of research discoveries on multivariate Bonferroni-type inequalities published in the past decade. The emergence of new bounding approaches pushes the conventional definitions of optimal inequalities and demands new insights into linear and Fréchet optimality. The book explores these advances in bounding techniques with corresponding innovative applications. It presents the method of linear programming for multivariate bounds, multivariate hybrid bounds, sub-Markovian bounds, and bounds using Hamil

  2. A New Model of Peaks over Threshold for Multivariate Extremes

    Institute of Scientific and Technical Information of China (English)

    罗耀; 朱良生

    2014-01-01

    The peaks over threshold (POT) methods are used for the univariate and multivariate extreme value analyses of the wave and wind records collected from a hydrometric station in the South China Sea. A new multivariate POT method:Multivariate GPD (MGPD) model is proposed, which can be built easily according to developed parametric models and is a natural distribution of multivariate POT methods. A joint threshold selection approach is used in the MGPD model well. Finally, sensitivity analyses are carried out to calculate the return values of the base shear, and two declustering schemes are compared in this study.

  3. SU-F-BRB-12: A Novel Haar Wavelet Based Approach to Deliver Non-Coplanar Intensity Modulated Radiotherapy Using Sparse Orthogonal Collimators

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, D; Ruan, D; Low, D; Sheng, K [Deparment of Radiation Oncology, University of California Los Angeles, Los Angeles, CA (United States); O’Connor, D [Deparment of Mathematics, University of California Los Angeles, Los Angeles, CA (United States); Boucher, S [RadiaBeam Technologies, Santa Monica, CA (United States)

    2015-06-15

    Purpose: Existing efforts to replace complex multileaf collimator (MLC) by simple jaws for intensity modulated radiation therapy (IMRT) resulted in unacceptable compromise in plan quality and delivery efficiency. We introduce a novel fluence map segmentation method based on compressed sensing for plan delivery using a simplified sparse orthogonal collimator (SOC) on the 4π non-coplanar radiotherapy platform. Methods: 4π plans with varying prescription doses were first created by automatically selecting and optimizing 20 non-coplanar beams for 2 GBM, 2 head & neck, and 2 lung patients. To create deliverable 4π plans using SOC, which are two pairs of orthogonal collimators with 1 to 4 leaves in each collimator bank, a Haar Fluence Optimization (HFO) method was used to regulate the number of Haar wavelet coefficients while maximizing the dose fidelity to the ideal prescription. The plans were directly stratified utilizing the optimized Haar wavelet rectangular basis. A matching number of deliverable segments were stratified for the MLC-based plans. Results: Compared to the MLC-based 4π plans, the SOC-based 4π plans increased the average PTV dose homogeneity from 0.811 to 0.913. PTV D98 and D99 were improved by 3.53% and 5.60% of the corresponding prescription doses. The average mean and maximal OAR doses slightly increased by 0.57% and 2.57% of the prescription doses. The average number of segments ranged between 5 and 30 per beam. The collimator travel time to create the segments decreased with increasing leaf numbers in the SOC. The two and four leaf designs were 1.71 and 1.93 times more efficient, on average, than the single leaf design. Conclusion: The innovative dose domain optimization based on compressed sensing enables uncompromised 4π non-coplanar IMRT dose delivery using simple rectangular segments that are deliverable using a sparse orthogonal collimator, which only requires 8 to 16 leaves yet is unlimited in modulation resolution. This work is supported in part by Varian Medical Systems, Inc. and NIH R43 CA18339.

  4. 采用小波分解的分频段加权辨识方法%WAVELET-BASED FREQUNCY BAND WEIGHTING APPROACH FOR IDENTIFICATION ISSUE

    Institute of Scientific and Technical Information of China (English)

    宋执环; 王海清; 李平

    2001-01-01

    @@1 引言 近年来,有关控制的辨识(ControlRelevant Identification, CRID)逐渐受到了重视[1~3],辨识与控制之间的配合问题向传统辨识方法提出了新的要求.正如L.Ljung曾经指出[4]:真实系统与数学模型是两种不同的东西,模型的可接受性应该是“实用性”,而不是“真实性”.然而对实用性的不同理解,导致了不同的模型评价准则.CRID问题的研究表明,传统的辨识方法很少考虑控制的特点,模型评价准则与控制系统设计的要求并不相符.比如文献[1,2]指出:模型的开环频率响应不需要在所有频段(甚至包括低频段)都很准确,只需在剪切频率附近或控制器的工作频带范围内足够准确,就可以满足系统设计的要求.但是对此类模型准则,采用传统的辨识方法不能有效地满足,即难以在辨识过程中强调对某个频段(而不是对多个单一的频率点上或整个时域)的准确建模[1, 2].本文基于离散小波变换,把辨识数据分解到不同的频段,然后通过加权使得不同频段信息的重要性反映到最终合成的辨识模型中,实现对感兴趣频段的精确建模.此辨识方法易与控制系统的设计相配合,并且对模型定阶的要求不高,适于小样本辨识数据的场合.

  5. Multivariate Modelling via Matrix Subordination

    DEFF Research Database (Denmark)

    Nicolato, Elisa

    stochastic volatility via time-change is quite ineffective when applied to the multivariate setting. In this work we propose a new class of models, which is obtained by conditioning a multivariate Brownian Motion to a so-called matrix subordinator. The obtained model-class encompasses the vast majority...

  6. Modeling Multivariate Volatility Processes: Theory and Evidence

    Directory of Open Access Journals (Sweden)

    Jelena Z. Minovic

    2009-05-01

    Full Text Available This article presents theoretical and empirical methodology for estimation and modeling of multivariate volatility processes. It surveys the model specifications and the estimation methods. Multivariate GARCH models covered are VEC (initially due to Bollerslev, Engle and Wooldridge, 1988, diagonal VEC (DVEC, BEKK (named after Baba, Engle, Kraft and Kroner, 1995, Constant Conditional Correlation Model (CCC, Bollerslev, 1990, Dynamic Conditional Correlation Model (DCC models of Tse and Tsui, 2002, and Engle, 2002. I illustrate approach by applying it to daily data from the Belgrade stock exchange, I examine two pairs of daily log returns for stocks and index, report the results obtained, and compare them with the restricted version of BEKK, DVEC and CCC representations. The methods for estimation parameters used are maximum log-likehood (in BEKK and DVEC models and twostep approach (in CCC model.

  7. Random matrix theory and multivariate statistics

    OpenAIRE

    Diaz-Garcia, Jose A.; Jáimez, Ramon Gutiérrez

    2009-01-01

    Some tools and ideas are interchanged between random matrix theory and multivariate statistics. In the context of the random matrix theory, classes of spherical and generalised Wishart random matrix ensemble, containing as particular cases the classical random matrix ensembles, are proposed. Some properties of these classes of ensemble are analysed. In addition, the random matrix ensemble approach is extended and a unified theory proposed for the study of distributions for real normed divisio...

  8. Robust Evaluation of Multivariate Density Forecasts

    OpenAIRE

    Dovern, Jonas; Manner, Hans

    2016-01-01

    We derive new tests for proper calibration of multivariate density forecasts based on Rosenblatt probability integral transforms. These tests have the advantage that they i) do not depend on the ordering of variables in the forecasting model, ii) are applicable to densities of arbitrary dimensions, and iii) have superior power relative to existing approaches. We furthermore develop adjusted tests that allow for estimated parameters and, consequently, can be used as in-sample specification tes...

  9. Model Checking Multivariate State Rewards

    DEFF Research Database (Denmark)

    Nielsen, Bo Friis; Nielson, Flemming; Nielson, Hanne Riis

    2010-01-01

    We consider continuous stochastic logics with state rewards that are interpreted over continuous time Markov chains. We show how results from multivariate phase type distributions can be used to obtain higher-order moments for multivariate state rewards (including covariance). We also generalise ...... the treatment of eventuality to unbounded path formulae. For all extensions we show how to obtain closed form definitions that are straightforward to implement and we illustrate our development on a small example.......We consider continuous stochastic logics with state rewards that are interpreted over continuous time Markov chains. We show how results from multivariate phase type distributions can be used to obtain higher-order moments for multivariate state rewards (including covariance). We also generalise...

  10. Strategies for Industrial Multivariable Control

    DEFF Research Database (Denmark)

    Hangstrup, M.

    Multivariable control strategies well-suited for industrial applications are suggested. The strategies combine the practical advantages of conventional SISO control schemes and -technology with the potential of multivariable controllers. Special emphasis is put on parameter-varying systems whose...... dynamics and gains strongly depend upon one or more physical parameters characterizing the operating point. This class covers many industrial systems such as airplanes, ships, robots and process control systems. Power plant boilers are representatives for process control systems in general. The dynamics...

  11. Multivariate analysis in thoracic research.

    Science.gov (United States)

    Mengual-Macenlle, Noemí; Marcos, Pedro J; Golpe, Rafael; González-Rivas, Diego

    2015-03-01

    Multivariate analysis is based in observation and analysis of more than one statistical outcome variable at a time. In design and analysis, the technique is used to perform trade studies across multiple dimensions while taking into account the effects of all variables on the responses of interest. The development of multivariate methods emerged to analyze large databases and increasingly complex data. Since the best way to represent the knowledge of reality is the modeling, we should use multivariate statistical methods. Multivariate methods are designed to simultaneously analyze data sets, i.e., the analysis of different variables for each person or object studied. Keep in mind at all times that all variables must be treated accurately reflect the reality of the problem addressed. There are different types of multivariate analysis and each one should be employed according to the type of variables to analyze: dependent, interdependence and structural methods. In conclusion, multivariate methods are ideal for the analysis of large data sets and to find the cause and effect relationships between variables; there is a wide range of analysis types that we can use.

  12. Fast Filtering and Smoothing for Multivariate State Space Models

    NARCIS (Netherlands)

    Koopman, S.J.M.; Durbin, J.

    1998-01-01

    This paper gives a new approach to diffuse filtering and smoothing for multivariate state space models. The standard approach treats the observations as vectors while our approach treats each element of the observational vector individually. This strategy leads to computationally efficient methods f

  13. Transfer entropy between multivariate time series

    Science.gov (United States)

    Mao, Xuegeng; Shang, Pengjian

    2017-06-01

    It is a crucial topic to identify the direction and strength of the interdependence between time series in multivariate systems. In this paper, we propose the method of transfer entropy based on the theory of time-delay reconstruction of a phase space, which is a model-free approach to detect causalities in multivariate time series. This method overcomes the limitation that original transfer entropy only can capture which system drives the transition probabilities of another in scalar time series. Using artificial time series, we show that the driving character is obviously reflected with the increase of the coupling strength between two signals and confirm the effectiveness of the method with noise added. Furthermore, we utilize it to real-world data, namely financial time series, in order to characterize the information flow among different stocks.

  14. Multivariate temporal dictionary learning for EEG.

    Science.gov (United States)

    Barthélemy, Q; Gouy-Pailler, C; Isaac, Y; Souloumiac, A; Larue, A; Mars, J I

    2013-04-30

    This article addresses the issue of representing electroencephalographic (EEG) signals in an efficient way. While classical approaches use a fixed Gabor dictionary to analyze EEG signals, this article proposes a data-driven method to obtain an adapted dictionary. To reach an efficient dictionary learning, appropriate spatial and temporal modeling is required. Inter-channels links are taken into account in the spatial multivariate model, and shift-invariance is used for the temporal model. Multivariate learned kernels are informative (a few atoms code plentiful energy) and interpretable (the atoms can have a physiological meaning). Using real EEG data, the proposed method is shown to outperform the classical multichannel matching pursuit used with a Gabor dictionary, as measured by the representative power of the learned dictionary and its spatial flexibility. Moreover, dictionary learning can capture interpretable patterns: this ability is illustrated on real data, learning a P300 evoked potential.

  15. Multivariate Generalized Multiscale Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Anne Humeau-Heurtier

    2016-11-01

    Full Text Available Multiscale entropy (MSE was introduced in the 2000s to quantify systems’ complexity. MSE relies on (i a coarse-graining procedure to derive a set of time series representing the system dynamics on different time scales; (ii the computation of the sample entropy for each coarse-grained time series. A refined composite MSE (rcMSE—based on the same steps as MSE—also exists. Compared to MSE, rcMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy for short time series. The multivariate versions of MSE (MMSE and rcMSE (MrcMSE have also been introduced. In the coarse-graining step used in MSE, rcMSE, MMSE, and MrcMSE, the mean value is used to derive representations of the original data at different resolutions. A generalization of MSE was recently published, using the computation of different moments in the coarse-graining procedure. However, so far, this generalization only exists for univariate signals. We therefore herein propose an extension of this generalized MSE to multivariate data. The multivariate generalized algorithms of MMSE and MrcMSE presented herein (MGMSE and MGrcMSE, respectively are first analyzed through the processing of synthetic signals. We reveal that MGrcMSE shows better performance than MGMSE for short multivariate data. We then study the performance of MGrcMSE on two sets of short multivariate electroencephalograms (EEG available in the public domain. We report that MGrcMSE may show better performance than MrcMSE in distinguishing different types of multivariate EEG data. MGrcMSE could therefore supplement MMSE or MrcMSE in the processing of multivariate datasets.

  16. Multivariable modeling and multivariate analysis for the behavioral sciences

    CERN Document Server

    Everitt, Brian S

    2009-01-01

    Multivariable Modeling and Multivariate Analysis for the Behavioral Sciences shows students how to apply statistical methods to behavioral science data in a sensible manner. Assuming some familiarity with introductory statistics, the book analyzes a host of real-world data to provide useful answers to real-life issues.The author begins by exploring the types and design of behavioral studies. He also explains how models are used in the analysis of data. After describing graphical methods, such as scatterplot matrices, the text covers simple linear regression, locally weighted regression, multip

  17. Fractional and multivariable calculus model building and optimization problems

    CERN Document Server

    Mathai, A M

    2017-01-01

    This textbook presents a rigorous approach to multivariable calculus in the context of model building and optimization problems. This comprehensive overview is based on lectures given at five SERC Schools from 2008 to 2012 and covers a broad range of topics that will enable readers to understand and create deterministic and nondeterministic models. Researchers, advanced undergraduate, and graduate students in mathematics, statistics, physics, engineering, and biological sciences will find this book to be a valuable resource for finding appropriate models to describe real-life situations. The first chapter begins with an introduction to fractional calculus moving on to discuss fractional integrals, fractional derivatives, fractional differential equations and their solutions. Multivariable calculus is covered in the second chapter and introduces the fundamentals of multivariable calculus (multivariable functions, limits and continuity, differentiability, directional derivatives and expansions of multivariable ...

  18. Multivariate Modelling via Matrix Subordination

    DEFF Research Database (Denmark)

    Nicolato, Elisa

    Extending the vast library of univariate models to price multi-asset derivatives is still a challenge in the field of Quantitative Finance. Within the literature on multivariate modelling, a dichotomy may be noticed. On one hand, the focus has been on the construction of models displaying...... stochastic correlation within the framework of discussion processes (see e.g. Pigorsh and Stelzer (2008), Hubalek and Nicolato (2008) and Zhu (2000)). On the other hand a number of authors have proposed multivariate Levy models, which allow for flexible modelling of returns, but at the expenses of a constant...... correlation structure (see e.g. Leoni and Schoutens (2007) and Leoni and Schoutens (2007) among others). Tractable multivariate models displaying flexible and stochastic correlation structures combined with jumps is proving to be rather problematic. In particular, the classical technique of introducing...

  19. Multivariable q-Racah polynomials

    CERN Document Server

    Van Diejen, J F

    1996-01-01

    The Koornwinder-Macdonald multivariable generalization of the Askey-Wilson polynomials is studied for parameters satisfying a truncation condition such that the orthogonality measure becomes discrete with support on a finite grid. For this parameter regime the polynomials may be seen as a multivariable counterpart of the (one-variable) q-Racah polynomials. We present the discrete orthogonality measure, expressions for the normalization constants converting the polynomials into an orthonormal system (in terms of the normalization constant for the unit polynomial), and we discuss the limit q\\rightarrow 1 leading to multivariable Racah type polynomials. Of special interest is the situation that q lies on the unit circle; in that case it is found that there exists a natural parameter domain for which the discrete orthogonality measure (which is complex in general) becomes real-valued and positive. We investigate the properties of a finite-dimensional discrete integral transform for functions over the grid, whose ...

  20. Multivariate stochastic simulation with subjective multivariate normal distributions

    Science.gov (United States)

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...