Constrained principal component analysis and related techniques
Takane, Yoshio
2013-01-01
In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre
Principal Components as a Data Reduction and Noise Reduction Technique
Imhoff, M. L.; Campbell, W. J.
1982-01-01
The potential of principal components as a pipeline data reduction technique for thematic mapper data was assessed and principal components analysis and its transformation as a noise reduction technique was examined. Two primary factors were considered: (1) how might data reduction and noise reduction using the principal components transformation affect the extraction of accurate spectral classifications; and (2) what are the real savings in terms of computer processing and storage costs of using reduced data over the full 7-band TM complement. An area in central Pennsylvania was chosen for a study area. The image data for the project were collected using the Earth Resources Laboratory's thematic mapper simulator (TMS) instrument.
Efficacy of the Principal Components Analysis Techniques Using ...
African Journals Online (AJOL)
Second, the paper reports results of principal components analysis after the artificial data were submitted to three commonly used procedures; scree plot, Kaiser rule, and modified Horn's parallel analysis, and demonstrate the pedagogical utility of using artificial data in teaching advanced quantitative concepts. The results ...
Bro, R.; Smilde, A.K.
2014-01-01
Principal component analysis is one of the most important and powerful methods in chemometrics as well as in a wealth of other areas. This paper provides a description of how to understand, use, and interpret principal component analysis. The paper focuses on the use of principal component analysis
Directory of Open Access Journals (Sweden)
Erika Pittella
2017-01-01
Full Text Available In this paper, Principal Component Analysis technique is applied on the signal measured by an ultra wide-band radar to compute the breath and heart rate of volunteers. The measurement set-up is based on an indirect time domain reflectometry technique, using an ultra wide-band antenna in contact with the subject’s thorax, at the heart height, and a vector network analyzer. The Principal Component Analysis is applied on the signal reflected by the thorax and the obtained breath frequencies are compared against measures acquired by a piezoelectric belt, a widely used commercial system for respiratory activity monitoring. Breath frequency results show that the proposed approach is suitable for breath activity monitoring. Moreover, the wearable ultra wide-band radar gives also promising results for heart activity frequency detection.
Directory of Open Access Journals (Sweden)
Gheorghe Gîlcă
2015-06-01
Full Text Available This article deals with a recognition system using an algorithm based on the Principal Component Analysis (PCA technique. The recognition system consists only of a PC and an integrated video camera. The algorithm is developed in MATLAB language and calculates the eigenfaces considered as features of the face. The PCA technique is based on the matching between the facial test image and the training prototype vectors. The mathcing score between the facial test image and the training prototype vectors is calculated between their coefficient vectors. If the matching is high, we have the best recognition. The results of the algorithm based on the PCA technique are very good, even if the person looks from one side at the video camera.
The application of principal component analysis to quantify technique in sports.
Federolf, P; Reid, R; Gilgien, M; Haugen, P; Smith, G
2014-06-01
Analyzing an athlete's "technique," sport scientists often focus on preselected variables that quantify important aspects of movement. In contrast, coaches and practitioners typically describe movements in terms of basic postures and movement components using subjective and qualitative features. A challenge for sport scientists is finding an appropriate quantitative methodology that incorporates the holistic perspective of human observers. Using alpine ski racing as an example, this study explores principal component analysis (PCA) as a mathematical method to decompose a complex movement pattern into its main movement components. Ski racing movements were recorded by determining the three-dimensional coordinates of 26 points on each skier which were subsequently interpreted as a 78-dimensional posture vector at each time point. PCA was then used to determine the mean posture and principal movements (PMk ) carried out by the athletes. The first four PMk contained 95.5 ± 0.5% of the variance in the posture vectors which quantified changes in body inclination, vertical or fore-aft movement of the trunk, and distance between skis. In summary, calculating PMk offered a data-driven, quantitative, and objective method of analyzing human movement that is similar to how human observers such as coaches or ski instructors would describe the movement. © 2012 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Energy Technology Data Exchange (ETDEWEB)
Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE
2008-01-01
Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.
A simple noniterative principal component technique for rapid noise reduction in parallel MR images.
Patel, Anand S; Duan, Qi; Robson, Philip M; McKenzie, Charles A; Sodickson, Daniel K
2012-01-01
The utilization of parallel imaging permits increased MR acquisition speed and efficiency; however, parallel MRI usually leads to a deterioration in the signal-to-noise ratio when compared with otherwise equivalent unaccelerated acquisitions. At high accelerations, the parallel image reconstruction matrix tends to become dominated by one principal component. This has been utilized to enable substantial reductions in g-factor-related noise. A previously published technique achieved noise reductions via a computationally intensive search for multiples of the dominant singular vector which, when subtracted from the image, minimized joint entropy between the accelerated image and a reference image. We describe a simple algorithm that can accomplish similar results without a time-consuming search. Significant reductions in g-factor-related noise were achieved using this new algorithm with in vivo acquisitions at 1.5 T with an eight-element array. Copyright © 2011 John Wiley & Sons, Ltd.
Multiscale principal component analysis
International Nuclear Information System (INIS)
Akinduko, A A; Gorban, A N
2014-01-01
Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis
Euler principal component analysis
Liwicki, Stephan; Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja
Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ 2-norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA,
International Nuclear Information System (INIS)
Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.
1981-01-01
A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained
Hearty, Aine P; Gibney, Michael J
2009-02-01
The aims of the present study were to examine and compare dietary patterns in adults using cluster and factor analyses and to examine the format of the dietary variables on the pattern solutions (i.e. expressed as grams/day (g/d) of each food group or as the percentage contribution to total energy intake). Food intake data were derived from the North/South Ireland Food Consumption Survey 1997-9, which was a randomised cross-sectional study of 7 d recorded food and nutrient intakes of a representative sample of 1379 Irish adults aged 18-64 years. Cluster analysis was performed using the k-means algorithm and principal component analysis (PCA) was used to extract dietary factors. Food data were reduced to thirty-three food groups. For cluster analysis, the most suitable format of the food-group variable was found to be the percentage contribution to energy intake, which produced six clusters: 'Traditional Irish'; 'Continental'; 'Unhealthy foods'; 'Light-meal foods & low-fat milk'; 'Healthy foods'; 'Wholemeal bread & desserts'. For PCA, food groups in the format of g/d were found to be the most suitable format, and this revealed four dietary patterns: 'Unhealthy foods & high alcohol'; 'Traditional Irish'; 'Healthy foods'; 'Sweet convenience foods & low alcohol'. In summary, cluster and PCA identified similar dietary patterns when presented with the same dataset. However, the two dietary pattern methods required a different format of the food-group variable, and the most appropriate format of the input variable should be considered in future studies.
Nika, Varvara; Babyn, Paul; Zhu, Hongmei
2014-07-01
Automatic change detection methods for identifying the changes of serial MR images taken at different times are of great interest to radiologists. The majority of existing change detection methods in medical imaging, and those of brain images in particular, include many preprocessing steps and rely mostly on statistical analysis of magnetic resonance imaging (MRI) scans. Although most methods utilize registration software, tissue classification remains a difficult and overwhelming task. Recently, dictionary learning techniques are being used in many areas of image processing, such as image surveillance, face recognition, remote sensing, and medical imaging. We present an improved version of the EigenBlockCD algorithm, named the EigenBlockCD-2. The EigenBlockCD-2 algorithm performs an initial global registration and identifies the changes between serial MR images of the brain. Blocks of pixels from a baseline scan are used to train local dictionaries to detect changes in the follow-up scan. We use PCA to reduce the dimensionality of the local dictionaries and the redundancy of data. Choosing the appropriate distance measure significantly affects the performance of our algorithm. We examine the differences between [Formula: see text] and [Formula: see text] norms as two possible similarity measures in the improved EigenBlockCD-2 algorithm. We show the advantages of the [Formula: see text] norm over the [Formula: see text] norm both theoretically and numerically. We also demonstrate the performance of the new EigenBlockCD-2 algorithm for detecting changes of MR images and compare our results with those provided in the recent literature. Experimental results with both simulated and real MRI scans show that our improved EigenBlockCD-2 algorithm outperforms the previous methods. It detects clinical changes while ignoring the changes due to the patient's position and other acquisition artifacts.
Rampe, E. B.; Lanza, N. L.
2012-01-01
Orbital near-infrared (NIR) reflectance spectra of the martian surface from the OMEGA and CRISM instruments have identified a variety of phyllosilicates in Noachian terrains. The types of phyllosilicates present on Mars have important implications for the aqueous environments in which they formed, and, thus, for recognizing locales that may have been habitable. Current identifications of phyllosilicates from martian NIR data are based on the positions of spectral absorptions relative to laboratory data of well-characterized samples and from spectral ratios; however, some phyllosilicates can be difficult to distinguish from one another with these methods (i.e. illite vs. muscovite). Here we employ a multivariate statistical technique, principal component analysis (PCA), to differentiate between spectrally similar phyllosilicate minerals. PCA is commonly used in a variety of industries (pharmaceutical, agricultural, viticultural) to discriminate between samples. Previous work using PCA to analyze raw NIR reflectance data from mineral mixtures has shown that this is a viable technique for identifying mineral types, abundances, and particle sizes. Here, we evaluate PCA of second-derivative NIR reflectance data as a method for classifying phyllosilicates and test whether this method can be used to identify phyllosilicates on Mars.
Teaching Principal Components Using Correlations.
Westfall, Peter H; Arias, Andrea L; Fulton, Lawrence V
2017-01-01
Introducing principal components (PCs) to students is difficult. First, the matrix algebra and mathematical maximization lemmas are daunting, especially for students in the social and behavioral sciences. Second, the standard motivation involving variance maximization subject to unit length constraint does not directly connect to the "variance explained" interpretation. Third, the unit length and uncorrelatedness constraints of the standard motivation do not allow re-scaling or oblique rotations, which are common in practice. Instead, we propose to motivate the subject in terms of optimizing (weighted) average proportions of variance explained in the original variables; this approach may be more intuitive, and hence easier to understand because it links directly to the familiar "R-squared" statistic. It also removes the need for unit length and uncorrelatedness constraints, provides a direct interpretation of "variance explained," and provides a direct answer to the question of whether to use covariance-based or correlation-based PCs. Furthermore, the presentation can be made without matrix algebra or optimization proofs. Modern tools from data science, including heat maps and text mining, provide further help in the interpretation and application of PCs; examples are given. Together, these techniques may be used to revise currently used methods for teaching and learning PCs in the behavioral sciences.
Young, Cole; Reinkensmeyer, David J
2014-08-01
Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization. Copyright © 2014 Elsevier B.V. All rights reserved.
Osis, Sean T; Hettinga, Blayne A; Leitch, Jessica; Ferber, Reed
2014-08-22
As 3-dimensional (3D) motion-capture for clinical gait analysis continues to evolve, new methods must be developed to improve the detection of gait cycle events based on kinematic data. Recently, the application of principal component analysis (PCA) to gait data has shown promise in detecting important biomechanical features. Therefore, the purpose of this study was to define a new foot strike detection method for a continuum of striking techniques, by applying PCA to joint angle waveforms. In accordance with Newtonian mechanics, it was hypothesized that transient features in the sagittal-plane accelerations of the lower extremity would be linked with the impulsive application of force to the foot at foot strike. Kinematic and kinetic data from treadmill running were selected for 154 subjects, from a database of gait biomechanics. Ankle, knee and hip sagittal plane angular acceleration kinematic curves were chained together to form a row input to a PCA matrix. A linear polynomial was calculated based on PCA scores, and a 10-fold cross-validation was performed to evaluate prediction accuracy against gold-standard foot strike as determined by a 10 N rise in the vertical ground reaction force. Results show 89-94% of all predicted foot strikes were within 4 frames (20 ms) of the gold standard with the largest error being 28 ms. It is concluded that this new foot strike detection is an improvement on existing methods and can be applied regardless of whether the runner exhibits a rearfoot, midfoot, or forefoot strike pattern. Copyright © 2014 Elsevier Ltd. All rights reserved.
Parametric functional principal component analysis.
Sang, Peijun; Wang, Liangliang; Cao, Jiguo
2017-09-01
Functional principal component analysis (FPCA) is a popular approach in functional data analysis to explore major sources of variation in a sample of random curves. These major sources of variation are represented by functional principal components (FPCs). Most existing FPCA approaches use a set of flexible basis functions such as B-spline basis to represent the FPCs, and control the smoothness of the FPCs by adding roughness penalties. However, the flexible representations pose difficulties for users to understand and interpret the FPCs. In this article, we consider a variety of applications of FPCA and find that, in many situations, the shapes of top FPCs are simple enough to be approximated using simple parametric functions. We propose a parametric approach to estimate the top FPCs to enhance their interpretability for users. Our parametric approach can also circumvent the smoothing parameter selecting process in conventional nonparametric FPCA methods. In addition, our simulation study shows that the proposed parametric FPCA is more robust when outlier curves exist. The parametric FPCA method is demonstrated by analyzing several datasets from a variety of applications. © 2017, The International Biometric Society.
Principal component regression for crop yield estimation
Suryanarayana, T M V
2016-01-01
This book highlights the estimation of crop yield in Central Gujarat, especially with regard to the development of Multiple Regression Models and Principal Component Regression (PCR) models using climatological parameters as independent variables and crop yield as a dependent variable. It subsequently compares the multiple linear regression (MLR) and PCR results, and discusses the significance of PCR for crop yield estimation. In this context, the book also covers Principal Component Analysis (PCA), a statistical procedure used to reduce a number of correlated variables into a smaller number of uncorrelated variables called principal components (PC). This book will be helpful to the students and researchers, starting their works on climate and agriculture, mainly focussing on estimation models. The flow of chapters takes the readers in a smooth path, in understanding climate and weather and impact of climate change, and gradually proceeds towards downscaling techniques and then finally towards development of ...
On Bayesian Principal Component Analysis
Czech Academy of Sciences Publication Activity Database
Šmídl, Václav; Quinn, A.
2007-01-01
Roč. 51, č. 9 (2007), s. 4101-4123 ISSN 0167-9473 R&D Projects: GA MŠk(CZ) 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Principal component analysis ( PCA ) * Variational bayes (VB) * von-Mises–Fisher distribution Subject RIV: BC - Control Systems Theory Impact factor: 1.029, year: 2007 http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6V8V-4MYD60N-6&_user=10&_coverDate=05%2F15%2F2007&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=b8ea629d48df926fe18f9e5724c9003a
Maktabdar Oghaz, Mahdi; Maarof, Mohd Aizaini; Zainal, Anazida; Rohani, Mohd Foad; Yaghoubyan, S Hadi
2015-01-01
Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications.
Probabilistic Principal Component Analysis for Metabolomic Data.
LENUS (Irish Health Repository)
Nyamundanda, Gift
2010-11-23
Abstract Background Data from metabolomic studies are typically complex and high-dimensional. Principal component analysis (PCA) is currently the most widely used statistical technique for analyzing metabolomic data. However, PCA is limited by the fact that it is not based on a statistical model. Results Here, probabilistic principal component analysis (PPCA) which addresses some of the limitations of PCA, is reviewed and extended. A novel extension of PPCA, called probabilistic principal component and covariates analysis (PPCCA), is introduced which provides a flexible approach to jointly model metabolomic data and additional covariate information. The use of a mixture of PPCA models for discovering the number of inherent groups in metabolomic data is demonstrated. The jackknife technique is employed to construct confidence intervals for estimated model parameters throughout. The optimal number of principal components is determined through the use of the Bayesian Information Criterion model selection tool, which is modified to address the high dimensionality of the data. Conclusions The methods presented are illustrated through an application to metabolomic data sets. Jointly modeling metabolomic data and covariates was successfully achieved and has the potential to provide deeper insight to the underlying data structure. Examination of confidence intervals for the model parameters, such as loadings, allows for principled and clear interpretation of the underlying data structure. A software package called MetabolAnalyze, freely available through the R statistical software, has been developed to facilitate implementation of the presented methods in the metabolomics field.
Use of Sparse Principal Component Analysis (SPCA) for Fault Detection
DEFF Research Database (Denmark)
Gajjar, Shriram; Kulahci, Murat; Palazoglu, Ahmet
2016-01-01
Principal component analysis (PCA) has been widely used for data dimension reduction and process fault detection. However, interpreting the principal components and the outcomes of PCA-based monitoring techniques is a challenging task since each principal component is a linear combination of the ...
PCA: Principal Component Analysis for spectra modeling
Hurley, Peter D.; Oliver, Seb; Farrah, Duncan; Wang, Lingyu; Efstathiou, Andreas
2012-07-01
The mid-infrared spectra of ultraluminous infrared galaxies (ULIRGs) contain a variety of spectral features that can be used as diagnostics to characterize the spectra. However, such diagnostics are biased by our prior prejudices on the origin of the features. Moreover, by using only part of the spectrum they do not utilize the full information content of the spectra. Blind statistical techniques such as principal component analysis (PCA) consider the whole spectrum, find correlated features and separate them out into distinct components. This code, written in IDL, classifies principal components of IRS spectra to define a new classification scheme using 5D Gaussian mixtures modelling. The five PCs and average spectra for the four classifications to classify objects are made available with the code.
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
Digital Repository Service at National Institute of Oceanography (India)
Prerna, R.; Naidu, V.S.; Soniya, S.
PRINCIPAL COMPONENT ANALYSIS AND GEO-SPATIAL TECHNIQUES: A CASE STUDY R. Prerna 1, V.S. Naidu 2, S. Soniya 3 1 Project Assistant II, National Institute of Oceanography, Regional Center 2 Senior Scientist, National Institute of Oceanography, Regional....Sc. Geoinformatics Educational Qualification: M.Sc. Geoinformatics, B.A. (Honours) Geography Work Experience: Currently employed as Project Assistant (Level II) at National Institute of Oceanography. Major tasks performed at NIO include studies on coastal...
Surface analysis the principal techniques
Vickerman, John C
2009-01-01
This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c
Kernel principal component analysis for change detection
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg; Morton, J.C.
2008-01-01
Principal component analysis (PCA) is often used to detect change over time in remotely sensed images. A commonly used technique consists of finding the projections along the two eigenvectors for data consisting of two variables which represent the same spectral band covering the same geographical...... region acquired at two different time points. If change over time does not dominate the scene, the projection of the original two bands onto the second eigenvector will show change over time. In this paper a kernel version of PCA is used to carry out the analysis. Unlike ordinary PCA, kernel PCA...
Incremental Tensor Principal Component Analysis for Handwritten Digit Recognition
Directory of Open Access Journals (Sweden)
Chang Liu
2014-01-01
Full Text Available To overcome the shortcomings of traditional dimensionality reduction algorithms, incremental tensor principal component analysis (ITPCA based on updated-SVD technique algorithm is proposed in this paper. This paper proves the relationship between PCA, 2DPCA, MPCA, and the graph embedding framework theoretically and derives the incremental learning procedure to add single sample and multiple samples in detail. The experiments on handwritten digit recognition have demonstrated that ITPCA has achieved better recognition performance than that of vector-based principal component analysis (PCA, incremental principal component analysis (IPCA, and multilinear principal component analysis (MPCA algorithms. At the same time, ITPCA also has lower time and space complexity.
Process Knowledge Discovery Using Sparse Principal Component Analysis
DEFF Research Database (Denmark)
Gao, Huihui; Gajjar, Shriram; Kulahci, Murat
2016-01-01
have been an active area of research. Among the methods used, principal component analysis (PCA) is a well-established technique that allows for dimensionality reduction for large data sets by finding new uncorrelated variables, namely principal components (PCs). However, it is difficult to interpret...
Directory of Open Access Journals (Sweden)
Frank Peprah
2017-10-01
Full Text Available Face Recognition System employs a variety of feature extraction projection techniques which are grouped into Appearance-Based and Feature-Based. In a vast majority of the studies undertaken in the field of Face Recognition special attention is given to the Appearance-Based Methods which represent the dominant and most popular feature extraction technique used. Even though a number of comparative studies exist researchers have not reached consensus within the scientific community regarding the relative ranking of the efficiency of the appearance-based methods LDA PCA etc for face recognition task. This paper studied two appearance-based methods LDA PCA separately with three 3 distance metrics similarity measures such as Euclidean distance City Block amp Cosine to ascertain which projection-metric combination was relatively more efficient in terms of time it takes to recognise a face. The study considered the effect of varying the image data size in a training database on all the projection-metric methods implemented. LDA-Cosine Distance Metric was consequently ascertained to be the most efficient when tested with two separate standard databases AT amp T Face Database and Indian Face Database. It was also concluded that LDA outperformed PCA.
Nonlinear principal component analysis and its applications
Mori, Yuichi; Makino, Naomichi
2016-01-01
This book expounds the principle and related applications of nonlinear principal component analysis (PCA), which is useful method to analyze mixed measurement levels data. In the part dealing with the principle, after a brief introduction of ordinary PCA, a PCA for categorical data (nominal and ordinal) is introduced as nonlinear PCA, in which an optimal scaling technique is used to quantify the categorical variables. The alternating least squares (ALS) is the main algorithm in the method. Multiple correspondence analysis (MCA), a special case of nonlinear PCA, is also introduced. All formulations in these methods are integrated in the same manner as matrix operations. Because any measurement levels data can be treated consistently as numerical data and ALS is a very powerful tool for estimations, the methods can be utilized in a variety of fields such as biometrics, econometrics, psychometrics, and sociology. In the applications part of the book, four applications are introduced: variable selection for mixed...
COPD phenotype description using principal components analysis
DEFF Research Database (Denmark)
Roy, Kay; Smith, Jacky; Kolsum, Umme
2009-01-01
BACKGROUND: Airway inflammation in COPD can be measured using biomarkers such as induced sputum and Fe(NO). This study set out to explore the heterogeneity of COPD using biomarkers of airway and systemic inflammation and pulmonary function by principal components analysis (PCA). SUBJECTS...... AND METHODS: In 127 COPD patients (mean FEV1 61%), pulmonary function, Fe(NO), plasma CRP and TNF-alpha, sputum differential cell counts and sputum IL8 (pg/ml) were measured. Principal components analysis as well as multivariate analysis was performed. RESULTS: PCA identified four main components (% variance...... associations between the variables within components 1 and 2. CONCLUSION: COPD is a multi dimensional disease. Unrelated components of disease were identified, including neutrophilic airway inflammation which was associated with systemic inflammation, and sputum eosinophils which were related to increased Fe...
Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham
2018-01-01
Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a
Stochastic convex sparse principal component analysis.
Baytas, Inci M; Lin, Kaixiang; Wang, Fei; Jain, Anil K; Zhou, Jiayu
2016-12-01
Principal component analysis (PCA) is a dimensionality reduction and data analysis tool commonly used in many areas. The main idea of PCA is to represent high-dimensional data with a few representative components that capture most of the variance present in the data. However, there is an obvious disadvantage of traditional PCA when it is applied to analyze data where interpretability is important. In applications, where the features have some physical meanings, we lose the ability to interpret the principal components extracted by conventional PCA because each principal component is a linear combination of all the original features. For this reason, sparse PCA has been proposed to improve the interpretability of traditional PCA by introducing sparsity to the loading vectors of principal components. The sparse PCA can be formulated as an ℓ 1 regularized optimization problem, which can be solved by proximal gradient methods. However, these methods do not scale well because computation of the exact gradient is generally required at each iteration. Stochastic gradient framework addresses this challenge by computing an expected gradient at each iteration. Nevertheless, stochastic approaches typically have low convergence rates due to the high variance. In this paper, we propose a convex sparse principal component analysis (Cvx-SPCA), which leverages a proximal variance reduced stochastic scheme to achieve a geometric convergence rate. We further show that the convergence analysis can be significantly simplified by using a weak condition which allows a broader class of objectives to be applied. The efficiency and effectiveness of the proposed method are demonstrated on a large-scale electronic medical record cohort.
Experimental and principal component analysis of waste ...
African Journals Online (AJOL)
The present study is aimed at determining through principal component analysis the most important variables affecting bacterial degradation in ponds. Data were collected from literature. In addition, samples were also collected from the waste stabilization ponds at the University of Nigeria, Nsukka and analyzed to ...
Principal Component Analysis as an Efficient Performance ...
African Journals Online (AJOL)
This paper uses the principal component analysis (PCA) to examine the possibility of using few explanatory variables (X's) to explain the variation in Y. It applied PCA to assess the performance of students in Abia State Polytechnic, Aba, Nigeria. This was done by estimating the coefficients of eight explanatory variables in a ...
Principal component analysis of psoriasis lesions images
DEFF Research Database (Denmark)
Maletti, Gabriela Mariel; Ersbøll, Bjarne Kjær
2003-01-01
A set of RGB images of psoriasis lesions is used. By visual examination of these images, there seem to be no common pattern that could be used to find and align the lesions within and between sessions. It is expected that the principal components of the original images could be useful during future...
Principal component analysis implementation in Java
Wójtowicz, Sebastian; Belka, Radosław; Sławiński, Tomasz; Parian, Mahnaz
2015-09-01
In this paper we show how PCA (Principal Component Analysis) method can be implemented using Java programming language. We consider using PCA algorithm especially in analysed data obtained from Raman spectroscopy measurements, but other applications of developed software should also be possible. Our goal is to create a general purpose PCA application, ready to run on every platform which is supported by Java.
Principal component analysis applied to remote sensing
Directory of Open Access Journals (Sweden)
Javier Estornell
2013-06-01
Full Text Available The main objective of this article was to show an application of principal component analysis (PCA which is used in two science degrees. Particularly, PCA analysis was used to obtain information of the land cover from satellite images. Three Landsat images were selected from two areas which were located in the municipalities of Gandia and Vallat, both in the Valencia province (Spain. In the first study area, just one Landsat image of the 2005 year was used. In the second study area, two Landsat images were used taken in the 1994 and 2000 years to analyse the most significant changes in the land cover. According to the results, the second principal component of the Gandia area image allowed detecting the presence of vegetation. The same component in the Vallat area allowed detecting a forestry area affected by a forest fire. Consequently in this study we confirmed the feasibility of using PCA in remote sensing to extract land use information.
Principal Component Analysis: Most Favourite Tool in Chemometrics
Indian Academy of Sciences (India)
Abstract. Principal component analysis (PCA) is the most commonlyused chemometric technique. It is an unsupervised patternrecognition technique. PCA has found applications in chemistry,biology, medicine and economics. The present work attemptsto understand how PCA work and how can we interpretits results.
A principal components model of soundscape perception.
Axelsson, Östen; Nilsson, Mats E; Berglund, Birgitta
2010-11-01
There is a need for a model that identifies underlying dimensions of soundscape perception, and which may guide measurement and improvement of soundscape quality. With the purpose to develop such a model, a listening experiment was conducted. One hundred listeners measured 50 excerpts of binaural recordings of urban outdoor soundscapes on 116 attribute scales. The average attribute scale values were subjected to principal components analysis, resulting in three components: Pleasantness, eventfulness, and familiarity, explaining 50, 18 and 6% of the total variance, respectively. The principal-component scores were correlated with physical soundscape properties, including categories of dominant sounds and acoustic variables. Soundscape excerpts dominated by technological sounds were found to be unpleasant, whereas soundscape excerpts dominated by natural sounds were pleasant, and soundscape excerpts dominated by human sounds were eventful. These relationships remained after controlling for the overall soundscape loudness (Zwicker's N(10)), which shows that 'informational' properties are substantial contributors to the perception of soundscape. The proposed principal components model provides a framework for future soundscape research and practice. In particular, it suggests which basic dimensions are necessary to measure, how to measure them by a defined set of attribute scales, and how to promote high-quality soundscapes.
A Genealogical Interpretation of Principal Components Analysis
McVean, Gil
2009-01-01
Principal components analysis, PCA, is a statistical method commonly used in population genetics to identify structure in the distribution of genetic variation across geographical location and ethnic background. However, while the method is often used to inform about historical demographic processes, little is known about the relationship between fundamental demographic parameters and the projection of samples onto the primary axes. Here I show that for SNP data the projection of samples onto the principal components can be obtained directly from considering the average coalescent times between pairs of haploid genomes. The result provides a framework for interpreting PCA projections in terms of underlying processes, including migration, geographical isolation, and admixture. I also demonstrate a link between PCA and Wright's fst and show that SNP ascertainment has a largely simple and predictable effect on the projection of samples. Using examples from human genetics, I discuss the application of these results to empirical data and the implications for inference. PMID:19834557
Principal Component Analysis of Anhui Agricultural Industrialization
Chen, Li
2011-01-01
Part 1: Simulation, Optimization, Monitoring and Control Technology; International audience; This paper is discussed the Anhui agricultural industrialization using the method of principal component analysis. The indexes include per capita net income of farmers in Anhui province, non-agricultural employment rate, urbanization rate, the total power of agricultural mechanization, universal ratio of rural water, car villages, the proportion of industrial waste water by sewage treatment in total e...
Multilevel sparse functional principal component analysis.
Di, Chongzhi; Crainiceanu, Ciprian M; Jank, Wolfgang S
2014-01-29
We consider analysis of sparsely sampled multilevel functional data, where the basic observational unit is a function and data have a natural hierarchy of basic units. An example is when functions are recorded at multiple visits for each subject. Multilevel functional principal component analysis (MFPCA; Di et al. 2009) was proposed for such data when functions are densely recorded. Here we consider the case when functions are sparsely sampled and may contain only a few observations per function. We exploit the multilevel structure of covariance operators and achieve data reduction by principal component decompositions at both between and within subject levels. We address inherent methodological differences in the sparse sampling context to: 1) estimate the covariance operators; 2) estimate the functional principal component scores; 3) predict the underlying curves. Through simulations the proposed method is able to discover dominating modes of variations and reconstruct underlying curves well even in sparse settings. Our approach is illustrated by two applications, the Sleep Heart Health Study and eBay auctions.
Azilawati, M I; Hashim, D M; Jamilah, B; Amin, I
2015-04-01
The amino acid compositions of bovine, porcine and fish gelatin were determined by amino acid analysis using 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate as derivatization reagent. Sixteen amino acids were identified with similar spectral chromatograms. Data pre-treatment via centering and transformation of data by normalization were performed to provide data that are more suitable for analysis and easier to be interpreted. Principal component analysis (PCA) transformed the original data matrix into a number of principal components (PCs). Three principal components (PCs) described 96.5% of the total variance, and 2 PCs (91%) explained the highest variances. The PCA model demonstrated the relationships among amino acids in the correlation loadings plot to the group of gelatins in the scores plot. Fish gelatin was correlated to threonine, serine and methionine on the positive side of PC1; bovine gelatin was correlated to the non-polar side chains amino acids that were proline, hydroxyproline, leucine, isoleucine and valine on the negative side of PC1 and porcine gelatin was correlated to the polar side chains amino acids that were aspartate, glutamic acid, lysine and tyrosine on the negative side of PC2. Verification on the database using 12 samples from commercial products gelatin-based had confirmed the grouping patterns and the variables correlations. Therefore, this quantitative method is very useful as a screening method to determine gelatin from various sources. Copyright © 2014 Elsevier Ltd. All rights reserved.
Analysis of breast cancer progression using principal component ...
Indian Academy of Sciences (India)
PRAKASH KUMAR
We use this set of genes and a new consensus ensemble k-clustering technique, which averages over several clustering methods and many data perturbations, to identify strong, stable clusters. We also define a simple criterion to find the optimum number of. Analysis of breast cancer progression using principal component.
Aida, S.; Matsuno, T.; Hasegawa, T.; Tsuji, K.
2017-01-01
Micro X-ray fluorescence (micro-XRF) analysis is repeated as a means of producing elemental maps. In some cases, however, the XRF images of trace elements that are obtained are not clear due to high background intensity. To solve this problem, we applied principal component analysis (PCA) to XRF spectra. We focused on improving the quality of XRF images by applying PCA. XRF images of the dried residue of standard solution on the glass substrate were taken. The XRF intensities for the dried re...
Integrating Data Transformation in Principal Components Analysis
Maadooliat, Mehdi
2015-01-02
Principal component analysis (PCA) is a popular dimension reduction method to reduce the complexity and obtain the informative aspects of high-dimensional datasets. When the data distribution is skewed, data transformation is commonly used prior to applying PCA. Such transformation is usually obtained from previous studies, prior knowledge, or trial-and-error. In this work, we develop a model-based method that integrates data transformation in PCA and finds an appropriate data transformation using the maximum profile likelihood. Extensions of the method to handle functional data and missing values are also developed. Several numerical algorithms are provided for efficient computation. The proposed method is illustrated using simulated and real-world data examples.
Dimensionality reduction using Principal Component Analysis for network intrusion detection
Directory of Open Access Journals (Sweden)
K. Keerthi Vasan
2016-09-01
Full Text Available Intrusion detection is the identification of malicious activities in a given network by analyzing its traffic. Data mining techniques used for this analysis study the traffic traces and identify hostile flows in the traffic. Dimensionality reduction in data mining focuses on representing data with minimum number of dimensions such that its properties are not lost and hence reducing the underlying complexity in processing the data. Principal Component Analysis (PCA is one of the prominent dimensionality reduction techniques widely used in network traffic analysis. In this paper, we focus on the efficiency of PCA for intrusion detection and determine its Reduction Ratio (RR, ideal number of Principal Components needed for intrusion detection and the impact of noisy data on PCA. We carried out experiments with PCA using various classifier algorithms on two benchmark datasets namely, KDD CUP and UNB ISCX. Experiments show that the first 10 Principal Components are effective for classification. The classification accuracy for 10 Principal Components is about 99.7% and 98.8%, nearly same as the accuracy obtained using original 41 features for KDD and 28 features for ISCX, respectively.
Gosav, Steluţa; Praisler, Mirela; Birsa, Mihail Lucian
2011-01-01
In this paper we present several expert systems that predict the class identity of the modeled compounds, based on a preprocessed spectral database. The expert systems were built using Artificial Neural Networks (ANN) and are designed to predict if an unknown compound has the toxicological activity of amphetamines (stimulant and hallucinogen), or whether it is a nonamphetamine. In attempts to circumvent the laws controlling drugs of abuse, new chemical structures are very frequently introduced on the black market. They are obtained by slightly modifying the controlled molecular structures by adding or changing substituents at various positions on the banned molecules. As a result, no substance similar to those forming a prohibited class may be used nowadays, even if it has not been specifically listed. Therefore, reliable, fast and accessible systems capable of modeling and then identifying similarities at molecular level, are highly needed for epidemiological, clinical, and forensic purposes. In order to obtain the expert systems, we have preprocessed a concatenated spectral database, representing the GC-FTIR (gas chromatography-Fourier transform infrared spectrometry) and GC-MS (gas chromatography-mass spectrometry) spectra of 103 forensic compounds. The database was used as input for a Principal Component Analysis (PCA). The scores of the forensic compounds on the main principal components (PCs) were then used as inputs for the ANN systems. We have built eight PC-ANN systems (principal component analysis coupled with artificial neural network) with a different number of input variables: 15 PCs, 16 PCs, 17 PCs, 18 PCs, 19 PCs, 20 PCs, 21 PCs and 22 PCs. The best expert system was found to be the ANN network built with 18 PCs, which accounts for an explained variance of 77%. This expert system has the best sensitivity (a rate of classification C = 100% and a rate of true positives TP = 100%), as well as a good selectivity (a rate of true negatives TN = 92.77%). A
Directory of Open Access Journals (Sweden)
Mihail Lucian Birsa
2011-10-01
Full Text Available In this paper we present several expert systems that predict the class identity of the modeled compounds, based on a preprocessed spectral database. The expert systems were built using Artificial Neural Networks (ANN and are designed to predict if an unknown compound has the toxicological activity of amphetamines (stimulant and hallucinogen, or whether it is a nonamphetamine. In attempts to circumvent the laws controlling drugs of abuse, new chemical structures are very frequently introduced on the black market. They are obtained by slightly modifying the controlled molecular structures by adding or changing substituents at various positions on the banned molecules. As a result, no substance similar to those forming a prohibited class may be used nowadays, even if it has not been specifically listed. Therefore, reliable, fast and accessible systems capable of modeling and then identifying similarities at molecular level, are highly needed for epidemiological, clinical, and forensic purposes. In order to obtain the expert systems, we have preprocessed a concatenated spectral database, representing the GC-FTIR (gas chromatography-Fourier transform infrared spectrometry and GC-MS (gas chromatography-mass spectrometry spectra of 103 forensic compounds. The database was used as input for a Principal Component Analysis (PCA. The scores of the forensic compounds on the main principal components (PCs were then used as inputs for the ANN systems. We have built eight PC-ANN systems (principal component analysis coupled with artificial neural network with a different number of input variables: 15 PCs, 16 PCs, 17 PCs, 18 PCs, 19 PCs, 20 PCs, 21 PCs and 22 PCs. The best expert system was found to be the ANN network built with 18 PCs, which accounts for an explained variance of 77%. This expert system has the best sensitivity (a rate of classification C = 100% and a rate of true positives TP = 100%, as well as a good selectivity (a rate of true negatives TN
Principal component analysis for fermionic critical points
Costa, Natanael C.; Hu, Wenjian; Bai, Z. J.; Scalettar, Richard T.; Singh, Rajiv R. P.
2017-11-01
We use determinant quantum Monte Carlo (DQMC), in combination with the principal component analysis (PCA) approach to unsupervised learning, to extract information about phase transitions in several of the most fundamental Hamiltonians describing strongly correlated materials. We first explore the zero-temperature antiferromagnet to singlet transition in the periodic Anderson model, the Mott insulating transition in the Hubbard model on a honeycomb lattice, and the magnetic transition in the 1/6-filled Lieb lattice. We then discuss the prospects for learning finite temperature superconducting transitions in the attractive Hubbard model, for which there is no sign problem. Finally, we investigate finite temperature charge density wave (CDW) transitions in the Holstein model, where the electrons are coupled to phonon degrees of freedom, and carry out a finite size scaling analysis to determine Tc. We examine the different behaviors associated with Hubbard-Stratonovich auxiliary field configurations on both the entire space-time lattice and on a single imaginary time slice, or other quantities, such as equal-time Green's and pair-pair correlation functions.
Principal Component Analysis (Pca) Dan Aplikasinya Dengan Spss
Bus Umar, Hermita
2009-01-01
PCA (Principal Component Analysis ) are statistical techniques applied to a single set of variables when the researcher is interested in discovering which variables in the setform coherent subset that are relativity independent of one another.Variables that are correlated with one another but largely independent of other subset of variables are combined into factors. The Coals of PCA to which each variables is explained by each dimension. Step in PCA include selecting and mean measuring a set...
PEMBUATAN PERANGKAT LUNAK PENGENALAN WAJAH MENGGUNAKAN PRINCIPAL COMPONENTS ANALYSIS
Directory of Open Access Journals (Sweden)
Kartika Gunadi
2001-01-01
Full Text Available Face recognition is one of many important researches, and today, many applications have implemented it. Through development of techniques like Principal Components Analysis (PCA, computers can now outperform human in many face recognition tasks, particularly those in which large database of faces must be searched. Principal Components Analysis was used to reduce facial image dimension into fewer variables, which are easier to observe and handle. Those variables then fed into artificial neural networks using backpropagation method to recognise the given facial image. The test results show that PCA can provide high face recognition accuracy. For the training faces, a correct identification of 100% could be obtained. From some of network combinations that have been tested, a best average correct identification of 91,11% could be obtained for the test faces while the worst average result is 46,67 % correct identification Abstract in Bahasa Indonesia : Pengenalan wajah manusia merupakan salah satu bidang penelitian yang penting, dan dewasa ini banyak aplikasi yang dapat menerapkannya. Melalui pengembangan suatu teknik seperti Principal Components Analysis (PCA, komputer sekarang dapat melebihi kemampuan otak manusia dalam berbagai tugas pengenalan wajah, terutama tugas-tugas yang membutuhkan pencarian pada database wajah yang besar. Principal Components Analysis digunakan untuk mereduksi dimensi gambar wajah sehingga menghasilkan variabel yang lebih sedikit yang lebih mudah untuk diobsevasi dan ditangani. Hasil yang diperoleh kemudian akan dimasukkan ke suatu jaringan saraf tiruan dengan metode Backpropagation untuk mengenali gambar wajah yang telah diinputkan ke dalam sistem. Hasil pengujian sistem menunjukkan bahwa penggunaan PCA untuk pengenalan wajah dapat memberikan tingkat akurasi yang cukup tinggi. Untuk gambar wajah yang diikutsertakankan dalam latihan, dapat diperoleh 100% identifikasi yang benar. Dari beberapa kombinasi jaringan yang
Improved pulsar timing via principal component mode tracking
Lin, Hsiu-Hsien; Masui, Kiyoshi; Pen, Ue-Li; Peterson, Jeffrey B.
2018-03-01
We present a principal component analysis method that tracks and compensates for short-time-scale variability in pulsar profiles, with a goal of improving pulsar timing precision. We couple this with a fast likelihood technique for determining pulse time of arrival, marginalizing over the principal component amplitudes. This allows accurate estimation of timing errors in the presence of pulsar variability. We apply the algorithm to the slow pulsar PSR J2139+0040 using an archived set of untargeted raster-scan observations at arbitrary epochs across four years, obtaining an improved timing solution. The method permits accurate pulsar timing in data sets with short contiguous on-source observations, opening opportunities for commensality between pulsar timing and mapping surveys.
Functional Principal Components Analysis of Shanghai Stock Exchange 50 Index
Directory of Open Access Journals (Sweden)
Zhiliang Wang
2014-01-01
Full Text Available The main purpose of this paper is to explore the principle components of Shanghai stock exchange 50 index by means of functional principal component analysis (FPCA. Functional data analysis (FDA deals with random variables (or process with realizations in the smooth functional space. One of the most popular FDA techniques is functional principal component analysis, which was introduced for the statistical analysis of a set of financial time series from an explorative point of view. FPCA is the functional analogue of the well-known dimension reduction technique in the multivariate statistical analysis, searching for linear transformations of the random vector with the maximal variance. In this paper, we studied the monthly return volatility of Shanghai stock exchange 50 index (SSE50. Using FPCA to reduce dimension to a finite level, we extracted the most significant components of the data and some relevant statistical features of such related datasets. The calculated results show that regarding the samples as random functions is rational. Compared with the ordinary principle component analysis, FPCA can solve the problem of different dimensions in the samples. And FPCA is a convenient approach to extract the main variance factors.
Mapping ash properties using principal components analysis
Pereira, Paulo; Brevik, Eric; Cerda, Artemi; Ubeda, Xavier; Novara, Agata; Francos, Marcos; Rodrigo-Comino, Jesus; Bogunovic, Igor; Khaledian, Yones
2017-04-01
In post-fire environments ash has important benefits for soils, such as protection and source of nutrients, crucial for vegetation recuperation (Jordan et al., 2016; Pereira et al., 2015a; 2016a,b). The thickness and distribution of ash are fundamental aspects for soil protection (Cerdà and Doerr, 2008; Pereira et al., 2015b) and the severity at which was produced is important for the type and amount of elements that is released in soil solution (Bodi et al., 2014). Ash is very mobile material, and it is important were it will be deposited. Until the first rainfalls are is very mobile. After it, bind in the soil surface and is harder to erode. Mapping ash properties in the immediate period after fire is complex, since it is constantly moving (Pereira et al., 2015b). However, is an important task, since according the amount and type of ash produced we can identify the degree of soil protection and the nutrients that will be dissolved. The objective of this work is to apply to map ash properties (CaCO3, pH, and select extractable elements) using a principal component analysis (PCA) in the immediate period after the fire. Four days after the fire we established a grid in a 9x27 m area and took ash samples every 3 meters for a total of 40 sampling points (Pereira et al., 2017). The PCA identified 5 different factors. Factor 1 identified high loadings in electrical conductivity, calcium, and magnesium and negative with aluminum and iron, while Factor 3 had high positive loadings in total phosphorous and silica. Factor 3 showed high positive loadings in sodium and potassium, factor 4 high negative loadings in CaCO3 and pH, and factor 5 high loadings in sodium and potassium. The experimental variograms of the extracted factors showed that the Gaussian model was the most precise to model factor 1, the linear to model factor 2 and the wave hole effect to model factor 3, 4 and 5. The maps produced confirm the patternd observed in the experimental variograms. Factor 1 and 2
Principal-vector-directed fringe-tracking technique.
Zhang, Zhihui; Guo, Hongwei
2014-11-01
Fringe tracking is one of the most straightforward techniques for analyzing a single fringe pattern. This work presents a principal-vector-directed fringe-tracking technique. It uses Gaussian derivatives for estimating fringe gradients and uses hysteresis thresholding for segmenting singular points, thus improving the principal component analysis method. Using it allows us to estimate the principal vectors of fringes from a pattern with high noise. The fringe-tracking procedure is directed by these principal vectors, so that erroneous results induced by noise and other error-inducing factors are avoided. At the same time, the singular point regions of the fringe pattern are identified automatically. Using them allows us to determine paths through which the "seed" point for each fringe skeleton is easy to find, thus alleviating the computational burden in processing the fringe pattern. The results of a numerical simulation and experiment demonstrate this method to be valid.
Efficient training of multilayer perceptrons using principal component analysis
International Nuclear Information System (INIS)
Bunzmann, Christoph; Urbanczik, Robert; Biehl, Michael
2005-01-01
A training algorithm for multilayer perceptrons is discussed and studied in detail, which relates to the technique of principal component analysis. The latter is performed with respect to a correlation matrix computed from the example inputs and their target outputs. Typical properties of the training procedure are investigated by means of a statistical physics analysis in models of learning regression and classification tasks. We demonstrate that the procedure requires by far fewer examples for good generalization than traditional online training. For networks with a large number of hidden units we derive the training prescription which achieves, within our model, the optimal generalization behavior
Validation in Principal Components Analysis Applied to EEG Data
Directory of Open Access Journals (Sweden)
João Carlos G. D. Costa
2014-01-01
Full Text Available The well-known multivariate technique Principal Components Analysis (PCA is usually applied to a sample, and so component scores are subjected to sampling variability. However, few studies address their stability, an important topic when the sample size is small. This work presents three validation procedures applied to PCA, based on confidence regions generated by a variant of a nonparametric bootstrap called the partial bootstrap: (i the assessment of PC scores variability by the spread and overlapping of “confidence regions” plotted around these scores; (ii the use of the confidence regions centroids as a validation set; and (iii the definition of the number of nontrivial axes to be retained for analysis. The methods were applied to EEG data collected during a postural control protocol with twenty-four volunteers. Two axes were retained for analysis, with 91.6% of explained variance. Results showed that the area of the confidence regions provided useful insights on the variability of scores and suggested that some subjects were not distinguishable from others, which was not evident from the principal planes. In addition, potential outliers, initially suggested by an analysis of the first principal plane, could not be confirmed by the confidence regions.
Lee, Jae K.; Mausel, Paul W.; Lulla, Kamlesh P.
1989-01-01
Both principal component analysis (PCA) and principal factor analysis (PFA) were used to analyze an experimental multispectral data structure in terms of common and unique variance. Only the common variance of the multispectral data was associated with the principal factor, while higher-order principal components were associated with both common and unique variance. The unique variance was found to represent small spectral variations within each cover type as well as noise vectors, and was most abundant in the lower-order principal components. The lower-order principal components can be useful in research designed to discriminate minor physical variations within features, and to highlight localized change when using multitemporal-multispectral data. Conversely, PFA of the multispectral data provided an insight into a great potential for discriminating basic land-cover types by excluding the unique variance which was related to the noise and minor spectral variations.
Digital Repository Service at National Institute of Oceanography (India)
Prerna, R.; Naidu, V.S.; Soniya, S.; Gajbhiye, S.N.
– Ground Control Points Index GoK – Gulf of Kachchh NHO – National Hydrographic Office ITZ – Inter-tidal Zone NIO – National Institute of Oceanography IRS – Indian Remote Sensing NIR – Near Infra-red LISS – Linear Imaging Self Scanner PCA – Principal...-tidal Zone IRS – Indian Remote Sensing LISS – Linear Imaging Self Scanner LTL – Low Tide Line MNP – Marine National Park MSS – Multispectral Scanner System NDVI – Normalized Difference Vegetation Index NHO – National Hydrographic Office NIO – National...
Preliminary Test Estimation for Multi-Sample Principal Components
Paindaveine, Davy; Rasoafaraniaina, Rondrotiana J; Verdebout, Thomas
2016-01-01
In this paper, we consider point estimation in a multi-sample principal components setup, in a situation where it is suspected that the hypothesis of common principal components (CPC) holds. We propose preliminary test estimators of the various principal eigenvectors. We derive their asymptotic distributions (i) under the CPC hypothesis, (ii) under sequences of hypotheses that are contiguous to the CPC hypothesis, and (iii) away from the CPC hypothesis. We conduct a Monte-Carlo study that sho...
An Introductory Application of Principal Components to Cricket Data
Manage, Ananda B. W.; Scariano, Stephen M.
2013-01-01
Principal Component Analysis is widely used in applied multivariate data analysis, and this article shows how to motivate student interest in this topic using cricket sports data. Here, principal component analysis is successfully used to rank the cricket batsmen and bowlers who played in the 2012 Indian Premier League (IPL) competition. In…
Exploratory principal components analysis of growth traits in Red ...
African Journals Online (AJOL)
Exploratory principal components analysis of growth traits in Red Sokoto goats. ... Similar prediction pattern is obtained for CWT. ... similarity of intercepts of regression equations and those of average values for growth traits in this study indicated the possibility of improvement of goat stocks through the principal components.
Principal Component Analysis In Radar Polarimetry
Directory of Open Access Journals (Sweden)
A. Danklmayer
2005-01-01
Full Text Available Second order moments of multivariate (often Gaussian joint probability density functions can be described by the covariance or normalised correlation matrices or by the Kennaugh matrix (Kronecker matrix. In Radar Polarimetry the application of the covariance matrix is known as target decomposition theory, which is a special application of the extremely versatile Principle Component Analysis (PCA. The basic idea of PCA is to convert a data set, consisting of correlated random variables into a new set of uncorrelated variables and order the new variables according to the value of their variances. It is important to stress that uncorrelatedness does not necessarily mean independent which is used in the much stronger concept of Independent Component Analysis (ICA. Both concepts agree for multivariate Gaussian distribution functions, representing the most random and least structured distribution. In this contribution, we propose a new approach in applying the concept of PCA to Radar Polarimetry. Therefore, new uncorrelated random variables will be introduced by means of linear transformations with well determined loading coefficients. This in turn, will allow the decomposition of the original random backscattering target variables into three point targets with new random uncorrelated variables whose variances agree with the eigenvalues of the covariance matrix. This allows a new interpretation of existing decomposition theorems.
Quantitative analysis of planetary reflectance spectra with principal components analysis
Johnson, P. E.; Smith, M. O.; Adams, J. B.
1985-01-01
A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.
Principal Component Analysis: Most Favourite Tool in Chemometrics
Indian Academy of Sciences (India)
Data compression by. PCA involves finding a new space spanned by fewer number of dimensions over which original data set is projected. The dimensions of the ... The variable that ex- plains the maximum variation is called the first principal compo- nent. Second principal component explains the second highest variation ...
A modified principal component analysis-based utility theory ...
African Journals Online (AJOL)
user
modified PCA-based utility theory (UT) approach for optimization of correlated responses. ... Keywords: EDM; Correlated responses; Optimization; Principal component analysis; Proportion of quality loss reduction; ...... On stochastic optimization: Taguchi MethodsTM demystified; its limitations and fallacy clarified.
A modified principal component analysis-based utility theory ...
African Journals Online (AJOL)
traditional machining processes having multiple performance characteristics, some of which are usually correlated. So, ideally, use of principal component analysis (PCA)-based approaches that take into account the possible correlations ...
Nonparametric inference in nonlinear principal components analysis : exploration and beyond
Linting, Mariëlle
2007-01-01
In the social and behavioral sciences, data sets often do not meet the assumptions of traditional analysis methods. Therefore, nonlinear alternatives to traditional methods have been developed. This thesis starts with a didactic discussion of nonlinear principal components analysis (NLPCA),
Integrative sparse principal component analysis of gene expression data.
Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge
2017-12-01
In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.
Fast, Exact Bootstrap Principal Component Analysis forp> 1 million.
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject ( p ) is much larger than the number of subjects ( n ), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n -dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n -dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p -dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings ( p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) ( p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods.
Principal Component Analysis of Body Measurements In Three ...
African Journals Online (AJOL)
This study was conducted to explore the relationship among body measurements in 3 strains of broilers chicken (Arbor Acre, Marshal and Ross) using principal component analysis with the view of identifying those components that define body conformation in broilers. A total of 180 birds were used, 60 per strain.
Longitudinal functional principal component modelling via Stochastic Approximation Monte Carlo
Martinez, Josue G.
2010-06-01
The authors consider the analysis of hierarchical longitudinal functional data based upon a functional principal components approach. In contrast to standard frequentist approaches to selecting the number of principal components, the authors do model averaging using a Bayesian formulation. A relatively straightforward reversible jump Markov Chain Monte Carlo formulation has poor mixing properties and in simulated data often becomes trapped at the wrong number of principal components. In order to overcome this, the authors show how to apply Stochastic Approximation Monte Carlo (SAMC) to this problem, a method that has the potential to explore the entire space and does not become trapped in local extrema. The combination of reversible jump methods and SAMC in hierarchical longitudinal functional data is simplified by a polar coordinate representation of the principal components. The approach is easy to implement and does well in simulated data in determining the distribution of the number of principal components, and in terms of its frequentist estimation properties. Empirical applications are also presented.
Principal component analysis for ataxic gait using a triaxial accelerometer.
Matsushima, Akira; Yoshida, Kunihiro; Genno, Hirokazu; Ikeda, Shu-Ichi
2017-05-02
It is quite difficult to evaluate ataxic gait quantitatively in clinical practice. The aim of this study was to analyze the characteristics of ataxic gait using a triaxial accelerometer and to develop a novel biomarker of integrated gate parameters for ataxic gait. Sixty-one patients with spinocerebellar ataxia (SCA) or multiple system atrophy with predominant cerebellar ataxia (MSA-C) and 57 healthy control subjects were enrolled. The subjects were instructed to walk 10 m for a total of 12 times on a flat floor at their usual walking speed with a triaxial accelerometer attached to their back. Gait velocity, cadence, step length, step regularity, step symmetry, and degree of body sway were evaluated. Principal component analysis (PCA) was used to analyze the multivariate gait parameters. The Scale for the Assessment and Rating of Ataxia (SARA) was evaluated on the same day of the 10-m walk trial. PCA divided the gait parameters into four principal components in the controls and into two principal components in the patients. The four principal components in the controls were similar to those found in earlier studies. The second principal component in the patients had relevant factor loading values for gait velocity, step length, regularity, and symmetry in addition to the degree of body sway in the medio-lateral direction. The second principal component score (PCS) in the patients was significantly correlated with disease duration and the SARA score of gait (ρ = -0.363, p = 0.004; ρ = -0.574, p gait. The PCS of the main component was significantly different between the patients and controls, and it was well correlated with disease duration and the SARA score of gait in the patients. We propose that this score provides a novel method to assess the severity of ataxic gait quantitatively using a triaxial accelerometer.
Sparse logistic principal components analysis for binary data
Lee, Seokho
2010-09-01
We develop a new principal components analysis (PCA) type dimension reduction method for binary data. Different from the standard PCA which is defined on the observed data, the proposed PCA is defined on the logit transform of the success probabilities of the binary observations. Sparsity is introduced to the principal component (PC) loading vectors for enhanced interpretability and more stable extraction of the principal components. Our sparse PCA is formulated as solving an optimization problem with a criterion function motivated from a penalized Bernoulli likelihood. A Majorization-Minimization algorithm is developed to efficiently solve the optimization problem. The effectiveness of the proposed sparse logistic PCA method is illustrated by application to a single nucleotide polymorphism data set and a simulation study. © Institute ol Mathematical Statistics, 2010.
Sparse Principal Component Analysis in Medical Shape Modeling
DEFF Research Database (Denmark)
Sjöstrand, Karl; Stegmann, Mikkel Bille; Larsen, Rasmus
2006-01-01
Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims...... analysis in medicine. Results for three different data sets are given in relation to standard PCA and sparse PCA by simple thresholding of sufficiently small loadings. Focus is on a recent algorithm for computing sparse principal components, but a review of other approaches is supplied as well. The SPCA...
Principal Component Clustering Approach to Teaching Quality Discriminant Analysis
Xian, Sidong; Xia, Haibo; Yin, Yubo; Zhai, Zhansheng; Shang, Yan
2016-01-01
Teaching quality is the lifeline of the higher education. Many universities have made some effective achievement about evaluating the teaching quality. In this paper, we establish the Students' evaluation of teaching (SET) discriminant analysis model and algorithm based on principal component clustering analysis. Additionally, we classify the SET…
Principal components regression of body measurements in five ...
African Journals Online (AJOL)
This study aimed at unfolding the interdependence among the linear body measurements in chickens and to predict body weight from their orthogonal body measurements using principal component regression. Body weight and seven biometric traits that are; body length (BL), breast girth (BG), wing length (WL), wing span ...
Principal Component Surface (2011) for Fish Bay, St. John
National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a 0.3x0.3 meter principal component analysis (PCA) surface for areas inside Fish Bay, St. John in the U.S. Virgin Islands (USVI). It was...
Assessment of drinking water quality using principal component ...
African Journals Online (AJOL)
The Partial Least Square Discriminant Analysis (PLS-DA) model showed a high correlation matrix of analysis for physicochemical quality of two types of water with 99.43% significant value. The classification matrix accuracy of the principal component analysis (PCA) highlighted 13 significant physico-chemical water quality ...
Water quality of the Chhoti Gandak River using principal component ...
Indian Academy of Sciences (India)
The principal components of water quality are controlled by lithology, gentle slope gradient, poor drainage, long residence of water, ion exchange, weathering of minerals, heavy use of fertilizers, and domestic wastes. At some stations water is hard with an excess alkalinity and is not suitable for drinking and irrigation ...
Group-wise Principal Component Analysis for Exploratory Data Analysis
Camacho, J.; Rodriquez-Gomez, Rafael A.; Saccenti, E.
2017-01-01
In this paper, we propose a new framework for matrix factorization based on Principal Component Analysis (PCA) where sparsity is imposed. The structure to impose sparsity is defined in terms of groups of correlated variables found in correlation matrices or maps. The framework is based on three new
Principal component analysis of image gradient orientations for face recognition
Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja
We introduce the notion of Principal Component Analysis (PCA) of image gradient orientations. As image data is typically noisy, but noise is substantially different from Gaussian, traditional PCA of pixel intensities very often fails to estimate reliably the low-dimensional subspace of a given data
Water quality of the Chhoti Gandak River using principal component ...
Indian Academy of Sciences (India)
Water quality of the Chhoti Gandak River using principal component analysis, Ganga Plain, India. Vikram Bhardwaj1,∗, Dhruv Sen Singh1 and A K Singh. 2. 1. Centre of Advanced Study in Geology, University of Lucknow, Lucknow 226 007, India. 2. Central Institute of Mining and Fuel Research, Barwa Road, Dhanbad 826 ...
Principal Component Surface (2011) for Coral Bay, St. John
National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a 0.3x0.3 meter principal component analysis (PCA) surface for areas inside Coral Bay, St. John in the U.S. Virgin Islands (USVI). It was...
Principal Components Analysis of Job Burnout and Coping ...
African Journals Online (AJOL)
The component structure of a 44-item scale measuring different aspects of job burnout incidence and 31-item scale on coping strategies were investigated, among extension officers in North West Province, South Africa. Items on job burnout and coping strategies s were measured at interval level and analyzed with Principal ...
Principal Component Analysis: Most Favourite Tool in Chemometrics
Indian Academy of Sciences (India)
differentiate olive oils from non-olive vegetable oils. Moreover, manual analysis of such a large volume of data is laborious and time consuming, and may not provide any meaningful interpre-. Figure 4. Amount of vari- ance captured by different principal components (PCs). The plot indicates that first two PCs are sufficient to ...
Dimensionality reduction and visualization in principal component analysis.
Ivosev, Gordana; Burton, Lyle; Bonner, Ron
2008-07-01
Many modern applications of analytical chemistry involve the collection of large megavariate data sets and subsequent processing with multivariate analysis techniques (MVA), two of the more common goals being data analysis (also known as data mining and exploratory data analysis) and classification. Classification attempts to determine variables that can distinguish known classes allowing unknown samples to be correctly assigned, whereas data analysis seeks to uncover and understand or confirm relationships between the samples and the variables. An important part of analysis is visualization which allows analysts to apply their expertise and knowledge and is often easier for the samples than the variables since there are frequently far more of the latter. Here we describe principal component variable grouping (PCVG), an unsupervised, intuitive method that assigns a large number of variables to a smaller number of groups that can be more readily visualized and understood. Knowledge of the source or nature of the variables in a group allows them all to be appropriately treated, for example, removed if they result from uninteresting effects or replaced by a single representative for further processing.
Multistage principal component analysis based method for abdominal ECG decomposition.
Petrolis, Robertas; Gintautas, Vladas; Krisciukaitis, Algimantas
2015-02-01
Reflection of fetal heart electrical activity is present in registered abdominal ECG signals. However this signal component has noticeably less energy than concurrent signals, especially maternal ECG. Therefore traditionally recommended independent component analysis, fails to separate these two ECG signals. Multistage principal component analysis (PCA) is proposed for step-by-step extraction of abdominal ECG signal components. Truncated representation and subsequent subtraction of cardio cycles of maternal ECG are the first steps. The energy of fetal ECG component then becomes comparable or even exceeds energy of other components in the remaining signal. Second stage PCA concentrates energy of the sought signal in one principal component assuring its maximal amplitude regardless to the orientation of the fetus in multilead recordings. Third stage PCA is performed on signal excerpts representing detected fetal heart beats in aim to perform their truncated representation reconstructing their shape for further analysis. The algorithm was tested with PhysioNet Challenge 2013 signals and signals recorded in the Department of Obstetrics and Gynecology, Lithuanian University of Health Sciences. Results of our method in PhysioNet Challenge 2013 on open data set were: average score: 341.503 bpm(2) and 32.81 ms.
Spectral decomposition of asteroid Itokawa based on principal component analysis
Koga, Sumire C.; Sugita, Seiji; Kamata, Shunichi; Ishiguro, Masateru; Hiroi, Takahiro; Tatsumi, Eri; Sasaki, Sho
2018-01-01
The heliocentric stratification of asteroid spectral types may hold important information on the early evolution of the Solar System. Asteroid spectral taxonomy is based largely on principal component analysis. However, how the surface properties of asteroids, such as the composition and age, are projected in the principal-component (PC) space is not understood well. We decompose multi-band disk-resolved visible spectra of the Itokawa surface with principal component analysis (PCA) in comparison with main-belt asteroids. The obtained distribution of Itokawa spectra projected in the PC space of main-belt asteroids follows a linear trend linking the Q-type and S-type regions and is consistent with the results of space-weathering experiments on ordinary chondrites and olivine, suggesting that this trend may be a space-weathering-induced spectral evolution track for S-type asteroids. Comparison with space-weathering experiments also yield a short average surface age (component of Itokawa surface spectra is consistent with spectral change due to space weathering and that the spatial variation in the degree of space weathering is very large (a factor of three in surface age), which would strongly suggest the presence of strong regional/local resurfacing process(es) on this small asteroid.
Fast principal component analysis for stacking seismic data
Wu, Juan; Bai, Min
2018-04-01
Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.
Principal Component Analysis - A Powerful Tool in Computing Marketing Information
Directory of Open Access Journals (Sweden)
Constantin C.
2014-12-01
Full Text Available This paper is about an instrumental research regarding a powerful multivariate data analysis method which can be used by the researchers in order to obtain valuable information for decision makers that need to solve the marketing problem a company face with. The literature stresses the need to avoid the multicollinearity phenomenon in multivariate analysis and the features of Principal Component Analysis (PCA in reducing the number of variables that could be correlated with each other to a small number of principal components that are uncorrelated. In this respect, the paper presents step-by-step the process of applying the PCA in marketing research when we use a large number of variables that naturally are collinear.
Research on Air Quality Evaluation based on Principal Component Analysis
Wang, Xing; Wang, Zilin; Guo, Min; Chen, Wei; Zhang, Huan
2018-01-01
Economic growth has led to environmental capacity decline and the deterioration of air quality. Air quality evaluation as a fundamental of environmental monitoring and air pollution control has become increasingly important. Based on the principal component analysis (PCA), this paper evaluates the air quality of a large city in Beijing-Tianjin-Hebei Area in recent 10 years and identifies influencing factors, in order to provide reference to air quality management and air pollution control.
Extraction of Independent Structural Images for Principal Component Thermography
Directory of Open Access Journals (Sweden)
Dmitry Gavrilov
2018-03-01
Full Text Available Thermography is a powerful tool for non-destructive testing of a wide range of materials. Thermography has a number of approaches differing in both experiment setup and the way the collected data are processed. Among such approaches is the Principal Component Thermography (PCT method, which is based on the statistical processing of raw thermal images collected by thermal camera. The processed images (principal components or empirical orthogonal functions form an orthonormal basis, and often look like a superposition of all possible structural features found in the object under inspection—i.e., surface heating non-uniformity, internal defects and material structure. At the same time, from practical point of view it is desirable to have images representing independent structural features. The work presented in this paper proposes an approach for separation of independent image patterns (archetypes from a set of principal component images. The approach is demonstrated in the application of inspection of composite materials as well as the non-invasive analysis of works of art.
Selecting the Number of Principal Components in Functional Data
Li, Yehua
2013-12-01
Functional principal component analysis (FPCA) has become the most widely used dimension reduction tool for functional data analysis. We consider functional data measured at random, subject-specific time points, contaminated with measurement error, allowing for both sparse and dense functional data, and propose novel information criteria to select the number of principal component in such data. We propose a Bayesian information criterion based on marginal modeling that can consistently select the number of principal components for both sparse and dense functional data. For dense functional data, we also develop an Akaike information criterion based on the expected Kullback-Leibler information under a Gaussian assumption. In connecting with the time series literature, we also consider a class of information criteria proposed for factor analysis of multivariate time series and show that they are still consistent for dense functional data, if a prescribed undersmoothing scheme is undertaken in the FPCA algorithm. We perform intensive simulation studies and show that the proposed information criteria vastly outperform existing methods for this type of data. Surprisingly, our empirical evidence shows that our information criteria proposed for dense functional data also perform well for sparse functional data. An empirical example using colon carcinogenesis data is also provided to illustrate the results. Supplementary materials for this article are available online. © 2013 American Statistical Association.
Source Signals Separation and Reconstruction Following Principal Component Analysis
Directory of Open Access Journals (Sweden)
WANG Cheng
2014-02-01
Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.
Quality Aware Compression of Electrocardiogram Using Principal Component Analysis.
Gupta, Rajarshi
2016-05-01
Electrocardiogram (ECG) compression finds wide application in various patient monitoring purposes. Quality control in ECG compression ensures reconstruction quality and its clinical acceptance for diagnostic decision making. In this paper, a quality aware compression method of single lead ECG is described using principal component analysis (PCA). After pre-processing, beat extraction and PCA decomposition, two independent quality criteria, namely, bit rate control (BRC) or error control (EC) criteria were set to select optimal principal components, eigenvectors and their quantization level to achieve desired bit rate or error measure. The selected principal components and eigenvectors were finally compressed using a modified delta and Huffman encoder. The algorithms were validated with 32 sets of MIT Arrhythmia data and 60 normal and 30 sets of diagnostic ECG data from PTB Diagnostic ECG data ptbdb, all at 1 kHz sampling. For BRC with a CR threshold of 40, an average Compression Ratio (CR), percentage root mean squared difference normalized (PRDN) and maximum absolute error (MAE) of 50.74, 16.22 and 0.243 mV respectively were obtained. For EC with an upper limit of 5 % PRDN and 0.1 mV MAE, the average CR, PRDN and MAE of 9.48, 4.13 and 0.049 mV respectively were obtained. For mitdb data 117, the reconstruction quality could be preserved up to CR of 68.96 by extending the BRC threshold. The proposed method yields better results than recently published works on quality controlled ECG compression.
Sparse principal component analysis in hyperspectral change detection
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg; Larsen, Rasmus; Vestergaard, Jacob Schack
2011-01-01
This contribution deals with change detection by means of sparse principal component analysis (PCA) of simple differences of calibrated, bi-temporal HyMap data. Results show that if we retain only 15 nonzero loadings (out of 126) in the sparse PCA the resulting change scores appear visually very...... similar although the loadings are very different from their usual non-sparse counterparts. The choice of three wavelength regions as being most important for change detection demonstrates the feature selection capability of sparse PCA....
Keithley, Richard B; Heien, Michael L; Wightman, R Mark
2009-10-01
Data analysis is an essential tenet of analytical chemistry, extending the possible information obtained from the measurement of chemical phenomena. Chemometric methods have grown considerably in recent years, but their wide use is hindered because some still consider them too complicated. The purpose of this review is to describe a multivariate chemometric method, principal component regression, in a simple manner from the point of view of an analytical chemist, to demonstrate the need for proper quality-control (QC) measures in multivariate analysis and to advocate the use of residuals as a proper QC method.
Association test with the principal component analysis in case ...
Indian Academy of Sciences (India)
with variance of Var(Yk) = λk. Note that the p principal components Y1, ··· , Yp are uncorrelated with each other, i.e.. Cov(Yk, Yl) = 0 (k, l = 1, ··· , p; k = l). Let S be the sample covariance matrix for the observed case–parent trios data Xi = (xi1, xi2, ··· , xip)T. (i = 1, ··· , n). Assume that the p sample eigenvalue–eigenvector pairs ...
Principal Component Analysis of Terrestrial and Venusian Topography
Stoddard, P. R.; Jurdy, D. M.
2015-12-01
We use Principal Component Analysis (PCA) as an objective tool in analyzing, comparing, and contrasting topographic profiles of different/similar features from different locations and planets. To do so, we take average profiles of a set of features and form a cross-correlation matrix, which is then diagonalized to determine its principal components. These components, not merely numbers, represent actual profile shapes that give a quantitative basis for comparing different sets of features. For example, PCA for terrestrial hotspots shows the main component as a generic dome shape. Secondary components show a more sinusoidal shape, related to the lithospheric loading response, and thus give information about the nature of the lithosphere setting of the various hotspots. We examine a range of terrestrial spreading centers: fast, slow, ultra-slow, incipient, and extinct, and compare these to several chasmata on Venus (including Devana, Ganis, Juno, Parga, and Kuanja). For upwelling regions, we consider the oceanic Hawaii, Reunion, and Iceland hotspots and Yellowstone, a prototypical continental hotspot. Venus has approximately one dozen broad topographic and geoid highs called regiones. Our analysis includes Atla, Beta, and W. Eistla regiones. Atla and Beta are widely thought to be the most likely to be currently or recently active. Analysis of terrestrial rifts suggests shows increasing uniformity of shape among rifts with increasing spreading rates. Venus' correlations of uniformity rank considerably lower than the terrestrial ones. Extrapolating the correlation/spreading rate suggests that Venus' chasmata, if analogous to terrestrial spreading centers, most resemble the ultra-slow spreading level (less than 12mm/yr) of the Arctic Gakkel ridge. PCA will provide an objective measurement of this correlation.
Mining gene expression data by interpreting principal components
Directory of Open Access Journals (Sweden)
Mortazavi Ali
2006-04-01
Full Text Available Abstract Background There are many methods for analyzing microarray data that group together genes having similar patterns of expression over all conditions tested. However, in many instances the biologically important goal is to identify relatively small sets of genes that share coherent expression across only some conditions, rather than all or most conditions as required in traditional clustering; e.g. genes that are highly up-regulated and/or down-regulated similarly across only a subset of conditions. Equally important is the need to learn which conditions are the decisive ones in forming such gene sets of interest, and how they relate to diverse conditional covariates, such as disease diagnosis or prognosis. Results We present a method for automatically identifying such candidate sets of biologically relevant genes using a combination of principal components analysis and information theoretic metrics. To enable easy use of our methods, we have developed a data analysis package that facilitates visualization and subsequent data mining of the independent sources of significant variation present in gene microarray expression datasets (or in any other similarly structured high-dimensional dataset. We applied these tools to two public datasets, and highlight sets of genes most affected by specific subsets of conditions (e.g. tissues, treatments, samples, etc.. Statistically significant associations for highlighted gene sets were shown via global analysis for Gene Ontology term enrichment. Together with covariate associations, the tool provides a basis for building testable hypotheses about the biological or experimental causes of observed variation. Conclusion We provide an unsupervised data mining technique for diverse microarray expression datasets that is distinct from major methods now in routine use. In test uses, this method, based on publicly available gene annotations, appears to identify numerous sets of biologically relevant genes. It
Wal JT van der; Vaal MA; Hoekstra JA; Hermens JLM; ECO; RITOX
1995-01-01
This report is part of a project studying the variation in the sensitivity of species to toxicant's. Patterns in the chronic toxicity of compounds to aquatic species are studied using a multivariate statistical technique called principal componant analysis. The matrix consists of 15 aquatic species
Fast grasping of unknown objects using principal component analysis
Lei, Qujiang; Chen, Guangming; Wisse, Martijn
2017-09-01
Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.
Sparse principal component analysis in medical shape modeling
Sjöstrand, Karl; Stegmann, Mikkel B.; Larsen, Rasmus
2006-03-01
Principal component analysis (PCA) is a widely used tool in medical image analysis for data reduction, model building, and data understanding and exploration. While PCA is a holistic approach where each new variable is a linear combination of all original variables, sparse PCA (SPCA) aims at producing easily interpreted models through sparse loadings, i.e. each new variable is a linear combination of a subset of the original variables. One of the aims of using SPCA is the possible separation of the results into isolated and easily identifiable effects. This article introduces SPCA for shape analysis in medicine. Results for three different data sets are given in relation to standard PCA and sparse PCA by simple thresholding of small loadings. Focus is on a recent algorithm for computing sparse principal components, but a review of other approaches is supplied as well. The SPCA algorithm has been implemented using Matlab and is available for download. The general behavior of the algorithm is investigated, and strengths and weaknesses are discussed. The original report on the SPCA algorithm argues that the ordering of modes is not an issue. We disagree on this point and propose several approaches to establish sensible orderings. A method that orders modes by decreasing variance and maximizes the sum of variances for all modes is presented and investigated in detail.
MPCA: Multilinear Principal Component Analysis of Tensor Objects.
Lu, Haiping; Plataniotis, Konstantinos N Kostas; Venetsanopoulos, Anastasios N
2008-01-01
This paper introduces a multilinear principal component analysis (MPCA) framework for tensor object feature extraction. Objects of interest in many computer vision and pattern recognition applications, such as 2-D/3-D images and video sequences are naturally described as tensors or multilinear arrays. The proposed framework performs feature extraction by determining a multilinear projection that captures most of the original tensorial input variation. The solution is iterative in nature and it proceeds by decomposing the original problem to a series of multiple projection subproblems. As part of this work, methods for subspace dimensionality determination are proposed and analyzed. It is shown that the MPCA framework discussed in this work supplants existing heterogeneous solutions such as the classical principal component analysis (PCA) and its 2-D variant (2-D PCA). Finally, a tensor object recognition system is proposed with the introduction of a discriminative tensor feature selection mechanism and a novel classification strategy, and applied to the problem of gait recognition. Results presented here indicate MPCA's utility as a feature extraction tool. It is shown that even without a fully optimized design, an MPCA-based gait recognition module achieves highly competitive performance and compares favorably to the state-of-the-art gait recognizers.
Fast grasping of unknown objects using principal component analysis
Directory of Open Access Journals (Sweden)
Qujiang Lei
2017-09-01
Full Text Available Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.
Palynological applications of principal component and cluster analyses
Adam, David P.
1974-01-01
Two multivariate statistical methods are suggested to help describe patterns in pollen data that result from changes in the relative frequencies of pollen types produced by past climatic and environmental variations. These methods, based on a geometric model, compare samples by use of the product-moment correlation coefficient computed from data subjected to a centering transformation. If there are m samples and n pollen types, then the data can be regarded as a set of m points in an n-dimensional space. The first method, cluster analysis, produces a dendrograph or clustering tree in which samples are grouped with other samples on the basis of their similarity to each other. The second method, principal component analysis, produces a set of variates that arc linear combinations of the pollen samples, are uncorrelated with each other, and best describe the data using a minimum number of dimensions. This method is useful in reducing the dimensionality of data sets. A further transformation known as varimax rotation acts on a subset of the principal components to make them easier to interpret. Both methods offer the advantages of reproducibility of results and speed in pattern description. Once the patterns in the data have been described, however, they must be interpreted by the palynologist. An application of the methods in palynology is shown by using data from Osgood Swamp, Calif.
Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.
Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J
2018-03-01
Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.
CMB constraints on principal components of the inflaton potential
International Nuclear Information System (INIS)
Dvorkin, Cora; Hu, Wayne
2010-01-01
We place functional constraints on the shape of the inflaton potential from the cosmic microwave background through a variant of the generalized slow-roll approximation that allows large amplitude, rapidly changing deviations from scale-free conditions. Employing a principal component decomposition of the source function G ' ≅3(V ' /V) 2 -2V '' /V and keeping only those measured to better than 10% results in 5 nearly independent Gaussian constraints that may be used to test any single-field inflationary model where such deviations are expected. The first component implies <3% variations at the 100 Mpc scale. One component shows a 95% CL preference for deviations around the 300 Mpc scale at the ∼10% level but the global significance is reduced considering the 5 components examined. This deviation also requires a change in the cold dark matter density which in a flat ΛCDM model is disfavored by current supernova and Hubble constant data and can be tested with future polarization or high multipole temperature data. Its impact resembles a local running of the tilt from multipoles 30-800 but is only marginally consistent with a constant running beyond this range. For this analysis, we have implemented a ∼40x faster WMAP7 likelihood method which we have made publicly available.
Principal semantic components of language and the measurement of meaning.
Samsonovich, Alexei V; Samsonovic, Alexei V; Ascoli, Giorgio A
2010-06-11
Metric systems for semantics, or semantic cognitive maps, are allocations of words or other representations in a metric space based on their meaning. Existing methods for semantic mapping, such as Latent Semantic Analysis and Latent Dirichlet Allocation, are based on paradigms involving dissimilarity metrics. They typically do not take into account relations of antonymy and yield a large number of domain-specific semantic dimensions. Here, using a novel self-organization approach, we construct a low-dimensional, context-independent semantic map of natural language that represents simultaneously synonymy and antonymy. Emergent semantics of the map principal components are clearly identifiable: the first three correspond to the meanings of "good/bad" (valence), "calm/excited" (arousal), and "open/closed" (freedom), respectively. The semantic map is sufficiently robust to allow the automated extraction of synonyms and antonyms not originally in the dictionaries used to construct the map and to predict connotation from their coordinates. The map geometric characteristics include a limited number ( approximately 4) of statistically significant dimensions, a bimodal distribution of the first component, increasing kurtosis of subsequent (unimodal) components, and a U-shaped maximum-spread planar projection. Both the semantic content and the main geometric features of the map are consistent between dictionaries (Microsoft Word and Princeton's WordNet), among Western languages (English, French, German, and Spanish), and with previously established psychometric measures. By defining the semantics of its dimensions, the constructed map provides a foundational metric system for the quantitative analysis of word meaning. Language can be viewed as a cumulative product of human experiences. Therefore, the extracted principal semantic dimensions may be useful to characterize the general semantic dimensions of the content of mental states. This is a fundamental step toward a
Principal semantic components of language and the measurement of meaning.
Directory of Open Access Journals (Sweden)
Alexei V Samsonovich
Full Text Available Metric systems for semantics, or semantic cognitive maps, are allocations of words or other representations in a metric space based on their meaning. Existing methods for semantic mapping, such as Latent Semantic Analysis and Latent Dirichlet Allocation, are based on paradigms involving dissimilarity metrics. They typically do not take into account relations of antonymy and yield a large number of domain-specific semantic dimensions. Here, using a novel self-organization approach, we construct a low-dimensional, context-independent semantic map of natural language that represents simultaneously synonymy and antonymy. Emergent semantics of the map principal components are clearly identifiable: the first three correspond to the meanings of "good/bad" (valence, "calm/excited" (arousal, and "open/closed" (freedom, respectively. The semantic map is sufficiently robust to allow the automated extraction of synonyms and antonyms not originally in the dictionaries used to construct the map and to predict connotation from their coordinates. The map geometric characteristics include a limited number ( approximately 4 of statistically significant dimensions, a bimodal distribution of the first component, increasing kurtosis of subsequent (unimodal components, and a U-shaped maximum-spread planar projection. Both the semantic content and the main geometric features of the map are consistent between dictionaries (Microsoft Word and Princeton's WordNet, among Western languages (English, French, German, and Spanish, and with previously established psychometric measures. By defining the semantics of its dimensions, the constructed map provides a foundational metric system for the quantitative analysis of word meaning. Language can be viewed as a cumulative product of human experiences. Therefore, the extracted principal semantic dimensions may be useful to characterize the general semantic dimensions of the content of mental states. This is a fundamental step
Principal component analysis of FDG PET in amnestic MCI
International Nuclear Information System (INIS)
Nobili, Flavio; Girtler, Nicola; Brugnolo, Andrea; Dessi, Barbara; Rodriguez, Guido; Salmaso, Dario; Morbelli, Silvia; Piccardo, Arnoldo; Larsson, Stig A.; Pagani, Marco
2008-01-01
The purpose of the study is to evaluate the combined accuracy of episodic memory performance and 18 F-FDG PET in identifying patients with amnestic mild cognitive impairment (aMCI) converting to Alzheimer's disease (AD), aMCI non-converters, and controls. Thirty-three patients with aMCI and 15 controls (CTR) were followed up for a mean of 21 months. Eleven patients developed AD (MCI/AD) and 22 remained with aMCI (MCI/MCI). 18 F-FDG PET volumetric regions of interest underwent principal component analysis (PCA) that identified 12 principal components (PC), expressed by coarse component scores (CCS). Discriminant analysis was performed using the significant PCs and episodic memory scores. PCA highlighted relative hypometabolism in PC5, including bilateral posterior cingulate and left temporal pole, and in PC7, including the bilateral orbitofrontal cortex, both in MCI/MCI and MCI/AD vs CTR. PC5 itself plus PC12, including the left lateral frontal cortex (LFC: BAs 44, 45, 46, 47), were significantly different between MCI/AD and MCI/MCI. By a three-group discriminant analysis, CTR were more accurately identified by PET-CCS + delayed recall score (100%), MCI/MCI by PET-CCS + either immediate or delayed recall scores (91%), while MCI/AD was identified by PET-CCS alone (82%). PET increased by 25% the correct allocations achieved by memory scores, while memory scores increased by 15% the correct allocations achieved by PET. Combining memory performance and 18 F-FDG PET yielded a higher accuracy than each single tool in identifying CTR and MCI/MCI. The PC containing bilateral posterior cingulate and left temporal pole was the hallmark of MCI/MCI patients, while the PC including the left LFC was the hallmark of conversion to AD. (orig.)
Preliminary study of soil permeability properties using principal component analysis
Yulianti, M.; Sudriani, Y.; Rustini, H. A.
2018-02-01
Soil permeability measurement is undoubtedly important in carrying out soil-water research such as rainfall-runoff modelling, irrigation water distribution systems, etc. It is also known that acquiring reliable soil permeability data is rather laborious, time-consuming, and costly. Therefore, it is desirable to develop the prediction model. Several studies of empirical equations for predicting permeability have been undertaken by many researchers. These studies derived the models from areas which soil characteristics are different from Indonesian soil, which suggest a possibility that these permeability models are site-specific. The purpose of this study is to identify which soil parameters correspond strongly to soil permeability and propose a preliminary model for permeability prediction. Principal component analysis (PCA) was applied to 16 parameters analysed from 37 sites consist of 91 samples obtained from Batanghari Watershed. Findings indicated five variables that have strong correlation with soil permeability, and we recommend a preliminary permeability model, which is potential for further development.
Recursive Principal Components Analysis Using Eigenvector Matrix Perturbation
Directory of Open Access Journals (Sweden)
Deniz Erdogmus
2004-10-01
Full Text Available Principal components analysis is an important and well-studied subject in statistics and signal processing. The literature has an abundance of algorithms for solving this problem, where most of these algorithms could be grouped into one of the following three approaches: adaptation based on Hebbian updates and deflation, optimization of a second-order statistical criterion (like reconstruction error or output variance, and fixed point update rules with deflation. In this paper, we take a completely different approach that avoids deflation and the optimization of a cost function using gradients. The proposed method updates the eigenvector and eigenvalue matrices simultaneously with every new sample such that the estimates approximately track their true values as would be calculated from the current sample estimate of the data covariance matrix. The performance of this algorithm is compared with that of traditional methods like Sanger's rule and APEX, as well as a structurally similar matrix perturbation-based method.
International Nuclear Information System (INIS)
Waheed, S.; Rahman, S.; Siddique, N.
2013-01-01
Different types of Ca supplements are available in the local markets of Pakistan. It is sometimes difficult to classify these with respect to their composition. In the present work principal component analysis (PCA) technique was applied to classify different Ca supplements on the basis of their elemental data obtained using instrumental neutron activation analysis (INAA) and atomic absorption spectrometry (AAS) techniques. The graphical representation of principal component analysis (PCA) scores utilizing intricate analytical data successfully generated four different types of Ca supplements with compatible samples grouped together. These included Ca supplements with CaCO/sub 3/as Ca source along with vitamin C, the supplements with CaCO/sub 3/ as Ca source along with vitamin D, Supplements with Ca from bone meal and supplements with chelated calcium. (author)
Principal Component Analysis of Process Datasets with Missing Values
Directory of Open Access Journals (Sweden)
Kristen A. Severson
2017-07-01
Full Text Available Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data typically operate during data pre-processing, but can also occur during model building. This article considers missing data within the context of principal component analysis (PCA, which is a method originally developed for complete data that has widespread industrial application in multivariate statistical process control. Due to the prevalence of missing data and the success of PCA for handling complete data, several PCA algorithms that can act on incomplete data have been proposed. Here, algorithms for applying PCA to datasets with missing values are reviewed. A case study is presented to demonstrate the performance of the algorithms and suggestions are made with respect to choosing which algorithm is most appropriate for particular settings. An alternating algorithm based on the singular value decomposition achieved the best results in the majority of test cases involving process datasets.
Comparison of analytical eddy current models using principal components analysis
Contant, S.; Luloff, M.; Morelli, J.; Krause, T. W.
2017-02-01
Monitoring the gap between the pressure tube (PT) and the calandria tube (CT) in CANDU® fuel channels is essential, as contact between the two tubes can lead to delayed hydride cracking of the pressure tube. Multifrequency transmit-receive eddy current non-destructive evaluation is used to determine this gap, as this method has different depths of penetration and variable sensitivity to noise, unlike single frequency eddy current non-destructive evaluation. An Analytical model based on the Dodd and Deeds solutions, and a second model that accounts for normal and lossy self-inductances, and a non-coaxial pickup coil, are examined for representing the response of an eddy current transmit-receive probe when considering factors that affect the gap response, such as pressure tube wall thickness and pressure tube resistivity. The multifrequency model data was analyzed using principal components analysis (PCA), a statistical method used to reduce the data set into a data set of fewer variables. The results of the PCA of the analytical models were then compared to PCA performed on a previously obtained experimental data set. The models gave similar results under variable PT wall thickness conditions, but the non-coaxial coil model, which accounts for self-inductive losses, performed significantly better than the Dodd and Deeds model under variable resistivity conditions.
Principal component analysis of 1/fα noise
International Nuclear Information System (INIS)
Gao, J.B.; Cao Yinhe; Lee, J.-M.
2003-01-01
Principal component analysis (PCA) is a popular data analysis method. One of the motivations for using PCA in practice is to reduce the dimension of the original data by projecting the raw data onto a few dominant eigenvectors with large variance (energy). Due to the ubiquity of 1/f α noise in science and engineering, in this Letter we study the prototypical stochastic model for 1/f α processes--the fractional Brownian motion (fBm) processes using PCA, and find that the eigenvalues from PCA of fBm processes follow a power-law, with the exponent being the key parameter defining the fBm processes. We also study random-walk-type processes constructed from DNA sequences, and find that the eigenvalue spectrum from PCA of those random-walk processes also follow power-law relations, with the exponent characterizing the correlation structures of the DNA sequence. In fact, it is observed that PCA can automatically remove linear trends induced by patchiness in the DNA sequence, hence, PCA has a similar capability to the detrended fluctuation analysis. Implications of the power-law distributed eigenvalue spectrum are discussed
PRINCIPAL COMPONENT ANALYSIS STUDIES OF TURBULENCE IN OPTICALLY THICK GAS
International Nuclear Information System (INIS)
Correia, C.; Medeiros, J. R. De; Lazarian, A.; Burkhart, B.; Pogosyan, D.
2016-01-01
In this work we investigate the sensitivity of principal component analysis (PCA) to the velocity power spectrum in high-opacity regimes of the interstellar medium (ISM). For our analysis we use synthetic position–position–velocity (PPV) cubes of fractional Brownian motion and magnetohydrodynamics (MHD) simulations, post-processed to include radiative transfer effects from CO. We find that PCA analysis is very different from the tools based on the traditional power spectrum of PPV data cubes. Our major finding is that PCA is also sensitive to the phase information of PPV cubes and this allows PCA to detect the changes of the underlying velocity and density spectra at high opacities, where the spectral analysis of the maps provides the universal −3 spectrum in accordance with the predictions of the Lazarian and Pogosyan theory. This makes PCA a potentially valuable tool for studies of turbulence at high opacities, provided that proper gauging of the PCA index is made. However, we found the latter to not be easy, as the PCA results change in an irregular way for data with high sonic Mach numbers. This is in contrast to synthetic Brownian noise data used for velocity and density fields that show monotonic PCA behavior. We attribute this difference to the PCA's sensitivity to Fourier phase information
Sensor Failure Detection of FASSIP System using Principal Component Analysis
Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina
2018-02-01
In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.
A principal component analysis of 39 scientific impact measures.
Directory of Open Access Journals (Sweden)
Johan Bollen
Full Text Available BACKGROUND: The impact of scientific publications has traditionally been expressed in terms of citation counts. However, scientific activity has moved online over the past decade. To better capture scientific impact in the digital era, a variety of new impact measures has been proposed on the basis of social network analysis and usage log data. Here we investigate how these new measures relate to each other, and how accurately and completely they express scientific impact. METHODOLOGY: We performed a principal component analysis of the rankings produced by 39 existing and proposed measures of scholarly impact that were calculated on the basis of both citation and usage log data. CONCLUSIONS: Our results indicate that the notion of scientific impact is a multi-dimensional construct that can not be adequately measured by any single indicator, although some measures are more suitable than others. The commonly used citation Impact Factor is not positioned at the core of this construct, but at its periphery, and should thus be used with caution.
Directory of Open Access Journals (Sweden)
Qiang Lin
Full Text Available Lung cancer is the leading cause of cancer death worldwide, but techniques for effective early diagnosis are still lacking. Proteomics technology has been applied extensively to the study of the proteins involved in carcinogenesis. In this paper, a classification method was developed based on principal components of surface-enhanced laser desorption/ionization (SELDI spectral data. This method was applied to SELDI spectral data from 71 lung adenocarcinoma patients and 24 healthy individuals. Unlike other peak-selection-based methods, this method takes each spectrum as a unity. The aim of this paper was to demonstrate that this unity-based classification method is more robust and powerful as a method of diagnosis than peak-selection-based methods.The results showed that this classification method, which is based on principal components, has outstanding performance with respect to distinguishing lung adenocarcinoma patients from normal individuals. Through leaving-one-out, 19-fold, 5-fold and 2-fold cross-validation studies, we found that this classification method based on principal components completely outperforms peak-selection-based methods, such as decision tree, classification and regression tree, support vector machine, and linear discriminant analysis.The classification method based on principal components of SELDI spectral data is a robust and powerful means of diagnosing lung adenocarcinoma. We assert that the high efficiency of this classification method renders it feasible for large-scale clinical use.
Principal component approach in variance component estimation for international sire evaluation
Directory of Open Access Journals (Sweden)
Jakobsen Jette
2011-05-01
Full Text Available Abstract Background The dairy cattle breeding industry is a highly globalized business, which needs internationally comparable and reliable breeding values of sires. The international Bull Evaluation Service, Interbull, was established in 1983 to respond to this need. Currently, Interbull performs multiple-trait across country evaluations (MACE for several traits and breeds in dairy cattle and provides international breeding values to its member countries. Estimating parameters for MACE is challenging since the structure of datasets and conventional use of multiple-trait models easily result in over-parameterized genetic covariance matrices. The number of parameters to be estimated can be reduced by taking into account only the leading principal components of the traits considered. For MACE, this is readily implemented in a random regression model. Methods This article compares two principal component approaches to estimate variance components for MACE using real datasets. The methods tested were a REML approach that directly estimates the genetic principal components (direct PC and the so-called bottom-up REML approach (bottom-up PC, in which traits are sequentially added to the analysis and the statistically significant genetic principal components are retained. Furthermore, this article evaluates the utility of the bottom-up PC approach to determine the appropriate rank of the (covariance matrix. Results Our study demonstrates the usefulness of both approaches and shows that they can be applied to large multi-country models considering all concerned countries simultaneously. These strategies can thus replace the current practice of estimating the covariance components required through a series of analyses involving selected subsets of traits. Our results support the importance of using the appropriate rank in the genetic (covariance matrix. Using too low a rank resulted in biased parameter estimates, whereas too high a rank did not result in
Lim, Hoong-Ta; Murukeshan, Vadakke Matham
2017-06-01
Hyperspectral imaging combines imaging and spectroscopy to provide detailed spectral information for each spatial point in the image. This gives a three-dimensional spatial-spatial-spectral datacube with hundreds of spectral images. Probe-based hyperspectral imaging systems have been developed so that they can be used in regions where conventional table-top platforms would find it difficult to access. A fiber bundle, which is made up of specially-arranged optical fibers, has recently been developed and integrated with a spectrograph-based hyperspectral imager. This forms a snapshot hyperspectral imaging probe, which is able to form a datacube using the information from each scan. Compared to the other configurations, which require sequential scanning to form a datacube, the snapshot configuration is preferred in real-time applications where motion artifacts and pixel misregistration can be minimized. Principal component analysis is a dimension-reducing technique that can be applied in hyperspectral imaging to convert the spectral information into uncorrelated variables known as principal components. A confidence ellipse can be used to define the region of each class in the principal component feature space and for classification. This paper demonstrates the use of the snapshot hyperspectral imaging probe to acquire data from samples of different colors. The spectral library of each sample was acquired and then analyzed using principal component analysis. Confidence ellipse was then applied to the principal components of each sample and used as the classification criteria. The results show that the applied analysis can be used to perform classification of the spectral data acquired using the snapshot hyperspectral imaging probe.
Selection of non-zero loadings in sparse principal component analysis
DEFF Research Database (Denmark)
Gajjar, Shriram; Kulahci, Murat; Palazoglu, Ahmet
2017-01-01
Principal component analysis (PCA) is a widely accepted procedure for summarizing data through dimensional reduction. In PCA, the selection of the appropriate number of components and the interpretation of those components have been the key challenging features. Sparse principal component analysi...
Principal Component and Independent Component Calculation of ECG Signal in Different Posture
Gupta, Varun; Singh, Dilbag; Sharma, Arvind Kumar
2011-12-01
A correct diagnosis of the problem can only lead to the correct treatment. Use of various signals (biomedical signals) produced by the human body for the purpose of diagnosis is not new. It is well known that a Proper analysis of biomedical signals leads to a correct diagnosis of the problem. Presently the biomedical signals for ECG are collected in supine position which may not be comfortable for all the patients. In this paper, a method of analysing the ECG signals collected in various postures i.e. sitting, standing and supine has been presented. ECG signals of 16 patients have been collected in all the three postures and analysed by using Principal Component Analysis (PCA) and Independent Component Analysis (ICA). PCA and ICA together are suitable for signal analysis because PCA reduces dimension of the acquired data whereas ICA reduces noise of the dimension reduced signal. Eigen value variance for all three postures has been calculated and found the first principal component Eigen value variance is 99.95% in standing posture, 99.94% in sitting posture, 99.66% in supine posture. The result indicates a high scale reduction of the data with considerable accuracy.
de Souza, Juliana Bottoni; Reisen, Valdério Anselmo; Santos, Jane Méri; Franco, Glaura Conceição
2014-01-01
OBJECTIVE To analyze the association between concentrations of air pollutants and admissions for respiratory causes in children. METHODS Ecological time series study. Daily figures for hospital admissions of children aged < 6, and daily concentrations of air pollutants (PM10, SO2, NO2, O3 and CO) were analyzed in the Região da Grande Vitória, ES, Southeastern Brazil, from January 2005 to December 2010. For statistical analysis, two techniques were combined: Poisson regression with generalized additive models and principal model component analysis. Those analysis techniques complemented each other and provided more significant estimates in the estimation of relative risk. The models were adjusted for temporal trend, seasonality, day of the week, meteorological factors and autocorrelation. In the final adjustment of the model, it was necessary to include models of the Autoregressive Moving Average Models (p, q) type in the residuals in order to eliminate the autocorrelation structures present in the components. RESULTS For every 10:49 μg/m3 increase (interquartile range) in levels of the pollutant PM10 there was a 3.0% increase in the relative risk estimated using the generalized additive model analysis of main components-seasonal autoregressive – while in the usual generalized additive model, the estimate was 2.0%. CONCLUSIONS Compared to the usual generalized additive model, in general, the proposed aspect of generalized additive model − principal component analysis, showed better results in estimating relative risk and quality of fit. PMID:25119940
Zhang, Qiong; Peng, Cong; Lu, Yiming; Wang, Hao; Zhu, Kaiguang
2018-04-01
A novel technique is developed to level airborne geophysical data using principal component analysis based on flight line difference. In the paper, flight line difference is introduced to enhance the features of levelling error for airborne electromagnetic (AEM) data and improve the correlation between pseudo tie lines. Thus we conduct levelling to the flight line difference data instead of to the original AEM data directly. Pseudo tie lines are selected distributively cross profile direction, avoiding the anomalous regions. Since the levelling errors of selective pseudo tie lines show high correlations, principal component analysis is applied to extract the local levelling errors by low-order principal components reconstruction. Furthermore, we can obtain the levelling errors of original AEM data through inverse difference after spatial interpolation. This levelling method does not need to fly tie lines and design the levelling fitting function. The effectiveness of this method is demonstrated by the levelling results of survey data, comparing with the results from tie-line levelling and flight-line correlation levelling.
The Reduction of the Dimensionality of Redundant Sensor Data Using Principal Component Analysis
Energy Technology Data Exchange (ETDEWEB)
Shin, Ho Cheol; Park, Moon Gue; Lee, Eun Gi; Bae, Sung Man [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)
2007-07-01
The safety related sensors of nuclear power plant are redundant. Redundant sensor-systems achieve fault tolerance by duplication of components. It increases the ability of systems to interact with their environment by combining independent sensor readings into logical representations. Sensor integration of highly redundant systems offers these advantages: 1) Multiple inaccurate sensors can cost less than a few accurate sensors; 2) Sensor reliability may increase; 3) Sensor efficiency and performance can be enhanced; 4) Self-calibration can be attained. But the feasibility of the systems requires attention be paid to both reliability bounds and cost. Several on-line monitoring techniques have been developed that calculate the parameter estimate using only the measurements from a group of redundant instrument channels. These techniques are commonly referred to as redundant sensor calibration monitoring models. In this paper, we reduced the dimensionality of redundant sensor data using principal component analysis.
Portable XRF and principal component analysis for bill characterization in forensic science
International Nuclear Information System (INIS)
Appoloni, C.R.; Melquiades, F.L.
2014-01-01
Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. - Highlights: • The paper is about a direct method for bills discrimination by EDXRF and principal component analysis. • The bills are analyzed directly, without sample preparation and non destructively. • The results demonstrates that the methodology is feasible and could be applied in forensic science for identification of origin and false banknotes. • The novelty is that portable EDXRF is very fast and efficient for bills characterization
Improving three-dimensional mechanical imaging of breast lesions with principal component analysis.
Tyagi, Mohit; Wang, Yuqi; Hall, Timothy J; Barbone, Paul E; Oberai, Assad A
2017-08-01
Elastography has emerged as a new tool for detecting and diagnosing many types of diseases including breast cancer. To date, most clinical applications of elastography have utilized two-dimensional strain images. The goal of this paper is to present a new quasi-static elastography technique that yields shear modulus images in three dimensions. An automated breast volume scanner was used to acquire ultrasound images of the breast as it was gently compressed. Cross-correlation between successive images was used to determine the displacement within the tissue. The resulting displacement field was filtered of all but compressive motion through principal component analysis. This displacement field was used to infer spatial distribution of shear modulus by solving a 3D elastic inverse problem. Three dimensional shear modulus images of benign breast lesions for two subjects were generated using the techniques described above. It was found that the lesions were visualized more clearly in images generated using the displacement data de-noised through the use of principal components. We have presented experimental and algorithmic techniques that lead to three-dimensional imaging of shear modulus using quasi-static elastography. This work demonstrates feasibility of this approach, and lays the foundation for images of other, more informative, mechanical parameters. © 2017 American Association of Physicists in Medicine.
(NDSI) and Normalised Difference Principal Component Snow Index
African Journals Online (AJOL)
Phila Sibandze
overview of some the common remote sensing datasets and methods used in snow mapping. One of the most successful image based snow mapping techniques is the Normalized Difference Snow. Index (NDSI) proposed Hall et al. (2001). This technique exploits the ratio between snow's high reflectance and strong ...
Souza, Juliana Bottoni de; Reisen, Valdério Anselmo; Santos, Jane Méri; Franco, Glaura Conceição
2014-06-01
OBJECTIVE To analyze the association between concentrations of air pollutants and admissions for respiratory causes in children. METHODS Ecological time series study. Daily figures for hospital admissions of children aged generalized additive models and principal model component analysis. Those analysis techniques complemented each other and provided more significant estimates in the estimation of relative risk. The models were adjusted for temporal trend, seasonality, day of the week, meteorological factors and autocorrelation. In the final adjustment of the model, it was necessary to include models of the Autoregressive Moving Average Models (p, q) type in the residuals in order to eliminate the autocorrelation structures present in the components. RESULTS For every 10:49 μg/m3 increase (interquartile range) in levels of the pollutant PM10 there was a 3.0% increase in the relative risk estimated using the generalized additive model analysis of main components-seasonal autoregressive - while in the usual generalized additive model, the estimate was 2.0%. CONCLUSIONS Compared to the usual generalized additive model, in general, the proposed aspect of generalized additive model - principal component analysis, showed better results in estimating relative risk and quality of fit.
Portable XRF and principal component analysis for bill characterization in forensic science.
Appoloni, C R; Melquiades, F L
2014-02-01
Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. Copyright © 2013 Elsevier Ltd. All rights reserved.
Improving hierarchical clustering of genotypic data via principal component analysis
Odong, T.L.; Heerwaarden, van J.; Hintum, van T.J.L.; Eeuwijk, van F.A.; Jansen, J.
2013-01-01
Understanding the genetic structure of germplasm collections is a prerequisite for effective and efficient use of crop genetic resources in genebanks. Currently, hierarchical clustering techniques are most popular for describing genetic structure in germplasm collections. Traditionally performed
Speed up Robust Features (SURF) with Principal Component ...
African Journals Online (AJOL)
A novel Computer Aided Diagnosis (CADx) component is proposed for breast cancer classifications. Four major phases were conducted in this research. The first phase is pre-processing, this is followed by features extraction phase by using the Speed Up Robust Features (SURF). The next phase is features selection by ...
Application of the Model of Principal Components Analysis on Romanian Insurance Market
Directory of Open Access Journals (Sweden)
Dan Armeanu
2008-06-01
Full Text Available Principal components analysis (PCA is a multivariate data analysis technique whose main purpose is to reduce the dimension of the observations and thus simplify the analysis and interpretation of data, as well as facilitate the construction of predictive models. A rigorous definition of PCA has been given by Bishop (1995 and it states that PCA is a linear dimensionality reduction technique, which identifies orthogonal directions of maximum variance in the original data, and projects the data into a lower-dimensionality space formed of a sub-set of the highest-variance components. PCA is commonly used in economic research, as well as in other fields of activity. When faced with the complexity of economic and financial processes, researchers have to analyze a large number of variables (or indicators, fact which often proves to be troublesome because it is difficult to collect such a large amount of data and perform calculations on it. In addition, there is a good chance that the initial data is powerfully correlated; therefore, the signification of variables is seriously diminished and it is virtually impossible to establish causal relationships between variables. Researchers thus require a simple, yet powerful annalytical tool to solve these problems and perform a coherent and conclusive analysis. This tool is PCA.The essence of PCA consists of transforming the space of the initial data into another space of lower dimension while maximising the quantity of information recovered from the initial space(1. Mathematically speaking, PCA is a method of determining a new space (called principal component space or factor space onto which the original space of variables can be projected. The axes of the new space (called factor axes are defined by the principal components determined as result of PCA. Principal components (PC are standardized linear combinations (SLC of the original variables and are uncorrelated. Theoretically, the number of PCs equals
Identifying apple surface defects using principal components analysis and artifical neural networks
Artificial neural networks and principal components were used to detect surface defects on apples in near-infrared images. Neural networks were trained and tested on sets of principal components derived from columns of pixels from images of apples acquired at two wavelengths (740 nm and 950 nm). I...
Principal Component Analysis of Long-Lag, Wide-Pulse Gamma ...
Indian Academy of Sciences (India)
2016-01-27
Jan 27, 2016 ... We have carried out a Principal Component Analysis (PCA) of the temporal and spectral variables of 24 long-lag, wide-pulse gamma-ray bursts (GRBs) presented by Norris et al. (2005). Taking all eight temporal and spectral parameters into account, our analysis shows that four principal components are ...
Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S
2017-06-01
Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.
Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan
2017-09-01
Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.
IMPROVED SEARCH OF PRINCIPAL COMPONENT ANALYSIS DATABASES FOR SPECTRO-POLARIMETRIC INVERSION
International Nuclear Information System (INIS)
Casini, R.; Lites, B. W.; Ramos, A. Asensio; Ariste, A. López
2013-01-01
We describe a simple technique for the acceleration of spectro-polarimetric inversions based on principal component analysis (PCA) of Stokes profiles. This technique involves the indexing of the database models based on the sign of the projections (PCA coefficients) of the first few relevant orders of principal components of the four Stokes parameters. In this way, each model in the database can be attributed a distinctive binary number of 2 4n bits, where n is the number of PCA orders used for the indexing. Each of these binary numbers (indices) identifies a group of ''compatible'' models for the inversion of a given set of observed Stokes profiles sharing the same index. The complete set of the binary numbers so constructed evidently determines a partition of the database. The search of the database for the PCA inversion of spectro-polarimetric data can profit greatly from this indexing. In practical cases it becomes possible to approach the ideal acceleration factor of 2 4n as compared to the systematic search of a non-indexed database for a traditional PCA inversion. This indexing method relies on the existence of a physical meaning in the sign of the PCA coefficients of a model. For this reason, the presence of model ambiguities and of spectro-polarimetric noise in the observations limits in practice the number n of relevant PCA orders that can be used for the indexing
International Nuclear Information System (INIS)
Jung, Young Mee
2003-01-01
Principal component analysis based two-dimensional (PCA-2D) correlation analysis is applied to FTIR spectra of polystyrene/methyl ethyl ketone/toluene solution mixture during the solvent evaporation. Substantial amount of artificial noise were added to the experimental data to demonstrate the practical noise-suppressing benefit of PCA-2D technique. 2D correlation analysis of the reconstructed data matrix from PCA loading vectors and scores successfully extracted only the most important features of synchronicity and asynchronicity without interference from noise or insignificant minor components. 2D correlation spectra constructed with only one principal component yield strictly synchronous response with no discernible a asynchronous features, while those involving at least two or more principal components generated meaningful asynchronous 2D correlation spectra. Deliberate manipulation of the rank of the reconstructed data matrix, by choosing the appropriate number and type of PCs, yields potentially more refined 2D correlation spectra
Machine learning of frustrated classical spin models. I. Principal component analysis
Wang, Ce; Zhai, Hui
2017-10-01
This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.
Lin, Mengshi; Al-Holy, Murad; Al-Qadiri, Hamzah; Kang, Dong-Hyun; Cavinato, Anna G; Huang, Yiqun; Rasco, Barbara A
2004-09-22
Fourier transform infrared spectroscopy (FT-IR, 4000-600 cm(-)(1)) was used to discriminate between intact and sonication-injured Listeria monocytogenes ATCC 19114 and to distinguish this strain from other selected Listeria strains (L. innocua ATCC 51742, L. innocua ATCC 33090, and L. monocytogenes ATCC 7644). FT-IR vibrational overtone and combination bands from mid-IR active components of intact and injured bacterial cells produced distinctive "fingerprints" at wavenumbers between 1500 and 800 cm(-)(1). Spectral data were analyzed by principal component analysis. Clear segregations of different intact and injured strains of Listeria were observed, suggesting that FT-IR can detect biochemical differences between intact and injured bacterial cells. This technique may provide a tool for the rapid assessment of cell viability and thereby the control of foodborne pathogens.
Directory of Open Access Journals (Sweden)
Shengkun Xie
2014-01-01
Full Text Available Classification of electroencephalography (EEG is the most useful diagnostic and monitoring procedure for epilepsy study. A reliable algorithm that can be easily implemented is the key to this procedure. In this paper a novel signal feature extraction method based on dynamic principal component analysis and nonoverlapping moving window is proposed. Along with this new technique, two detection methods based on extracted sparse features are applied to deal with signal classification. The obtained results demonstrated that our proposed methodologies are able to differentiate EEGs from controls and interictal for epilepsy diagnosis and to separate EEGs from interictal and ictal for seizure detection. Our approach yields high classification accuracy for both single-channel short-term EEGs and multichannel long-term EEGs. The classification performance of the method is also compared with other state-of-the-art techniques on the same datasets and the effect of signal variability on the presented methods is also studied.
Duong, Tuan A.; Duong, Vu A.
2009-01-01
This paper presents the JPL-developed Sequential Principal Component Analysis (SPCA) algorithm for feature extraction / image compression, based on "dominant-term selection" unsupervised learning technique that requires an order-of-magnitude lesser computation and has simpler architecture compared to the state of the art gradient-descent techniques. This algorithm is inherently amenable to a compact, low power and high speed VLSI hardware embodiment. The paper compares the lossless image compression performance of the JPL's SPCA algorithm with the state of the art JPEG2000, widely used due to its simplified hardware implementability. JPEG2000 is not an optimal data compression technique because of its fixed transform characteristics, regardless of its data structure. On the other hand, conventional Principal Component Analysis based transform (PCA-transform) is a data-dependent-structure transform. However, it is not easy to implement the PCA in compact VLSI hardware, due to its highly computational and architectural complexity. In contrast, the JPL's "dominant-term selection" SPCA algorithm allows, for the first time, a compact, low-power hardware implementation of the powerful PCA algorithm. This paper presents a direct comparison of the JPL's SPCA versus JPEG2000, incorporating the Huffman and arithmetic coding for completeness of the data compression operation. The simulation results show that JPL's SPCA algorithm is superior as an optimal data-dependent-transform over the state of the art JPEG2000. When implemented in hardware, this technique is projected to be ideally suited to future NASA missions for autonomous on-board image data processing to improve the bandwidth of communication.
Energy Technology Data Exchange (ETDEWEB)
Bruno, P.; Caselli, M.; de Gennaro, G.; Traini, A. [Dept. of Chemistry, Univ. of Bari (Italy)
2001-12-01
A multivariate statistical method has been applied to apportion the atmospheric pollutant concentrations measured by automatic gas analyzers placed on a mobile laboratory for air quality monitoring in Taranto (Italy). In particular, Principal Component Analysis (PCA) followed by Absolute Principal Component Scores (APCS) technique was performed to identify the number of emission sources and their contribution to measured concentrations of CO, NO{sub x}, benzene toluene m+p-Xylene (BTX). This procedure singled out two different sources that explain about 85% of collected data variance. (orig.)
Local Prediction Models on Mid-Atlantic Ridge MORB by Principal Component Regression
Ling, X.; Snow, J. E.; Chin, W.
2017-12-01
The isotopic compositions of the daughter isotopes of long-lived radioactive systems (Sr, Nd, Hf and Pb ) can be used to map the scale and history of mantle heterogeneities beneath mid-ocean ridges. Our goal is to relate the multidimensional structure in the existing isotopic dataset with an underlying physical reality of mantle sources. The numerical technique of Principal Component Analysis is useful to reduce the linear dependence of the data to a minimum set of orthogonal eigenvectors encapsulating the information contained (cf Agranier et al 2005). The dataset used for this study covers almost all the MORBs along mid-Atlantic Ridge (MAR), from 54oS to 77oN and 8.8oW to -46.7oW, including replicating the dataset of Agranier et al., 2005 published plus 53 basalt samples dredged and analyzed since then (data from PetDB). The principal components PC1 and PC2 account for 61.56% and 29.21%, respectively, of the total isotope ratios variability. The samples with similar compositions to HIMU and EM and DM are identified to better understand the PCs. PC1 and PC2 are accountable for HIMU and EM whereas PC2 has limited control over the DM source. PC3 is more strongly controlled by the depleted mantle source than PC2. What this means is that all three principal components have a high degree of significance relevant to the established mantle sources. We also tested the relationship between mantle heterogeneity and sample locality. K-means clustering algorithm is a type of unsupervised learning to find groups in the data based on feature similarity. The PC factor scores of each sample are clustered into three groups. Cluster one and three are alternating on the north and south MAR. Cluster two exhibits on 45.18oN to 0.79oN and -27.9oW to -30.40oW alternating with cluster one. The ridge has been preliminarily divided into 16 sections considering both the clusters and ridge segments. The principal component regression models the section based on 6 isotope ratios and PCs. The
On the Performance of Principal Component Liu-Type Estimator under the Mean Square Error Criterion
Directory of Open Access Journals (Sweden)
Jibo Wu
2013-01-01
Full Text Available Wu (2013 proposed an estimator, principal component Liu-type estimator, to overcome multicollinearity. This estimator is a general estimator which includes ordinary least squares estimator, principal component regression estimator, ridge estimator, Liu estimator, Liu-type estimator, r-k class estimator, and r-d class estimator. In this paper, firstly we use a new method to propose the principal component Liu-type estimator; then we study the superior of the new estimator by using the scalar mean squares error criterion. Finally, we give a numerical example to show the theoretical results.
Directory of Open Access Journals (Sweden)
Jinlu Sheng
2016-07-01
Full Text Available To effectively extract the typical features of the bearing, a new method that related the local mean decomposition Shannon entropy and improved kernel principal component analysis model was proposed. First, the features are extracted by time–frequency domain method, local mean decomposition, and using the Shannon entropy to process the original separated product functions, so as to get the original features. However, the features been extracted still contain superfluous information; the nonlinear multi-features process technique, kernel principal component analysis, is introduced to fuse the characters. The kernel principal component analysis is improved by the weight factor. The extracted characteristic features were inputted in the Morlet wavelet kernel support vector machine to get the bearing running state classification model, bearing running state was thereby identified. Cases of test and actual were analyzed.
Principal components analysis for quality evaluation of cooled banana 'Nanicão' in different packing
Directory of Open Access Journals (Sweden)
Sanches Juliana
2003-01-01
Full Text Available This work aims determinate the evaluation of the quality of 'Nanicão' banana, submitted to two conditions of storage temperature and three different kinds of package, using the technique of the Analysis of Principal Components (ACP, as a basis for an Analysis of Variance. The fruits used were 'Nanicão' bananas, at ripening degree 3, that is, more green than yellow. The packages tested were: "Torito" wood boxes, load capacity: 18 kg; "½ box" wood boxes, load capacity: 13 kg; and cardboard boxes, load capacity: 18 kg. The temperatures assessed were: room temperature (control; and (13±1ºC, with humidity controlled to 90±2,5%. Fruits were discarded when a sensory analysis determined they had become unfit for consumption. Peel coloration, percentages of imperfection, fresh mass, total acidity, pH, total soluble solids and percentages of sucrose were assessed. A completely randomized design with a 2-factorial treatment structure (packing X temperature was used. The obtained data were analyzed through a multivariate analysis known as Principal Components Analysis, using S-plus 4.2. The conclusion was that the best packages to preserve the fruit were the ½ box ones, which proves that it is necessary to reduce the number of fruits per package to allow better ventilation and decreases mechanical injuries and ensure quality for more time.
Directory of Open Access Journals (Sweden)
Manoj Tripathy
2012-01-01
Full Text Available This paper describes a new approach for power transformer differential protection which is based on the wave-shape recognition technique. An algorithm based on neural network principal component analysis (NNPCA with back-propagation learning is proposed for digital differential protection of power transformer. The principal component analysis is used to preprocess the data from power system in order to eliminate redundant information and enhance hidden pattern of differential current to discriminate between internal faults from inrush and overexcitation conditions. This algorithm has been developed by considering optimal number of neurons in hidden layer and optimal number of neurons at output layer. The proposed algorithm makes use of ratio of voltage to frequency and amplitude of differential current for transformer operating condition detection. This paper presents a comparative study of power transformer differential protection algorithms based on harmonic restraint method, NNPCA, feed forward back propagation neural network (FFBPNN, space vector analysis of the differential signal, and their time characteristic shapes in Park’s plane. The algorithms are compared as to their speed of response, computational burden, and the capability to distinguish between a magnetizing inrush and power transformer internal fault. The mathematical basis for each algorithm is briefly described. All the algorithms are evaluated using simulation performed with PSCAD/EMTDC and MATLAB.
Directory of Open Access Journals (Sweden)
Rockson Dobgegah
2011-03-01
Full Text Available The study adopts a data reduction technique to examine the presence of any complex structure among a set of project management competency variables. A structured survey questionnaire was administered to 100 project managers to elicit relevant data, and this achieved a relatively high response rate of 54%. After satisfying all the necessary tests of reliability of the survey instrument, sample size adequacy and population matrix, the data was subjected to principal component analysis, resulting in the identification of six new thematic project management competency areas ; and were explained in terms of human resource management and project control; construction innovation and communication; project financial resources management; project risk and quality management; business ethics and; physical resources and procurement management. These knowledge areas now form the basis for lateral project management training requirements in the context of the Ghanaian construction industry. Key contribution of the paper is manifested in the use of the principal component analysis, which has rigorously provided understanding into the complex structure and the relationship between the various knowledge areas. The originality and value of the paper is embedded in the use of contextual-task conceptual knowledge to expound the six uncorrelated empirical utility of the project management competencies.
Adam, Craig D; Sherratt, Sarah L; Zholobenko, Vladimir L
2008-01-15
The technique of principal component analysis has been applied to the UV-vis spectra of inks obtained from a wide range of black ballpoint pens available in the UK market. Both the pen ink and material extracted from the ink line on paper have been examined. Here, principal component analysis characterised each spectrum within a group through the numerical loadings attached to the first few principal components. Analysis of the spectra from multiple measurements on the same brand of pen showed excellent reproducibility and clear discrimination between inks that was supported by statistical analysis. Indeed it was possible to discriminate between the pen ink and the ink line from all brands examined in this way, suggesting that the solvent extraction process may have an influence on these results. For the complete set of 25 pens, interpretation of the loadings for the first few principal components showed that both the pen inks and the extracted ink lines may be classified in an objective manner and in agreement with the results of parallel thin layer chromatography studies. Within each class almost all inks could be individualised. Further work has shown that principal component analysis may be used to identify a particular ink from a database of reference UV-vis spectra and a strategy for developing this approach is suggested.
Oil classification using X-ray scattering and principal component analysis
International Nuclear Information System (INIS)
Almeida, Danielle S.; Souza, Amanda S.; Lopes, Ricardo T.; Oliveira, Davi F.; Anjos, Marcelino J.
2015-01-01
X-ray scattering techniques have been considered promising for the classification and characterization of many types of samples. This study employed this technique combined with chemical analysis and multivariate analysis to characterize 54 vegetable oil samples (being 25 olive oils)with different properties obtained in commercial establishments in Rio de Janeiro city. The samples were chemically analyzed using the following indexes: iodine, acidity, saponification and peroxide. In order to obtain the X-ray scattering spectrum, an X-ray tube with a silver anode operating at 40kV and 50 μA was used. The results showed that oils cab ne divided in tow large groups: olive oils and non-olive oils. Additionally, in a multivariate analysis (Principal Component Analysis - PCA), two components were obtained and accounted for more than 80% of the variance. One component was associated with chemical parameters and the other with scattering profiles of each sample. Results showed that use of X-ray scattering spectra combined with chemical analysis and PCA can be a fast, cheap and efficient method for vegetable oil characterization. (author)
Oil classification using X-ray scattering and principal component analysis
Energy Technology Data Exchange (ETDEWEB)
Almeida, Danielle S.; Souza, Amanda S.; Lopes, Ricardo T., E-mail: dani.almeida84@gmail.com, E-mail: ricardo@lin.ufrj.br, E-mail: amandass@bioqmed.ufrj.br [Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil); Oliveira, Davi F.; Anjos, Marcelino J., E-mail: davi.oliveira@uerj.br, E-mail: marcelin@uerj.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica Armando Dias Tavares
2015-07-01
X-ray scattering techniques have been considered promising for the classification and characterization of many types of samples. This study employed this technique combined with chemical analysis and multivariate analysis to characterize 54 vegetable oil samples (being 25 olive oils)with different properties obtained in commercial establishments in Rio de Janeiro city. The samples were chemically analyzed using the following indexes: iodine, acidity, saponification and peroxide. In order to obtain the X-ray scattering spectrum, an X-ray tube with a silver anode operating at 40kV and 50 μA was used. The results showed that oils cab ne divided in tow large groups: olive oils and non-olive oils. Additionally, in a multivariate analysis (Principal Component Analysis - PCA), two components were obtained and accounted for more than 80% of the variance. One component was associated with chemical parameters and the other with scattering profiles of each sample. Results showed that use of X-ray scattering spectra combined with chemical analysis and PCA can be a fast, cheap and efficient method for vegetable oil characterization. (author)
Burst and Principal Components Analyses of MEA Data Separates Chemicals by Class
Microelectrode arrays (MEAs) detect drug and chemical induced changes in action potential "spikes" in neuronal networks and can be used to screen chemicals for neurotoxicity. Analytical "fingerprinting," using Principal Components Analysis (PCA) on spike trains recorded from prim...
Introduction to uses and interpretation of principal component analyses in forest biology.
J. G. Isebrands; Thomas R. Crow
1975-01-01
The application of principal component analysis for interpretation of multivariate data sets is reviewed with emphasis on (1) reduction of the number of variables, (2) ordination of variables, and (3) applications in conjunction with multiple regression.
International Nuclear Information System (INIS)
Zeng, J.; Li, G.; Sun, J.
2013-01-01
Principal components analysis and cluster analysis were used to investigate the properties of different corn varieties. The chemical compositions and some properties of corn flour which processed by drying milling were determined. The results showed that the chemical compositions and physicochemical properties were significantly different among twenty six corn varieties. The quality of corn flour was concerned with five principal components from principal component analysis and the contribution rate of starch pasting properties was important, which could account for 48.90%. Twenty six corn varieties could be classified into four groups by cluster analysis. The consistency between principal components analysis and cluster analysis indicated that multivariate analyses were feasible in the study of corn variety properties. (author)
Kernel principal component and maximum autocorrelation factor analyses for change detection
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg; Canty, Morton John
2009-01-01
Principal component analysis (PCA) has often been used to detect change over time in remotely sensed images. A commonly used technique consists of finding the projections along the eigenvectors for data consisting of pair-wise (perhaps generalized) differences between corresponding spectral bands...... in Nevada acquired on successive passes of the Landsat-5 satellite in August-September 1991. The six-band images (the thermal band is omitted) with 1,000 by 1,000 28.5 m pixels were first processed with the iteratively re-weighted MAD (IR-MAD) algorithm in order to discriminate change. Then the MAD image...... was post-processed with both ordinary and kernel versions of PCA and MAF analysis. Kernel MAF suppresses the noisy no-change background much more successfully than ordinary MAF. The ratio between variances of the ordinary MAF 1 and the kernel MAF 1 (both scaled to unit variance) calculated in a no...
Application of Principal Component Analysis in Prompt Gamma Spectra for Material Sorting
Energy Technology Data Exchange (ETDEWEB)
Im, Hee Jung; Lee, Yun Hee; Song, Byoung Chul; Park, Yong Joon; Kim, Won Ho
2006-11-15
For the detection of illicit materials in a very short time by comparing unknown samples' gamma spectra to pre-programmed material signatures, we at first, selected a method to reduce the noise of the obtained gamma spectra. After a noise reduction, a pattern recognition technique was applied to discriminate the illicit materials from the innocuous materials in the noise reduced data. Principal component analysis was applied for a noise reduction and pattern recognition in prompt gamma spectra. A computer program for the detection of illicit materials based on PCA method was developed in our lab and can be applied to the PGNAA system for the baggage checking at all ports of entry at a very short time.
Application of Principal Component Analysis in Prompt Gamma Spectra for Material Sorting
International Nuclear Information System (INIS)
Im, Hee Jung; Lee, Yun Hee; Song, Byoung Chul; Park, Yong Joon; Kim, Won Ho
2006-11-01
For the detection of illicit materials in a very short time by comparing unknown samples' gamma spectra to pre-programmed material signatures, we at first, selected a method to reduce the noise of the obtained gamma spectra. After a noise reduction, a pattern recognition technique was applied to discriminate the illicit materials from the innocuous materials in the noise reduced data. Principal component analysis was applied for a noise reduction and pattern recognition in prompt gamma spectra. A computer program for the detection of illicit materials based on PCA method was developed in our lab and can be applied to the PGNAA system for the baggage checking at all ports of entry at a very short time
Zalameda, Joseph N.; Bolduc, Sean; Harman, Rebecca
2017-05-01
A composite fuselage aircraft forward section was inspected with flash thermography. The fuselage section is 24 feet long and approximately 8 feet in diameter. The structure is primarily configured with a composite sandwich structure of carbon fiber face sheets with a Nomex® honeycomb core. The outer surface area was inspected. The thermal data consisted of 477 data sets totaling in size of over 227 Gigabytes. Principal component analysis (PCA) was used to process the data sets for substructure and defect detection. A fixed eigenvector approach using a global covariance matrix was used and compared to a varying eigenvector approach. The fixed eigenvector approach was demonstrated to be a practical analysis method for the detection and interpretation of various defects such as paint thickness variation, possible water intrusion damage, and delamination damage. In addition, inspection considerations are discussed including coordinate system layout, manipulation of the fuselage section, and the manual scanning technique used for full coverage.
Zalameda, Joseph N.; Bolduc, Sean; Harman, Rebecca
2017-01-01
A composite fuselage aircraft forward section was inspected with flash thermography. The fuselage section is 24 feet long and approximately 8 feet in diameter. The structure is primarily configured with a composite sandwich structure of carbon fiber face sheets with a Nomex(Trademark) honeycomb core. The outer surface area was inspected. The thermal data consisted of 477 data sets totaling in size of over 227 Gigabytes. Principal component analysis (PCA) was used to process the data sets for substructure and defect detection. A fixed eigenvector approach using a global covariance matrix was used and compared to a varying eigenvector approach. The fixed eigenvector approach was demonstrated to be a practical analysis method for the detection and interpretation of various defects such as paint thickness variation, possible water intrusion damage, and delamination damage. In addition, inspection considerations are discussed including coordinate system layout, manipulation of the fuselage section, and the manual scanning technique used for full coverage.
Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.
2015-03-01
Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.
Directory of Open Access Journals (Sweden)
Glogovac Svetlana
2012-01-01
Full Text Available This study investigates variability of tomato genotypes based on morphological and biochemical fruit traits. Experimental material is a part of tomato genetic collection from Institute of Filed and Vegetable Crops in Novi Sad, Serbia. Genotypes were analyzed for fruit mass, locule number, index of fruit shape, fruit colour, dry matter content, total sugars, total acidity, lycopene and vitamin C. Minimum, maximum and average values and main indicators of variability (CV and σ were calculated. Principal component analysis was performed to determinate variability source structure. Four principal components, which contribute 93.75% of the total variability, were selected for analysis. The first principal component is defined by vitamin C, locule number and index of fruit shape. The second component is determined by dry matter content, and total acidity, the third by lycopene, fruit mass and fruit colour. Total sugars had the greatest part in the fourth component.
Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis
Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.;
2015-01-01
The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an 55Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.
Principal component analysis of the cytokine and chemokine response to human traumatic brain injury.
Directory of Open Access Journals (Sweden)
Adel Helmy
Full Text Available There is a growing realisation that neuro-inflammation plays a fundamental role in the pathology of Traumatic Brain Injury (TBI. This has led to the search for biomarkers that reflect these underlying inflammatory processes using techniques such as cerebral microdialysis. The interpretation of such biomarker data has been limited by the statistical methods used. When analysing data of this sort the multiple putative interactions between mediators need to be considered as well as the timing of production and high degree of statistical co-variance in levels of these mediators. Here we present a cytokine and chemokine dataset from human brain following human traumatic brain injury and use principal component analysis and partial least squares discriminant analysis to demonstrate the pattern of production following TBI, distinct phases of the humoral inflammatory response and the differing patterns of response in brain and in peripheral blood. This technique has the added advantage of making no assumptions about the Relative Recovery (RR of microdialysis derived parameters. Taken together these techniques can be used in complex microdialysis datasets to summarise the data succinctly and generate hypotheses for future study.
Magnetic Flux Leakage and Principal Component Analysis for metal loss approximation in a pipeline
International Nuclear Information System (INIS)
Ruiz, M; Mujica, L E; Quintero, M; Florez, J; Quintero, S
2015-01-01
Safety and reliability of hydrocarbon transportation pipelines represent a critical aspect for the Oil an Gas industry. Pipeline failures caused by corrosion, external agents, among others, can develop leaks or even rupture, which can negatively impact on population, natural environment, infrastructure and economy. It is imperative to have accurate inspection tools traveling through the pipeline to diagnose the integrity. In this way, over the last few years, different techniques under the concept of structural health monitoring (SHM) have continuously been in development.This work is based on a hybrid methodology that combines the Magnetic Flux Leakage (MFL) and Principal Components Analysis (PCA) approaches. The MFL technique induces a magnetic field in the pipeline's walls. The data are recorded by sensors measuring leakage magnetic field in segments with loss of metal, such as cracking, corrosion, among others. The data provide information of a pipeline with 15 years of operation approximately, which transports gas, has a diameter of 20 inches and a total length of 110 km (with several changes in the topography). On the other hand, PCA is a well-known technique that compresses the information and extracts the most relevant information facilitating the detection of damage in several structures. At this point, the goal of this work is to detect and localize critical loss of metal of a pipeline that are currently working. (paper)
Principal component structure and sport-specific differences in the running one-leg vertical jump.
Laffaye, G; Bardy, B G; Durey, A
2007-05-01
The aim of this study is to identify the kinetic principal components involved in one-leg running vertical jumps, as well as the potential differences between specialists from different sports. The sample was composed of 25 regional skilled athletes who play different jumping sports (volleyball players, handball players, basketball players, high jumpers and novices), who performed a running one-leg jump. A principal component analysis was performed on the data obtained from the 200 tested jumps in order to identify the principal components summarizing the six variables extracted from the force-time curve. Two principal components including six variables accounted for 78 % of the variance in jump height. Running one-leg vertical jump performance was predicted by a temporal component (that brings together impulse time, eccentric time and vertical displacement of the center of mass) and a force component (who brings together relative peak of force and power, and rate of force development). A comparison made among athletes revealed a temporal-prevailing profile for volleyball players, and a force-dominant profile for Fosbury high jumpers. Novices showed an ineffective utilization of the force component, while handball and basketball players showed heterogeneous and neutral component profiles. Participants will use a jumping strategy in which variables related to either the magnitude or timing of force production will be closely coupled; athletes from different sporting backgrounds will use a jumping strategy that reflects the inherent demands of their chosen sport.
A hybrid least squares and principal component analysis algorithm for Raman spectroscopy.
Directory of Open Access Journals (Sweden)
Dominique Van de Sompel
Full Text Available Raman spectroscopy is a powerful technique for detecting and quantifying analytes in chemical mixtures. A critical part of Raman spectroscopy is the use of a computer algorithm to analyze the measured Raman spectra. The most commonly used algorithm is the classical least squares method, which is popular due to its speed and ease of implementation. However, it is sensitive to inaccuracies or variations in the reference spectra of the analytes (compounds of interest and the background. Many algorithms, primarily multivariate calibration methods, have been proposed that increase robustness to such variations. In this study, we propose a novel method that improves robustness even further by explicitly modeling variations in both the background and analyte signals. More specifically, it extends the classical least squares model by allowing the declared reference spectra to vary in accordance with the principal components obtained from training sets of spectra measured in prior characterization experiments. The amount of variation allowed is constrained by the eigenvalues of this principal component analysis. We compare the novel algorithm to the least squares method with a low-order polynomial residual model, as well as a state-of-the-art hybrid linear analysis method. The latter is a multivariate calibration method designed specifically to improve robustness to background variability in cases where training spectra of the background, as well as the mean spectrum of the analyte, are available. We demonstrate the novel algorithm's superior performance by comparing quantitative error metrics generated by each method. The experiments consider both simulated data and experimental data acquired from in vitro solutions of Raman-enhanced gold-silica nanoparticles.
Principal component analysis of MSBAS DInSAR time series from Campi Flegrei, Italy
Tiampo, Kristy F.; González, Pablo J.; Samsonov, Sergey; Fernández, Jose; Camacho, Antonio
2017-09-01
Because of its proximity to the city of Naples and with a population of nearly 1 million people within its caldera, Campi Flegrei is one of the highest risk volcanic areas in the world. Since the last major eruption in 1538, the caldera has undergone frequent episodes of ground subsidence and uplift accompanied by seismic activity that has been interpreted as the result of a stationary, deeper source below the caldera that feeds shallower eruptions. However, the location and depth of the deeper source is not well-characterized and its relationship to current activity is poorly understood. Recently, a significant increase in the uplift rate has occurred, resulting in almost 13 cm of uplift by 2013 (De Martino et al., 2014; Samsonov et al., 2014b; Di Vito et al., 2016). Here we apply a principal component decomposition to high resolution time series from the region produced by the advanced Multidimensional SBAS DInSAR technique in order to better delineate both the deeper source and the recent shallow activity. We analyzed both a period of substantial subsidence (1993-1999) and a second of significant uplift (2007-2013) and inverted the associated vertical surface displacement for the most likely source models. Results suggest that the underlying dynamics of the caldera changed in the late 1990s, from one in which the primary signal arises from a shallow deflating source above a deeper, expanding source to one dominated by a shallow inflating source. In general, the shallow source lies between 2700 and 3400 m below the caldera while the deeper source lies at 7600 m or more in depth. The combination of principal component analysis with high resolution MSBAS time series data allows for these new insights and confirms the applicability of both to areas at risk from dynamic natural hazards.
Optimized principal component analysis on coronagraphic images of the fomalhaut system
Energy Technology Data Exchange (ETDEWEB)
Meshkat, Tiffany; Kenworthy, Matthew A. [Sterrewacht Leiden, P.O. Box 9513, Niels Bohrweg 2, 2300-RA Leiden (Netherlands); Quanz, Sascha P.; Amara, Adam [Institute for Astronomy, ETH Zurich, Wolfgang-Pauli-Strasse 27, 8093-CH Zurich (Switzerland)
2014-01-01
We present the results of a study to optimize the principal component analysis (PCA) algorithm for planet detection, a new algorithm complementing angular differential imaging and locally optimized combination of images (LOCI) for increasing the contrast achievable next to a bright star. The stellar point spread function (PSF) is constructed by removing linear combinations of principal components, allowing the flux from an extrasolar planet to shine through. The number of principal components used determines how well the stellar PSF is globally modeled. Using more principal components may decrease the number of speckles in the final image, but also increases the background noise. We apply PCA to Fomalhaut Very Large Telescope NaCo images acquired at 4.05 μm with an apodized phase plate. We do not detect any companions, with a model dependent upper mass limit of 13-18 M {sub Jup} from 4-10 AU. PCA achieves greater sensitivity than the LOCI algorithm for the Fomalhaut coronagraphic data by up to 1 mag. We make several adaptations to the PCA code and determine which of these prove the most effective at maximizing the signal-to-noise from a planet very close to its parent star. We demonstrate that optimizing the number of principal components used in PCA proves most effective for pulling out a planet signal.
Barzin, Razieh; Shirvani, Amin; Lotfi, Hossein
2017-01-01
Downward shortwave radiation is a key quantity in the land-atmosphere interaction. Since the moderate resolution imaging spectroradiometer data has a coarse temporal resolution, which is not suitable for estimating daily average radiation, many efforts have been undertaken to estimate instantaneous solar radiation using moderate resolution imaging spectroradiometer data. In this study, the principal components analysis technique was applied to capture the information of moderate resolution imaging spectroradiometer bands, extraterrestrial radiation, aerosol optical depth, and atmospheric water vapour. A regression model based on the principal components was used to estimate daily average shortwave radiation for ten synoptic stations in the Fars province, Iran, for the period 2009-2012. The Durbin-Watson statistic and autocorrelation function of the residuals of the fitted principal components regression model indicated that the residuals were serially independent. The results indicated that the fitted principal components regression models accounted for about 86-96% of total variance of the observed shortwave radiation values and the root mean square error was about 0.9-2.04 MJ m-2 d-1. Also, the results indicated that the model accuracy decreased as the aerosol optical depth increased and extraterrestrial radiation was the most important predictor variable among all.
Medina, José M; Díaz, José A; Vukusic, Pete
2015-04-20
Iridescent structural colors in biology exhibit sophisticated spatially-varying reflectance properties that depend on both the illumination and viewing angles. The classification of such spectral and spatial information in iridescent structurally colored surfaces is important to elucidate the functional role of irregularity and to improve understanding of color pattern formation at different length scales. In this study, we propose a non-invasive method for the spectral classification of spatial reflectance patterns at the micron scale based on the multispectral imaging technique and the principal component analysis similarity factor (PCASF). We demonstrate the effectiveness of this approach and its component methods by detailing its use in the study of the angle-dependent reflectance properties of Pavo cristatus (the common peacock) feathers, a species of peafowl very well known to exhibit bright and saturated iridescent colors. We show that multispectral reflectance imaging and PCASF approaches can be used as effective tools for spectral recognition of iridescent patterns in the visible spectrum and provide meaningful information for spectral classification of the irregularity of the microstructure in iridescent plumage.
Kirkwood, Renata N; Resende, Renan A; Magalhães, Cláudio M B; Gomes, Henrique A; Mingoti, Sueli A; Sampaio, Rosana F
2011-01-01
The applicability of gait analysis has been implemented with the introduction of the principal component analysis (PCA), a statistical data reduction technique that allows the comparison of the whole cycle between groups of individuals. Applying PCA, to compare the kinematics of the knee joint during gait, in the frontal and sagittal planes, between a group of elderly women with and without diagnosis in the initial and moderate stages of Osteoarthritis (OA). A total of 38 elderly women (69.6±8.1 years) with knee OA and 40 asymptomatic (70.3±7.7 years) participated on this study. The kinematics was obtained using the Qualisys Pro-reflex system. The OA group showed decreased gait velocity and stride length (pgait, was the variable with higher discrimination power between groups. PCA is an effective multivariate statistical technique to analyse the kinematic gait waveform during the gait cycle. The smaller knee flexion angle in the OA group was appointed as a discriminatory factor between groups, therefore, it should be considered in the physical therapy evaluation and treatment of elderly women with knee OA.
Toupin, S; de Senneville, B Denis; Ozenne, V; Bour, P; Lepetit-Coiffe, M; Boissenin, M; Jais, P; Quesson, B
2017-02-21
The use of magnetic resonance (MR) thermometry for the monitoring of thermal ablation is rapidly expanding. However, this technique remains challenging for the monitoring of the treatment of cardiac arrhythmia by radiofrequency ablation due to the heart displacement with respiration and contraction. Recent studies have addressed this problem by compensating in-plane motion in real-time with optical-flow based tracking technique. However, these algorithms are sensitive to local variation of signal intensity on magnitude images associated with tissue heating. In this study, an optical-flow algorithm was combined with a principal component analysis method to reduce the impact of such effects. The proposed method was integrated to a fully automatic cardiac MR thermometry pipeline, compatible with a future clinical workflow. It was evaluated on nine healthy volunteers under free breathing conditions, on a phantom and in vivo on the left ventricle of a sheep. The results showed that local intensity changes in magnitude images had lower impact on motion estimation with the proposed method. Using this strategy, the temperature mapping accuracy was significantly improved.
Dimensionality reduction of hyperspectral imaging data using local principal components transforms
Manolakis, Dimitris G.; Marden, David B.
2004-08-01
The spectral exploitation of hyperspectral imaging (HSI) data is based on their representation as vectors in a high dimensional space defined by a set of orthogonal coordinate axes, where each axis corresponds to one spectral band. The larger number of bands, which varies from 100-400 in existing sensors, makes the storage, transmission, and processing of HSI data a challenging task. A practical way to facilitate these tasks is to reduce the dimensionality of HSI data without significant loss of information. The purpose of this paper is twofold. First, to provide a concise review of various approaches that have been used to reduce the dimensionality of HSI data, as a preprocessing step for compression, visualization, classification, and detection applications. Second, we show that the nonlinear and nonnormal structure of HSI data, can often be more effectively exploited by using a nonlinear dimensionality reduction technique known as local principal component analyzers. The performance of the various techniques is illustrated using HYDICE and AVIRIS HSI data.
Support vector machine and principal component analysis for microarray data classification
Astuti, Widi; Adiwijaya
2018-03-01
Cancer is a leading cause of death worldwide although a significant proportion of it can be cured if it is detected early. In recent decades, technology called microarray takes an important role in the diagnosis of cancer. By using data mining technique, microarray data classification can be performed to improve the accuracy of cancer diagnosis compared to traditional techniques. The characteristic of microarray data is small sample but it has huge dimension. Since that, there is a challenge for researcher to provide solutions for microarray data classification with high performance in both accuracy and running time. This research proposed the usage of Principal Component Analysis (PCA) as a dimension reduction method along with Support Vector Method (SVM) optimized by kernel functions as a classifier for microarray data classification. The proposed scheme was applied on seven data sets using 5-fold cross validation and then evaluation and analysis conducted on term of both accuracy and running time. The result showed that the scheme can obtained 100% accuracy for Ovarian and Lung Cancer data when Linear and Cubic kernel functions are used. In term of running time, PCA greatly reduced the running time for every data sets.
Directory of Open Access Journals (Sweden)
Paul Robert Martin Werfette
2010-06-01
Full Text Available Analysis of quantitative structure - activity relationship (QSAR for a series of antimalarial compound artemisinin derivatives has been done using principal component regression. The descriptors for QSAR study were representation of electronic structure i.e. atomic net charges of the artemisinin skeleton calculated by AM1 semi-empirical method. The antimalarial activity of the compound was expressed in log 1/IC50 which is an experimental data. The main purpose of the principal component analysis approach is to transform a large data set of atomic net charges to simplify into a data set which known as latent variables. The best QSAR equation to analyze of log 1/IC50 can be obtained from the regression method as a linear function of several latent variables i.e. x1, x2, x3, x4 and x5. The best QSAR model is expressed in the following equation, (;; Keywords: QSAR, antimalarial, artemisinin, principal component regression
Rosacea assessment by erythema index and principal component analysis segmentation maps
Kuzmina, Ilona; Rubins, Uldis; Saknite, Inga; Spigulis, Janis
2017-12-01
RGB images of rosacea were analyzed using segmentation maps of principal component analysis (PCA) and erythema index (EI). Areas of segmented clusters were compared to Clinician's Erythema Assessment (CEA) values given by two dermatologists. The results show that visible blood vessels are segmented more precisely on maps of the erythema index and the third principal component (PC3). In many cases, a distribution of clusters on EI and PC3 maps are very similar. Mean values of clusters' areas on these maps show a decrease of the area of blood vessels and erythema and an increase of lighter skin area after the therapy for the patients with diagnosis CEA = 2 on the first visit and CEA=1 on the second visit. This study shows that EI and PC3 maps are more useful than the maps of the first (PC1) and second (PC2) principal components for indicating vascular structures and erythema on the skin of rosacea patients and therapy monitoring.
Directory of Open Access Journals (Sweden)
Stefania Salvatore
2016-07-01
Full Text Available Abstract Background Wastewater-based epidemiology (WBE is a novel approach in drug use epidemiology which aims to monitor the extent of use of various drugs in a community. In this study, we investigate functional principal component analysis (FPCA as a tool for analysing WBE data and compare it to traditional principal component analysis (PCA and to wavelet principal component analysis (WPCA which is more flexible temporally. Methods We analysed temporal wastewater data from 42 European cities collected daily over one week in March 2013. The main temporal features of ecstasy (MDMA were extracted using FPCA using both Fourier and B-spline basis functions with three different smoothing parameters, along with PCA and WPCA with different mother wavelets and shrinkage rules. The stability of FPCA was explored through bootstrapping and analysis of sensitivity to missing data. Results The first three principal components (PCs, functional principal components (FPCs and wavelet principal components (WPCs explained 87.5-99.6 % of the temporal variation between cities, depending on the choice of basis and smoothing. The extracted temporal features from PCA, FPCA and WPCA were consistent. FPCA using Fourier basis and common-optimal smoothing was the most stable and least sensitive to missing data. Conclusion FPCA is a flexible and analytically tractable method for analysing temporal changes in wastewater data, and is robust to missing data. WPCA did not reveal any rapid temporal changes in the data not captured by FPCA. Overall the results suggest FPCA with Fourier basis functions and common-optimal smoothing parameter as the most accurate approach when analysing WBE data.
Salvatore, Stefania; Bramness, Jørgen G; Røislien, Jo
2016-07-12
Wastewater-based epidemiology (WBE) is a novel approach in drug use epidemiology which aims to monitor the extent of use of various drugs in a community. In this study, we investigate functional principal component analysis (FPCA) as a tool for analysing WBE data and compare it to traditional principal component analysis (PCA) and to wavelet principal component analysis (WPCA) which is more flexible temporally. We analysed temporal wastewater data from 42 European cities collected daily over one week in March 2013. The main temporal features of ecstasy (MDMA) were extracted using FPCA using both Fourier and B-spline basis functions with three different smoothing parameters, along with PCA and WPCA with different mother wavelets and shrinkage rules. The stability of FPCA was explored through bootstrapping and analysis of sensitivity to missing data. The first three principal components (PCs), functional principal components (FPCs) and wavelet principal components (WPCs) explained 87.5-99.6 % of the temporal variation between cities, depending on the choice of basis and smoothing. The extracted temporal features from PCA, FPCA and WPCA were consistent. FPCA using Fourier basis and common-optimal smoothing was the most stable and least sensitive to missing data. FPCA is a flexible and analytically tractable method for analysing temporal changes in wastewater data, and is robust to missing data. WPCA did not reveal any rapid temporal changes in the data not captured by FPCA. Overall the results suggest FPCA with Fourier basis functions and common-optimal smoothing parameter as the most accurate approach when analysing WBE data.
Schürks, Markus; Buring, Julie E.; Kurth, Tobias
2011-01-01
Aims Migraine has a wide clinical spectrum. Our aim was to group information on migraine characteristics into meaningful components and to identify key components of the migraine phenotype. Methods We performed two principal component analyses, one among participants in the Women's Health Study enrolment cohort and one in a sub-cohort with additional migraine-specific information. Results Among the 9,427 women with migraine attack-related information at enrolment, the three most important components pertained to central nervous system (CNS) sensitization, attack frequency/pain location, and aura/visual phenomena. In the sub-group of 1,675 women with more detailed information, food triggers and unspecific symptoms constituted two principal components that explain more of the variance of the migraine phenotype than the three attack-related components. Conclusions Our results indicate that information on migraine-associated features, symptoms, and triggers is highly correlated allowing the extraction of principal components. Migraine attack-related symptoms are best summarized by symptoms related to CNS sensitization, attack frequency/pain location, and aura/visual phenomena. Taking a more general view, unspecific symptoms and food triggers appear to carry stronger importance in characterizing the migraine phenotype. These components are useful for future research on the pathophysiology and genetics of migraine and may have implications for diagnosing and treating patients. PMID:21398421
Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat
2015-01-01
A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...... are generated using a stationary first-order vector autoregressive model. The results show that the descriptive ability of PCA may be seriously affected by autocorrelation causing a need to incorporate additional principal components to maintain the model's explanatory ability. When all variables have equal...
Principal component analysis of the summertime winds over the Gulf of California: A gulf surge index
Bordoni, S.; Stevens, B.
2006-01-01
A principal component analysis of the summertime near-surface Quick Scatterometer (QuikSCAT) winds is used to identify the leading mode of synoptic-scale variability of the low-level flow along the Gulf of California during the North American monsoon season. A gulf surge mode emerges from this analysis as the leading EOF, with the corresponding principal component time series interpretable as an objective index for gulf surge occurrence. This index is used as a reference time series for regre...
Optimal pattern synthesis for speech recognition based on principal component analysis
Korsun, O. N.; Poliyev, A. V.
2018-02-01
The algorithm for building an optimal pattern for the purpose of automatic speech recognition, which increases the probability of correct recognition, is developed and presented in this work. The optimal pattern forming is based on the decomposition of an initial pattern to principal components, which enables to reduce the dimension of multi-parameter optimization problem. At the next step the training samples are introduced and the optimal estimates for principal components decomposition coefficients are obtained by a numeric parameter optimization algorithm. Finally, we consider the experiment results that show the improvement in speech recognition introduced by the proposed optimization algorithm.
Directory of Open Access Journals (Sweden)
Anna Maria Stellacci
2012-07-01
Full Text Available Hyperspectral (HS data represents an extremely powerful means for rapidly detecting crop stress and then aiding in the rational management of natural resources in agriculture. However, large volume of data poses a challenge for data processing and extracting crucial information. Multivariate statistical techniques can play a key role in the analysis of HS data, as they may allow to both eliminate redundant information and identify synthetic indices which maximize differences among levels of stress. In this paper we propose an integrated approach, based on the combined use of Principal Component Analysis (PCA and Canonical Discriminant Analysis (CDA, to investigate HS plant response and discriminate plant status. The approach was preliminary evaluated on a data set collected on durum wheat plants grown under different nitrogen (N stress levels. Hyperspectral measurements were performed at anthesis through a high resolution field spectroradiometer, ASD FieldSpec HandHeld, covering the 325-1075 nm region. Reflectance data were first restricted to the interval 510-1000 nm and then divided into five bands of the electromagnetic spectrum [green: 510-580 nm; yellow: 581-630 nm; red: 631-690 nm; red-edge: 705-770 nm; near-infrared (NIR: 771-1000 nm]. PCA was applied to each spectral interval. CDA was performed on the extracted components to identify the factors maximizing the differences among plants fertilised with increasing N rates. Within the intervals of green, yellow and red only the first principal component (PC had an eigenvalue greater than 1 and explained more than 95% of total variance; within the ranges of red-edge and NIR, the first two PCs had an eigenvalue higher than 1. Two canonical variables explained cumulatively more than 81% of total variance and the first was able to discriminate wheat plants differently fertilised, as confirmed also by the significant correlation with aboveground biomass and grain yield parameters. The combined
Krishnan, M.; Bhowmik, B.; Hazra, B.; Pakrashi, V.
2018-02-01
In this paper, a novel baseline free approach for continuous online damage detection of multi degree of freedom vibrating structures using Recursive Principal Component Analysis (RPCA) in conjunction with Time Varying Auto-Regressive Modeling (TVAR) is proposed. In this method, the acceleration data is used to obtain recursive proper orthogonal components online using rank-one perturbation method, followed by TVAR modeling of the first transformed response, to detect the change in the dynamic behavior of the vibrating system from its pristine state to contiguous linear/non-linear-states that indicate damage. Most of the works available in the literature deal with algorithms that require windowing of the gathered data owing to their data-driven nature which renders them ineffective for online implementation. Algorithms focussed on mathematically consistent recursive techniques in a rigorous theoretical framework of structural damage detection is missing, which motivates the development of the present framework that is amenable for online implementation which could be utilized along with suite experimental and numerical investigations. The RPCA algorithm iterates the eigenvector and eigenvalue estimates for sample covariance matrices and new data point at each successive time instants, using the rank-one perturbation method. TVAR modeling on the principal component explaining maximum variance is utilized and the damage is identified by tracking the TVAR coefficients. This eliminates the need for offline post processing and facilitates online damage detection especially when applied to streaming data without requiring any baseline data. Numerical simulations performed on a 5-dof nonlinear system under white noise excitation and El Centro (also known as 1940 Imperial Valley earthquake) excitation, for different damage scenarios, demonstrate the robustness of the proposed algorithm. The method is further validated on results obtained from case studies involving
Principal component analysis for neural electron/jet discrimination in highly segmented calorimeters
International Nuclear Information System (INIS)
Vassali, M.R.; Seixas, J.M.
2001-01-01
A neural electron/jet discriminator based on calorimetry is developed for the second-level trigger system of the ATLAS detector. As preprocessing of the calorimeter information, a principal component analysis is performed on each segment of the two sections (electromagnetic and hadronic) of the calorimeter system, in order to reduce significantly the dimension of the input data space and fully explore the detailed energy deposition profile, which is provided by the highly-segmented calorimeter system. It is shown that projecting calorimeter data onto 33 segmented principal components, the discrimination efficiency of the neural classifier reaches 98.9% for electrons (with only 1% of false alarm probability). Furthermore, restricting data projection onto only 9 components, an electron efficiency of 99.1% is achieved (with 3% of false alarm), which confirms that a fast triggering system may be designed using few components
International Nuclear Information System (INIS)
Sengupta, S.K.; Boyle, J.S.
1993-05-01
Variables describing atmospheric circulation and other climate parameters derived from various GCMs and obtained from observations can be represented on a spatio-temporal grid (lattice) structure. The primary objective of this paper is to explore existing as well as some new statistical methods to analyze such data structures for the purpose of model diagnostics and intercomparison from a statistical perspective. Among the several statistical methods considered here, a new method based on common principal components appears most promising for the purpose of intercomparison of spatio-temporal data structures arising in the task of model/model and model/data intercomparison. A complete strategy for such an intercomparison is outlined. The strategy includes two steps. First, the commonality of spatial structures in two (or more) fields is captured in the common principal vectors. Second, the corresponding principal components obtained as time series are then compared on the basis of similarities in their temporal evolution
Yousefi, Bardia; Sfarra, Stefano; Ibarra Castanedo, Clemente; Maldague, Xavier P. V.
2017-09-01
Thermal and infrared imagery creates considerable developments in Non-Destructive Testing (NDT) area. Here, a thermography method for NDT specimens inspection is addressed by applying a technique for computation of eigen-decomposition which refers as Candid Covariance-Free Incremental Principal Component Thermography (CCIPCT). The proposed approach uses a shorter computational alternative to estimate covariance matrix and Singular Value Decomposition (SVD) to obtain the result of Principal Component Thermography (PCT) and ultimately segments the defects in the specimens applying color based K-medoids clustering approach. The problem of computational expenses for high-dimensional thermal image acquisition is also investigated. Three types of specimens (CFRP, Plexiglas and Aluminium) have been used for comparative benchmarking. The results conclusively indicate the promising performance and demonstrate a confirmation for the outlined properties.
Efficient real time OD matrix estimation based on principal component analysis
Djukic, T.; Flötteröd, G.; Van Lint, H.; Hoogendoorn, S.P.
2012-01-01
In this paper we explore the idea of dimensionality reduction and approximation of OD demand based on principal component analysis (PCA). First, we show how we can apply PCA to linearly transform the high dimensional OD matrices into the lower dimensional space without significant loss of accuracy.
Witjes, H.; Rijpkema, M.J.P.; Graaf, M. van der; Melssen, W.J.; Heerschap, A.; Buydens, L.M.C.
2003-01-01
PURPOSE: To explore the possibilities of combining multispectral magnetic resonance (MR) images of different patients within one data matrix. MATERIALS AND METHODS: Principal component and linear discriminant analysis was applied to multispectral MR images of 12 patients with different brain tumors.
An Analysis of Rural Poverty in Oyo State: A principal Component ...
African Journals Online (AJOL)
The choice of expenditure as a proxy for measuring poverty was further corroborated. These findings indicated that factor analysis is very helpful in poverty -targeting and alleviation. Keywords: Fundamental Freedoms of Action, Unacceptable Deprivation, Economic Growth, Rural household poverty, Principal Component ...
Application of principal component analysis to time series of daily air pollution and mortality
Quant C; Fischer P; Buringh E; Ameling C; Houthuijs D; Cassee F; MGO
2004-01-01
We investigated whether cause-specific daily mortality can be attributed to specific sources of air pollution. To construct indicators of source-specific air pollution, we applied a principal component analysis (PCA) on routinely collected air pollution data in the Netherlands during the period
Directory of Open Access Journals (Sweden)
Mohebodini Mehdi
2017-08-01
Full Text Available Landraces of spinach in Iran have not been sufficiently characterised for their morpho-agronomic traits. Such characterisation would be helpful in the development of new genetically improved cultivars. In this study 54 spinach accessions collected from the major spinach growing areas of Iran were evaluated to determine their phenotypic diversity profile of spinach genotypes on the basis of 10 quantitative and 9 qualitative morpho-agronomic traits. High coefficients of variation were recorded in some quantitative traits (dry yield and leaf area and all of the qualitative traits. Using principal component analysis, the first four principal components with eigen-values more than 1 contributed 87% of the variability among accessions for quantitative traits, whereas the first four principal components with eigen-values more than 0.8 contributed 79% of the variability among accessions for qualitative traits. The most important relations observed on the first two principal components were a strong positive association between leaf width and petiole length; between leaf length and leaf numbers in flowering; and among fresh yield, dry yield and petiole diameter; a near zero correlation between days to flowering with leaf width and petiole length. Prickly seeds, high percentage of female plants, smooth leaf texture, high numbers of leaves at flowering, greygreen leaves, erect petiole attitude and long petiole length are important characters for spinach breeding programmes.
Principal Component Analysis: Resources for an Essential Application of Linear Algebra
Pankavich, Stephen; Swanson, Rebecca
2015-01-01
Principal Component Analysis (PCA) is a highly useful topic within an introductory Linear Algebra course, especially since it can be used to incorporate a number of applied projects. This method represents an essential application and extension of the Spectral Theorem and is commonly used within a variety of fields, including statistics,…
Giesen, E.B.W.; Ding, M.; Dalstra, M.; Eijden, T.M. van
2003-01-01
As several morphological parameters of cancellous bone express more or less the same architectural measure, we applied principal components analysis to group these measures and correlated these to the mechanical properties. Cylindrical specimens (n = 24) were obtained in different orientations from
Khodasevich, M. A.; Sinitsyn, G. V.; Gres'ko, M. A.; Dolya, V. M.; Rogovaya, M. V.; Kazberuk, A. V.
2017-07-01
A study of 153 brands of commercial vodka products showed that counterfeit samples could be identified by introducing a unified additive at the minimum concentration acceptable for instrumental detection and multivariate analysis of UV-Vis transmission spectra. Counterfeit products were detected with 100% probability by using hierarchical cluster analysis or the C-means method in two-dimensional principal-component space.
Principal Component Surface (2011) for St. Thomas East End Reserve, St. Thomas
National Oceanic and Atmospheric Administration, Department of Commerce — This image represents a 0.3x0.3 meter principal component analysis (PCA) surface for areas the St. Thomas East End Reserve (STEER) in the U.S. Virgin Islands (USVI)....
Combined principal component preprocessing and n-tuple neural networks for improved classification
DEFF Research Database (Denmark)
Høskuldsson, Agnar; Linneberg, Christian
2000-01-01
We present a combined principal component analysis/neural network scheme for classification. The data used to illustrate the method consist of spectral fluorescence recordings from seven different production facilities, and the task is to relate an unknown sample to one of these seven factories. ...
k-t PCA: temporally constrained k-t BLAST reconstruction using principal component analysis
DEFF Research Database (Denmark)
Pedersen, Henrik; Kozerke, Sebastian; Ringgaard, Steffen
2009-01-01
in applications exhibiting a broad range of temporal frequencies such as free-breathing myocardial perfusion imaging. We show that temporal basis functions calculated by subjecting the training data to principal component analysis (PCA) can be used to constrain the reconstruction such that the temporal resolution...... is improved. The presented method is called k-t PCA....
Gao, Yang; Chen, Maomao; Wu, Junyu; Zhou, Yuan; Cai, Chuangjian; Wang, Daliang; Luo, Jianwen
2017-09-01
Fluorescence molecular imaging has been used to target tumors in mice with xenograft tumors. However, tumor imaging is largely distorted by the aggregation of fluorescent probes in the liver. A principal component analysis (PCA)-based strategy was applied on the in vivo dynamic fluorescence imaging results of three mice with xenograft tumors to facilitate tumor imaging, with the help of a tumor-specific fluorescent probe. Tumor-relevant features were extracted from the original images by PCA and represented by the principal component (PC) maps. The second principal component (PC2) map represented the tumor-related features, and the first principal component (PC1) map retained the original pharmacokinetic profiles, especially of the liver. The distribution patterns of the PC2 map of the tumor-bearing mice were in good agreement with the actual tumor location. The tumor-to-liver ratio and contrast-to-noise ratio were significantly higher on the PC2 map than on the original images, thus distinguishing the tumor from its nearby fluorescence noise of liver. The results suggest that the PC2 map could serve as a bioimaging marker to facilitate in vivo tumor localization, and dynamic fluorescence molecular imaging with PCA could be a valuable tool for future studies of in vivo tumor metabolism and progression.
ten Berge, Jos M.F.; Kiers, Henk A.L.
When r Principal Components are available for k variables, the correlation matrix is approximated in the least squares sense by the loading matrix times its transpose. The approximation is generally not perfect unless r = k. In the present paper it is shown that, when r is at or above the Ledermann
DEFF Research Database (Denmark)
Tian, Fang; Rades, Thomas; Sandler, Niklas
2008-01-01
The purpose of this research is to gain a greater insight into the hydrate formation processes of different carbamazepine (CBZ) anhydrate forms in aqueous suspension, where principal component analysis (PCA) was applied for data analysis. The capability of PCA to visualize and to reveal simplified...
A case of extreme simplicity of the core matrix in three-mode principal components analysis
Murakami, Takashi; Ten Berge, Jos M.F.; Kiers, Henk A.L.
In three-mode Principal Components Analysis, the P x Q x R core matrix G can be transformed to simple structure before it is interpreted. It is well-known that, when P = QR, G can be transformed to the identity matrix, which implies that all elements become equal to values specified a priori. In the
Hunley-Jenkins, Keisha Janine
2012-01-01
This qualitative study explores large, urban, mid-western principal perspectives about cyberbullying and the policy components and practices that they have found effective and ineffective at reducing its occurrence and/or negative effect on their schools' learning environments. More specifically, the researcher was interested in learning more…
Students' Perceptions of Teaching and Learning Practices: A Principal Component Approach
Mukorera, Sophia; Nyatanga, Phocenah
2017-01-01
Students' attendance and engagement with teaching and learning practices is perceived as a critical element for academic performance. Even with stipulated attendance policies, students still choose not to engage. The study employed a principal component analysis to analyze first- and second-year students' perceptions of the importance of the 12…
Fall detection in walking robots by multi-way principal component analysis
Karssen, J.G.; Wisse, M.
2008-01-01
Large disturbances can cause a biped to fall. If an upcoming fall can be detected, damage can be minimized or the fall can be prevented. We introduce the multi-way principal component analysis (MPCA) method for the detection of upcoming falls. We study the detection capability of the MPCA method in
International Nuclear Information System (INIS)
Nigran, K.S.; Barber, D.C.
1985-01-01
A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99 Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)
Directory of Open Access Journals (Sweden)
Ehsan Nasr Esfahani
2017-12-01
Full Text Available Local domain structures of ferroelectrics have been studied extensively using various modes of scanning probes at the nanoscale, including piezoresponse force microscopy (PFM and Kelvin probe force microscopy (KPFM, though none of these techniques measure the polarization directly, and the fast formation kinetics of domains and screening charges cannot be captured by these quasi-static measurements. In this study, we used charge gradient microscopy (CGM to image ferroelectric domains of lithium niobate based on current measured during fast scanning, and applied principal component analysis (PCA to enhance the signal-to-noise ratio of noisy raw data. We found that the CGM signal increases linearly with the scan speed while decreases with the temperature under power-law, consistent with proposed imaging mechanisms of scraping and refilling of surface charges within domains, and polarization change across domain wall. We then, based on CGM mappings, estimated the spontaneous polarization and the density of surface charges with order of magnitude agreement with literature data. The study demonstrates that PCA is a powerful method in imaging analysis of scanning probe microscopy (SPM, with which quantitative analysis of noisy raw data becomes possible.
Recognition of grasp types through principal components of DWT based EMG features.
Kakoty, Nayan M; Hazarika, Shyamanta M
2011-01-01
With the advancement in machine learning and signal processing techniques, electromyogram (EMG) signals have increasingly gained importance in man-machine interaction. Multifingered hand prostheses using surface EMG for control has appeared in the market. However, EMG based control is still rudimentary, being limited to a few hand postures based on higher number of EMG channels. Moreover, control is non-intuitive, in the sense that the user is required to learn to associate muscle remnants actions to unrelated posture of the prosthesis. Herein lies the promise of a low channel EMG based grasp classification architecture for development of an embedded intelligent prosthetic controller. This paper reports classification of six grasp types used during 70% of daily living activities based on two channel forearm EMG. A feature vector through principal component analysis of discrete wavelet transform coefficients based features of the EMG signal is derived. Classification is through radial basis function kernel based support vector machine following preprocessing and maximum voluntary contraction normalization of EMG signals. 10-fold cross validation is done. We have achieved an average recognition rate of 97.5%. © 2011 IEEE
International Nuclear Information System (INIS)
Kaistha, Nitin; Upadhyaya, Belle R.
2001-01-01
An integrated method for the detection and isolation of incipient faults in common field devices, such as sensors and actuators, using plant operational data is presented. The approach is based on the premise that data for normal operation lie on a surface and abnormal situations lead to deviations from the surface in a particular way. Statistically significant deviations from the surface result in the detection of faults, and the characteristic directions of deviations are used for isolation of one or more faults from the set of typical faults. Principal component analysis (PCA), a multivariate data-driven technique, is used to capture the relationships in the data and fit a hyperplane to the data. The fault direction for each of the scenarios is obtained using the singular value decomposition on the state and control function prediction errors, and fault isolation is then accomplished from projections on the fault directions. This approach is demonstrated for a simulated pressurized water reactor steam generator system and for a laboratory process control system under single device fault conditions. Enhanced fault isolation capability is also illustrated by incorporating realistic nonlinear terms in the PCA data matrix
Li, Jiangtong; Luo, Yongdao; Dai, Honglin
2018-01-01
Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.
Directory of Open Access Journals (Sweden)
Panazzolo Diogo G
2012-11-01
Full Text Available Abstract Background We aimed to evaluate the multivariate association between functional microvascular variables and clinical-laboratorial-anthropometrical measurements. Methods Data from 189 female subjects (34.0±15.5 years, 30.5±7.1 kg/m2, who were non-smokers, non-regular drug users, without a history of diabetes and/or hypertension, were analyzed by principal component analysis (PCA. PCA is a classical multivariate exploratory tool because it highlights common variation between variables allowing inferences about possible biological meaning of associations between them, without pre-establishing cause-effect relationships. In total, 15 variables were used for PCA: body mass index (BMI, waist circumference, systolic and diastolic blood pressure (BP, fasting plasma glucose, levels of total cholesterol, high-density lipoprotein cholesterol (HDL-c, low-density lipoprotein cholesterol (LDL-c, triglycerides (TG, insulin, C-reactive protein (CRP, and functional microvascular variables measured by nailfold videocapillaroscopy. Nailfold videocapillaroscopy was used for direct visualization of nutritive capillaries, assessing functional capillary density, red blood cell velocity (RBCV at rest and peak after 1 min of arterial occlusion (RBCVmax, and the time taken to reach RBCVmax (TRBCVmax. Results A total of 35% of subjects had metabolic syndrome, 77% were overweight/obese, and 9.5% had impaired fasting glucose. PCA was able to recognize that functional microvascular variables and clinical-laboratorial-anthropometrical measurements had a similar variation. The first five principal components explained most of the intrinsic variation of the data. For example, principal component 1 was associated with BMI, waist circumference, systolic BP, diastolic BP, insulin, TG, CRP, and TRBCVmax varying in the same way. Principal component 1 also showed a strong association among HDL-c, RBCV, and RBCVmax, but in the opposite way. Principal component 3 was
International Nuclear Information System (INIS)
Zarzo, Manuel; Marti, Pau
2011-01-01
Research highlights: →Principal components analysis was applied to R s data recorded at 30 stations. → Four principal components explain 97% of the data variability. → The latent variables can be fitted according to latitude, longitude and altitude. → The PCA approach is more effective for gap infilling than conventional approaches. → The proposed method allows daily R s estimations at locations in the area of study. - Abstract: Measurements of global terrestrial solar radiation (R s ) are commonly recorded in meteorological stations. Daily variability of R s has to be taken into account for the design of photovoltaic systems and energy efficient buildings. Principal components analysis (PCA) was applied to R s data recorded at 30 stations in the Mediterranean coast of Spain. Due to equipment failures and site operation problems, time series of R s often present data gaps or discontinuities. The PCA approach copes with this problem and allows estimation of present and past values by taking advantage of R s records from nearby stations. The gap infilling performance of this methodology is compared with neural networks and alternative conventional approaches. Four principal components explain 66% of the data variability with respect to the average trajectory (97% if non-centered values are considered). A new method based on principal components regression was also developed for R s estimation if previous measurements are not available. By means of multiple linear regression, it was found that the latent variables associated to the four relevant principal components can be fitted according to the latitude, longitude and altitude of the station where data were recorded from. Additional geographical or climatic variables did not increase the predictive goodness-of-fit. The resulting models allow the estimation of daily R s values at any location in the area under study and present higher accuracy than artificial neural networks and some conventional approaches
Salvatore, Stefania; Røislien, Jo; Baz-Lomba, Jose A; Bramness, Jørgen G
2017-03-01
Wastewater-based epidemiology is an alternative method for estimating the collective drug use in a community. We applied functional data analysis, a statistical framework developed for analysing curve data, to investigate weekly temporal patterns in wastewater measurements of three prescription drugs with known abuse potential: methadone, oxazepam and methylphenidate, comparing them to positive and negative control drugs. Sewage samples were collected in February 2014 from a wastewater treatment plant in Oslo, Norway. The weekly pattern of each drug was extracted by fitting of generalized additive models, using trigonometric functions to model the cyclic behaviour. From the weekly component, the main temporal features were then extracted using functional principal component analysis. Results are presented through the functional principal components (FPCs) and corresponding FPC scores. Clinically, the most important weekly feature of the wastewater-based epidemiology data was the second FPC, representing the difference between average midweek level and a peak during the weekend, representing possible recreational use of a drug in the weekend. Estimated scores on this FPC indicated recreational use of methylphenidate, with a high weekend peak, but not for methadone and oxazepam. The functional principal component analysis uncovered clinically important temporal features of the weekly patterns of the use of prescription drugs detected from wastewater analysis. This may be used as a post-marketing surveillance method to monitor prescription drugs with abuse potential. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
Suwicha Jirayucharoensak
2014-01-01
Full Text Available Automatic emotion recognition is one of the most challenging tasks. To detect emotion from nonstationary EEG signals, a sophisticated learning algorithm that can represent high-level abstraction is required. This study proposes the utilization of a deep learning network (DLN to discover unknown feature correlation between input signals that is crucial for the learning task. The DLN is implemented with a stacked autoencoder (SAE using hierarchical feature learning approach. Input features of the network are power spectral densities of 32-channel EEG signals from 32 subjects. To alleviate overfitting problem, principal component analysis (PCA is applied to extract the most important components of initial input features. Furthermore, covariate shift adaptation of the principal components is implemented to minimize the nonstationary effect of EEG signals. Experimental results show that the DLN is capable of classifying three different levels of valence and arousal with accuracy of 49.52% and 46.03%, respectively. Principal component based covariate shift adaptation enhances the respective classification accuracy by 5.55% and 6.53%. Moreover, DLN provides better performance compared to SVM and naive Bayes classifiers.
Principal Component Analysis of Working Memory Variables during Child and Adolescent Development.
Barriga-Paulino, Catarina I; Rodríguez-Martínez, Elena I; Rojas-Benjumea, María Ángeles; Gómez, Carlos M
2016-10-03
Correlation and Principal Component Analysis (PCA) of behavioral measures from two experimental tasks (Delayed Match-to-Sample and Oddball), and standard scores from a neuropsychological test battery (Working Memory Test Battery for Children) was performed on data from participants between 6-18 years old. The correlation analysis (p 1), the scores of the first extracted component were significantly correlated (p < .05) to most behavioral measures, suggesting some commonalities of the processes of age-related changes in the measured variables. The results suggest that this first component would be related to age but also to individual differences during the cognitive maturation process across childhood and adolescence stages. The fourth component would represent the speed-accuracy trade-off phenomenon as it presents loading components with different signs for reaction times and errors.
International Nuclear Information System (INIS)
Li, Yanfu; Liu, Hongli; Ma, Ziji
2016-01-01
Rail corrugation dynamic measurement techniques are critical to guarantee transport security and guide rail maintenance. During the inspection process, low-frequency trends caused by rail fluctuation are usually superimposed on rail corrugation and seriously affect the assessment of rail maintenance quality. In order to extract and remove the nonlinear and non-stationary trends from original mixed signals, a hybrid model based ensemble empirical mode decomposition (EEMD) and modified principal component analysis (MPCA) is proposed in this paper. Compared with the existing de-trending methods based on EMD, this method first considers low-frequency intrinsic mode functions (IMFs) thought to be underlying trend components that maybe contain some unrelated components, such as white noise and low-frequency signal itself, and proposes to use PCA to accurately extract the pure trends from the IMFs containing multiple components. On the other hand, due to the energy contribution ratio between trends and mixed signals is prior unknown, and the principal components (PCs) decomposed by PCA are arranged in order of energy reduction without considering frequency distribution, the proposed method modifies traditional PCA and just selects relevant low-frequency PCs to reconstruct the trends based on the zero-crossing numbers (ZCN) of each PC. Extensive tests are presented to illustrate the effectiveness of the proposed method. The results show the proposed EEMD-PCA-ZCN is an effective tool for trend extraction of rail corrugation measured dynamically. (paper)
Cloud Masking for Remotely Sensed Data Using Spectral and Principal Components Analysis
Directory of Open Access Journals (Sweden)
A. Ahmad
2012-06-01
Full Text Available Two methods of cloud masking tuned to tropical conditions have been developed, based on spectral analysis and Principal Components Analysis (PCA of Moderate Resolution Imaging Spectroradiometer (MODIS data. In the spectral approach, thresholds were applied to four reflective bands (1, 2, 3, and 4, three thermal bands (29, 31 and 32, the band 2/band 1 ratio, and the difference between band 29 and 31 in order to detect clouds. The PCA approach applied a threshold to the first principal component derived from the seven quantities used for spectral analysis. Cloud detections were compared with the standard MODIS cloud mask, and their accuracy was assessed using reference images and geographical information on the study area.
Directory of Open Access Journals (Sweden)
S. Roy
2013-12-01
Full Text Available The present investigation is an experimental approach to deposit electroless Ni-P-W coating on mild steel substrate and find out the optimum combination of various tribological performances on the basis of minimum friction and wear, using weighted principal component analysis (WPCA. In this study three main tribological parameters are chosen viz. load (A, speed (B and time(C. The responses are coefficient of friction and wear depth. Here Weighted Principal Component Analysis (WPCA method is adopted to convert the multi-responses into single performance index called multiple performance index (MPI and Taguchi L27 orthogonal array is used to design the experiment and to find the optimum combination of tribological parameters for minimum coefficient of friction and wear depth. ANOVA is performed to find the significance of the each tribological process parameters and their interactions. The EDX analysis, SEM and XRD are performed to study the composition and structural aspects.
Directory of Open Access Journals (Sweden)
Pengyu Gao
2016-03-01
Full Text Available It is difficult to forecast the well productivity because of the complexity of vertical and horizontal developments in fluvial facies reservoir. This paper proposes a method based on Principal Component Analysis and Artificial Neural Network to predict well productivity of fluvial facies reservoir. The method summarizes the statistical reservoir factors and engineering factors that affect the well productivity, extracts information by applying the principal component analysis method and approximates arbitrary functions of the neural network to realize an accurate and efficient prediction on the fluvial facies reservoir well productivity. This method provides an effective way for forecasting the productivity of fluvial facies reservoir which is affected by multi-factors and complex mechanism. The study result shows that this method is a practical, effective, accurate and indirect productivity forecast method and is suitable for field application.
Variability search in M 31 using Principal Component Analysis and the Hubble Source Catalog
Moretti, M. I.; Hatzidimitriou, D.; Karampelas, A.; Sokolovsky, K. V.; Bonanos, A. Z.; Gavras, P.; Yang, M.
2018-03-01
Principal Component Analysis (PCA) is being extensively used in Astronomy but not yet exhaustively exploited for variability search. The aim of this work is to investigate the effectiveness of using the PCA as a method to search for variable stars in large photometric data sets. We apply PCA to variability indices computed for light curves of 18152 stars in three fields in M 31 extracted from the Hubble Source Catalogue. The projection of the data into the principal components is used as a stellar variability detection and classification tool, capable of distinguishing between RR Lyrae stars, long period variables (LPVs) and non-variables. This projection recovered more than 90% of the known variables and revealed 38 previously unknown variable stars (about 30% more), all LPVs except for one object of uncertain variability type. We conclude that this methodology can indeed successfully identify candidate variable stars.
A Cure for Variance Inflation in High Dimensional Kernel Principal Component Analysis
DEFF Research Database (Denmark)
Abrahamsen, Trine Julie; Hansen, Lars Kai
2011-01-01
Small sample high-dimensional principal component analysis (PCA) suffers from variance inflation and lack of generalizability. It has earlier been pointed out that a simple leave-one-out variance renormalization scheme can cure the problem. In this paper we generalize the cure in two directions......: First, we propose a computationally less intensive approximate leave-one-out estimator, secondly, we show that variance inflation is also present in kernel principal component analysis (kPCA) and we provide a non-parametric renormalization scheme which can quite efficiently restore generalizability in kPCA....... As for PCA our analysis also suggests a simplified approximate expression. © 2011 Trine J. Abrahamsen and Lars K. Hansen....
Directory of Open Access Journals (Sweden)
Haorui Liu
2016-01-01
Full Text Available In the car control systems, it is hard to measure some key vehicle states directly and accurately when running on the road and the cost of the measurement is high as well. To address these problems, a vehicle state estimation method based on the kernel principal component analysis and the improved Elman neural network is proposed. Combining with nonlinear vehicle model of three degrees of freedom (3 DOF, longitudinal, lateral, and yaw motion, this paper applies the method to the soft sensor of the vehicle states. The simulation results of the double lane change tested by Matlab/SIMULINK cosimulation prove the KPCA-IENN algorithm (kernel principal component algorithm and improved Elman neural network to be quick and precise when tracking the vehicle states within the nonlinear area. This algorithm method can meet the software performance requirements of the vehicle states estimation in precision, tracking speed, noise suppression, and other aspects.
Principal component analysis of the nonlinear coupling of harmonic modes in heavy-ion collisions
BoŻek, Piotr
2018-03-01
The principal component analysis of flow correlations in heavy-ion collisions is studied. The correlation matrix of harmonic flow is generalized to correlations involving several different flow vectors. The method can be applied to study the nonlinear coupling between different harmonic modes in a double differential way in transverse momentum or pseudorapidity. The procedure is illustrated with results from the hydrodynamic model applied to Pb + Pb collisions at √{sN N}=2760 GeV. Three examples of generalized correlations matrices in transverse momentum are constructed corresponding to the coupling of v22 and v4, of v2v3 and v5, or of v23,v33 , and v6. The principal component decomposition is applied to the correlation matrices and the dominant modes are calculated.
Kanoga, Suguru; Murai, Akihiko; Tada, Mitsunori
2017-07-01
Forearm movements realize various functions needed in daily life. For reproduction of the motion sequences, active myoelectric devices have been developed. Usually, feature indices are extracted from observed signals in control strategy; however, the optimal combination of indices is still unclear. This paper introduces sparsity-inducing penalty term in principal component analysis (PCA) to explore optimal myoelectric feature indices. An electromyographic database including seven forearm movements from 30 subjects was used for performance comparison. Linear classifier with sparse features showed best performance (7.86±3.82% error rate) that was significantly better than linear classifier with all features because of recovering low rank matrix in original data. Furthermore, the sparse features had a large contribution of underlying data structure with less number of principal components than PCA. Root-mean-square, time-domain features, autoregressive coefficients, and Histogram purported to be important in projected feature space; therefore, the feature indices are important to myoelectric strategies.
Influencing Factors of Catering and Food Service Industry Based on Principal Component Analysis
Zi Tang
2014-01-01
Scientific analysis of influencing factors is of great importance for the healthy development of catering and food service industry. This study attempts to present a set of critical indicators for evaluating the contribution of influencing factors to catering and food service industry in the particular context of Harbin City, Northeast China. Ten indicators that correlate closely with catering and food service industry were identified and performed by the principal component analysis method u...
Karpuzcu, M Ekrem; Fairbairn, David; Arnold, William A; Barber, Brian L; Kaufenberg, Elizabeth; Koskinen, William C; Novak, Paige J; Rice, Pamela J; Swackhamer, Deborah L
2014-01-01
Principal components analysis (PCA) was used to identify sources of emerging organic contaminants in the Zumbro River watershed in Southeastern Minnesota. Two main principal components (PCs) were identified, which together explained more than 50% of the variance in the data. Principal Component 1 (PC1) was attributed to urban wastewater-derived sources, including municipal wastewater and residential septic tank effluents, while Principal Component 2 (PC2) was attributed to agricultural sources. The variances of the concentrations of cotinine, DEET and the prescription drugs carbamazepine, erythromycin and sulfamethoxazole were best explained by PC1, while the variances of the concentrations of the agricultural pesticides atrazine, metolachlor and acetochlor were best explained by PC2. Mixed use compounds carbaryl, iprodione and daidzein did not specifically group with either PC1 or PC2. Furthermore, despite the fact that caffeine and acetaminophen have been historically associated with human use, they could not be attributed to a single dominant land use category (e.g., urban/residential or agricultural). Contributions from septic systems did not clarify the source for these two compounds, suggesting that additional sources, such as runoff from biosolid-amended soils, may exist. Based on these results, PCA may be a useful way to broadly categorize the sources of new and previously uncharacterized emerging contaminants or may help to clarify transport pathways in a given area. Acetaminophen and caffeine were not ideal markers for urban/residential contamination sources in the study area and may need to be reconsidered as such in other areas as well.
An application of principal component analysis to the clavicle and clavicle fixation devices.
LENUS (Irish Health Repository)
Daruwalla, Zubin J
2010-01-01
Principal component analysis (PCA) enables the building of statistical shape models of bones and joints. This has been used in conjunction with computer assisted surgery in the past. However, PCA of the clavicle has not been performed. Using PCA, we present a novel method that examines the major modes of size and three-dimensional shape variation in male and female clavicles and suggests a method of grouping the clavicle into size and shape categories.
Classification of Basmati Rice Grain Variety using Image Processing and Principal Component Analysis
Kambo, Rubi; Yerpude, Amit
2014-01-01
All important decisions about the variety of rice grain end product are based on the different features of rice grain.There are various methods available for classification of basmati rice. This paper proposed a new principal component analysis based approach for classification of different variety of basmati rice. The experimental result shows the effectiveness of the proposed methodology for various samples of different variety of basmati rice.
Reconstruction Error and Principal Component Based Anomaly Detection in Hyperspectral Imagery
2014-03-27
NIR Spectroscopy with Applications in Food and Beverage Analysis. Essex: Longman Scientific & Technical. Peres-Neto, P. R., Jackson, D. A...Presented to the Faculty Department of Aeronautics and Astronautics Graduate School of Engineering and Management Air Force Institute of Technology...2003), and (Jackson D. A., 1993). In 1933, Hotelling ( Hotelling , 1933), who coined the term ‘principal components,’ surmised that there was a
A principal components approach to parent-to-newborn body composition associations in South India
Veena, Sargoor R; Krishnaveni, Ghattu V; Wills, Andrew K; Hill, Jacqueline C; Fall, Caroline HD
2009-01-01
Abstract Background Size at birth is influenced by environmental factors, like maternal nutrition and parity, and by genes. Birth weight is a composite measure, encompassing bone, fat and lean mass. These may have different determinants. The main purpose of this paper was to use anthropometry and principal components analysis (PCA) to describe maternal and newborn body composition, and associations between them, in an Indian population. We also compared maternal and paternal measurements (bod...
Directory of Open Access Journals (Sweden)
Christian NZENGUE PEGNET
2011-07-01
Full Text Available The recent financial turmoil has clearly highlighted the potential role of financial factors on amplification of macroeconomic developments and stressed the importance of analyzing the relationship between banks’ balance sheets and economic activity. This paper assesses the impact of the bank capital channel in the transmission of schocks in Europe on the basis of bank's balance sheet data. The empirical analysis is carried out through a Principal Component Analysis and in a Vector Error Correction Model.
Duforet-Frebourg, Nicolas; Luu, Keurcien; Laval, Guillaume; Bazin, Eric; Blum, Michael G B
2016-04-01
To characterize natural selection, various analytical methods for detecting candidate genomic regions have been developed. We propose to perform genome-wide scans of natural selection using principal component analysis (PCA). We show that the common FST index of genetic differentiation between populations can be viewed as the proportion of variance explained by the principal components. Considering the correlations between genetic variants and each principal component provides a conceptual framework to detect genetic variants involved in local adaptation without any prior definition of populations. To validate the PCA-based approach, we consider the 1000 Genomes data (phase 1) considering 850 individuals coming from Africa, Asia, and Europe. The number of genetic variants is of the order of 36 millions obtained with a low-coverage sequencing depth (3×). The correlations between genetic variation and each principal component provide well-known targets for positive selection (EDAR, SLC24A5, SLC45A2, DARC), and also new candidate genes (APPBPP2, TP1A1, RTTN, KCNMA, MYO5C) and noncoding RNAs. In addition to identifying genes involved in biological adaptation, we identify two biological pathways involved in polygenic adaptation that are related to the innate immune system (beta defensins) and to lipid metabolism (fatty acid omega oxidation). An additional analysis of European data shows that a genome scan based on PCA retrieves classical examples of local adaptation even when there are no well-defined populations. PCA-based statistics, implemented in the PCAdapt R package and the PCAdapt fast open-source software, retrieve well-known signals of human adaptation, which is encouraging for future whole-genome sequencing project, especially when defining populations is difficult. © The Author(s) 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
[Content of mineral elements of Gastrodia elata by principal components analysis].
Li, Jin-ling; Zhao, Zhi; Liu, Hong-chang; Luo, Chun-li; Huang, Ming-jin; Luo, Fu-lai; Wang, Hua-lei
2015-03-01
To study the content of mineral elements and the principal components in Gastrodia elata. Mineral elements were determined by ICP and the data was analyzed by SPSS. K element has the highest content-and the average content was 15.31 g x kg(-1). The average content of N element was 8.99 g x kg(-1), followed by K element. The coefficient of variation of K and N was small, but the Mn was the biggest with 51.39%. The highly significant positive correlation was found among N, P and K . Three principal components were selected by principal components analysis to evaluate the quality of G. elata. P, B, N, K, Cu, Mn, Fe and Mg were the characteristic elements of G. elata. The content of K and N elements was higher and relatively stable. The variation of Mn content was biggest. The quality of G. elata in Guizhou and Yunnan was better from the perspective of mineral elements.
Dynamic of consumer groups and response of commodity markets by principal component analysis
Nobi, Ashadun; Alam, Shafiqul; Lee, Jae Woo
2017-09-01
This study investigates financial states and group dynamics by applying principal component analysis to the cross-correlation coefficients of the daily returns of commodity futures. The eigenvalues of the cross-correlation matrix in the 6-month timeframe displays similar values during 2010-2011, but decline following 2012. A sharp drop in eigenvalue implies the significant change of the market state. Three commodity sectors, energy, metals and agriculture, are projected into two dimensional spaces consisting of two principal components (PC). We observe that they form three distinct clusters in relation to various sectors. However, commodities with distinct features have intermingled with one another and scattered during severe crises, such as the European sovereign debt crises. We observe the notable change of the position of two dimensional spaces of groups during financial crises. By considering the first principal component (PC1) within the 6-month moving timeframe, we observe that commodities of the same group change states in a similar pattern, and the change of states of one group can be used as a warning for other group.
Yan, Han-Jing; Fang, Zhi-Jian
2008-02-01
To explore the character of inorganic elements in Polygonum multiflorum. The contents of elements such as Al, B, Ba, Ca, Cu, Mn, Mg, Fe, Na, Ni, P, Se, Sr, Ti and Zn in nine P. multiflorum samples were determined by means of ICP-AEC. The results were used for the development of element distrubution diagram. The principal component analysis and one-way ANOVA of SPSS were applied for the study of characteristic elements in P. multiflorum. The contents of Al, Ca, K, Mg, Sr,Ti in wild P. multiflorum were remarkable higher than those in cultured P. multiflorum, and there was no significant difference between cultured and wild in the other elements. Five principal components which accounted for over 90% of the total variance were extracted from the original data. The analysis results show that Al, B, Ba, Fe, Na, Ni, Ti, Ca and Sr may be the characteristic elements in P. multiflorum. The element distrubution diagram of the sample from Tianyang was remarkable different comparing with the others. The principal component analysis could be used in data processing in inorganic elements.
Dovbeshko, G. I.; Repnytska, O. P.; Pererva, T.; Miruta, A.; Kosenkov, D.
2004-07-01
Conformation analysis of mutated DNA-bacteriophages (PLys-23, P23-2, P47- the numbers have been assigned by T. Pererva) induced by MS2 virus incorporated in Ecoli AB 259 Hfr 3000 has been done. Surface enhanced infrared absorption (SEIRA) spectroscopy and principal component analysis has been applied for solving this problem. The nucleic acids isolated from the mutated phages had a form of double stranded DNA with different modifications. The nucleic acid from phage P47 was undergone the structural rearrangement in the most degree. The shape and position ofthe fine structure of the Phosphate asymmetrical band at 1071cm-1 as well as the stretching OH vibration at 3370-3390 cm-1 has indicated to the appearance ofadditional OH-groups. The Z-form feature has been found in the base vibration region (1694 cm-1) and the sugar region (932 cm-1). A supposition about modification of structure of DNA by Z-fragments for P47 phage has been proposed. The P23-2 and PLys-23 phages have showed the numerous minor structural changes also. On the basis of SEIRA spectra we have determined the characteristic parameters of the marker bands of nucleic acid used for construction of principal components. Contribution of different spectral parameters of nucleic acids to principal components has been estimated.
Principal component analysis of Raman spectra for TiO2 nanoparticle characterization
Ilie, Alina Georgiana; Scarisoareanu, Monica; Morjan, Ion; Dutu, Elena; Badiceanu, Maria; Mihailescu, Ion
2017-09-01
The Raman spectra of anatase/rutile mixed phases of Sn doped TiO2 nanoparticles and undoped TiO2 nanoparticles, synthesised by laser pyrolysis, with nanocrystallite dimensions varying from 8 to 28 nm, was simultaneously processed with a self-written software that applies Principal Component Analysis (PCA) on the measured spectrum to verify the possibility of objective auto-characterization of nanoparticles from their vibrational modes. The photo-excited process of Raman scattering is very sensible to the material characteristics, especially in the case of nanomaterials, where more properties become relevant for the vibrational behaviour. We used PCA, a statistical procedure that performs eigenvalue decomposition of descriptive data covariance, to automatically analyse the sample's measured Raman spectrum, and to interfere the correlation between nanoparticle dimensions, tin and carbon concentration, and their Principal Component values (PCs). This type of application can allow an approximation of the crystallite size, or tin concentration, only by measuring the Raman spectrum of the sample. The study of loadings of the principal components provides information of the way the vibrational modes are affected by the nanoparticle features and the spectral area relevant for the classification.
Walker, Matthew D; Bradley, Kevin M; McGowan, Daniel R
2018-02-08
Respiratory motion can degrade PET image quality and lead to inaccurate quantification of lesion uptake. Such motion can be mitigated via respiratory gating. Our objective was to evaluate a data driven gating (DDG) technique that is being developed commercially for clinical PET/CT. A data driven respiratory gating algorithm based on principal component analysis (PCA) was applied to phantom and FDG patient data. An anthropomorphic phantom and a NEMA IEC Body phantom were filled with 18F, placed on a respiratory motion platform, and imaged using a PET/CT scanner. Motion waveforms were measured using an infra-red camera (the Real-time Position Management™ system (RPM)) and also extracted from the PET data using the DDG algorithm. The waveforms were compared via calculation of Pearson's correlation coefficients. PET data were reconstructed using quiescent period gating (QPG) and compared via measurement of recovery percentage and background variability. Data driven gating had similar performance to the external gating system, with correlation coefficients in excess of 0.97. Phantom and patient images were visually clearer with improved contrast when QPG was applied as compared to no motion compensation. Recovery coefficients in the phantoms were not significantly different between DDG- and RPM-based QPG, but were significantly higher than those found for no motion compensation (p<0.05). A PCA-based DDG algorithm was evaluated and found to provide a reliable respiratory gating signal in anthropomorphic phantom studies and in example patients. Advances in knowledge: The prototype commercial DDG algorithm may enable reliable respiratory gating in routine clinical PET-CT.
Kernel Principal Component Analysis for dimensionality reduction in fMRI-based diagnosis of ADHD.
Sidhu, Gagan S; Asgarian, Nasimeh; Greiner, Russell; Brown, Matthew R G
2012-01-01
This study explored various feature extraction methods for use in automated diagnosis of Attention-Deficit Hyperactivity Disorder (ADHD) from functional Magnetic Resonance Image (fMRI) data. Each participant's data consisted of a resting state fMRI scan as well as phenotypic data (age, gender, handedness, IQ, and site of scanning) from the ADHD-200 dataset. We used machine learning techniques to produce support vector machine (SVM) classifiers that attempted to differentiate between (1) all ADHD patients vs. healthy controls and (2) ADHD combined (ADHD-c) type vs. ADHD inattentive (ADHD-i) type vs. controls. In different tests, we used only the phenotypic data, only the imaging data, or else both the phenotypic and imaging data. For feature extraction on fMRI data, we tested the Fast Fourier Transform (FFT), different variants of Principal Component Analysis (PCA), and combinations of FFT and PCA. PCA variants included PCA over time (PCA-t), PCA over space and time (PCA-st), and kernelized PCA (kPCA-st). Baseline chance accuracy was 64.2% produced by guessing healthy control (the majority class) for all participants. Using only phenotypic data produced 72.9% accuracy on two class diagnosis and 66.8% on three class diagnosis. Diagnosis using only imaging data did not perform as well as phenotypic-only approaches. Using both phenotypic and imaging data with combined FFT and kPCA-st feature extraction yielded accuracies of 76.0% on two class diagnosis and 68.6% on three class diagnosis-better than phenotypic-only approaches. Our results demonstrate the potential of using FFT and kPCA-st with resting-state fMRI data as well as phenotypic data for automated diagnosis of ADHD. These results are encouraging given known challenges of learning ADHD diagnostic classifiers using the ADHD-200 dataset (see Brown et al., 2012).
Race, Alan M; Steven, Rory T; Palmer, Andrew D; Styles, Iain B; Bunch, Josephine
2013-03-19
A memory efficient algorithm for the computation of principal component analysis (PCA) of large mass spectrometry imaging data sets is presented. Mass spectrometry imaging (MSI) enables two- and three-dimensional overviews of hundreds of unlabeled molecular species in complex samples such as intact tissue. PCA, in combination with data binning or other reduction algorithms, has been widely used in the unsupervised processing of MSI data and as a dimentionality reduction method prior to clustering and spatial segmentation. Standard implementations of PCA require the data to be stored in random access memory. This imposes an upper limit on the amount of data that can be processed, necessitating a compromise between the number of pixels and the number of peaks to include. With increasing interest in multivariate analysis of large 3D multislice data sets and ongoing improvements in instrumentation, the ability to retain all pixels and many more peaks is increasingly important. We present a new method which has no limitation on the number of pixels and allows an increased number of peaks to be retained. The new technique was validated against the MATLAB (The MathWorks Inc., Natick, Massachusetts) implementation of PCA (princomp) and then used to reduce, without discarding peaks or pixels, multiple serial sections acquired from a single mouse brain which was too large to be analyzed with princomp. Then, k-means clustering was performed on the reduced data set. We further demonstrate with simulated data of 83 slices, comprising 20,535 pixels per slice and equaling 44 GB of data, that the new method can be used in combination with existing tools to process an entire organ. MATLAB code implementing the memory efficient PCA algorithm is provided.
The Langat River water quality index based on principal component analysis
Mohd Ali, Zalina; Ibrahim, Noor Akma; Mengersen, Kerrie; Shitan, Mahendran; Juahir, Hafizan
2013-04-01
River Water Quality Index (WQI) is calculated using an aggregation function of the six water quality sub-indices variables, together with their relative importance or weights respectively. The formula is used by the Department of Environment to indicate a general status of the rivers in Malaysia. The six elected water quality variables used in the formula are, namely: suspended solids (SS), biochemical oxygen demand (BOD), ammoniacal nitrogen (AN), chemical oxygen demand (COD), dissolved oxygen (DO) and pH. The sub-indices calculations, determined by quality rating curve and their weights, were based on expert opinions. However, the use of sub-indices and the relative importance established in the formula is very subjective in nature and does not consider the inter-relationships among the variables. The relationships of the variables are important due to the nature of multi-dimensionality and complex characteristics found in river water. Therefore, a well-known multivariate technique, i.e. Principal Component Analysis (PCA) was proposed to re-calculate the waterquality index specifically in Langat River based on the inter-relationship approach. The application of this approach is not well-studied in river water quality index development studies in Malaysia. Hence, the approach in the study is relevant and important since the first river water quality development took place in 1981. The PCA results showed that the weights obtained indicate the difference in ranking of the relative importance for particular variables compared to the classical approaches used in WQI-DOE. Based on the new weights, the Langat River water quality index was calculated and the comparison between both indexes was also discussed in this paper.
Strale, Mathieu; Krysinska, Karolina; Overmeiren, Gaëtan Van; Andriessen, Karl
2017-06-01
This study investigated the geographic distribution of suicide and railway suicide in Belgium over 2008--2013 on local (i.e., district or arrondissement) level. There were differences in the regional distribution of suicide and railway suicides in Belgium over the study period. Principal component analysis identified three groups of correlations among population variables and socio-economic indicators, such as population density, unemployment, and age group distribution, on two components that helped explaining the variance of railway suicide at a local (arrondissement) level. This information is of particular importance to prevent suicides in high-risk areas on the Belgian railway network.
Directory of Open Access Journals (Sweden)
Alia Colniță
2017-09-01
Full Text Available Raman scattering and its particular effect, surface-enhanced Raman scattering (SERS, are whole-organism fingerprinting spectroscopic techniques that gain more and more popularity in bacterial detection. In this work, two relevant Gram-positive bacteria species, Lactobacillus casei (L. casei and Listeria monocytogenes (L. monocytogenes were characterized based on their Raman and SERS spectral fingerprints. The SERS spectra were used to identify the biochemical structures of the bacterial cell wall. Two synthesis methods of the SERS-active nanomaterials were used and the recorded spectra were analyzed. L. casei and L. monocytogenes were successfully discriminated by applying Principal Component Analysis (PCA to their specific spectral data.
Trusiak, Maciej; Służewski, Łukasz; Patorski, Krzysztof
2016-02-22
Hybrid single shot algorithm for accurate phase demodulation of complex fringe patterns is proposed. It employs empirical mode decomposition based adaptive fringe pattern enhancement (i.e., denoising, background removal and amplitude normalization) and subsequent boosted phase demodulation using 2D Hilbert spiral transform aided by the Principal Component Analysis method for novel, correct and accurate local fringe direction map calculation. Robustness to fringe pattern significant noise, uneven background and amplitude modulation as well as local fringe period and shape variations is corroborated by numerical simulations and experiments. Proposed automatic, adaptive, fast and comprehensive fringe analysis solution compares favorably with other previously reported techniques.
Bayati, Mohsen; Rashidian, Arash; Akbari Sari, Ali; Emamgholipour, Sara
2017-01-01
Background: Based on the target income hypothesis, the economic behavior of physicians is mainly affected by their target income. This study aimed at designing an instrument to explain how general practitioners (GPs) set their desired income. Methods: A self-administered questionnaire of affecting factors on GPs' target income was extracted from literature reviews and a small qualitative study. Respondents were 666 GPs who completed the questionnaire (response rate= 52%) during 2 seasonal congresses of Iranian GPs. The principal component analysis (PCA) with varimax rotation was used to classify the variables and data reduction. Sample adequacy test, sphericity test, eigenvalues of components, and scree plot were evaluated for PCA. Cronbach's alpha was also checked to assess the internal consistency of the principal components. Results: The results of the KMO measure of sampling adequacy (0.657) and Bartlett's test of sphericity (809.05, pluxury services were selected, which explained 65.19% of the total variance. Finally, only those with a Cronbach's alpha value higher than 0.6 were considered reliable (the first 4 components). Conclusion: Based on the target income hypothesis, a physician's desired level of income affects their behavior. Our developed instrument and its mentioned components can be used in future studies related to GPs' behavior, especially those studies related to the economic aspects of GPs' behavior. It also helps formulate a better payment mechanism for primary care providers.
Iwamori, Hikaru; Yoshida, Kenta; Nakamura, Hitomi; Kuwatani, Tatsu; Hamada, Morihisa; Haraguchi, Satoru; Ueki, Kenta
2017-03-01
Identifying the data structure including trends and groups/clusters in geochemical problems is essential to discuss the origin of sources and processes from the observed variability of data. An increasing number and high dimensionality of recent geochemical data require efficient and accurate multivariate statistical analysis methods. In this paper, we show the relationship and complementary roles of k-means cluster analysis (KCA), principal component analysis (PCA), and independent component analysis (ICA) to capture the true data structure. When the data are preprocessed by primary standardization (i.e., with the zero mean and normalized by the standard deviation), KCA and PCA provide essentially the same results, although the former returns the solution in a discretized space. When the data are preprocessed by whitening (i.e., normalized by eigenvalues along the principal components), KCA and ICA may identify a set of independent trends and groups, irrespective of the amplitude (power) of variance. As an example, basalt isotopic compositions have been analyzed with KCA on the whitened data, demonstrating clear rock type/tectonic occurrence/mantle end-member discrimination. Therefore, the combination of these methods, particularly KCA on whitened data, is useful to capture and discuss the data structure of various geochemical systems, for which an Excel program is provided.
International Nuclear Information System (INIS)
Jesse, Stephen; Kalinin, Sergei V
2009-01-01
An approach for the analysis of multi-dimensional, spectroscopic-imaging data based on principal component analysis (PCA) is explored. PCA selects and ranks relevant response components based on variance within the data. It is shown that for examples with small relative variations between spectra, the first few PCA components closely coincide with results obtained using model fitting, and this is achieved at rates approximately four orders of magnitude faster. For cases with strong response variations, PCA allows an effective approach to rapidly process, de-noise, and compress data. The prospects for PCA combined with correlation function analysis of component maps as a universal tool for data analysis and representation in microscopy are discussed.
Directory of Open Access Journals (Sweden)
Selvin J. PITCHAIKANI
2017-06-01
Full Text Available Principal component analysis (PCA is a technique used to emphasize variation and bring out strong patterns in a dataset. It is often used to make data easy to explore and visualize. The primary objective of the present study was to record information of zooplankton diversity in a systematic way and to study the variability and relationships among seasons prevailed in Gulf of Mannar. The PCA for the zooplankton seasonal diversity was investigated using the four seasonal datasets to understand the statistical significance among the four seasons. Two different principal components (PC were segregated in all the seasons homogeneously. PCA analyses revealed that Temora turbinata is an opportunistic species and zooplankton diversity was significantly different from season to season and principally, the zooplankton abundance and its dynamics in Gulf of Mannar is structured by seasonal current patterns. The factor loadings of zooplankton for different seasons in Tiruchendur coastal water (GOM is different compared with the Southwest coast of India; particularly, routine and opportunistic species were found within the positive and negative factors. The copepods Acrocalanus gracilis and Acartia erythrea were dominant in summer and Southwest monsoon due to the rainfall and freshwater discharge during the summer season; however, these species were replaced by Temora turbinata during Northeast monsoon season.
Principal components analysis of reward prediction errors in a reinforcement learning task.
Sambrook, Thomas D; Goslin, Jeremy
2016-01-01
Models of reinforcement learning represent reward and punishment in terms of reward prediction errors (RPEs), quantitative signed terms describing the degree to which outcomes are better than expected (positive RPEs) or worse (negative RPEs). An electrophysiological component known as feedback related negativity (FRN) occurs at frontocentral sites 240-340ms after feedback on whether a reward or punishment is obtained, and has been claimed to neurally encode an RPE. An outstanding question however, is whether the FRN is sensitive to the size of both positive RPEs and negative RPEs. Previous attempts to answer this question have examined the simple effects of RPE size for positive RPEs and negative RPEs separately. However, this methodology can be compromised by overlap from components coding for unsigned prediction error size, or "salience", which are sensitive to the absolute size of a prediction error but not its valence. In our study, positive and negative RPEs were parametrically modulated using both reward likelihood and magnitude, with principal components analysis used to separate out overlying components. This revealed a single RPE encoding component responsive to the size of positive RPEs, peaking at ~330ms, and occupying the delta frequency band. Other components responsive to unsigned prediction error size were shown, but no component sensitive to negative RPE size was found. Copyright © 2015 Elsevier Inc. All rights reserved.
Dien, Joseph; Spencer, Kevin M; Donchin, Emanuel
2003-10-01
Recent research indicates that novel stimuli elicit at least two distinct components, the Novelty P3 and the P300. The P300 is thought to be elicited when a context updating mechanism is activated by a wide class of deviant events. The functional significance of the Novelty P3 is uncertain. Identification of the generator sources of the two components could provide additional information about their functional significance. Previous localization efforts have yielded conflicting results. The present report demonstrates that the use of principal components analysis (PCA) results in better convergence with knowledge about functional neuroanatomy than did previous localization efforts. The results are also more convincing than that obtained by two alternative methods, MUSIC-RAP and the Minimum Norm. Source modeling on 129-channel data with BESA and BrainVoyager suggests the P300 has sources in the temporal-parietal junction whereas the Novelty P3 has sources in the anterior cingulate.
Directory of Open Access Journals (Sweden)
Shu-zhi Gao
2016-01-01
Full Text Available In view of the fact that the production process of Polyvinyl chloride (PVC polymerization has more fault types and its type is complex, a fault diagnosis algorithm based on the hybrid Dynamic Kernel Principal Component Analysis-Fisher Discriminant Analysis (DKPCA-FDA method is proposed in this paper. Kernel principal component analysis and Dynamic Kernel Principal Component Analysis are used for fault diagnosis of Polyvinyl chloride (PVC polymerization process, while Fisher Discriminant Analysis (FDA method was adopted to make failure data for further separation. The simulation results show that the Dynamic Kernel Principal Component Analyses to fault diagnosis of Polyvinyl chloride (PVC polymerization process have better diagnostic accuracy, the Fisher Discriminant Analysis (FDA can further realize the fault isolation, and the actual fault in the process of Polyvinyl chloride (PVC polymerization production can be monitored by Dynamic Kernel Principal Component Analysis.
The use of principal components and univariate charts to control multivariate processes
Directory of Open Access Journals (Sweden)
Marcela A. G. Machado
2008-04-01
Full Text Available In this article, we evaluate the performance of the T² chart based on the principal components (PC X chart and the simultaneous univariate control charts based on the original variables (SU charts or based on the principal components (SUPC charts. The main reason to consider the PC chart lies on the dimensionality reduction. However, depending on the disturbance and on the way the original variables are related, the chart is very slow in signaling, except when all variables are negatively correlated and the principal component is wisely selected. Comparing the SU , the SUPC and the T² charts we conclude that the SU X charts (SUPC charts have a better overall performance when the variables are positively (negatively correlated. We also develop the expression to obtain the power of two S² charts designed for monitoring the covariance matrix. These joint S² charts are, in the majority of the cases, more efficient than the generalized variance chart.Neste artigo, avaliamos o desempenho do gráfico de T² baseado em componentes principais (gráfico PC e dos gráficos de controle simultâneos univariados baseados nas variáveis originais (gráfico SU X ou baseados em componentes principais (gráfico SUPC. A principal razão para o uso do gráfico PC é a redução de dimensionalidade. Entretanto, dependendo da perturbação e da correlação entre as variáveis originais, o gráfico é lento em sinalizar, exceto quando todas as variáveis são negativamente correlacionadas e a componente principal é adequadamente escolhida. Comparando os gráficos SU X, SUPC e T² concluímos que o gráfico SU X (gráfico SUPC tem um melhor desempenho global quando as variáveis são positivamente (negativamente correlacionadas. Desenvolvemos também uma expressão para obter o poder de detecção de dois gráficos de S² projetados para controlar a matriz de covariâncias. Os gráficos conjuntos de S² são, na maioria dos casos, mais eficientes que o gr
Directory of Open Access Journals (Sweden)
Yuliana Yuliana
2010-06-01
Full Text Available Quantitative Electronic Structure Activity Relationship (QSAR analysis of a series of benzalacetones has been investigated based on semi empirical PM3 calculation data using Principal Components Regression (PCR. Investigation has been done based on antimutagen activity from benzalacetone compounds (presented by log 1/IC50 and was studied as linear correlation with latent variables (Tx resulted from transformation of atomic net charges using Principal Component Analysis (PCA. QSAR equation was determinated based on distribution of selected components and then was analysed with PCR. The result was described by the following QSAR equation : log 1/IC50 = 6.555 + (2.177.T1 + (2.284.T2 + (1.933.T3 The equation was significant on the 95% level with statistical parameters : n = 28 r = 0.766 SE = 0.245 Fcalculation/Ftable = 3.780 and gave the PRESS result 0.002. It means that there were only a relatively few deviations between the experimental and theoretical data of antimutagenic activity. New types of benzalacetone derivative compounds were designed and their theoretical activity were predicted based on the best QSAR equation. It was found that compounds number 29, 30, 31, 32, 33, 35, 36, 37, 38, 40, 41, 42, 44, 47, 48, 49 and 50 have a relatively high antimutagenic activity. Keywords: QSAR; antimutagenic activity; benzalaceton; atomic net charge
Polat, Esra; Gunay, Suleyman
2013-10-01
One of the problems encountered in Multiple Linear Regression (MLR) is multicollinearity, which causes the overestimation of the regression parameters and increase of the variance of these parameters. Hence, in case of multicollinearity presents, biased estimation procedures such as classical Principal Component Regression (CPCR) and Partial Least Squares Regression (PLSR) are then performed. SIMPLS algorithm is the leading PLSR algorithm because of its speed, efficiency and results are easier to interpret. However, both of the CPCR and SIMPLS yield very unreliable results when the data set contains outlying observations. Therefore, Hubert and Vanden Branden (2003) have been presented a robust PCR (RPCR) method and a robust PLSR (RPLSR) method called RSIMPLS. In RPCR, firstly, a robust Principal Component Analysis (PCA) method for high-dimensional data on the independent variables is applied, then, the dependent variables are regressed on the scores using a robust regression method. RSIMPLS has been constructed from a robust covariance matrix for high-dimensional data and robust linear regression. The purpose of this study is to show the usage of RPCR and RSIMPLS methods on an econometric data set, hence, making a comparison of two methods on an inflation model of Turkey. The considered methods have been compared in terms of predictive ability and goodness of fit by using a robust Root Mean Squared Error of Cross-validation (R-RMSECV), a robust R2 value and Robust Component Selection (RCS) statistic.
Dong, Fengxia; Mitchell, Paul D; Colquhoun, Jed
2015-01-01
Measuring farm sustainability performance is a crucial component for improving agricultural sustainability. While extensive assessments and indicators exist that reflect the different facets of agricultural sustainability, because of the relatively large number of measures and interactions among them, a composite indicator that integrates and aggregates over all variables is particularly useful. This paper describes and empirically evaluates a method for constructing a composite sustainability indicator that individually scores and ranks farm sustainability performance. The method first uses non-negative polychoric principal component analysis to reduce the number of variables, to remove correlation among variables and to transform categorical variables to continuous variables. Next the method applies common-weight data envelope analysis to these principal components to individually score each farm. The method solves weights endogenously and allows identifying important practices in sustainability evaluation. An empirical application to Wisconsin cranberry farms finds heterogeneity in sustainability practice adoption, implying that some farms could adopt relevant practices to improve the overall sustainability performance of the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Baloch, M.J.
2003-01-01
Nine upland cotton varieties/strains were tested over 36 environments in Pakistan so as to determine their stability in yield performance. The regression coefficient (b) was used as a measure of adaptability, whereas parameters such as coefficient of determination (r2) and sum of squared deviations from regression (s/sup 2/d) were used as measure of stability. Although the regression coefficients (b) of all varieties did not deviate significantly from the unit slope, the varieties CRIS-5A. BII-89, DNH-40 and Rehmani gave b value closer to unity implying their better adaptation. Lower s/sub 2/d and higher r/sub 2/ of CRIS- 121 and DNH-40 suggest that both of these are fairly stable. The results indicate that, generally, adaptability and stability parameters are independent of each in as much as not all of the parameters simultaneously favoured one variety over the other excepting the variety DNH-40, which was stable based on majority of the parameters. Principal component analysis revealed that the first two components (latent roots) account for about 91.4% of the total variation. The latent vectors of first principal component (PCA1) were smaller and positive which also suggest that most of the varieties were quite adaptive to all of the test environments. (author)
Ioele, Giuseppina; De Luca, Michele; Dinç, Erdal; Oliverio, Filomena; Ragno, Gaetano
2011-01-01
A chemometric approach based on the combined use of the principal component analysis (PCA) and artificial neural network (ANN) was developed for the multicomponent determination of caffeine (CAF), mepyramine (MEP), phenylpropanolamine (PPA) and pheniramine (PNA) in their pharmaceutical preparations without any chemical separation. The predictive ability of the ANN method was compared with the classical linear regression method Partial Least Squares 2 (PLS2). The UV spectral data between 220 and 300 nm of a training set of sixteen quaternary mixtures were processed by PCA to reduce the dimensions of input data and eliminate the noise coming from instrumentation. Several spectral ranges and different numbers of principal components (PCs) were tested to find the PCA-ANN and PLS2 models reaching the best determination results. A two layer ANN, using the first four PCs, was used with log-sigmoid transfer function in first hidden layer and linear transfer function in output layer. Standard error of prediction (SEP) was adopted to assess the predictive accuracy of the models when subjected to external validation. PCA-ANN showed better prediction ability in the determination of PPA and PNA in synthetic samples with added excipients and pharmaceutical formulations. Since both components are characterized by low absorptivity, the better performance of PCA-ANN was ascribed to the ability in considering all non-linear information from noise or interfering excipients.
Directory of Open Access Journals (Sweden)
Alaba Boluwade
2016-09-01
Full Text Available Accurate characterization of soil properties such as soil water content (SWC and bulk density (BD is vital for hydrologic processes and thus, it is importance to estimate θ (water content and ρ (soil bulk density among other soil surface parameters involved in water retention and infiltration, runoff generation and water erosion, etc. The spatial estimation of these soil properties are important in guiding agricultural management decisions. These soil properties vary both in space and time and are correlated. Therefore, it is important to find an efficient and robust technique to simulate spatially correlated variables. Methods such as principal component analysis (PCA and independent component analysis (ICA can be used for the joint simulations of spatially correlated variables, but they are not without their flaws. This study applied a variant of PCA called independent principal component analysis (IPCA that combines the strengths of both PCA and ICA for spatial simulation of SWC and BD using the soil data set from an 11 km2 Castor watershed in southern Quebec, Canada. Diagnostic checks using the histograms and cumulative distribution function (cdf both raw and back transformed simulations show good agreement. Therefore, the results from this study has potential in characterization of water content variability and bulk density variation for precision agriculture.
Competition analysis on the operating system market using principal component analysis
Directory of Open Access Journals (Sweden)
Brătucu, G.
2011-01-01
Full Text Available Operating system market has evolved greatly. The largest software producer in the world, Microsoft, dominates the operating systems segment. With three operating systems: Windows XP, Windows Vista and Windows 7 the company held a market share of 87.54% in January 2011. Over time, open source operating systems have begun to penetrate the market very strongly affecting other manufacturers. Companies such as Apple Inc. and Google Inc. penetrated the operating system market. This paper aims to compare the best-selling operating systems on the market in terms of defining characteristics. To this purpose the principal components analysis method was used.
Campos, Joaquin; Ferrero, Alejandro; Woolliams, Emma; Greenwell, Claire; Bialek, Agnieszka; Hernanz, Luisa; Pons, Alicia
2018-02-01
Determining reflectance factor and its variability across reference sites for Earth observation satellites is a problem involving large amounts of data and measurement time. Principal component analysis (PCA) may be used to simplify this problem by reducing the size of the data and by highlighting spectral features that could be related to physical phenomena. This work presents the results obtained in applying PCA to two reference sites for calibration and validation of Earth observation satellites located at La Crau (France) and Gobabeb (Namibia), respectively.
Research of Transforming Grassroots Government Function Based On Principal Component Analysis
Directory of Open Access Journals (Sweden)
Sun Kewei
2015-01-01
Full Text Available According to principal component analysis theory, through government functions of network platform, performance of organization departments, the software power and information reflected by public to participate in the government network platform construction, to improve the transformation of government function, establish a reasonable standard of government network platform. According to the four secondary indexes, it comprehensive develop 18 evaluation factors as the main factors and targets that influence transformation of town government functions, to get the weight proportion of each index, and analyze on the transformation of town government function in the network era.
Pinto da Costa, Joaquim
2015-01-01
This book examines in detail the correlation, more precisely the weighted correlation, and applications involving rankings. A general application is the evaluation of methods to predict rankings. Others involve rankings representing human preferences to infer user preferences; the use of weighted correlation with microarray data and those in the domain of time series. In this book we present new weighted correlation coefficients and new methods of weighted principal component analysis. We also introduce new methods of dimension reduction and clustering for time series data, and describe some theoretical results on the weighted correlation coefficients in separate sections.
Directory of Open Access Journals (Sweden)
Oliveira-Esquerre K.P.
2002-01-01
Full Text Available This work presents a way to predict the biochemical oxygen demand (BOD of the output stream of the biological wastewater treatment plant at RIPASA S/A Celulose e Papel, one of the major pulp and paper plants in Brazil. The best prediction performance is achieved when the data are preprocessed using principal components analysis (PCA before they are fed to a backpropagated neural network. The influence of input variables is analyzed and satisfactory prediction results are obtained for an optimized situation.
Kernel Principal Component Analysis and its Applications in Face Recognition and Active Shape Models
Wang, Quan
2012-01-01
Principal component analysis (PCA) is a popular tool for linear dimensionality reduction and feature extraction. Kernel PCA is the nonlinear form of PCA, which better exploits the complicated spatial structure of high-dimensional features. In this paper, we first review the basic ideas of PCA and kernel PCA. Then we focus on the reconstruction of pre-images for kernel PCA. We also give an introduction on how PCA is used in active shape models (ASMs), and discuss how kernel PCA can be applied ...
On the structure of dynamic principal component analysis used in statistical process monitoring
DEFF Research Database (Denmark)
Vanhatalo, Erik; Kulahci, Murat; Bergquist, Bjarne
2017-01-01
driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method......When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time...
Principal component analysis of bacteria using surface-enhanced Raman spectroscopy
Guicheteau, Jason; Christesen, Steven D.
2006-05-01
Surface-enhanced Raman scattering (SERS) provides rapid fingerprinting of biomaterial in a non-destructive manner. The problem of tissue fluorescence, which can overwhelm a normal Raman signal from biological samples, is largely overcome by treatment of biomaterials with colloidal silver. This work presents a study into the applicability of qualitative SER spectroscopy with principal component analysis (PCA) for the discrimination of four biological threat simulants; Bacillus globigii, Pantoea agglomerans, Brucella noetomae, and Yersinia rohdei. We also demonstrate differentiation of gram-negative and gram-positive species and as well as spores and vegetative cells of Bacillus globigii.
Smilek, Jan; Hadas, Zdenek
2017-02-01
In this paper we propose the use of principal component analysis to process the measured acceleration data in order to determine the direction of acceleration with the highest variance on given frequency of interest. This method can be used for improving the power generated by inertial energy harvesters. Their power output is highly dependent on the excitation acceleration magnitude and frequency, but the axes of acceleration measurements might not always be perfectly aligned with the directions of movement, and therefore the generated power output might be severely underestimated in simulations, possibly leading to false conclusions about the feasibility of using the inertial energy harvester for the examined application.
Research on application of principal component statistical analysis in the financial early-warning
Directory of Open Access Journals (Sweden)
Lan Yang
2017-06-01
Full Text Available Under the background of market economy, the environment of enterprises is changing rapidly, so the management layer urgently needs to know the financial situation in advance, in order to take measures to resolve risks. Based on 25 domestic listed companies, this paper uses SPSS software and statistical method of principal component analysis to establish the financial early warning model that is suitable for the listed companies in China. Taking Maotai Company as an example, this paper conducts prediction analysis, and obtains the conclusion that it has some practical guidance, and proposes a suggestion that the combination with qualitative and quantitative analysis can predict risks more comprehensively and accurately.
International Nuclear Information System (INIS)
Polikreti, Kyriaki; Argyropoulos, Vassilike; Charalambous, Demetres; Vossou, Aggelina; Perdikatsis, Vassilis; Apostolaki, Chryssa
2009-01-01
Although the corrosion of outdoor bronzes has been extensively studied for the last decades, there is no quantitative correlation of corrosion products to microclimatic factors. The present work aims to demonstrate how Principal Component Analysis (PCA) can serve this purpose. Thirty corrosion product samples were collected from the bronze monument of Theodoros Kolokotronis (Nafplio, Greece) and analysed using X-Ray Diffractometry (XRD). The quantitative XRD data together with data on surface orientation and exposure to rain or wind were treated by PCA and three distinct groups were found. Each group includes samples of similar composition and microclimate characteristics showing that PCA may give useful information on corrosion mechanisms.
A principal components approach to parent-to-newborn body composition associations in South India
Directory of Open Access Journals (Sweden)
Hill Jacqueline C
2009-02-01
Full Text Available Abstract Background Size at birth is influenced by environmental factors, like maternal nutrition and parity, and by genes. Birth weight is a composite measure, encompassing bone, fat and lean mass. These may have different determinants. The main purpose of this paper was to use anthropometry and principal components analysis (PCA to describe maternal and newborn body composition, and associations between them, in an Indian population. We also compared maternal and paternal measurements (body mass index (BMI and height as predictors of newborn body composition. Methods Weight, height, head and mid-arm circumferences, skinfold thicknesses and external pelvic diameters were measured at 30 ± 2 weeks gestation in 571 pregnant women attending the antenatal clinic of the Holdsworth Memorial Hospital, Mysore, India. Paternal height and weight were also measured. At birth, detailed neonatal anthropometry was performed. Unrotated and varimax rotated PCA was applied to the maternal and neonatal measurements. Results Rotated PCA reduced maternal measurements to 4 independent components (fat, pelvis, height and muscle and neonatal measurements to 3 components (trunk+head, fat, and leg length. An SD increase in maternal fat was associated with a 0.16 SD increase (β in neonatal fat (p Conclusion Principal components analysis is a useful method to describe neonatal body composition and its determinants. Newborn adiposity is related to maternal nutritional status and parity, while newborn length is genetically determined. Further research is needed to understand mechanisms linking maternal pelvic size to fetal growth and the determinants and implications of the components (trunk v leg length of fetal skeletal growth.
Corriveau, H; Arsenault, A B; Dutil, E; Lepage, Y
1992-01-01
An evaluation based on the Bobath approach to treatment has previously been developed and partially validated. The purpose of the present study was to verify the content validity of this evaluation with the use of a statistical approach known as principal components analysis. Thirty-eight hemiplegic subjects participated in the study. Analysis of the scores on each of six parameters (sensorium, active movements, muscle tone, reflex activity, postural reactions, and pain) was evaluated on three occasions across a 2-month period. Each time this produced three factors that contained 70% of the variation in the data set. The first component mainly reflected variations in mobility, the second mainly variations in muscle tone, and the third mainly variations in sensorium and pain. The results of such exploratory analysis highlight the fact that some of the parameters are not only important but also interrelated. These results seem to partially support the conceptual framework substantiating the Bobath approach to treatment.
A stable systemic risk ranking in China's banking sector: Based on principal component analysis
Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing
2018-02-01
In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.
Hirose, Misa; Toyota, Saori; Ojima, Nobutoshi; Ogawa-Ochiai, Keiko; Tsumura, Norimichi
2017-08-01
In this paper, principal component analysis is applied to the distribution of pigmentation, surface reflectance, and landmarks in whole facial images to obtain feature values. The relationship between the obtained feature vectors and the age of the face is then estimated by multiple regression analysis so that facial images can be modulated for woman aged 10-70. In a previous study, we analyzed only the distribution of pigmentation, and the reproduced images appeared to be younger than the apparent age of the initial images. We believe that this happened because we did not modulate the facial structures and detailed surfaces, such as wrinkles. By considering landmarks and surface reflectance over the entire face, we were able to analyze the variation in the distributions of facial structures and fine asperity, and pigmentation. As a result, our method is able to appropriately modulate the appearance of a face so that it appears to be the correct age.
Directory of Open Access Journals (Sweden)
Hemant Pathak
2011-01-01
Full Text Available Groundwater is one of the major resources of the drinking water in Sagar city (India.. In this study 15 sampling station were selected for the investigations on 14 chemical parameters. The work was carried out during different months of the pre-monsoon, monsoon and post-monsoon seasons in June 2009 to June 2010. The multivariate statistics such as principal component and cluster analysis were applied to the datasets to investigate seasonal variations in groundwater quality. Principal axis factoring has been used to observe the mode of association of parameters and their interrelationships, for evaluating water quality. Average value of BOD, COD, ammonia and iron was high during entire study period. Elevated values of BOD and ammonia in monsoon, slightly more value of BOD in post-monsoon, BOD, ammonia and iron in pre-monsoon period reflected contribution on temporal effect on groundwater. Results of principal component analysis evinced that all the parameters equally and significantly contribute to groundwater quality variations. Factor 1 and factor 2 analysis revealed the DO value deteriorate due to organic load (BOD/Ammonia in different seasons. Hierarchical cluster analysis grouped 15 stations into four clusters in monsoon, five clusters in post-monsoon and five clusters in pre-monsoon with similar water quality features. Clustered group at monsoon, post-monsoon and pre-monsoon consisted one station exhibiting significant spatial variation in physicochemical composition. The anthropogenic nitrogenous species, as fallout from modernization activities. The study indicated that the groundwater sufficiently well oxygenated and nutrient-rich in study places.
Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao
2015-01-01
Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383
Using principal component analysis to understand the variability of PDS 456
Parker, M. L.; Reeves, J. N.; Matzeu, G. A.; Buisson, D. J. K.; Fabian, A. C.
2018-02-01
We present a spectral-variability analysis of the low-redshift quasar PDS 456 using principal component analysis. In the XMM-Newton data, we find a strong peak in the first principal component at the energy of the Fe absorption line from the highly blueshifted outflow. This indicates that the absorption feature is more variable than the continuum, and that it is responding to the continuum. We find qualitatively different behaviour in the Suzaku data, which is dominated by changes in the column density of neutral absorption. In this case, we find no evidence of the absorption produced by the highly ionized gas being correlated with this variability. Additionally, we perform simulations of the source variability, and demonstrate that PCA can trivially distinguish between outflow variability correlated, anticorrelated and un-correlated with the continuum flux. Here, the observed anticorrelation between the absorption line equivalent width and the continuum flux may be due to the ionization of the wind responding to the continuum. Finally, we compare our results with those found in the narrow-line Seyfert 1 IRAS 13224-3809. We find that the Fe K UFO feature is sharper and more prominent in PDS 456, but that it lacks the lower energy features from lighter elements found in IRAS 13224-3809, presumably due to differences in ionization.
On using principal components to represent stations in empirical–statistical downscaling
Directory of Open Access Journals (Sweden)
Rasmus E. Benestad
2015-10-01
Full Text Available We test a strategy for downscaling seasonal mean temperature for many locations within a region, based on principal component analysis (PCA, and assess potential benefits of this strategy which include an enhancement of the signal-to-noise ratio, more efficient computations, and reduced sensitivity to the choice of predictor domain. These conditions are tested in some case studies for parts of Europe (northern and central and northern China. Results show that the downscaled results were not highly sensitive to whether a PCA-basis or a more traditional strategy was used. However, the results based on a PCA were associated with marginally and systematically higher correlation scores as well as lower root-mean-squared errors. The results were also consistent with the notion that PCA emphasises the large-scale dependency in the station data and an enhancement of the signal-to-noise ratio. Furthermore, the computations were more efficient when the predictands were represented in terms of principal components.
Fault detection of flywheel system based on clustering and principal component analysis
Directory of Open Access Journals (Sweden)
Wang Rixin
2015-12-01
Full Text Available Considering the nonlinear, multifunctional properties of double-flywheel with closed-loop control, a two-step method including clustering and principal component analysis is proposed to detect the two faults in the multifunctional flywheels. At the first step of the proposed algorithm, clustering is taken as feature recognition to check the instructions of “integrated power and attitude control” system, such as attitude control, energy storage or energy discharge. These commands will ask the flywheel system to work in different operation modes. Therefore, the relationship of parameters in different operations can define the cluster structure of training data. Ordering points to identify the clustering structure (OPTICS can automatically identify these clusters by the reachability-plot. K-means algorithm can divide the training data into the corresponding operations according to the reachability-plot. Finally, the last step of proposed model is used to define the relationship of parameters in each operation through the principal component analysis (PCA method. Compared with the PCA model, the proposed approach is capable of identifying the new clusters and learning the new behavior of incoming data. The simulation results show that it can effectively detect the faults in the multifunctional flywheels system.
International Nuclear Information System (INIS)
Christensen, J.H.; Hansen, A.B.; Andersen, O.
2005-01-01
Biomarkers such as steranes and terpanes are abundant in crude oils, particularly in heavy distillate petroleum products. They are useful for matching highly weathered oil samples when other groups of petroleum hydrocarbons fail to distinguish oil samples. In this study, time warping and principal component analysis (PCA) were applied for oil hydrocarbon fingerprinting based on relative amounts of terpane and sterane isomers analyzed by gas chromatography and mass spectrometry. The 4 principal components were boiling point range, clay content, marine or organic terrestrial matter, and maturity based on differences in the terpane and sterane isomer patterns. This study is an extension of a previous fingerprinting study for identifying the sources of oil spill samples based only on the profiles of sterane isomers. Spill samples from the Baltic Carrier oil spill were correctly identified by inspection of score plots. The interpretation of the loading and score plots offered further chemical information about correlations between changes in the amounts of sterane and terpane isomers. It was concluded that this method is an objective procedure for analyzing chromatograms with more comprehensive data usage compared to other fingerprinting methods. 20 refs., 4 figs
DEFF Research Database (Denmark)
Rasmussen, Peter Mondrup; Abrahamsen, Trine Julie; Madsen, Kristoffer Hougaard
2012-01-01
We investigate the use of kernel principal component analysis (PCA) and the inverse problem known as pre-image estimation in neuroimaging: i) We explore kernel PCA and pre-image estimation as a means for image denoising as part of the image preprocessing pipeline. Evaluation of the denoising proc...... base these illustrations on two fMRI BOLD data sets — one from a simple finger tapping experiment and the other from an experiment on object recognition in the ventral temporal lobe.......We investigate the use of kernel principal component analysis (PCA) and the inverse problem known as pre-image estimation in neuroimaging: i) We explore kernel PCA and pre-image estimation as a means for image denoising as part of the image preprocessing pipeline. Evaluation of the denoising...... procedure is performed within a data-driven split-half evaluation framework. ii) We introduce manifold navigation for exploration of a nonlinear data manifold, and illustrate how pre-image estimation can be used to generate brain maps in the continuum between experimentally defined brain states/classes. We...
Principal component analysis of PiB distribution in Parkinson and Alzheimer diseases.
Campbell, Meghan C; Markham, Joanne; Flores, Hubert; Hartlein, Johanna M; Goate, Alison M; Cairns, Nigel J; Videen, Tom O; Perlmutter, Joel S
2013-08-06
To use principal component analyses (PCA) of Pittsburgh compound B (PiB) PET imaging to determine whether the pattern of in vivo β-amyloid (Aβ) in Parkinson disease (PD) with cognitive impairment is similar to the pattern found in symptomatic Alzheimer disease (AD). PiB PET scans were obtained from participants with PD with cognitive impairment (n = 53), participants with symptomatic AD (n = 35), and age-matched controls (n = 67). All were assessed using the Clinical Dementia Rating and APOE genotype was determined in 137 participants. PCA was used to (1) determine the PiB binding pattern in AD, (2) determine a possible unique PD pattern, and (3) directly compare the PiB binding patterns in PD and AD groups. The first 2 principal components (PC1 and PC2) significantly separated the AD and control participants (p PiB binding pattern as participants with AD. These data suggest that Aβ deposition may play a different pathophysiologic role in the cognitive impairment of PD compared to that in AD.
Assessing the effect of oil price on world food prices. Application of principal component analysis
International Nuclear Information System (INIS)
Esmaeili, Abdoulkarim; Shokoohi, Zainab
2011-01-01
The objective of this paper is to investigate the co-movement of food prices and the macroeconomic index, especially the oil price, by principal component analysis to further understand the influence of the macroeconomic index on food prices. We examined the food prices of seven major products: eggs, meat, milk, oilseeds, rice, sugar and wheat. The macroeconomic variables studied were crude oil prices, consumer price indexes, food production indexes and GDP around the world between 1961 and 2005. We use the Scree test and the proportion of variance method for determining the optimal number of common factors. The correlation coefficient between the extracted principal component and the macroeconomic index varies between 0.87 for the world GDP and 0.36 for the consumer price index. We find the food production index has the greatest influence on the macroeconomic index and that the oil price index has an influence on the food production index. Consequently, crude oil prices have an indirect effect on food prices. (author)
Sigirli, Deniz; Ercan, Ilker
2015-09-01
Most of the studies in medical and biological sciences are related to the examination of geometrical properties of an organ or organism. Growth and allometry studies are important in the way of investigating the effects of diseases and the environmental factors effects on the structure of the organ or organism. Thus, statistical shape analysis has recently become more important in the medical and biological sciences. Shape is all geometrical information that remains when location, scale and rotational effects are removed from an object. Allometry, which is a relationship between size and shape, plays an important role in the development of statistical shape analysis. The aim of the present study was to compare two different models for allometry which includes tangent coordinates and principal component scores of tangent coordinates as dependent variables in multivariate regression analysis. The results of the simulation study showed that the model constructed by taking tangent coordinates as dependent variables is more appropriate than the model constructed by taking principal component scores of tangent coordinates as dependent variables, for all sample sizes.
Papadopoulos , Hélène; Ellis , Daniel P.W.
2014-01-01
International audience; Robust Principal Component Analysis (RPCA) is a technique to decompose signals into sparse and low rank components, and has recently drawn the attention of the MIR field for the problem of separating leading vocals from accompaniment, with appealing re-sults obtained on small excerpts of music. However, the perfor-mance of the method drops when processing entire music tracks. We present an adaptive formulation of RPCA that incorporates music content information to guid...
International Nuclear Information System (INIS)
Li, Xiaozhou; Yang, Tianyue; Wang, Deli; Li, Siqi; Song, Youtao; Zhang, Su
2016-01-01
This paper attempts to investigate the feasibility of using Raman spectroscopy for the diagnosis of colon cancer. Serum taken from 75 healthy volunteers, 65 colon cancer patients and 60 post-operation colon cancer patients was measured in this experiment. In the Raman spectra of all three groups, the Raman peaks at 750, 1083, 1165, 1321, 1629 and 1779 cm −1 assigned to nucleic acids, amino acids and chromophores were consistently observed. All of these six Raman peaks were observed to have statistically significant differences between groups. For quantitative analysis, the multivariate statistical techniques of principal component analysis (PCA) and k nearest neighbour analysis (KNN) were utilized to develop diagnostic algorithms for classification. In PCA, several peaks in the principal component (PC) loadings spectra were identified as the major contributors to the PC scores. Some of the peaks in the PC loadings spectra were also reported as characteristic peaks for colon tissues, which implies correlation between peaks in PC loadings spectra and those in the original Raman spectra. KNN was also performed on the obtained PCs, and a diagnostic accuracy of 91.0% and a specificity of 92.6% were achieved. (paper)
Li, Xiaozhou; Yang, Tianyue; Li, Siqi; Wang, Deli; Song, Youtao; Zhang, Su
2016-03-01
This paper attempts to investigate the feasibility of using Raman spectroscopy for the diagnosis of colon cancer. Serum taken from 75 healthy volunteers, 65 colon cancer patients and 60 post-operation colon cancer patients was measured in this experiment. In the Raman spectra of all three groups, the Raman peaks at 750, 1083, 1165, 1321, 1629 and 1779 cm-1 assigned to nucleic acids, amino acids and chromophores were consistently observed. All of these six Raman peaks were observed to have statistically significant differences between groups. For quantitative analysis, the multivariate statistical techniques of principal component analysis (PCA) and k nearest neighbour analysis (KNN) were utilized to develop diagnostic algorithms for classification. In PCA, several peaks in the principal component (PC) loadings spectra were identified as the major contributors to the PC scores. Some of the peaks in the PC loadings spectra were also reported as characteristic peaks for colon tissues, which implies correlation between peaks in PC loadings spectra and those in the original Raman spectra. KNN was also performed on the obtained PCs, and a diagnostic accuracy of 91.0% and a specificity of 92.6% were achieved.
McNabola, Aonghus; Broderick, Brian M; Gill, Laurence W
2009-10-01
Principal component analysis was used to examine air pollution personal exposure data of four urban commuter transport modes for their interrelationships between pollutants and relationships with traffic and meteorological data. Air quality samples of PM2.5 and VOCs were recorded during peak traffic congestion for the car, bus, cyclist and pedestrian between January 2005 and June 2006 on a busy route in Dublin, Ireland. In total, 200 personal exposure samples were recorded each comprising 17 variables describing the personal exposure concentrations, meteorological conditions and traffic conditions. The data reduction technique, principal component analysis (PCA), was used to create weighted linear combinations of the data and these were subsequently examined for interrelationships between the many variables recorded. The results of the PCA found that personal exposure concentrations in non-motorised forms of transport were influenced to a higher degree by wind speed, whereas personal exposure concentrations in motorised forms of transport were influenced to a higher degree by traffic congestion. The findings of the investigation show that the most effective mechanisms of personal exposure reduction differ between motorised and non-motorised modes of commuter transport.
Pavement macrotexture estimation using principal component analysis of tire/road noise
Zhang, Yiying; McDaniel, J. Gregory; Wang, Ming L.
2014-04-01
An investigation on the prediction of macrotexture Mean Texture Depth (MTD) of pavement from a moving vehicle is conducted. The MTD was predicted by using the tire/road noise measured from a microphone mounted underneath a moving vehicle. Principal Component Analysis (PCA) is used to filter noise from microphone data prioer to estimating its energy over an optimally selected bandwidth. Energy obtained using this method is named PCA energy, hence the developed method for MTD estimation is termed as PCA Energy Method. The acoustic energy is assumed to have positive linear correlation with MTD of pavement. Moreover, PCA was used to differentiate important information about the road surface from noisy data while vehicle is moving, yielding a set of principal component vectors representing the conditions of each road section. This principal component vector was used to compute the PCA energy that is to be used for MTD prediction. The frequency band most relative to pavement macrotexture was determined to be 140 to 700 Hz through theoretical and statistical research. Then, a MTD prediction model was built based on a Taylor series expansion with two variables, PCA energy and the driving speed of the vehicle. The model parameters were obtained from an engineered track (interstate highway) with known MTD, and then applied to urban roads for the feasibility test. The accuracy of the model is 83.61% for the engineered track, which is 10% higher than the previous energy-based methods without PCA treatment. Moreover, applicability of the model is increased by the extended MTD prediction range between 0.2 and 3 mm compared to that of the engineered track having 0.4 to 1.5 mm. In addition, the MTD could be predicted every 7.8 meters and with good repeatability in the urban road test, which proves the feasibility of the proposed approach. Therefore, the PCA Energy Method is a reliable, efficient, and cost effective way to predict MTD for engineering applications as an important
A principal components approach to parent-to-newborn body composition associations in South India.
Veena, Sargoor R; Krishnaveni, Ghattu V; Wills, Andrew K; Hill, Jacqueline C; Fall, Caroline Hd
2009-02-24
Size at birth is influenced by environmental factors, like maternal nutrition and parity, and by genes. Birth weight is a composite measure, encompassing bone, fat and lean mass. These may have different determinants. The main purpose of this paper was to use anthropometry and principal components analysis (PCA) to describe maternal and newborn body composition, and associations between them, in an Indian population. We also compared maternal and paternal measurements (body mass index (BMI) and height) as predictors of newborn body composition. Weight, height, head and mid-arm circumferences, skinfold thicknesses and external pelvic diameters were measured at 30 +/- 2 weeks gestation in 571 pregnant women attending the antenatal clinic of the Holdsworth Memorial Hospital, Mysore, India. Paternal height and weight were also measured. At birth, detailed neonatal anthropometry was performed. Unrotated and varimax rotated PCA was applied to the maternal and neonatal measurements. Rotated PCA reduced maternal measurements to 4 independent components (fat, pelvis, height and muscle) and neonatal measurements to 3 components (trunk+head, fat, and leg length). An SD increase in maternal fat was associated with a 0.16 SD increase (beta) in neonatal fat (p maternal parity, newborn sex and socio-economic status). Maternal pelvis, height and (for male babies) muscle predicted neonatal trunk+head (beta = 0. 09 SD; p = 0.017, beta = 0.12 SD; p = 0.006 and beta = 0.27 SD; p maternal BMI predicted neonatal fat (beta = 0.20 SD; p neonatal trunk+head (beta = 0.15 SD; p = 0.001). Both maternal (beta = 0.12 SD; p = 0.002) and paternal height (beta = 0.09 SD; p = 0.030) predicted neonatal trunk+head but the associations became weak and statistically non-significant in multivariate analysis. Only paternal height predicted neonatal leg length (beta = 0.15 SD; p = 0.003). Principal components analysis is a useful method to describe neonatal body composition and its determinants. Newborn
Chen, Shuming; Wang, Dengfeng; Liu, Bo
This paper investigates optimization design of the thickness of the sound package performed on a passenger automobile. The major characteristics indexes for performance selected to evaluate the processes are the SPL of the exterior noise and the weight of the sound package, and the corresponding parameters of the sound package are the thickness of the glass wool with aluminum foil for the first layer, the thickness of the glass fiber for the second layer, and the thickness of the PE foam for the third layer. In this paper, the process is fundamentally with multiple performances, thus, the grey relational analysis that utilizes grey relational grade as performance index is especially employed to determine the optimal combination of the thickness of the different layers for the designed sound package. Additionally, in order to evaluate the weighting values corresponding to various performance characteristics, the principal component analysis is used to show their relative importance properly and objectively. The results of the confirmation experiments uncover that grey relational analysis coupled with principal analysis methods can successfully be applied to find the optimal combination of the thickness for each layer of the sound package material. Therefore, the presented method can be an effective tool to improve the vehicle exterior noise and lower the weight of the sound package. In addition, it will also be helpful for other applications in the automotive industry, such as the First Automobile Works in China, Changan Automobile in China, etc.
FlashPCA2: principal component analysis of Biobank-scale genotype datasets.
Abraham, Gad; Qiu, Yixuan; Inouye, Michael
2017-09-01
Principal component analysis (PCA) is a crucial step in quality control of genomic data and a common approach for understanding population genetic structure. With the advent of large genotyping studies involving hundreds of thousands of individuals, standard approaches are no longer feasible. However, when the full decomposition is not required, substantial computational savings can be made. We present FlashPCA2, a tool that can perform partial PCA on 1 million individuals faster than competing approaches, while requiring substantially less memory. https://github.com/gabraham/flashpca . gad.abraham@unimelb.edu.au. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
QIM blind video watermarking scheme based on Wavelet transform and principal component analysis
Directory of Open Access Journals (Sweden)
Nisreen I. Yassin
2014-12-01
Full Text Available In this paper, a blind scheme for digital video watermarking is proposed. The security of the scheme is established by using one secret key in the retrieval of the watermark. Discrete Wavelet Transform (DWT is applied on each video frame decomposing it into a number of sub-bands. Maximum entropy blocks are selected and transformed using Principal Component Analysis (PCA. Quantization Index Modulation (QIM is used to quantize the maximum coefficient of the PCA blocks of each sub-band. Then, the watermark is embedded into the selected suitable quantizer values. The proposed scheme is tested using a number of video sequences. Experimental results show high imperceptibility. The computed average PSNR exceeds 45 dB. Finally, the scheme is applied on two medical videos. The proposed scheme shows high robustness against several attacks such as JPEG coding, Gaussian noise addition, histogram equalization, gamma correction, and contrast adjustment in both cases of regular videos and medical videos.
Neural Network for Principal Component Analysis with Applications in Image Compression
Directory of Open Access Journals (Sweden)
Luminita State
2007-04-01
Full Text Available Classical feature extraction and data projection methods have been extensively investigated in the pattern recognition and exploratory data analysis literature. Feature extraction and multivariate data projection allow avoiding the "curse of dimensionality", improve the generalization ability of classifiers and significantly reduce the computational requirements of pattern classifiers. During the past decade a large number of artificial neural networks and learning algorithms have been proposed for solving feature extraction problems, most of them being adaptive in nature and well-suited for many real environments where adaptive approach is required. Principal Component Analysis, also called Karhunen-Loeve transform is a well-known statistical method for feature extraction, data compression and multivariate data projection and so far it has been broadly used in a large series of signal and image processing, pattern recognition and data analysis applications.
Day-Ahead Electricity Price Forecasting Using a Hybrid Principal Component Analysis Network
Directory of Open Access Journals (Sweden)
Ching-Ping Wu
2012-11-01
Full Text Available Bidding competition is one of the main transaction approaches in a deregulated electricity market. Locational marginal prices (LMPs resulting from bidding competition and system operation conditions indicate electricity values at a node or in an area. The LMP reveals important information for market participants in developing their bidding strategies. Moreover, LMP is also a vital indicator for the Security Coordinator to perform market redispatch for congestion management. This paper presents a method using a principal component analysis (PCA network cascaded with a multi-layer feedforward (MLF network for forecasting LMPs in a day-ahead market. The PCA network extracts essential features from periodic information in the market. These features serve as inputs to the MLF network for forecasting LMPs. The historical LMPs in the PJM market are employed to test the proposed method. It is found that the proposed method is capable of forecasting day-ahead LMP values efficiently.
Directory of Open Access Journals (Sweden)
A. Bhushan
2015-07-01
Full Text Available In this paper, we address outliers in spatiotemporal data streams obtained from sensors placed across geographically distributed locations. Outliers may appear in such sensor data due to various reasons such as instrumental error and environmental change. Real-time detection of these outliers is essential to prevent propagation of errors in subsequent analyses and results. Incremental Principal Component Analysis (IPCA is one possible approach for detecting outliers in such type of spatiotemporal data streams. IPCA has been widely used in many real-time applications such as credit card fraud detection, pattern recognition, and image analysis. However, the suitability of applying IPCA for outlier detection in spatiotemporal data streams is unknown and needs to be investigated. To fill this research gap, this paper contributes by presenting two new IPCA-based outlier detection methods and performing a comparative analysis with the existing IPCA-based outlier detection methods to assess their suitability for spatiotemporal sensor data streams.
DEFF Research Database (Denmark)
Kotwa, Ewelina Katarzyna; Jørgensen, Bo Munk; Brockhoff, Per B.
2013-01-01
In this paper, we introduce a new method, based on spherical principal component analysis (S‐PCA), for the identification of Rayleigh and Raman scatters in fluorescence excitation–emission data. These scatters should be found and eliminated as a prestep before fitting parallel factor analysis...... models to the data, in order to avoid model degeneracies. The work is inspired and based on a previous research, where scatter removal was automatic (based on a robust version of PCA called ROBPCA) and required no visual data inspection but appeared to be computationally intensive. To overcome...... this drawback, we implement the fast S‐PCA in the scatter identification routine. Moreover, an additional pattern interpolation step that complements the method, based on robust regression, will be applied. In this way, substantial time savings are gained, and the user's engagement is restricted to a minimum...
PRINCIPAL COMPONENTS IN MULTIVARIATE CONTROL CHARTS APPLIED TO DATA INSTRUMENTATION OF DAMS
Directory of Open Access Journals (Sweden)
Emerson Lazzarotto
2016-03-01
Full Text Available Hydroelectric plants are monitored by a high number of instruments that assess various quality characteristics of interest that have an inherent variability. The readings of these instruments generate time series of data on many occasions have correlation. Each project of a dam plant has characteristics that make it unique. Faced with the need to establish statistical control limits for the instrumentation data, this article makes an approach to multivariate statistical analysis and proposes a model that uses principal components control charts and statistical and to explain variability and establish a method of monitoring to control future observations. An application for section E of the Itaipu hydroelectric plant is performed to validate the model. The results show that the method used is appropriate and can help identify the type of outliers, reducing false alarms and reveal instruments that have higher contribution to the variability.
Elsawy, Amr S; Eldawlatly, Seif; Taher, Mohamed; Aly, Gamal M
2014-01-01
The current trend to use Brain-Computer Interfaces (BCIs) with mobile devices mandates the development of efficient EEG data processing methods. In this paper, we demonstrate the performance of a Principal Component Analysis (PCA) ensemble classifier for P300-based spellers. We recorded EEG data from multiple subjects using the Emotiv neuroheadset in the context of a classical oddball P300 speller paradigm. We compare the performance of the proposed ensemble classifier to the performance of traditional feature extraction and classifier methods. Our results demonstrate the capability of the PCA ensemble classifier to classify P300 data recorded using the Emotiv neuroheadset with an average accuracy of 86.29% on cross-validation data. In addition, offline testing of the recorded data reveals an average classification accuracy of 73.3% that is significantly higher than that achieved using traditional methods. Finally, we demonstrate the effect of the parameters of the P300 speller paradigm on the performance of the method.
He, A.; Quan, C.
2018-04-01
The principal component analysis (PCA) and region matching combined method is effective for fringe direction estimation. However, its mask construction algorithm for region matching fails in some circumstances, and the algorithm for conversion of orientation to direction in mask areas is computationally-heavy and non-optimized. We propose an improved PCA based region matching method for the fringe direction estimation, which includes an improved and robust mask construction scheme, and a fast and optimized orientation-direction conversion algorithm for the mask areas. Along with the estimated fringe direction map, filtered fringe pattern by automatic selective reconstruction modification and enhanced fast empirical mode decomposition (ASRm-EFEMD) is used for Hilbert spiral transform (HST) to demodulate the phase. Subsequently, windowed Fourier ridge (WFR) method is used for the refinement of the phase. The robustness and effectiveness of proposed method are demonstrated by both simulated and experimental fringe patterns.
Choi, Wook-Jin; Choi, Tae-Sun
2009-08-01
Pulmonary nodule detection is a binary classification problem. The main objective is to classify nodule from the lung computed tomography (CT) images. The intra class variability is mainly due to the grey-level variance, texture differences and shape. The purpose of this study is to develop a novel nodule detection method which is based on Two-dimensional Principal Component Analysis (2DPCA). We extract the futures using 2DPCA from nodule candidate images. Nodule candidates are classified using threshold. The proposed method reduces False Positive (FP) rate. We tested the proposed algorithm by using Lung Imaging Database Consortium (LIDC) database of National Cancer Institute (NCI). The experimental results demonstrate the effectiveness and efficiency of the proposed method. The proposed method achieved 85.11% detection rate with 1.13 FPs per scan.
Abdullah, Nurul Azma; Saidi, Md. Jamri; Rahman, Nurul Hidayah Ab; Wen, Chuah Chai; Hamid, Isredza Rahmi A.
2017-10-01
In practice, identification of criminal in Malaysia is done through thumbprint identification. However, this type of identification is constrained as most of criminal nowadays getting cleverer not to leave their thumbprint on the scene. With the advent of security technology, cameras especially CCTV have been installed in many public and private areas to provide surveillance activities. The footage of the CCTV can be used to identify suspects on scene. However, because of limited software developed to automatically detect the similarity between photo in the footage and recorded photo of criminals, the law enforce thumbprint identification. In this paper, an automated facial recognition system for criminal database was proposed using known Principal Component Analysis approach. This system will be able to detect face and recognize face automatically. This will help the law enforcements to detect or recognize suspect of the case if no thumbprint present on the scene. The results show that about 80% of input photo can be matched with the template data.
Directory of Open Access Journals (Sweden)
Wenjing Zhao
2018-01-01
Full Text Available SGK (sequential generalization of K-means dictionary learning denoising algorithm has the characteristics of fast denoising speed and excellent denoising performance. However, the noise standard deviation must be known in advance when using SGK algorithm to process the image. This paper presents a denoising algorithm combined with SGK dictionary learning and the principal component analysis (PCA noise estimation. At first, the noise standard deviation of the image is estimated by using the PCA noise estimation algorithm. And then it is used for SGK dictionary learning algorithm. Experimental results show the following: (1 The SGK algorithm has the best denoising performance compared with the other three dictionary learning algorithms. (2 The SGK algorithm combined with PCA is superior to the SGK algorithm combined with other noise estimation algorithms. (3 Compared with the original SGK algorithm, the proposed algorithm has higher PSNR and better denoising performance.
DEFF Research Database (Denmark)
Sharifzadeh, Sara; Ghodsi, Ali; Clemmensen, Line H.
2017-01-01
Principal component analysis (PCA) is one of the main unsupervised pre-processing methods for dimension reduction. When the training labels are available, it is worth using a supervised PCA strategy. In cases that both dimension reduction and variable selection are required, sparse PCA (SPCA......) methods are preferred. In this paper, a sparse supervised PCA (SSPCA) method is proposed for pre-processing. This method is appropriate especially in problems where, a high dimensional input necessitates the use of a sparse method and a target label is also available to guide the variable selection......) algorithm. We compare the proposed method with PCA, PMD-based SPCA and supervised PCA. In addition, SSPCA is also compared with sparse partial least squares (SPLS), due to the similarity between the two objective functions. Experimental results from the simulated as well as real data sets show that, SSPCA...
3D Reconstruction of Coronal Loops by the Principal Component Analysis
Directory of Open Access Journals (Sweden)
Erwin Verwichte
2013-10-01
Full Text Available Knowing the three dimensional structure of plasma filaments in the uppermost part of the solar atmosphere, known as coronal loops, and especially their length, is an important parameter in the wave-based diagnostics of this part of the Sun. The combination of observations of the Sun from different points of observations in space, thanks to the most recent missions, including the Solar Dynamics Observatory (SDO and the Solar TErrestrial RElations Observatory (STEREO, allows us to infer information about the geometrical shape of coronal loops in 3D space. Here, we propose a new method to reconstruct the loop shape starting from stereoscopically determined 3D points, which sample the loop length, by principal component analysis. This method is shown to retrieve in an easy way the main parameters that define the loop, e.g., the minor and major axes, the loop plane, the azimuthal and inclination angles, for the special case of a coplanar loop.
Zia, Asif Iqbal
2015-06-01
The surface roughness of thin-film gold electrodes induces instability in impedance spectroscopy measurements of capacitive interdigital printable sensors. Post-fabrication thermodynamic annealing was carried out at temperatures ranging from 30 °C to 210 °C in a vacuum oven and the variation in surface morphology of thin-film gold electrodes was observed by scanning electron microscopy. Impedance spectra obtained at different temperatures were translated into equivalent circuit models by applying complex nonlinear least square curve-fitting algorithm. Principal component analysis was applied to deduce the classification of the parameters affected due to the annealing process and to evaluate the performance stability using mathematical model. Physics of the thermodynamic annealing was discussed based on the surface activation energies. The post anneal testing of the sensors validated the achieved stability in impedance measurement. © 2001-2012 IEEE.
Directory of Open Access Journals (Sweden)
Zhang Jingrong
2017-01-01
Full Text Available Comprehensive Transportation Logistics Network (CTLN acts as a crucial prop and fundamental carrier for regional economic and social development. Firstly, an index system for evaluating the development of regional Comprehensive Transportation Logistics (CTL nodes is established; then regional CTLN nodes are ranked according to their importance by the method of Principal Component Analysis(PCA, and main factors affecting the development of regional CTL nodes are analyzed by applying factor analysis, and regional CTL nodes are classified according to their feature similarities by applying cluster analysis; and then level structure of constructing regional CTLN is proposed. Finally, combined with geographic locations of different nodes, level layout model of CTLN of the whole region is obtained. Taking Henan province this region as an instance, level layout model of hub-and-spoke CTLN taking Zhengzhou at its core is proposed after analysis, providing a reference basis for constructing CTLN in whole province scientifically and reasonably.
DEFF Research Database (Denmark)
Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan
2016-01-01
This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production...... process operating at Novozymes A/S. Following the FUPCR methodology, the final product concentration could be predicted with an average prediction error of 7.4%. Multiple iterations of preprocessing were applied by implementing the methodology to identify the best data handling methods for the model....... It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify...
Fernández-Arjona, María Del Mar; Grondona, Jesús M; Granados-Durán, Pablo; Fernández-Llebrez, Pedro; López-Ávalos, María D
2017-01-01
It is known that microglia morphology and function are closely related, but only few studies have objectively described different morphological subtypes. To address this issue, morphological parameters of microglial cells were analyzed in a rat model of aseptic neuroinflammation. After the injection of a single dose of the enzyme neuraminidase (NA) within the lateral ventricle (LV) an acute inflammatory process occurs. Sections from NA-injected animals and sham controls were immunolabeled with the microglial marker IBA1, which highlights ramifications and features of the cell shape. Using images obtained by section scanning, individual microglial cells were sampled from various regions (septofimbrial nucleus, hippocampus and hypothalamus) at different times post-injection (2, 4 and 12 h). Each cell yielded a set of 15 morphological parameters by means of image analysis software. Five initial parameters (including fractal measures) were statistically different in cells from NA-injected rats (most of them IL-1β positive, i.e., M1-state) compared to those from control animals (none of them IL-1β positive, i.e., surveillant state). However, additional multimodal parameters were revealed more suitable for hierarchical cluster analysis (HCA). This method pointed out the classification of microglia population in four clusters. Furthermore, a linear discriminant analysis (LDA) suggested three specific parameters to objectively classify any microglia by a decision tree. In addition, a principal components analysis (PCA) revealed two extra valuable variables that allowed to further classifying microglia in a total of eight sub-clusters or types. The spatio-temporal distribution of these different morphotypes in our rat inflammation model allowed to relate specific morphotypes with microglial activation status and brain location. An objective method for microglia classification based on morphological parameters is proposed. Main points Microglia undergo a quantifiable
Principal Component Analysis to Explore Climatic Variability and Dengue Outbreak in Lahore
Directory of Open Access Journals (Sweden)
Syed Afrozuddin Ahmed
2014-08-01
Full Text Available Normal 0 false false false EN-US X-NONE X-NONE Various studies have reported that global warming causes unstable climate and many serious impact to physical environment and public health. The increasing incidence of dengue incidence is now a priority health issue and become a health burden of Pakistan. In this study it has been investigated that spatial pattern of environment causes the emergence or increasing rate of dengue fever incidence that effects the population and its health. Principal component analysis is performed for the purpose of finding if there is/are any general environmental factor/structure which could be affected in the emergence of dengue fever cases in Pakistani climate. Principal component is applied to find structure in data for all four periods i.e. 1980 to 2012, 1980 to 1995 and 1996 to 2012. The first three PCs for the period (1980-2012, 1980-1994, 1995-2012 are almost the same and it represent hot and windy weather. The PC1s of all dengue periods are different to each other. PC2 for all period are same and it is wetness in weather. PC3s are different and it is the combination of wetness and windy weather. PC4s for all period show humid but no rain in weather. For climatic variable only minimum temperature and maximum temperature are significantly correlated with daily dengue cases. PC1, PC3 and PC4 are highly significantly correlated with daily dengue cases
Use of Principal Components Analysis to Explain Controls on Nutrient Fluxes to the Chesapeake Bay
Rice, K. C.; Mills, A. L.
2017-12-01
The Chesapeake Bay watershed, on the east coast of the United States, encompasses about 166,000-square kilometers (km2) of diverse land use, which includes a mixture of forested, agricultural, and developed land. The watershed is now managed under a Total Daily Maximum Load (TMDL), which requires implementation of management actions by 2025 that are sufficient to reduce nitrogen, phosphorus, and suspended-sediment fluxes to the Chesapeake Bay and restore the bay's water quality. We analyzed nutrient and sediment data along with land-use and climatic variables in nine sub watersheds to better understand the drivers of flux within the watershed and to provide relevant management implications. The nine sub watersheds range in area from 300 to 30,000 km2, and the analysis period was 1985-2014. The 31 variables specific to each sub watershed were highly statistically significantly correlated, so Principal Components Analysis was used to reduce the dimensionality of the dataset. The analysis revealed that about 80% of the variability in the whole dataset can be explained by discharge, flux, and concentration of nutrients and sediment. The first two principal components (PCs) explained about 68% of the total variance. PC1 loaded strongly on discharge and flux, and PC2 loaded on concentration. The PC scores of both PC1 and PC2 varied by season. Subsequent analysis of PC1 scores versus PC2 scores, broken out by sub watershed, revealed management implications. Some of the largest sub watersheds are largely driven by discharge, and consequently large fluxes. In contrast, some of the smaller sub watersheds are more variable in nutrient concentrations than discharge and flux. Our results suggest that, given no change in discharge, a reduction in nutrient flux to the streams in the smaller watersheds could result in a proportionately larger decrease in fluxes of nutrients down the river to the bay, than in the larger watersheds.
Oropesa, Ignacio; Escamirosa, Fernando Pérez; Sánchez-Margallo, Juan A; Enciso, Silvia; Rodríguez-Vila, Borja; Martínez, Arturo Minor; Sánchez-Margallo, Francisco M; Gómez, Enrique J; Sánchez-González, Patricia
2018-01-18
Motion analysis parameters (MAPs) have been extensively validated for assessment of minimally invasive surgical skills. However, there are discrepancies on how specific MAPs, tasks, and skills match with each other, reflecting that motion analysis cannot be generalized independently of the learning outcomes of a task. Additionally, there is a lack of knowledge on the meaning of motion analysis in terms of surgical skills, making difficult the provision of meaningful, didactic feedback. In this study, new higher significance MAPs (HSMAPs) are proposed, validated, and discussed for the assessment of technical skills in box trainers, based on principal component analysis (PCA). Motion analysis data were collected from 25 volunteers performing three box trainer tasks (peg grasping/PG, pattern cutting/PC, knot suturing/KS) using the EVA tracking system. PCA was applied on 10 MAPs for each task and hand. Principal components were trimmed to those accounting for an explained variance > 80% to define the HSMAPs. Individual contributions of MAPs to HSMAPs were obtained by loading analysis and varimax rotation. Construct validity of the new HSMAPs was carried out at two levels of experience based on number of surgeries. Three new HSMAPs per hand were defined for PG and PC tasks, and two per hand for KS task. PG presented validity for HSMAPs related to insecurity and economy of space. PC showed validity for HSMAPs related to cutting efficacy, peripheral unawareness, and confidence. Finally, KS presented validity for HSMAPs related with economy of space and knotting security. PCA-defined HSMAPs can be used for technical skills' assessment. Construct validation and expert knowledge can be combined to infer how competences are acquired in box trainer tasks. These findings can be exploited to provide residents with meaningful feedback on performance. Future works will compare the new HSMAPs with valid scoring systems such as GOALS.
Characterization of soil chemical properties of strawberry fields using principal component analysis
Directory of Open Access Journals (Sweden)
Gláucia Oliveira Islabão
2013-02-01
Full Text Available One of the largest strawberry-producing municipalities of Rio Grande do Sul (RS is Turuçu, in the South of the State. The strawberry production system adopted by farmers is similar to that used in other regions in Brazil and in the world. The main difference is related to the soil management, which can change the soil chemical properties during the strawberry cycle. This study had the objective of assessing the spatial and temporal distribution of soil fertility parameters using principal component analysis (PCA. Soil sampling was based on topography, dividing the field in three thirds: upper, middle and lower. From each of these thirds, five soil samples were randomly collected in the 0-0.20 m layer, to form a composite sample for each third. Four samples were taken during the strawberry cycle and the following properties were determined: soil organic matter (OM, soil total nitrogen (N, available phosphorus (P and potassium (K, exchangeable calcium (Ca and magnesium (Mg, soil pH (pH, cation exchange capacity (CEC at pH 7.0, soil base (V% and soil aluminum saturation(m%. No spatial variation was observed for any of the studied soil fertility parameters in the strawberry fields and temporal variation was only detected for available K. Phosphorus and K contents were always high or very high from the beginning of the strawberry cycle, while pH values ranged from very low to very high. Principal component analysis allowed the clustering of all strawberry fields based on variables related to soil acidity and organic matter content.
Principal components analysis based control of a multi-dof underactuated prosthetic hand
Directory of Open Access Journals (Sweden)
Magenes Giovanni
2010-04-01
Full Text Available Abstract Background Functionality, controllability and cosmetics are the key issues to be addressed in order to accomplish a successful functional substitution of the human hand by means of a prosthesis. Not only the prosthesis should duplicate the human hand in shape, functionality, sensorization, perception and sense of body-belonging, but it should also be controlled as the natural one, in the most intuitive and undemanding way. At present, prosthetic hands are controlled by means of non-invasive interfaces based on electromyography (EMG. Driving a multi degrees of freedom (DoF hand for achieving hand dexterity implies to selectively modulate many different EMG signals in order to make each joint move independently, and this could require significant cognitive effort to the user. Methods A Principal Components Analysis (PCA based algorithm is used to drive a 16 DoFs underactuated prosthetic hand prototype (called CyberHand with a two dimensional control input, in order to perform the three prehensile forms mostly used in Activities of Daily Living (ADLs. Such Principal Components set has been derived directly from the artificial hand by collecting its sensory data while performing 50 different grasps, and subsequently used for control. Results Trials have shown that two independent input signals can be successfully used to control the posture of a real robotic hand and that correct grasps (in terms of involved fingers, stability and posture may be achieved. Conclusions This work demonstrates the effectiveness of a bio-inspired system successfully conjugating the advantages of an underactuated, anthropomorphic hand with a PCA-based control strategy, and opens up promising possibilities for the development of an intuitively controllable hand prosthesis.
International Nuclear Information System (INIS)
Critto, Andrea; Carlon, Claudio; Marcomini, Antonio
2003-01-01
Information on soil and groundwater contamination was used to develop a site conceptual model and to identify exposure scenarios. - The characterization of a hydrologically complex contaminated site bordering the lagoon of Venice (Italy) was undertaken by investigating soils and groundwaters affected by the chemical contaminants originated by the wastes dumped into an illegal landfill. Statistical tools such as principal components analysis and geostatistical techniques were applied to obtain the spatial distribution of chemical contaminants. Dissolved organic carbon (DOC), SO 4 2- and Cl - were used to trace the migration of the contaminants from the top soil to the underlying groundwaters. The chemical and hydrogeological available information was assembled to obtain the schematic of the conceptual model of the contaminated site capable to support the formulation of major exposure scenarios, which are also provided
International Nuclear Information System (INIS)
Reid, M.K.; Spencer, K.L.
2009-01-01
Principal components analysis (PCA) is a multivariate statistical technique capable of discerning patterns in large environmental datasets. Although widely used, there is disparity in the literature with respect to data pre-treatment prior to PCA. This research examines the influence of commonly reported data pre-treatment methods on PCA outputs, and hence data interpretation, using a typical environmental dataset comprising sediment geochemical data from an estuary in SE England. This study demonstrated that applying the routinely used log (x + 1) transformation skewed the data and masked important trends. Removing outlying samples and correcting for the influence of grain size had the most significant effect on PCA outputs and data interpretation. Reducing the influence of grain size using granulometric normalisation meant that other factors affecting metal variability, including mineralogy, anthropogenic sources and distance along the salinity transect could be identified and interpreted more clearly. - Data pre-treatment can have a significant influence on the outcome of PCA.
Unglert, K.; Radić, V.; Jellinek, A. M.
2016-06-01
Variations in the spectral content of volcano seismicity related to changes in volcanic activity are commonly identified manually in spectrograms. However, long time series of monitoring data at volcano observatories require tools to facilitate automated and rapid processing. Techniques such as self-organizing maps (SOM) and principal component analysis (PCA) can help to quickly and automatically identify important patterns related to impending eruptions. For the first time, we evaluate the performance of SOM and PCA on synthetic volcano seismic spectra constructed from observations during two well-studied eruptions at Klauea Volcano, Hawai'i, that include features observed in many volcanic settings. In particular, our objective is to test which of the techniques can best retrieve a set of three spectral patterns that we used to compose a synthetic spectrogram. We find that, without a priori knowledge of the given set of patterns, neither SOM nor PCA can directly recover the spectra. We thus test hierarchical clustering, a commonly used method, to investigate whether clustering in the space of the principal components and on the SOM, respectively, can retrieve the known patterns. Our clustering method applied to the SOM fails to detect the correct number and shape of the known input spectra. In contrast, clustering of the data reconstructed by the first three PCA modes reproduces these patterns and their occurrence in time more consistently. This result suggests that PCA in combination with hierarchical clustering is a powerful practical tool for automated identification of characteristic patterns in volcano seismic spectra. Our results indicate that, in contrast to PCA, common clustering algorithms may not be ideal to group patterns on the SOM and that it is crucial to evaluate the performance of these tools on a control dataset prior to their application to real data.
Li, Jun; Song, Minghui; Peng, Yuanxi
2018-03-01
Current infrared and visible image fusion methods do not achieve adequate information extraction, i.e., they cannot extract the target information from infrared images while retaining the background information from visible images. Moreover, most of them have high complexity and are time-consuming. This paper proposes an efficient image fusion framework for infrared and visible images on the basis of robust principal component analysis (RPCA) and compressed sensing (CS). The novel framework consists of three phases. First, RPCA decomposition is applied to the infrared and visible images to obtain their sparse and low-rank components, which represent the salient features and background information of the images, respectively. Second, the sparse and low-rank coefficients are fused by different strategies. On the one hand, the measurements of the sparse coefficients are obtained by the random Gaussian matrix, and they are then fused by the standard deviation (SD) based fusion rule. Next, the fused sparse component is obtained by reconstructing the result of the fused measurement using the fast continuous linearized augmented Lagrangian algorithm (FCLALM). On the other hand, the low-rank coefficients are fused using the max-absolute rule. Subsequently, the fused image is superposed by the fused sparse and low-rank components. For comparison, several popular fusion algorithms are tested experimentally. By comparing the fused results subjectively and objectively, we find that the proposed framework can extract the infrared targets while retaining the background information in the visible images. Thus, it exhibits state-of-the-art performance in terms of both fusion effects and timeliness.
Nanni, Arthur; Roisenberg, Ari; Fachel, Jandyra M G; Mesquita, Gilberto; Danieli, Cristiano
2008-12-01
Principal component analysis is applied to 309 groundwater chemical data information from wells in the Serra Geral Aquifer System. Correlations among seven hydrochemical parameters are statistically examined. A four-component model is suggested and explains 81% of total variance. Component 1 represents calcium-magnesium bicarbonated groundwaters with long time of residence. Component 2 represents sulfated and chlorinated calcium and sodium groundwaters; Component 3 represents sodium bicarbonated groundwaters; and Component 4 is characterized by sodium sulfated with high fluoride facies. The components' spatial distribution shows high fluoride concentration along analyzed tectonic fault system and aligned on northeast direction in other areas, suggesting other hydrogeological fault systems. High fluoride concentration increases according to groundwater pumping depth. The Principal Component Analysis reveals features of the groundwater mixture and individualizes water facies. In this scenery, it can be determined hydrogeological blocks associated with tectonic fault system here introduced.
Aviyente, Selin; Bernat, Edward M; Malone, Stephen M; Iacono, William G
2010-01-01
Joint time-frequency representations offer a rich representation of event related potentials (ERPs) that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely-used matching pursuit (MP) approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.
Directory of Open Access Journals (Sweden)
Shaker AS
2017-06-01
Full Text Available Chicken eggs represent an important source of protein to the growing human population and also supply repositories of unique genes that could be used worldwide. The inheritance of shank feathering trait is dominant upon non-feathering shank trait in chicken which is based on two factors: pti-1L and pti-1B that are located on Chromosomes 13, 15, and 24. Using 185 fertile eggs collected from two genetic lines (shank feathering and non-feathering shank of White Kurdish chicken, we found that egg weight highly (P < 0.01 correlated with yolk weight (r2=0.520, 0.704, respectively, albumen weight (r2=0.918, 0.835, and shell weight (r2=0.626, 0.225. The first two principal components explained the greatest variance in both the White with shank feathering (85.6% of total variance and non-feathering shank (76.5%. Therefore, differences in the component traits of the eggs between the two genetic lines may be influenced by the same gene actions as shank feathering trait. According to these results, the two genetic lines of Kurdish chicken yield significant differences in the internal traits of eggs.
Scullin, Michael K; Harrison, Tyler L; Factor, Stewart A; Bliwise, Donald L
2014-01-15
Sleep disturbances are common in many neurodegenerative diseases and may include altered sleep duration, fragmented sleep, nocturia, excessive daytime sleepiness, and vivid dreaming experiences, with occasional parasomnias. Although representing the "gold standard," polysomnography is not always cost-effective or available for measuring sleep disturbance, particularly for screening. Although numerous sleep-related questionnaires exist, many focus on a specific sleep disturbance (e.g., restless legs, REM Behavior Disorder) and do not capture efficiently the variety of sleep issues experienced by such patients. We administered the 12-item Neurodegenerative Disease Sleep Questionnaire (NDSQ) and the Epworth Sleepiness Scale to 145 idiopathic Parkinson's disease patients. Principal component analysis using eigenvalues greater than 1 suggested five separate components: sleep quality (e.g., sleep fragmentation), nocturia, vivid dreams/nightmares, restless legs symptoms, and sleep-disordered breathing. These results demonstrate construct validity of our sleep questionnaire and suggest that the NDSQ may be a useful screening tool for sleep disturbances in at least some types of neurodegenerative disorders. © 2013.
New Role of Thermal Mapping in Winter Maintenance with Principal Components Analysis
Directory of Open Access Journals (Sweden)
Mario Marchetti
2014-01-01
Full Text Available Thermal mapping uses IR thermometry to measure road pavement temperature at a high resolution to identify and to map sections of the road network prone to ice occurrence. However, measurements are time-consuming and ultimately only provide a snapshot of road conditions at the time of the survey. As such, there is a need for surveys to be restricted to a series of specific climatic conditions during winter. Typically, five to six surveys are used, but it is questionable whether the full range of atmospheric conditions is adequately covered. This work investigates the role of statistics in adding value to thermal mapping data. Principal components analysis is used to interpolate between individual thermal mapping surveys to build a thermal map (or even a road surface temperature forecast, for a wider range of climatic conditions than that permitted by traditional surveys. The results indicate that when this approach is used, fewer thermal mapping surveys are actually required. Furthermore, comparisons with numerical models indicate that this approach could yield a suitable verification method for the spatial component of road weather forecasts—a key issue currently in winter road maintenance.
Directory of Open Access Journals (Sweden)
Selin Aviyente
2010-01-01
Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.
Directory of Open Access Journals (Sweden)
A. Jayanegara
2014-10-01
Full Text Available This research was aimed to explore the use of multivariate statistics i.e. principal componentanalysis (PCA in identifying and integrating variables related to forage quality and ruminal methaneproduction, and in classifying forage species into both characteristics. Seventeen plants were used as adatabase for the above mentioned purposes. Plant samples were determined for their chemicalcomposition, cumulative gas production (represents the nutrient degradation and methane productionafter 24 hours of fermentation period using the Hohenheim gas test. The results showed that the PCAcould clearly identify factors related to forage quality and methane production and separated them intodifferent principal components (PC. The obtained PC1 was related to methane production andsubstantially influenced positively by crude protein, NDF, ADF (positive, total phenols, total tannins,condensed tannins and tannin activity (negative. On the other hand, the obtained PC2 was related tocumulative gas production (forage quality and substantially influenced by crude protein (positive,NDF, ADF and condensed tannins (negative. Classification and screening of forages that have highquality and low methane production are possible using the PCA technique. Rhenum undulatum,Peltiphyllum peltatum and Rhus typhina were found to have such desired characteristics.
Structured Sparse Principal Components Analysis With the TV-Elastic Net Penalty.
de Pierrefeu, Amicie; Lofstedt, Tommy; Hadj-Selem, Fouad; Dubois, Mathieu; Jardri, Renaud; Fovet, Thomas; Ciuciu, Philippe; Frouin, Vincent; Duchesnay, Edouard
2018-02-01
Principal component analysis (PCA) is an exploratory tool widely used in data analysis to uncover the dominant patterns of variability within a population. Despite its ability to represent a data set in a low-dimensional space, PCA's interpretability remains limited. Indeed, the components produced by PCA are often noisy or exhibit no visually meaningful patterns. Furthermore, the fact that the components are usually non-sparse may also impede interpretation, unless arbitrary thresholding is applied. However, in neuroimaging, it is essential to uncover clinically interpretable phenotypic markers that would account for the main variability in the brain images of a population. Recently, some alternatives to the standard PCA approach, such as sparse PCA (SPCA), have been proposed, their aim being to limit the density of the components. Nonetheless, sparsity alone does not entirely solve the interpretability problem in neuroimaging, since it may yield scattered and unstable components. We hypothesized that the incorporation of prior information regarding the structure of the data may lead to improved relevance and interpretability of brain patterns. We therefore present a simple extension of the popular PCA framework that adds structured sparsity penalties on the loading vectors in order to identify the few stable regions in the brain images that capture most of the variability. Such structured sparsity can be obtained by combining, e.g., and total variation (TV) penalties, where the TV regularization encodes information on the underlying structure of the data. This paper presents the structured SPCA (denoted SPCA-TV) optimization framework and its resolution. We demonstrate SPCA-TV's effectiveness and versatility on three different data sets. It can be applied to any kind of structured data, such as, e.g., -dimensional array images or meshes of cortical surfaces. The gains of SPCA-TV over unstructured approaches (such as SPCA and ElasticNet PCA) or structured approach
Warmenhoven, John; Cobley, Stephen; Draper, Conny; Harrison, Andrew; Bargary, Norma; Smith, Richard
2017-11-15
The proliferation of new biomechanical technology in laboratory and field settings facilitates the capture of data-sets consisting of complex time-series. An understanding of the appropriate statistical approaches for analysing and interpreting these data-sets is required and the functional data analysis (FDA) family of statistical techniques has emerged in the biomechanical literature. Given the use of FDA is currently in its infancy with biomechanical data, this paper will form the first of a two part series aiming to address practical issues surrounding the application of FDA techniques in biomechanics. This work focuses on functional principal components analysis (fPCA), which is explored using existing literature and sample data from an on-water rowing database. In particular methodological considerations for the implementation of fPCA such as temporal normalisation of data, removal of unwanted forms of variation in a data-set and documented methods for preserving the original temporal properties within a set of curves are explored in detail as a part of this review. Limitations and strengths of the technique are outlined and recommendations are provided to encourage the appropriate use of fPCA within the field of applied sports biomechanics.
Directory of Open Access Journals (Sweden)
Karacaören Burak
2011-05-01
Full Text Available Abstract Background It has been shown that if genetic relationships among individuals are not taken into account for genome wide association studies, this may lead to false positives. To address this problem, we used Genome-wide Rapid Association using Mixed Model and Regression and principal component stratification analyses. To account for linkage disequilibrium among the significant markers, principal components loadings obtained from top markers can be included as covariates. Estimation of Bayesian networks may also be useful to investigate linkage disequilibrium among SNPs and their relation with environmental variables. For the quantitative trait we first estimated residuals while taking polygenic effects into account. We then used a single SNP approach to detect the most significant SNPs based on the residuals and applied principal component regression to take linkage disequilibrium among these SNPs into account. For the categorical trait we used principal component stratification methodology to account for background effects. For correction of linkage disequilibrium we used principal component logit regression. Bayesian networks were estimated to investigate relationship among SNPs. Results Using the Genome-wide Rapid Association using Mixed Model and Regression and principal component stratification approach we detected around 100 significant SNPs for the quantitative trait (p Conclusions GRAMMAR could efficiently incorporate the information regarding random genetic effects. Principal component stratification should be cautiously used with stringent multiple hypothesis testing correction to correct for ancestral stratification and association analyses for binary traits when there are systematic genetic effects such as half sib family structures. Bayesian networks are useful to investigate relationships among SNPs and environmental variables.
Directory of Open Access Journals (Sweden)
Peter Haščík
2017-01-01
Full Text Available The objective of the present study was to examine the effect of different dietary supplements (bee pollen, propolis, and probiotic on sensory quality of chicken breast muscle. The experiment was performed with 180 one day-old Ross 308 broiler chicks of mixed sex. The dietary treatments were as follows: 1. basal diet with no supplementation as control (C; 2. basal diet plus 400 mg bee pollen extract per 1 kg of feed mixture (E1; 3. basal diet plus 400 mg propolis extract per 1 kg of feed mixture (E2; 4. basal diet plus 3.3 g probiotic preparation based on Lactobacillus fermentum added to drinking water (E3. Sensory properties of chicken breast muscle were assessed by a five-member panel that rated the meat for aroma, taste, juiciness, tenderness and overall acceptability. The ANOVA results for each attribute showed that at least one mean score for any group differs significantly (p ≤0.05. Subsequent Tukey's HSD revealed that only C group had significantly higher mean score (p ≤0.05 for each attribute compared with E2 group. As regards the E1 and E3 groups, there were not significant differences (p >0.05 in aroma, taste and tenderness when compared to C group, with the significantly lowest juiciness value (p ≤0.05 found in E3 group and significantly lower values of overall acceptability in both groups (p ≤0.05. In addition, it is noteworthy that control group received the highest raking scores for each sensory attribute, i.e. the supplements did not influence positively the sensory quality of chicken breast meat. Principal component analysis (PCA of the sensory data showed that the first 3 principal components (PCs explained 69.82% of the total variation in 5 variables. Visualisation of extracted PCs has shown that groups were very well represented, with E2 group clearly distinguished from the others. Normal 0 21 false false false SK X-NONE X-NONE
Directory of Open Access Journals (Sweden)
Matrone Giulia C
2012-06-01
Full Text Available Abstract Background In spite of the advances made in the design of dexterous anthropomorphic hand prostheses, these sophisticated devices still lack adequate control interfaces which could allow amputees to operate them in an intuitive and close-to-natural way. In this study, an anthropomorphic five-fingered robotic hand, actuated by six motors, was used as a prosthetic hand emulator to assess the feasibility of a control approach based on Principal Components Analysis (PCA, specifically conceived to address this problem. Since it was demonstrated elsewhere that the first two principal components (PCs can describe the whole hand configuration space sufficiently well, the controller here employed reverted the PCA algorithm and allowed to drive a multi-DoF hand by combining a two-differential channels EMG input with these two PCs. Hence, the novelty of this approach stood in the PCA application for solving the challenging problem of best mapping the EMG inputs into the degrees of freedom (DoFs of the prosthesis. Methods A clinically viable two DoFs myoelectric controller, exploiting two differential channels, was developed and twelve able-bodied participants, divided in two groups, volunteered to control the hand in simple grasp trials, using forearm myoelectric signals. Task completion rates and times were measured. The first objective (assessed through one group of subjects was to understand the effectiveness of the approach; i.e., whether it is possible to drive the hand in real-time, with reasonable performance, in different grasps, also taking advantage of the direct visual feedback of the moving hand. The second objective (assessed through a different group was to investigate the intuitiveness, and therefore to assess statistical differences in the performance throughout three consecutive days. Results Subjects performed several grasp, transport and release trials with differently shaped objects, by operating the hand with the myoelectric
Use of principal component analysis to classify forages and predict their calculated energy content.
Gallo, A; Moschini, M; Cerioli, C; Masoero, F
2013-06-01
A set of 180 forages (47 alfalfa hays, 26 grass hays, 52 corn silages, 35 small grain silages and 20 sorghum silages) were randomly collected from different locations of the Po Valley (Northern Italy) from 2009 to 2010. The forages were characterised for chemical composition (11 parameters), NDF digestibility (five parameters) and net energy for lactation (NEL). The latter was calculated according to the two approaches adopted by the 2001 Nutrient Research Council and based on chemical parameters either alone (NEL3x-Lig) or in combination with 48 h NDF degradability in the rumen (NEL3x-48h). Thereafter, a principal component analysis (PCA) was used to define forage populations and limit the number of variables to those useful for obtaining a rapid forage quality evaluation on the basis of the calculated NEL content of forages. The PCA identified three forage populations: corn silage, alfalfa hay and a generic population of so-called 'grasses', consisting of grass hays, small grain and sorghum silages. This differentiation was also confirmed by a cluster analysis. The first three principal components (PC) together explained 79.9% of the total variation. PC1 was mainly associated with protein fractions, ether extract and lignin, PC2 with ash, starch, NDF and indigestible NDF (iNDF) and PC3 with NDF digestibility. Moreover, PC2 was highly correlated to both NEL3x-Lig (r = -0.84) and NEL3x-48h (r = -0.94). Subsequently, forage-based scores (FS) were calculated by multiplying the original standardised variables of ash, starch, NDF and iNDF with the scoring factors obtained from PCA (0.112, -0.141, 0.227 and 0.170, respectively). The FS showed a high determination coefficient for both NEL3x-Lig (R 2 = 0.86) and NEL3x-48h (R 2 = 0.73). These results indicate that PCA enables the distinction of different forage classes and appropriate prediction of the energy value on the basis of a reduced number of parameters. With respect to the rumen in situ parameters, iNDF was found
Principal component analysis and the locus of the Fréchet mean in the space of phylogenetic trees.
Nye, Tom M W; Tang, Xiaoxian; Weyenberg, Grady; Yoshida, Ruriko
2017-12-01
Evolutionary relationships are represented by phylogenetic trees, and a phylogenetic analysis of gene sequences typically produces a collection of these trees, one for each gene in the analysis. Analysis of samples of trees is difficult due to the multi-dimensionality of the space of possible trees. In Euclidean spaces, principal component analysis is a popular method of reducing high-dimensional data to a low-dimensional representation that preserves much of the sample's structure. However, the space of all phylogenetic trees on a fixed set of species does not form a Euclidean vector space, and methods adapted to tree space are needed. Previous work introduced the notion of a principal geodesic in this space, analogous to the first principal component. Here we propose a geometric object for tree space similar to the [Formula: see text]th principal component in Euclidean space: the locus of the weighted Fréchet mean of [Formula: see text] vertex trees when the weights vary over the [Formula: see text]-simplex. We establish some basic properties of these objects, in particular showing that they have dimension [Formula: see text], and propose algorithms for projection onto these surfaces and for finding the principal locus associated with a sample of trees. Simulation studies demonstrate that these algorithms perform well, and analyses of two datasets, containing Apicomplexa and African coelacanth genomes respectively, reveal important structure from the second principal components.
Using the principal component analysis for evaluating the quality of a tourist destination
Directory of Open Access Journals (Sweden)
Ida Vajčnerová
2012-01-01
Full Text Available The article deals with problems concerning evaluating the quality of a tourist destination. A tourist destination is a conjunction of products, services, natural resources, culture resources, local people, artificially created attractions and information, due to which it is able to attract a number of visitors. The visitors’ satisfaction with a destination depends on the quality of their overall experience that is created on the basis of the cooperation of all actors in tourism in the given area – these are local inhabitants, service providers, public administration workers and destination management. The quality of services is a component of consumer satisfaction and so it is evaluated according to the level of a customer’s satisfaction. Sustainable development and the quality of natural environment are parts of the destination quality, too. When evaluating quality it is necessary to define a set of factors (variables that can be quantified and then to determine the quality of a destination.The objective of the paper is to create a model for evaluating the quality of a destination on the basis of analysing the importance of individual factors (variables concerning the quality of a destination. The importance of these factors was determined by relevant responders during a questionnaire survey. For reducing the original number of twenty dependant variables the multidimensional statistical method of analysing the principal components was used. On the basis of similarities in evaluation this method supported clusters of factors – relative dimensions of the quality of a destination. Subsequently a methodology was formulated to evaluate the quality of a destination according to four newly-defined dimensions of quality: Attractions, Services, Marketing management, Sustainability and cooperation.
The aging profile of the Portuguese population: a principal component analysis.
Rodrigues, Vitor; Mota-Pinto, Anabela; de Sousa, Bruno; Botelho, Amália; Alves, Catarina; de Oliveira, Catarina Resende
2014-08-01
In the last 5 years the resident population of Portugal has increased 2.3%, along with a progressive ageing. This study aims assessing the social dependence and frailty, as well as social and familial support needs of the elderly. In an observational, cross-sectional community based study (EPEPP study), a total of 2,672 people, aged 55 or more, were submitted to an enquiry and several variables were studied among three age groups: 55-64 years old (37%), 65-74 years old (37%) and ≥ 75 years old (26%), encompassing a total of 57% women and 43% men. A questionnaire including items such as physical autonomy, locomotion, falls, health/medical complaints, instrumental autonomy, physical activity, health self-evaluation and emotional status was applied. The strong correlations among the studied scores allowed the identification of people groups with common characteristics when a principal component analysis was used: "autonomy" (scores of instrumental autonomy, locomotion and physical autonomy) and "perception of health and emotional status" (scores of health self-evaluation and emotional status), were present in the three age groups. The component analysis evidences that a good autonomy, a good perception of health and emotional status are determinant to a good quality of life in elderly. Although health status and self-rated health have a propensity to deteriorate with aging, older Portuguese consider their state of health satisfactory and tend to underestimate their decline. In what concerns the analysis of gender with the same age and in contrast to what has been reported, older women alike to men, experience a good mobility and health self-evaluation.
Air Pollution and Human Development in Europe: A New Index Using Principal Component Analysis
Directory of Open Access Journals (Sweden)
Ana-Maria Săndică
2018-01-01
Full Text Available EU countries to measure human development incorporating the ambient PM2.5 concentration effect. Using a principal component analysis, we extract the information for 2010 and 2015 using the Real GDP/capita, the life expectancy at birth, tertiary educational attainment, ambient PM2.5 concentration, and the death rate due to exposure to ambient PM2.5 concentration for 29 European countries. This paper has two main results: it gives an overview about the relationship between human development and ambient PM2.5 concentration, and second, it provides a new quantitative measure, PHDI, which reshapes the concept of human development and the exposure to ambient PM2.5 concentration. Using rating classes, we defined thresholds for both HDI and PHDI values to group the countries in four categories. When comparing the migration matrix from 2010 to 2015 for HDI values, some countries improved the development indicator (Romania, Poland, Malta, Estonia, Cyprus, while no downgrades were observed. When comparing the transition matrix using the newly developed indicator, PHDI, the upgrades observed were for Denmark and Estonia, while some countries like Spain and Italy moved to a lower rating class due to ambient PM2.5 concentration.
Sartipi, Majid; Nedjat, Saharnaz; Mansournia, Mohammad Ali; Baigi, Vali; Fotouhi, Akbar
2016-11-01
Some variables like Socioeconomic Status (SES) cannot be directly measured, instead, so-called 'latent variables' are measured indirectly through calculating tangible items. There are different methods for measuring latent variables such as data reduction methods e.g. Principal Components Analysis (PCA) and Latent Class Analysis (LCA). The purpose of our study was to measure assets index- as a representative of SES- through two methods of Non-Linear PCA (NLPCA) and LCA, and to compare them for choosing the most appropriate model. This was a cross sectional study in which 1995 respondents filled the questionnaires about their assets in Tehran. The data were analyzed by SPSS 19 (CATPCA command) and SAS 9.2 (PROC LCA command) to estimate their socioeconomic status. The results were compared based on the Intra-class Correlation Coefficient (ICC). The 6 derived classes from LCA based on BIC, were highly consistent with the 6 classes from CATPCA (Categorical PCA) (ICC = 0.87, 95%CI: 0.86 - 0.88). There is no gold standard to measure SES. Therefore, it is not possible to definitely say that a specific method is better than another one. LCA is a complicated method that presents detailed information about latent variables and required one assumption (local independency), while NLPCA is a simple method, which requires more assumptions. Generally, NLPCA seems to be an acceptable method of analysis because of its simplicity and high agreement with LCA.
Multi-point accelerometric detection and principal component analysis of heart sounds.
De Panfilis, S; Moroni, C; Peccianti, M; Chiru, O M; Vashkevich, V; Parisi, G; Cassone, R
2013-03-01
Heart sounds are a fundamental physiological variable that provide a unique insight into cardiac semiotics. However a deterministic and unambiguous association between noises in cardiac dynamics is far from being accomplished yet due to many and different overlapping events which contribute to the acoustic emission. The current computer-based capacities in terms of signal detection and processing allow one to move from the standard cardiac auscultation, even in its improved forms like electronic stethoscopes or hi-tech phonocardiography, to the extraction of information on the cardiac activity previously unexplored. In this report, we present a new equipment for the detection of heart sounds, based on a set of accelerometric sensors placed in contact with the chest skin on the precordial area, and are able to measure simultaneously the vibration induced on the chest surface by the heart's mechanical activity. By utilizing advanced algorithms for the data treatment, such as wavelet decomposition and principal component analysis, we are able to condense the spatially extended acoustic information and to provide a synthetical representation of the heart activity. We applied our approach to 30 adults, mixed per gender, age and healthiness, and correlated our results with standard echocardiographic examinations. We obtained a 93% concordance rate with echocardiography between healthy and unhealthy hearts, including minor abnormalities such as mitral valve prolapse.
Improved Principal Component Analysis for Anomaly Detection: Application to an Emergency Department
Harrou, Fouzi
2015-07-03
Monitoring of production systems, such as those in hospitals, is primordial for ensuring the best management and maintenance desired product quality. Detection of emergent abnormalities allows preemptive actions that can prevent more serious consequences. Principal component analysis (PCA)-based anomaly-detection approach has been used successfully for monitoring systems with highly correlated variables. However, conventional PCA-based detection indices, such as the Hotelling’s T2T2 and the Q statistics, are ill suited to detect small abnormalities because they use only information from the most recent observations. Other multivariate statistical metrics, such as the multivariate cumulative sum (MCUSUM) control scheme, are more suitable for detection small anomalies. In this paper, a generic anomaly detection scheme based on PCA is proposed to monitor demands to an emergency department. In such a framework, the MCUSUM control chart is applied to the uncorrelated residuals obtained from the PCA model. The proposed PCA-based MCUSUM anomaly detection strategy is successfully applied to the practical data collected from the database of the pediatric emergency department in the Lille Regional Hospital Centre, France. The detection results evidence that the proposed method is more effective than the conventional PCA-based anomaly-detection methods.
PRINCIPAL COMPONENT ANALYSIS OF FACTORS DETERMINING PHOSPHATE ROCK DISSOLUTION ON ACID SOILS
Directory of Open Access Journals (Sweden)
Yusdar Hilman
2016-10-01
Full Text Available Many of the agricultural soils in Indonesia are acidic and low in both total and available phosphorus which severely limits their potential for crops production. These problems can be corrected by application of chemical fertilizers. However, these fertilizers are expensive, and cheaper alternatives such as phosphate rock (PR have been considered. Several soil factors may influence the dissolution of PR in soils, including both chemical and physical properties. The study aimed to identify PR dissolution factors and evaluate their relative magnitude. The experiment was conducted in Soil Chemical Laboratory, Universiti Putra Malaysia and Indonesian Center for Agricultural Land Resources Research and Development from January to April 2002. The principal component analysis (PCA was used to characterize acid soils in an incubation system into a number of factors that may affect PR dissolution. Three major factors selected were soil texture, soil acidity, and fertilization. Using the scores of individual factors as independent variables, stepwise regression analysis was performed to derive a PR dissolution function. The factors influencing PR dissolution in order of importance were soil texture, soil acidity, then fertilization. Soil texture factors including clay content and organic C, and soil acidity factor such as P retention capacity interacted positively with P dissolution and promoted PR dissolution effectively. Soil texture factors, such as sand and silt content, soil acidity factors such as pH, and exchangeable Ca decreased PR dissolution.
The pls Package: Principal Component and Partial Least Squares Regression in R
Directory of Open Access Journals (Sweden)
Bjørn-Helge Mevik
2007-01-01
Full Text Available The pls package implements principal component regression (PCR and partial least squares regression (PLSR in R (R Development Core Team 2006b, and is freely available from the Comprehensive R Archive Network (CRAN, licensed under the GNU General Public License (GPL. The user interface is modelled after the traditional formula interface, as exemplied by lm. This was done so that people used to R would not have to learn yet another interface, and also because we believe the formula interface is a good way of working interactively with models. It thus has methods for generic functions like predict, update and coef. It also has more specialised functions like scores, loadings and RMSEP, and a exible crossvalidation system. Visual inspection and assessment is important in chemometrics, and the pls package has a number of plot functions for plotting scores, loadings, predictions, coecients and RMSEP estimates. The package implements PCR and several algorithms for PLSR. The design is modular, so that it should be easy to use the underlying algorithms in other functions. It is our hope that the package will serve well both for interactive data analysis and as a building block for other functions or packages using PLSR or PCR. We will here describe the package and how it is used for data analysis, as well as how it can be used as a part of other packages. Also included is a section about formulas and data frames, for people not used to the R modelling idioms.
Determinants of Return on Assets in Romania: A Principal Component Analysis
Directory of Open Access Journals (Sweden)
Sorana Vatavu
2015-03-01
Full Text Available This paper examines the impact of capital structure, as well as its determinants on the financial performance of Romanian companies listed on the Bucharest Stock Exchange. The analysis is based on cross sectional regressions and factor analysis, and it refers to a ten-year period (2003-2012. Return on assets (ROA is the performance proxy, while the capital structure indicator is debt ratio. Regression results indicate that Romanian companies register higher returns when they operate with limited borrowings. Among the capital structure determinants, tangibility and business risk have a negative impact on ROA, but the level of taxation has a positive effect, showing that companies manage their assets more efficiently during times of higher fiscal pressure. Performance is sustained by sales turnover, but not significantly influenced by high levels of liquidity. Periods of unstable economic conditions, reflected by high inflation rates and the current financial crisis, have a strong negative impact on corporate performance. Based on regression results, three factors were considered through the method of iterated principal component factors: the first one incorporates debt and size, as an indicator of consumption, the second one integrates the influence of tangibility and liquidity, marking the investment potential, and the third one is an indicator of assessed risk, integrating the volatility of earnings with the level of taxation. ROA is significantly influenced by these three factors, regardless the regression method used. The consumption factor has a negative impact on performance, while the investment and risk variables positively influence ROA.
Aerodynamic multi-objective integrated optimization based on principal component analysis
Directory of Open Access Journals (Sweden)
Jiangtao HUANG
2017-08-01
Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.
Wu, Shuang; Wu, Hulin
2013-01-16
One of the fundamental problems in time course gene expression data analysis is to identify genes associated with a biological process or a particular stimulus of interest, like a treatment or virus infection. Most of the existing methods for this problem are designed for data with longitudinal replicates. But in reality, many time course gene experiments have no replicates or only have a small number of independent replicates. We focus on the case without replicates and propose a new method for identifying differentially expressed genes by incorporating the functional principal component analysis (FPCA) into a hypothesis testing framework. The data-driven eigenfunctions allow a flexible and parsimonious representation of time course gene expression trajectories, leaving more degrees of freedom for the inference compared to that using a prespecified basis. Moreover, the information of all genes is borrowed for individual gene inferences. The proposed approach turns out to be more powerful in identifying time course differentially expressed genes compared to the existing methods. The improved performance is demonstrated through simulation studies and a real data application to the Saccharomyces cerevisiae cell cycle data.
Directory of Open Access Journals (Sweden)
Shaohui Foong
2016-08-01
Full Text Available In this paper, a novel magnetic field-based sensing system employing statistically optimized concurrent multiple sensor outputs for precise field-position association and localization is presented. This method capitalizes on the independence between simultaneous spatial field measurements at multiple locations to induce unique correspondences between field and position. This single-source-multi-sensor configuration is able to achieve accurate and precise localization and tracking of translational motion without contact over large travel distances for feedback control. Principal component analysis (PCA is used as a pseudo-linear filter to optimally reduce the dimensions of the multi-sensor output space for computationally efficient field-position mapping with artificial neural networks (ANNs. Numerical simulations are employed to investigate the effects of geometric parameters and Gaussian noise corruption on PCA assisted ANN mapping performance. Using a 9-sensor network, the sensing accuracy and closed-loop tracking performance of the proposed optimal field-based sensing system is experimentally evaluated on a linear actuator with a significantly more expensive optical encoder as a comparison.
Sensory characterization of doda burfi (Indian milk cake) using Principal Component Analysis.
Chawla, Rekha; Patil, Girdhari Ramdas; Singh, Ashish Kumar
2014-03-01
Traditional sweetmeats of various countries hold a great and promising scope in their improvement and in order to tap the potential of the same, several companies and co-operative federations have started their organized production. Doda burfi, a heat desiccated and popular sweetmeat of northern India, is one of the regional specific, unfamiliarized products of India. The typical sweetmeat is characterized by caramelized and nutty flavour and granular texture. The purpose of this study was to determine the close relationship among various sensory attributes of the product collected from renowned manufacturers located in four different cities and to characterize an overall acceptable product. Individuals from academia participated in a round table discussion to generate descriptive terms related to colour and appearance, flavour and texture. Prior to sensory evaluation, sensory panel was trained and briefed about the terminology used to judge the product involving a descriptive intensity scale of 100 points for describing major sensory attributes. Results were analyzed using ANOVA and principal component analysis. Correlation table indicated a good degree of positive association between the attributes such as glossy appearance, dark colour, caramelized and nutty flavour and cohesive and chewy texture with the overall acceptability of the product.
State and group dynamics of world stock market by principal component analysis
Nobi, Ashadun; Lee, Jae Woo
2016-05-01
We study the dynamic interactions and structural changes by a principal component analysis (PCA) to cross-correlation coefficients of global financial indices in the years 1998-2012. The variances explained by the first PC increase with time and show a drastic change during the crisis. A sharp change in PC coefficient implies a transition of market state, a situation which occurs frequently in the American and Asian indices. However, the European indices remain stable over time. Using the first two PC coefficients, we identify indices that are similar and more strongly correlated than the others. We observe that the European indices form a robust group over the observation period. The dynamics of the individual indices within the group increase in similarity with time, and the dynamics of indices are more similar during the crises. Furthermore, the group formation of indices changes position in two-dimensional spaces due to crises. Finally, after a financial crisis, the difference of PCs between the European and American indices narrows.
PRINCIPAL COMPONENT ANALYSIS AND CLUSTER ANALYSIS IN MULTIVARIATE ASSESSMENT OF WATER QUALITY
Directory of Open Access Journals (Sweden)
Elzbieta Radzka
2017-03-01
Full Text Available This paper deals with the use of multivariate methods in drinking water analysis. During a five-year project, from 2008 to 2012, selected chemical parameters in 11 water supply networks of the Siedlce County were studied. Throughout that period drinking water was of satisfactory quality, with only iron and manganese ions exceeding the limits (21 times and 12 times, respectively. In accordance with the results of cluster analysis, all water networks were put into three groups of different water quality. A high concentration of chlorides, sulphates, and manganese and a low concentration of copper and sodium was found in the water of Group 1 supply networks. The water in Group 2 had a high concentration of copper and sodium, and a low concentration of iron and sulphates. The water from Group 3 had a low concentration of chlorides and manganese, but a high concentration of fluorides. Using principal component analysis and cluster analysis, multivariate correlation between the studied parameters was determined, helping to put water supply networks into groups according to similar water quality.
CLASSIFICATION OF LIDAR DATA OVER BUILDING ROOFS USING K-MEANS AND PRINCIPAL COMPONENT ANALYSIS
Directory of Open Access Journals (Sweden)
Renato César dos Santos
Full Text Available Abstract: The classification is an important step in the extraction of geometric primitives from LiDAR data. Normally, it is applied for the identification of points sampled on geometric primitives of interest. In the literature there are several studies that have explored the use of eigenvalues to classify LiDAR points into different classes or structures, such as corner, edge, and plane. However, in some works the classes are defined considering an ideal geometry, which can be affected by the inadequate sampling and/or by the presence of noise when using real data. To overcome this limitation, in this paper is proposed the use of metrics based on eigenvalues and the k-means method to carry out the classification. So, the concept of principal component analysis is used to obtain the eigenvalues and the derived metrics, while the k-means is applied to cluster the roof points in two classes: edge and non-edge. To evaluate the proposed method four test areas with different levels of complexity were selected. From the qualitative and quantitative analyses, it could be concluded that the proposed classification procedure gave satisfactory results, resulting in completeness and correctness above 92% for the non-edge class, and between 61% to 98% for the edge class.
Directory of Open Access Journals (Sweden)
Dongdong Song
2015-01-01
Full Text Available To predict the service life of polystyrene (PS under an aggressive environment, the nondimensional expression Z was established from a data set of multiple properties of PS by principal component analysis (PCA. In this study, PS specimens were exposed to the tropical environment on Xisha Islands in China for two years. Chromatic aberration, gloss, tensile strength, elongation at break, flexural strength, and impact strength were tested to evaluate the aging behavior of PS. Based on different needs of industries, each of the multiple properties could be used to evaluate the service life of PS. However, selecting a single performance variation will inevitably hide some information about the entire aging process. Therefore, finding a comprehensive measure representing the overall aging performance of PS can be highly significant. Herein, PCA was applied to obtain a specific property (Z which can represent all properties of PS. Z of PS degradation showed a slight decrease for the initial two months of exposure after which it increased rapidly in the next eight months. Subsequently, a slower increase of Z value was observed. From the three different stages shown as Z value increases, three stages have been identified for PS service life.
In-TFT-Array-Process Micro Defect Inspection Using Nonlinear Principal Component Analysis
Directory of Open Access Journals (Sweden)
Zhi-Hao Kang
2009-10-01
Full Text Available Defect inspection plays a critical role in thin film transistor liquid crystal display (TFT-LCD manufacture, and has received much attention in the field of automatic optical inspection (AOI. Previously, most focus was put on the problems of macro-scale Mura-defect detection in cell process, but it has recently been found that the defects which substantially influence the yield rate of LCD panels are actually those in the TFT array process, which is the first process in TFT-LCD manufacturing. Defect inspection in TFT array process is therefore considered a difficult task. This paper presents a novel inspection scheme based on kernel principal component analysis (KPCA algorithm, which is a nonlinear version of the well-known PCA algorithm. The inspection scheme can not only detect the defects from the images captured from the surface of LCD panels, but also recognize the types of the detected defects automatically. Results, based on real images provided by a LCD manufacturer in Taiwan, indicate that the KPCA-based defect inspection scheme is able to achieve a defect detection rate of over 99% and a high defect classification rate of over 96% when the imbalanced support vector machine (ISVM with 2-norm soft margin is employed as the classifier. More importantly, the inspection time is less than 1 s per input image.
Directory of Open Access Journals (Sweden)
Francisco Criado-Aldeanueva
2013-01-01
Full Text Available Two different paradigms of the Mediterranean Oscillation (MO teleconnection index have been compared in this work: station-based definitions obtained by the difference of some climate variable between two selected points in the eastern and western basins (i.e., Algiers and Cairo, Gibraltar and Israel, Marseille and Jerusalem, or south France and Levantine basin and the principal component (PC approach in which the index is obtained as the time series of the first mode of normalised sea level pressure anomalies across the extended Mediterranean region. Interannual to interdecadal precipitation (P, evaporation (E, E-P, and net heat flux have been correlated with the different MO indices to compare their relative importance in the long-term variability of heat and freshwater budgets over the Mediterranean Sea. On an annual basis, the PC paradigm is the most effective tool to assess the effect of the large-scale atmospheric forcing in the Mediterranean Sea because the station-based indices exhibit a very poor correlation with all climatic variables and only influence a reduced fraction of the basin. In winter, the station-based indices highly improve their ability to represent the atmospheric forcing and results are fairly independent of the paradigm used.
Directory of Open Access Journals (Sweden)
Ida Vajčnerová
2016-01-01
Full Text Available The objective of the paper is to explore possibilities of evaluating the quality of a tourist destination by means of the principal components analysis (PCA and the cluster analysis. In the paper both types of analysis are compared on the basis of the results they provide. The aim is to identify advantage and limits of both methods and provide methodological suggestion for their further use in the tourism research. The analyses is based on the primary data from the customers’ satisfaction survey with the key quality factors of a destination. As output of the two statistical methods is creation of groups or cluster of quality factors that are similar in terms of respondents’ evaluations, in order to facilitate the evaluation of the quality of tourist destinations. Results shows the possibility to use both tested methods. The paper is elaborated in the frame of wider research project aimed to develop a methodology for the quality evaluation of tourist destinations, especially in the context of customer satisfaction and loyalty.
Wang, Li; Liu, Hongzhi; Liu, Li; Wang, Qiang; Li, Shurong; Li, Qizhai
2017-03-01
Supervised principal component regression (SPCR) analysis was adopted to establish the evaluation model of peanut protein solubility. Sixty-six peanut varieties were analysed in the present study. Results showed there was intimate correlation between protein solubility and other indexes. At 0.05 level, these 11 indexes, namely crude fat, crude protein, total sugar, cystine, arginine, conarachin I, 37.5kDa, 23.5kDa, 15.5kDa, protein extraction rate, and kernel ratio, were correlated with protein solubility and were extracted to for establishing the SPCR model. At 0.01 level, a simper model was built between the four indexes (crude protein, cystine, conarachin I, and 15.5kDa) and protein solubility. Verification results showed that the coefficients between theoretical and experimental values were 0.815 (psolubility effectively. The application of models was more convenient and efficient than traditional determination method. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hou, Ying; Jiang, Canping; Shukla, Abhinav A; Cramer, Steven M
2011-01-01
Protein A chromatography is widely employed for the capture and purification of antibodies and Fc-fusion proteins. Due to the high cost of protein A resins, there is a significant economic driving force for using these chromatographic materials for a large number of cycles. The maintenance of column performance over the resin lifetime is also a significant concern in large-scale manufacturing. In this work, several statistical methods are employed to develop a novel principal component analysis (PCA)-based tool for predicting protein A chromatographic column performance over time. A method is developed to carry out detection of column integrity failures before their occurrence without the need for a separate integrity test. In addition, analysis of various transitions in the chromatograms was also employed to develop PCA-based models to predict both subtle and general trends in real-time protein A column yield decay. The developed approach has significant potential for facilitating timely and improved decisions in large-scale chromatographic operations in line with the process analytical technology (PAT) guidance from the Food and Drug Administration (FDA). © 2010 Wiley Periodicals, Inc.
Multi-point accelerometric detection and principal component analysis of heart sounds
International Nuclear Information System (INIS)
De Panfilis, S; Peccianti, M; Chiru, O M; Moroni, C; Vashkevich, V; Parisi, G; Cassone, R
2013-01-01
Heart sounds are a fundamental physiological variable that provide a unique insight into cardiac semiotics. However a deterministic and unambiguous association between noises in cardiac dynamics is far from being accomplished yet due to many and different overlapping events which contribute to the acoustic emission. The current computer-based capacities in terms of signal detection and processing allow one to move from the standard cardiac auscultation, even in its improved forms like electronic stethoscopes or hi-tech phonocardiography, to the extraction of information on the cardiac activity previously unexplored. In this report, we present a new equipment for the detection of heart sounds, based on a set of accelerometric sensors placed in contact with the chest skin on the precordial area, and are able to measure simultaneously the vibration induced on the chest surface by the heart's mechanical activity. By utilizing advanced algorithms for the data treatment, such as wavelet decomposition and principal component analysis, we are able to condense the spatially extended acoustic information and to provide a synthetical representation of the heart activity. We applied our approach to 30 adults, mixed per gender, age and healthiness, and correlated our results with standard echocardiographic examinations. We obtained a 93% concordance rate with echocardiography between healthy and unhealthy hearts, including minor abnormalities such as mitral valve prolapse. (fast track communication)
A Principal Component Analysis/Fuzzy Comprehensive Evaluation for Rockburst Potential in Kimberlite
Pu, Yuanyuan; Apel, Derek; Xu, Huawei
2018-02-01
Kimberlite is an igneous rock which sometimes bears diamonds. Most of the diamonds mined in the world today are found in kimberlite ores. Burst potential in kimberlite has not been investigated, because kimberlite is mostly mined using open-pit mining, which poses very little threat of rock bursting. However, as the mining depth keeps increasing, the mines convert to underground mining methods, which can pose a threat of rock bursting in kimberlite. This paper focuses on the burst potential of kimberlite at a diamond mine in northern Canada. A combined model with the methods of principal component analysis (PCA) and fuzzy comprehensive evaluation (FCE) is developed to process data from 12 different locations in kimberlite pipes. Based on calculated 12 fuzzy evaluation vectors, 8 locations show a moderate burst potential, 2 locations show no burst potential, and 2 locations show strong and violent burst potential, respectively. Using statistical principles, a Mahalanobis distance is adopted to build a comprehensive fuzzy evaluation vector for the whole mine and the final evaluation for burst potential is moderate, which is verified by a practical rockbursting situation at mine site.
Directory of Open Access Journals (Sweden)
Tomáš Masák
2017-09-01
Full Text Available Principal component analysis (PCA is a popular dimensionality reduction and data visualization method. Sparse PCA (SPCA is its extensively studied and NP-hard-to-solve modifcation. In the past decade, many diferent algorithms were proposed to perform SPCA. We build upon the work of Zou et al. (2006 who recast the SPCA problem into the regression framework and proposed to induce sparsity with the l1 penalty. Instead, we propose to drop the l1 penalty and promote sparsity by re-weighting the l2-norm. Our algorithm thus consists mainly of solving weighted ridge regression problems. We show that the algorithm basically attempts to fnd a solution to a penalized least squares problem with a non-convex penalty that resembles the l0-norm more closely. We also apply the algorithm to analyze the voting records of the Chamber of Deputies of the Parliament of the Czech Republic. We show not only why the SPCA is more appropriate to analyze this type of data, but we also discuss whether the variable selection property can be utilized as an additional piece of information, for example to create voting calculators automatically.
Trajectory modeling of gestational weight: A functional principal component analysis approach.
Directory of Open Access Journals (Sweden)
Menglu Che
Full Text Available Suboptimal gestational weight gain (GWG, which is linked to increased risk of adverse outcomes for a pregnant woman and her infant, is prevalent. In the study of a large cohort of Canadian pregnant women, our goals are to estimate the individual weight growth trajectory using sparsely collected bodyweight data, and to identify the factors affecting the weight change during pregnancy, such as prepregnancy body mass index (BMI, dietary intakes and physical activity. The first goal was achieved through functional principal component analysis (FPCA by conditional expectation. For the second goal, we used linear regression with the total weight gain as the response variable. The trajectory modeling through FPCA had a significantly smaller root mean square error (RMSE and improved adaptability than the classic nonlinear mixed-effect models, demonstrating a novel tool that can be used to facilitate real time monitoring and interventions of GWG. Our regression analysis showed that prepregnancy BMI had a high predictive value for the weight changes during pregnancy, which agrees with the published weight gain guideline.
Dopico-García, M S; Fique, A; Guerra, L; Afonso, J M; Pereira, O; Valentão, P; Andrade, P B; Seabra, R M
2008-06-15
Phenolic profile of 10 different varieties of red "Vinho Verde" grapes (Azal Tinto, Borraçal, Brancelho, Doçal, Espadeiro, Padeiro de Basto, Pedral, Rabo de ovelha, Verdelho and Vinhão), from Minho (Portugal) were studied. Nine Flavonols, four phenolic acids, three flavan-3-ols, one stilben and eight anthocyanins were determined. Malvidin-3-O-glucoside was the most abundant anthocyanin while the main non-coloured compound was much more heterogeneous: catechin, epicatechin, myricetin-3-O-glucoside, quercetin-3-O-glucoside or syringetin-3-O-glucoside. Anthocyanin contents ranged from 42 to 97%. Principal component analysis (PCA) was applied to analyse the date and study the relations between the samples and their phenolic profiles. Anthocyanin profile proved to be a good marker to characterize the varieties even considering different origin and harvest. "Vinhão" grapes showed anthocyanins levels until twenty four times higher than the rest of the samples, with 97% of these compounds.
International Nuclear Information System (INIS)
Araujo, Janeo Severino C. de; Dantas, Carlos Costa; Santos, Valdemir A. dos; Souza, Jose Edson G. de; Luna-Finkler, Christine L.
2009-01-01
The fluid dynamic behavior of riser of a cold flow model of a Fluid Catalytic Cracking Unit (FCCU) was investigated. The experimental data were obtained by the nuclear technique of gamma transmission. A gamma source was placed diametrically opposite to a detector in any straight section of the riser. The gas-solid flow through riser was monitored with a source of Americium-241 what allowed obtaining information of the axial solid concentration without flow disturbance and also identifying the dependence of this concentration profile with several independent variables. The MatLab R and Statistica R software were used. Statistica tool employed was the Principal Components Analysis (PCA), that consisted of the job of the data organization, through two-dimensional head offices to allow extract relevant information about the importance of the independent variables on axial solid concentration in a cold flow riser. The variables investigated were mass flow rate of solid, mass flow rate of gas, pressure in the riser base and the relative height in the riser. The first two components reached about 98 % of accumulated percentage of explained variance. (author)
Szabo, J.K.; Fedriani, E.M.; Segovia-Gonzalez, M. M.; Astheimer, L.B.; Hooper, M.J.
2010-01-01
This paper introduces a new technique in ecology to analyze spatial and temporal variability in environmental variables. By using simple statistics, we explore the relations between abiotic and biotic variables that influence animal distributions. However, spatial and temporal variability in rainfall, a key variable in ecological studies, can cause difficulties to any basic model including time evolution. The study was of a landscape scale (three million square kilometers in eastern Australia), mainly over the period of 19982004. We simultaneously considered qualitative spatial (soil and habitat types) and quantitative temporal (rainfall) variables in a Geographical Information System environment. In addition to some techniques commonly used in ecology, we applied a new method, Functional Principal Component Analysis, which proved to be very suitable for this case, as it explained more than 97% of the total variance of the rainfall data, providing us with substitute variables that are easier to manage and are even able to explain rainfall patterns. The main variable came from a habitat classification that showed strong correlations with rainfall values and soil types. ?? 2010 World Scientific Publishing Company.
Directory of Open Access Journals (Sweden)
Aline Gomes da Silva
2014-01-01
Full Text Available In the current context of climate change discussions, predictions of future scenarios of weather and climate are crucial for the generation of information of interest to the global community. Due to the atmosphere being a chaotic system, errors in predictions of future scenarios are systematically observed. Therefore, numerous techniques have been tested in order to generate more reliable predictions, and two techniques have excelled in science: dynamic downscaling, through regional models, and ensemble prediction, combining different outputs of climate models through the arithmetic average, in other words, a postprocessing of the output data species. Thus, this paper proposes a method of postprocessing outputs of regional climate models. This method consists in using the statistical tool multiple linear regression by principal components for combining different simulations obtained by dynamic downscaling with the regional climate model (RegCM4. Tests for the Amazon and Northeast region of Brazil (South America showed that the method provided a more realistic prediction in terms of average daily rainfall for the analyzed period prescribed, after comparing with the prediction made by set through the arithmetic averages of the simulations. This method photographed the extreme events (outlier that the prediction by averaging failed. Data from the Tropical Rainfall Measuring Mission (TRMM were used to evaluate the method.
Karmakar, Bibha; Kobyliansky, Eugene
2009-09-01
Objective of this study is to explore the nature of sex differences between two different sets of dermatoglyphic traits based on principal components in the Turkmenian population. Two categories of dermatoglyphic traits--22 usually studied quantitative traits and 42 variables of diversity and asymmetry were analysed among 745 individuals (309 males and 436 females). The three principal components are very prominent in both sexes--"digital pattern size factor" indicates the degree of universality, as found in earlier studies among different ethnic populations; "intra individual diversity factor" and "bilateral asymmetry factor" are also similar with the earlier studies, which suggest the genetic factor has more influence on these variables than environmental factors. These results strongly indicate that there is a common biological validity exists of the underlying principal component structures between two different sets of dermatoglyphic characters and thus dermatoglyphic factors between two groups of variables can be used for sex-discrimination in different populations.
Medina, José M.; Díaz, José A.
2013-05-01
We have applied principal component analysis to examine trial-to-trial variability of reflectances of automotive coatings that contain effect pigments. Reflectance databases were measured from different color batch productions using a multi-angle spectrophotometer. A method to classify the principal components was used based on the eigenvalue spectra. It was found that the eigenvalue spectra follow distinct power laws and depend on the detection angle. The scaling exponent provided an estimation of the correlation between reflectances and it was higher near specular reflection, suggesting a contribution from the deposition of effect pigments. Our findings indicate that principal component analysis can be a useful tool to classify different sources of spectral variability in color engineering.
Soares, Denise Paschoal; de Castro, Marcelo Peduzzi; Mendes, Emilia Assunção; Machado, Leandro
2016-12-01
The alterations in gait pattern of people with transfemoral amputation leave them more susceptible to musculoskeletal injury. Principal component analysis is a method that reduces the amount of gait data and allows analyzing the entire waveform. To use the principal component analysis to compare the ground reaction force and center of pressure displacement waveforms obtained during gait between able-bodied subjects and both limbs of individuals with transfemoral amputation. This is a transversal study with a convenience sample. We used a force plate and pressure plate to record the anterior-posterior, medial-lateral and vertical ground reaction force, and anterior-posterior and medial-lateral center of pressure positions of 12 participants with transfemoral amputation and 20 able-bodied subjects during gait. The principal component analysis was performed to compare the gait waveforms between the participants with transfemoral amputation and the able-bodied individuals. The principal component analysis model explained between 74% and 93% of the data variance. In all ground reaction force and center of pressure waveforms relevant portions were identified; and always at least one principal component presented scores statistically different (p analysis was able to discriminate many portions of the stance phase between both lower limbs of people with transfemoral amputation compared to the able-bodied participants. Principal component analysis reduced the amount of data, allowed analyzing the whole waveform, and identified specific sub-phases of gait that were different between the groups. Therefore, this approach seems to be a powerful tool to be used in gait evaluation and following the rehabilitation status of people with transfemoral amputation. © The International Society for Prosthetics and Orthotics 2015.
2L-PCA: a two-level principal component analyzer for quantitative drug design and its applications.
Du, Qi-Shi; Wang, Shu-Qing; Xie, Neng-Zhong; Wang, Qing-Yan; Huang, Ri-Bo; Chou, Kuo-Chen
2017-09-19
A two-level principal component predictor (2L-PCA) was proposed based on the principal component analysis (PCA) approach. It can be used to quantitatively analyze various compounds and peptides about their functions or potentials to become useful drugs. One level is for dealing with the physicochemical properties of drug molecules, while the other level is for dealing with their structural fragments. The predictor has the self-learning and feedback features to automatically improve its accuracy. It is anticipated that 2L-PCA will become a very useful tool for timely providing various useful clues during the process of drug development.
International Nuclear Information System (INIS)
Di Maria, Costanzo; Liu, Chengyu; Zheng, Dingchang; Murray, Alan; Langley, Philip
2014-01-01
This study presents a systematic comparison of different approaches to the automated selection of the principal components (PC) which optimise the detection of maternal and fetal heart beats from non-invasive maternal abdominal recordings. A public database of 75 4-channel non-invasive maternal abdominal recordings was used for training the algorithm. Four methods were developed and assessed to determine the optimal PC: (1) power spectral distribution, (2) root mean square, (3) sample entropy, and (4) QRS template. The sensitivity of the performance of the algorithm to large-amplitude noise removal (by wavelet de-noising) and maternal beat cancellation methods were also assessed. The accuracy of maternal and fetal beat detection was assessed against reference annotations and quantified using the detection accuracy score F1 [2*PPV*Se / (PPV + Se)], sensitivity (Se), and positive predictive value (PPV). The best performing implementation was assessed on a test dataset of 100 recordings and the agreement between the computed and the reference fetal heart rate (fHR) and fetal RR (fRR) time series quantified. The best performance for detecting maternal beats (F1 99.3%, Se 99.0%, PPV 99.7%) was obtained when using the QRS template method to select the optimal maternal PC and applying wavelet de-noising. The best performance for detecting fetal beats (F1 89.8%, Se 89.3%, PPV 90.5%) was obtained when the optimal fetal PC was selected using the sample entropy method and utilising a fixed-length time window for the cancellation of the maternal beats. The performance on the test dataset was 142.7 beats 2 /min 2 for fHR and 19.9 ms for fRR, ranking respectively 14 and 17 (out of 29) when compared to the other algorithms presented at the Physionet Challenge 2013. (paper)
Nyaku, Seloame T; Kantety, Ramesh V; Cebert, Ernst; Lawrence, Kathy S; Honger, Joseph O; Sharma, Govind C
2016-04-01
U.S. cotton production is suffering from the yield loss caused by the reniform nematode (RN), Rotylenchulus reniformis. Management of this devastating pest is of utmost importance because, no upland cotton cultivar exhibits adequate resistance to RN. Nine populations of RN from distinct regions in Alabama and one population from Mississippi were studied and thirteen morphometric features were measured on 20 male and 20 female nematodes from each population. Highly correlated variables (positive) in female and male RN morphometric parameters were observed for body length (L) and distance of vulva from the lip region (V) (r = 0.7) and tail length (TL) and c' (r = 0.8), respectively. The first and second principal components for the female and male populations showed distinct clustering into three groups. These results show pattern of sub-groups within the RN populations in Alabama. A one-way ANOVA on female and male RN populations showed significant differences (p ≤ 0.05) among the variables. Multiple sequence alignment (MSA) of 18S rRNA sequences (421) showed lengths of 653 bp. Sites within the aligned sequences were conserved (53%), parsimony-informative (17%), singletons (28%), and indels (2%), respectively. Neighbor-Joining analysis showed intra and inter-nematodal variations within the populations as clone sequences from different nematodes irrespective of the sex of nematode isolate clustered together. Morphologically, the three groups (I, II and III) could not be distinctly associated with the molecular data from the 18S rRNA sequences. The three groups may be identified as being non-geographically contiguous.
Directory of Open Access Journals (Sweden)
Seloame T. Nyaku
2016-04-01
Full Text Available U.S. cotton production is suffering from the yield loss caused by the reniform nematode (RN, Rotylenchulus reniformis. Management of this devastating pest is of utmost importance because, no upland cotton cultivar exhibits adequate resistance to RN. Nine populations of RN from distinct regions in Alabama and one population from Mississippi were studied and thirteen morphometric features were measured on 20 male and 20 female nematodes from each population. Highly correlated variables (positive in female and male RN morphometric parameters were observed for body length (L and distance of vulva from the lip region (V (r = 0.7 and tail length (TL and c′ (r = 0.8, respectively. The first and second principal components for the female and male populations showed distinct clustering into three groups. These results show pattern of sub-groups within the RN populations in Alabama. A one-way ANOVA on female and male RN populations showed significant differences (p ≤ 0.05 among the variables. Multiple sequence alignment (MSA of 18S rRNA sequences (421 showed lengths of 653 bp. Sites within the aligned sequences were conserved (53%, parsimony-informative (17%, singletons (28%, and indels (2%, respectively. Neighbor-Joining analysis showed intra and inter-nematodal variations within the populations as clone sequences from different nematodes irrespective of the sex of nematode isolate clustered together. Morphologically, the three groups (I, II and III could not be distinctly associated with the molecular data from the 18S rRNA sequences. The three groups may be identified as being non-geographically contiguous.
Cole, Jacqueline M; Cheng, Xie; Payne, Michael C
2016-11-07
The use of principal component analysis (PCA) to statistically infer features of local structure from experimental pair distribution function (PDF) data is assessed on a case study of rare-earth phosphate glasses (REPGs). Such glasses, codoped with two rare-earth ions (R and R') of different sizes and optical properties, are of interest to the laser industry. The determination of structure-property relationships in these materials is an important aspect of their technological development. Yet, realizing the local structure of codoped REPGs presents significant challenges relative to their singly doped counterparts; specifically, R and R' are difficult to distinguish in terms of establishing relative material compositions, identifying atomic pairwise correlation profiles in a PDF that are associated with each ion, and resolving peak overlap of such profiles in PDFs. This study demonstrates that PCA can be employed to help overcome these structural complications, by statistically inferring trends in PDFs that exist for a restricted set of experimental data on REPGs, and using these as training data to predict material compositions and PDF profiles in unknown codoped REPGs. The application of these PCA methods to resolve individual atomic pairwise correlations in t(r) signatures is also presented. The training methods developed for these structural predictions are prevalidated by testing their ability to reproduce known physical phenomena, such as the lanthanide contraction, on PDF signatures of the structurally simpler singly doped REPGs. The intrinsic limitations of applying PCA to analyze PDFs relative to the quality control of source data, data processing, and sample definition, are also considered. While this case study is limited to lanthanide-doped REPGs, this type of statistical inference may easily be extended to other inorganic solid-state materials and be exploited in large-scale data-mining efforts that probe many t(r) functions.
Nomoto, Yohei; Yamashita, Kazuhiko; Ohya, Tetsuya; Koyama, Hironori; Kawasumi, Masashi
There is the increasing concern of the society to prevent the fall of the aged. The improvement in aged people's the muscular strength of the lower-limb, postural control and walking ability are important for quality of life and fall prevention. The aim of this study was to develop multiple evaluation methods in order to advise for improvement and maintenance of lower limb function between aged and young. The subjects were 16 healthy young volunteers (mean ± S.D: 19.9 ± 0.6 years) and 10 healthy aged volunteers (mean ± S.D: 80.6 ± 6.1 years). Measurement items related to lower limb function were selected from the items which we have ever used. Selected measurement items of function of lower are distance of extroversion of the toe, angle of flexion of the toe, maximum width of step, knee elevation, moving distance of greater trochanter, walking balance, toe-gap force and rotation range of ankle joint. Measurement items summarized by the principal component analysis into lower ability evaluation methods including walking ability and muscle strength of lower limb and flexibility of ankle. The young group demonstrated the factor of 1.6 greater the assessment score of walking ability compared with the aged group. The young group demonstrated the factor of 1.4 greater the assessment score of muscle strength of lower limb compared with the aged group. The young group demonstrated the factor of 1.2 greater the assessment score of flexibility of ankle compared with the aged group. The results suggested that it was possible to assess the lower limb function of aged and young numerically and to advise on their foot function.
Carcass and meat quality of light lambs using principal component analysis.
Cañeque, V; Pérez, C; Velasco, S; Dı X0301 Az, M T; Lauzurica, S; Alvarez, I; Ruiz de Huidobro, F; Onega, E; De la Fuente, J
2004-08-01
Eighty-six male light lambs of Manchego breed were used in this study. Principal component (PC) analysis was performed to study the relationship between carcass quality variables (n=22) and between meat quality measures (n=21). The carcass quality was assessed using objective and subjective measurements of conformation and fatness besides the joints proportion and tissues proportion of the leg. The measurements used to evaluate meat quality were pH in longissimus dorsi and semitendinosus muscles, the colour, moisture, water holding capacity, cooking losses, texture and sensorial analysis on longissimus dorsi. The five first PCs explained about 77% of the total variability for carcass measures whereas for meat quality the 74% of the total variability was explained for the eight first PCs. All the carcass measurements showed similar weight to define the first PC, whereas the muscle and bone proportion as well as muscle:bone ratio of the leg were useful to define the second PC. The meat quality measures that were more effective to define the first PC were the meat colour measurements, whereas the sensorial variables defined the second PC. The projection of the carcass quality data in the first two PCs allowed to distinguish clearly between heavier carcasses (higher than 6.5 kg) and lighter carcasses (lower than 5.5 kg). The carcasses with a weight higher than 6.5 kg were on the left side of the figure, where the variables of conformation and fatness lie. The group of medium carcass weight were placed between the two previous groups. The projection of the meat quality data in the first two PCs did not differ between hot carcass weights, although there was a trend, the lighter carcasses lay on the left side of the graph, which implies small differences between meat quality in this range of carcass weight.
Lee, Jiseon; Park, Junhee; Yang, Sejung; Kim, Hani; Choi, Yun Seo; Kim, Hyeon Jin; Lee, Hyang Woon; Lee, Byung-Uk
2017-01-01
The use of automatic electrical stimulation in response to early seizure detection has been introduced as a new treatment for intractable epilepsy. For the effective application of this method as a successful treatment, improving the accuracy of the early seizure detection is crucial. In this paper, we proposed the application of a frequency-based algorithm derived from principal component analysis (PCA), and demonstrated improved efficacy for early seizure detection in a pilocarpine-induced epilepsy rat model. A total of 100 ictal electroencephalographs (EEG) during spontaneous recurrent seizures from 11 epileptic rats were finally included for the analysis. PCA was applied to the covariance matrix of a conventional EEG frequency band signal. Two PCA results were compared: one from the initial segment of seizures (5 sec of seizure onset) and the other from the whole segment of seizures. In order to compare the accuracy, we obtained the specific threshold satisfying the target performance from the training set, and compared the False Positive (FP), False Negative (FN), and Latency (Lat) of the PCA based feature derived from the initial segment of seizures to the other six features in the testing set. The PCA based feature derived from the initial segment of seizures performed significantly better than other features with a 1.40% FP, zero FN, and 0.14 s Lat. These results demonstrated that the proposed frequency-based feature from PCA that captures the characteristics of the initial phase of seizure was effective for early detection of seizures. Experiments with rat ictal EEGs showed an improved early seizure detection rate with PCA applied to the covariance of the initial 5 s segment of visual seizure onset instead of using the whole seizure segment or other conventional frequency bands.
Aydin, T; Bayrak, N; Baran, E; Cakir, A
2017-08-01
Insecticidal effects of the dichloromethane, ethyl acetate, acetone, ethanol and methanol extracts of Humulus lupulus (hops) L. cones and its principal components, xanthohumol was investigated on five stored pests, Sitophilus granarius (L.), Sitophilus oryzae (L.), Acanthoscelides obtectus (Say.), Tribolium castaneum (Herbst) and Lasioderma serricorne (F.). The mortality of adults of the insects treated with 2, 5, 5, 10 and 20 mg ml̠-1 concentrations of the extracts and xanthuhumol was counted after 24, 48, 72, 96 and 120 h. In order to determine the toxic effects of the substances tested against all tested insects, durations for 50% mortality of the adults, and LD50 values were also determined in the first 48 h by probit analysis. Our results also showed that xanthohumol was more toxic against the pests in comparison with the extracts applications. LD50 values for xanthohumol were found to be low dose as compared with the extracts. Xanthohumol was more toxic against S. granarius (L.) with 6.8 µg of LD50 value. Among the extracts, methanol extract was less effective than other extracts against all tested insects. The ethyl acetate extract of H. lupulus cones was the most effective extract against the tested pests. The quantitative amounts of xanthohumol in the extracts were determined using a high-performance liquid chromatography. The quantitative data indicated that amount of xanthohumol in the extracts increased with increase of polarity of the solvents used from methanol to dichloromethane. The methanol extract contained the high amount of xanthohumol with 5.74 g/100 g extract (0.46 g/100 g plant sample).
Taguchi, Y-H
2016-01-01
The recently proposed principal component analysis (PCA) based unsupervised feature extraction (FE) has successfully been applied to various bioinformatics problems ranging from biomarker identification to the screening of disease causing genes using gene expression/epigenetic profiles. However, the conditions required for its successful use and the mechanisms involved in how it outperforms other supervised methods is unknown, because PCA based unsupervised FE has only been applied to challenging (i.e. not well known) problems. In this study, PCA based unsupervised FE was applied to an extensively studied organism, i.e., budding yeast. When applied to two gene expression profiles expected to be temporally periodic, yeast metabolic cycle (YMC) and yeast cell division cycle (YCDC), PCA based unsupervised FE outperformed simple but powerful conventional methods, with sinusoidal fitting with regards to several aspects: (i) feasible biological term enrichment without assuming periodicity for YMC; (ii) identification of periodic profiles whose period was half as long as the cell division cycle for YMC; and (iii) the identification of no more than 37 genes associated with the enrichment of biological terms related to cell division cycle for the integrated analysis of seven YCDC profiles, for which sinusoidal fittings failed. The explantation for differences between methods used and the necessary conditions required were determined by comparing PCA based unsupervised FE with fittings to various periodic (artificial, thus pre-defined) profiles. Furthermore, four popular unsupervised clustering algorithms applied to YMC were not as successful as PCA based unsupervised FE. PCA based unsupervised FE is a useful and effective unsupervised method to investigate YMC and YCDC. This study identified why the unsupervised method without pre-judged criteria outperformed supervised methods requiring human defined criteria.
Shaffer, John R; Feingold, Eleanor; Wang, Xiaojing; Tcuenco, Karen T; Weeks, Daniel E; DeSensi, Rebecca S; Polk, Deborah E; Wendell, Steve; Weyant, Robert J; Crout, Richard; McNeil, Daniel W; Marazita, Mary L
2012-03-09
Dental caries is the result of a complex interplay among environmental, behavioral, and genetic factors, with distinct patterns of decay likely due to specific etiologies. Therefore, global measures of decay, such as the DMFS index, may not be optimal for identifying risk factors that manifest as specific decay patterns, especially if the risk factors such as genetic susceptibility loci have small individual effects. We used two methods to extract patterns of decay from surface-level caries data in order to generate novel phenotypes with which to explore the genetic regulation of caries. The 128 tooth surfaces of the permanent dentition were scored as carious or not by intra-oral examination for 1,068 participants aged 18 to 75 years from 664 biological families. Principal components analysis (PCA) and factor analysis (FA), two methods of identifying underlying patterns without a priori surface classifications, were applied to our data. The three strongest caries patterns identified by PCA recaptured variation represented by DMFS index (correlation, r = 0.97), pit and fissure surface caries (r = 0.95), and smooth surface caries (r = 0.89). However, together, these three patterns explained only 37% of the variability in the data, indicating that a priori caries measures are insufficient for fully quantifying caries variation. In comparison, the first pattern identified by FA was strongly correlated with pit and fissure surface caries (r = 0.81), but other identified patterns, including a second pattern representing caries of the maxillary incisors, were not representative of any previously defined caries indices. Some patterns identified by PCA and FA were heritable (h(2) = 30-65%, p = 0.043-0.006), whereas other patterns were not, indicating both genetic and non-genetic etiologies of individual decay patterns. This study demonstrates the use of decay patterns as novel phenotypes to assist in understanding the multifactorial nature of dental caries.
Directory of Open Access Journals (Sweden)
Badaruddoza
2015-09-01
Full Text Available The current study focused to determine significant cardiovascular risk factors through principal component factor analysis (PCFA among three generations on 1827 individuals in three generations including 911 males (378 from offspring, 439 from parental and 94 from grand-parental generations and 916 females (261 from offspring, 515 from parental and 140 from grandparental generations. The study performed PCFA with orthogonal rotation to reduce 12 inter-correlated variables into groups of independent factors. The factors have been identified as 2 for male grandparents, 3 for male offspring, female parents and female grandparents each, 4 for male parents and 5 for female offspring. This data reduction method identified these factors that explained 72%, 84%, 79%, 69%, 70% and 73% for male and female offspring, male and female parents and male and female grandparents respectively, of the variations in original quantitative traits. The factor 1 accounting for the largest portion of variations was strongly loaded with factors related to obesity (body mass index (BMI, waist circumference (WC, waist to hip ratio (WHR, and thickness of skinfolds among all generations with both sexes, which has been known to be an independent predictor for cardiovascular morbidity and mortality. The second largest components, factor 2 and factor 3 for almost all generations reflected traits of blood pressure phenotypes loaded, however, in male offspring generation it was observed that factor 2 was loaded with blood pressure phenotypes as well as obesity. This study not only confirmed but also extended prior work by developing a cumulative risk scale from factor scores. Till today, such a cumulative and extensive scale has not been used in any Indian studies with individuals of three generations. These findings and study highlight the importance of global approach for assessing the risk and need for studies that elucidate how these different cardiovascular risk factors
Processing of spectral X-ray data with principal components analysis
Butler, A P H; Cook, N J; Butzer, J; Schleich, N; Tlustos, L; Scott, N; Grasset, R; de Ruiter, N; Anderson, N G
2011-01-01
The goal of the work was to develop a general method for processing spectral x-ray image data. Principle component analysis (PCA) is a well understood technique for multivariate data analysis and so was investigated. To assess this method, spectral (multi-energy) computed tomography (CT) data was obtained using a Medipix2 detector in a MARS-CT (Medipix All Resolution System). PCA was able to separate bone (calcium) from two elements with k-edges in the X-ray spectrum used (iodine and barium) within a mouse. This has potential clinical application in dual-energy CT systems and future Medipix3 based spectral imaging where up to eight energies can be recorded simultaneously with excellent energy resolution. (c) 2010 Elsevier B.V. All rights reserved.
Płonka, Agnieszka; Fichtner, Andreas
2017-04-01
Lateral density variations are the source of mass transport in the Earth at all scales, acting as drivers of convective motion. However, the density structure of the Earth remains largely unknown since classic seismic observables and gravity provide only weak constraints with strong trade-offs. Current density models are therefore often based on velocity scaling, making strong assumptions on the origin of structural heterogeneities, which may not necessarily be correct. Our goal is to assess if 3D density structure may be resolvable with emerging full-waveform inversion techniques. We have previously quantified the impact of regional-scale crustal density structure on seismic waveforms with the conclusion that reasonably sized density variations within the crust can leave a strong imprint on both travel times and amplitudes, and, while this can produce significant biases in velocity and Q estimates, the seismic waveform inversion for density may become feasible. In this study we perform principal component analyses of sensitivity kernels for P velocity, S velocity, and density. This is intended to establish the extent to which these kernels are linearly independent, i.e. the extent to which the different parameters may be constrained independently. We apply the method to data from 81 events around the Iberian Penninsula, registered in total by 492 stations. The objective is to find a principal kernel which would maximize the sensitivity to density, potentially allowing for as independent as possible density resolution. We find that surface (mosty Rayleigh) waves have significant sensitivity to density, and that the trade-off with velocity is negligible. We also show the preliminary results of the inversion.
Directory of Open Access Journals (Sweden)
Mohammad Kashkoei Jahroomi
2016-07-01
Full Text Available Introduction In remote sensing studies, especially those in which multi-spectral image data are used, (i.e., Landsat-7 Enhanced Thematic Mapper, various statistical methods are often applied for image enhancement and feature extraction (Reddy, 2008. Principal component analysis is a multivariate statistical technique which is frequently used in multidimensional data analysis. This method attempts to extract and place the spectral information into a smaller set of new components that are more interpretable. However, the results obtained from this method are not so straightforward and require somewhat sophisticated techniques to interpret (Drury, 2001. In this paper we present a new approach for mapping of hydrothermal alteration by analyzing and selecting the principal components extracted through processing of Landsat ETM+ images. The study area is located in a mountainous region of southern Kerman. Geologically, it lies in the volcanic belt of central Iran adjacent to the Gogher-Baft ophiolite zone. The region is highly altered with sericitic, propyliticand argillic alterationwell developed, and argillic alteration is limited (Jafari, 2009; Masumi and Ranjbar, 2011. Multispectral data of Landsat ETM+ was acquired (path 181, row 34 in this study. In these images the color composites of Band 7, Band 4 and Band 1 in RGB indicate the lithology outcropping in the study area. The principal component analysis (PCA ofimage data is often implemented computationally using three steps: (1 Calculation of the variance, covariance matrix or correlation matrix of the satellite sensor data. (2 Computation of the eigenvalues and eigenvectors of the variance-covariance matrix or correlation matrix, and (3 Linear transformation of the image data using the coefficients of the eigenvector matrix. Results By applying PCA to the spectral data, according to the eigenvectors obtained, 6 principal components were extracted from the data set. In the PCA matrix, theeigen
DEFF Research Database (Denmark)
Ruiz, Magda; Sin, Gürkan; Berjaga, Xavier
2011-01-01
The main idea of this paper is to develop a methodology for process monitoring, fault detection and predictive diagnosis of a WasteWater Treatment Plant (WWTP). To achieve this goal, a combination of Multiway Principal Component Analysis (MPCA) and Case-Based Reasoning (CBR) is proposed. First...
Improving Cross-Day EEG-Based Emotion Classification Using Robust Principal Component Analysis
Directory of Open Access Journals (Sweden)
Yuan-Pin Lin
2017-07-01
Full Text Available Constructing a robust emotion-aware analytical framework using non-invasively recorded electroencephalogram (EEG signals has gained intensive attentions nowadays. However, as deploying a laboratory-oriented proof-of-concept study toward real-world applications, researchers are now facing an ecological challenge that the EEG patterns recorded in real life substantially change across days (i.e., day-to-day variability, arguably making the pre-defined predictive model vulnerable to the given EEG signals of a separate day. The present work addressed how to mitigate the inter-day EEG variability of emotional responses with an attempt to facilitate cross-day emotion classification, which was less concerned in the literature. This study proposed a robust principal component analysis (RPCA-based signal filtering strategy and validated its neurophysiological validity and machine-learning practicability on a binary emotion classification task (happiness vs. sadness using a five-day EEG dataset of 12 subjects when participated in a music-listening task. The empirical results showed that the RPCA-decomposed sparse signals (RPCA-S enabled filtering off the background EEG activity that contributed more to the inter-day variability, and predominately captured the EEG oscillations of emotional responses that behaved relatively consistent along days. Through applying a realistic add-day-in classification validation scheme, the RPCA-S progressively exploited more informative features (from 12.67 ± 5.99 to 20.83 ± 7.18 and improved the cross-day binary emotion-classification accuracy (from 58.31 ± 12.33% to 64.03 ± 8.40% as trained the EEG signals from one to four recording days and tested against one unseen subsequent day. The original EEG features (prior to RPCA processing neither achieved the cross-day classification (the accuracy was around chance level nor replicated the encouraging improvement due to the inter-day EEG variability. This result
Directory of Open Access Journals (Sweden)
Y-h Taguchi
Full Text Available The discovery and characterization of blood-based disease biomarkers are clinically important because blood collection is easy and involves relatively little stress for the patient. However, blood generally reflects not only targeted diseases, but also the whole body status of patients. Thus, the selection of biomarkers may be difficult. In this study, we considered miRNAs as biomarker candidates for several reasons. First, since miRNAs were discovered relatively recently, they have not yet been tested extensively. Second, since the number of miRNAs is relatively limited, selection is expected to be easy. Third, since they are known to play critical roles in a wide range of biological processes, their expression may be disease specific. We applied a newly proposed method to select combinations of miRNAs that discriminate between healthy controls and each of 14 diseases that include 5 cancers. A new feature selection method is based on principal component analysis. Namely this method does not require knowledge of whether each sample was derived from a disease patient or a healthy control. Using this method, we found that hsa-miR-425, hsa-miR-15b, hsa-miR-185, hsa-miR-92a, hsa-miR-140-3p, hsa-miR-320a, hsa-miR-486-5p, hsa-miR-16, hsa-miR-191, hsa-miR-106b, hsa-miR-19b, and hsa-miR-30d were potential biomarkers; combinations of 10 of these miRNAs allowed us to discriminate each disease included in this study from healthy controls. These 12 miRNAs are significantly up- or downregulated in most cancers and other diseases, albeit in a cancer- or disease-specific combinatory manner. Therefore, these 12 miRNAs were also previously reported to be cancer- and disease-related miRNAs. Many disease-specific KEGG pathways were also significantly enriched by target genes of up-/downregulated miRNAs within several combinations of 10 miRNAs among these 12 miRNAs. We also selected miRNAs that could discriminate one disease from another or from healthy controls
Kernel Principal Component Analysis for dimensionality reduction in fMRI-based diagnosis of ADHD
Directory of Open Access Journals (Sweden)
Gagan S Sidhu
2012-11-01
Full Text Available This article explores various preprocessing tools that select/create features to help a learner produce a classifier that can use fMRI data to effectively discriminate Attention-Deficit Hyperactivity Disorder (ADHD patients from healthy controls. We consider four different learning tasks: predicting either two (ADHD vs control or three classes (ADHD-1 vs ADHD-3 vs control, where each use either the imaging data only, or the phenotypic and imaging data. After averaging, BOLD-signal normalization, and masking of the fMRI images, we considered applying Fast Fourier Transform (FFT, possibly followed by some Principal Component Analysis (PCA variant (over time: PCA-t; over space and time: PCA-st or the kernelized variant, kPCA-st, to produce inputs to a learner, to determine which learned classifier performs the best – or at least better than the baseline of 64.2%, which is the proportion of the majority class (here, controls.In the two-class setting, PCA-t and PCA-st did not perform statistically better than baseline, whereas FFT and kPCA-st did (FFT, 68.4%; kPCA-st, 70.3%; when combined with the phenotypic data, which by itself produces 72.9% accuracy, all methods performed statistically better than the baseline, but none did better than using the phenotypic data. In the three-class setting, neither the PCA variants, or the phenotypic data classifiers, performed statistically better than the baseline.We next used the FFT output as input to the PCA variants. In the two-class setting, the PCA variants performed statistically better than the baseline using either the FFTed waveforms only (FFT+PCA-t, 69.6%,; FFT+PCA-st, 69.3% ; FFT+kPCA-st, 68.7%, or combining them with the phenotypic data (FFT+PCA-t, 70.6%; FFT+PCA-st, 70.6%; kPCA-st, 76%. In both settings, combining FFT+kPCA-st’s features with the phenotypic data was better than using only the phenotypic data, with the result in the two-class setting being statistically better.
Xiangdong, Gao; Qian, Wen
2013-12-01
There exists plenty of welding quality information on a molten pool during high-power fiber laser welding. An approach for monitoring the high-power fiber laser welding status based on the principal component analysis (PCA) of a molten pool configuration is investigated. An infrared-sensitive high-speed camera was used to capture the molten pool images during laser butt-joint welding of Type 304 austenitic stainless steel plates with a high-power (10 kW) continuous wave fiber laser. In order to study the relationship between the molten pool configuration and the welding status, a new method based on PCA is proposed to analyze the welding stability by comparing the situation when the laser beam spot moves along, and when it deviates from the weld seam. Image processing techniques were applied to process the molten pool images and extract five characteristic parameters. Moreover, the PCA method was used to extract a composite indicator which is the linear combination of the five original characteristics to analyze the different status during welding. Experimental results showed that the extracted composite indicator had a close relationship with the actual welding results and it could be used to evaluate the status of the high-power fiber laser welding, providing a theoretical basis for the monitoring of laser welding quality.
Paolucci, Enrico; Lunedei, Enrico; Albarello, Dario
2017-10-01
In this work, we propose a procedure based on principal component analysis on data sets consisting of many horizontal to vertical spectral ratio (HVSR or H/V) curves obtained by single-station ambient vibration acquisitions. This kind of analysis aimed at the seismic characterization of the investigated area by identifying sites characterized by similar HVSR curves. It also allows to extract the typical HVSR patterns of the explored area and to establish their relative importance, providing an estimate of the level of heterogeneity under the seismic point of view. In this way, an automatic explorative seismic characterization of the area becomes possible by only considering ambient vibration data. This also implies that the relevant outcomes can be safely compared with other available information (geological data, borehole measurements, etc.) without any conceptual trade-off. The whole algorithm is remarkably fast: on a common personal computer, the processing time takes few seconds for a data set including 100-200 HVSR measurements. The procedure has been tested in three study areas in the Central-Northern Italy characterized by different geological settings. Outcomes demonstrate that this technique is effective and well correlates with most significant seismostratigraphical heterogeneities present in each of the study areas.
International Nuclear Information System (INIS)
Yun Liang; Messer, Karen; Rose, Brent S.; Lewis, John H.; Jiang, Steve B.; Yashar, Catheryn M.; Mundt, Arno J.; Mell, Loren K.
2010-01-01
Purpose: To study the effects of increasing pelvic bone marrow (BM) radiation dose on acute hematologic toxicity in patients undergoing chemoradiotherapy, using a novel modeling approach to preserve the local spatial dose information. Methods and Materials: The study included 37 cervical cancer patients treated with concurrent weekly cisplatin and pelvic radiation therapy. The white blood cell count nadir during treatment was used as the indicator for acute hematologic toxicity. Pelvic BM radiation dose distributions were standardized across patients by registering the pelvic BM volumes to a common template, followed by dose remapping using deformable image registration, resulting in a dose array. Principal component (PC) analysis was applied to the dose array, and the significant eigenvectors were identified by linear regression on the PCs. The coefficients for PC regression and significant eigenvectors were represented in three dimensions to identify critical BM subregions where dose accumulation is associated with hematologic toxicity. Results: We identified five PCs associated with acute hematologic toxicity. PC analysis regression modeling explained a high proportion of the variation in acute hematologicity (adjusted R 2 , 0.49). Three-dimensional rendering of a linear combination of the significant eigenvectors revealed patterns consistent with anatomical distributions of hematopoietically active BM. Conclusions: We have developed a novel approach that preserves spatial dose information to model effects of radiation dose on toxicity, which may be useful in optimizing radiation techniques to avoid critical subregions of normal tissues. Further validation of this approach in a large cohort is ongoing.
Uğuz, Harun
2012-09-01
A transcranial Doppler (TCD) is a non-invasive, easy to apply and reliable technique which is used in the diagnosis of various brain diseases by measuring the blood flow velocities in brain arteries. This study aimed to classify the TCD signals, and feature ranking (information gain - IG) and dimension reduction methods (principal component analysis - PCA) were used as a hybrid to improve the classification efficiency and accuracy. In this context, each feature within the feature space was ranked depending on its importance for the classification using the IG method. Thus, the less important features were ignored and the highly important features were selected. Then, the PCA method was applied to the highly important features for dimension reduction. As a result, a hybrid feature reduction between the selection of the highly important features and the application of the PCA method on the reduced features were achieved. To evaluate the effectiveness of the proposed method, experiments were conducted using a support vector machine (SVM) classifier on the TCD signals recorded from the temporal region of the brain of 82 patients, as well as 24 healthy people. The experimental results showed that using the IG and PCA methods as a hybrid improves the classification efficiency and accuracy compared with individual usage. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Azad, Abul Kalam; Khan, Faisal Nadeem; Alarashi, Waled Hussein; Guo, Nan; Lau, Alan Pak Tao; Lu, Chao
2017-07-10
We propose and experimentally demonstrate the use of principal component analysis (PCA) based pattern recognition to extract temperature distribution from the measured Brillouin gain spectra (BGSs) along the fiber under test (FUT) obtained by Brillouin optical time domain analysis (BOTDA) system. The proposed scheme employs a reference database consisting of relevant ideal BGSs with known temperature attributes. PCA is then applied to the BGSs in the reference database as well as to the measured BGSs so as to reduce their size by extracting their most significant features. Now, for each feature vector of the measured BGS, we determine its best match in the reference database comprised of numerous reduced-size feature vectors of the ideal BGSs. The known temperature attribute corresponding to the best-matched BGS in the reference database is then taken as the extracted temperature of the measured BGS. We analyzed the performance of PCA-based pattern recognition algorithm in detail and compared it with that of curve fitting method. The experimental results validate that the proposed technique can provide better accuracy, faster processing speed and larger noise tolerance for the measured BGSs. Therefore, the proposed PCA-based pattern recognition algorithm can be considered as an attractive method for extracting temperature distributions along the fiber in BOTDA sensors.
Prata, Paloma S; Alexandrino, Guilherme L; Mogollón, Noroska Gabriela S; Augusto, Fabio
2016-11-11
The geochemical characterization of petroleum is an essential task to develop new strategies and technologies when analyzing the commercial potential of crude oils for exploitation. Due to the chemical complexity of these samples, the use of modern analytical techniques along with multivariate exploratory data analysis approaches is an interesting strategy to extract relevant geochemical characteristics about the oils. In this work, important geochemical information obtained from crude oils from different production basins were obtained analyzing the maltene fraction of the oils by comprehensive two-dimensional gas chromatography coupled to quadrupole mass spectrometry (GC×GC-QMS), and performing multiway principal component analysis (MPCA) of the chromatographic data. The results showed that four MPC explained 93.57% of the data variance, expressing mainly the differences on the profiles of the saturated hydrocarbon fraction of the oils (C 13 -C 18 and C 19 -C 30 n-alkanes and the pristane/phytane ratio). The MPC1 grouped the samples severely biodegraded oils, while the type of the depositional paleoenvironments of the oils and its oxidation conditions (as well as their thermal maturity) could be inferred analysing others relevant MPC. Additionally, considerations about the source of the oil samples was also possible based on the overall distribution of relevant biomarkers such as the phenanthrene derivatives, tri-, tetra- and pentacyclic terpanes. Copyright © 2016 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Shuai Sun
2014-06-01
Full Text Available Due to the scarcity of resources of Ziziphi spinosae semen (ZSS, many inferior goods and even adulterants are generally found in medicine markets. To strengthen the quality control, HPLC fingerprint common pattern established in this paper showed three main bioactive compounds in one chromatogram simultaneously. Principal component analysis based on DAD signals could discriminate adulterants and inferiorities. Principal component analysis indicated that all samples could be mainly regrouped into two main clusters according to the first principal component (PC1, redefined as Vicenin II and the second principal component (PC2, redefined as zizyphusine. PC1 and PC2 could explain 91.42% of the variance. Content of zizyphusine fluctuated more greatly than that of spinosin, and this result was also confirmed by the HPTLC result. Samples with low content of jujubosides and two common adulterants could not be used equivalently with authenticated ones in clinic, while one reference standard extract could substitute the crude drug in pharmaceutical production. Giving special consideration to the well-known bioactive saponins but with low response by end absorption, a fast and cheap HPTLC method for quality control of ZSS was developed and the result obtained was commensurate well with that of HPLC analysis. Samples having similar fingerprints to HPTLC common pattern targeting at saponins could be regarded as authenticated ones. This work provided a faster and cheaper way for quality control of ZSS and laid foundation for establishing a more effective quality control method for ZSS. Keywords: Adulterant, Common pattern, Principal component analysis, Quality control, Ziziphi spinosae semen
Rapid cultivar identification of barley seeds through disjoint principal component modeling.
Whitehead, Iain; Munoz, Alicia; Becker, Thomas
2017-01-01
Classification of barley varieties is a crucial part of the control and assessment of barley seeds especially for the malting and brewing industry. The correct classification of barley is essential in that a majority of decisions made regarding process specifications, economic considerations, and the type of product produced with the cereal are made based on the barley variety itself. This fact combined with the need to promptly assess the cereal as it is delivered to a malt house or production facility creates the need for a technique to quickly identify a barley variety based on a sample. This work explores the feasibility of differentiating between barley varieties based on the protein spectrum of barley seeds. In order to produce a rapid analysis of the protein composition of the barley seeds, lab-on-a-chip micro fluid technology is used to analyze the protein composition. Classification of the barley variety is then made using disjoint principle component models. This work included 19 different barley varieties. The varieties consisted of both winter and summer barley types. In this work, it is demonstrated that this system can identify the most likely barley variety with an accuracy of 95.9% based on cross validation and can screen summer barley with an accuracy of 95.2% and a false positive rate of 0.0% based on cross validation. This demonstrates the feasibility of the method to provide a rapid and relatively inexpensive method to verify the heritage of barley seeds.
A Fault Prognosis Strategy Based on Time-Delayed Digraph Model and Principal Component Analysis
Directory of Open Access Journals (Sweden)
Ningyun Lu
2012-01-01
Full Text Available Because of the interlinking of process equipments in process industry, event information may propagate through the plant and affect a lot of downstream process variables. Specifying the causality and estimating the time delays among process variables are critically important for data-driven fault prognosis. They are not only helpful to find the root cause when a plant-wide disturbance occurs, but to reveal the evolution of an abnormal event propagating through the plant. This paper concerns with the information flow directionality and time-delay estimation problems in process industry and presents an information synchronization technique to assist fault prognosis. Time-delayed mutual information (TDMI is used for both causality analysis and time-delay estimation. To represent causality structure of high-dimensional process variables, a time-delayed signed digraph (TD-SDG model is developed. Then, a general fault prognosis strategy is developed based on the TD-SDG model and principle component analysis (PCA. The proposed method is applied to an air separation unit and has achieved satisfying results in predicting the frequently occurred “nitrogen-block” fault.
Wang, C.; Chen, Q.; Hussain, M.; Wu, S.; Chen, J.; Tang, Z.
2017-07-01
This study provides a new approach to the classification of textile fibers by using principal component analysis (PCA), based on UV-Vis diffuse reflectance spectroscopy (UV-Vis DRS). Different natural and synthetic fibers such as cotton, wool, silk, linen, viscose, and polyester were used. The spectrum of each kind of fiber was scanned by a spectrometer equipped with an integrating sphere. The characteristics of their UV-Vis diffuse reflectance spectra were analyzed. PCA revealed that the first three components represented 99.17% of the total variability in the ultraviolet region. Principal component score scatter plot (PC1 × PC2) of each fiber indicated the accuracy of this classification for these six varieties of fibers. Therefore, it was demonstrated that UV diffuse reflectance spectroscopy can be used as a novel approach to rapid, real-time, fiber identification.
DEFF Research Database (Denmark)
Giesen, EB; Ding, Ming; Dalstra, M
2003-01-01
embalmed mandibular condyles; the angle of the first principal direction and the axis of the specimen, expressing the orientation of the trabeculae, ranged from 10 degrees to 87 degrees. Morphological parameters were determined by a method based on Archimedes' principle and by micro-CT scanning...
Wojciechowski, Adam
2017-04-01
In order to assess ecodiversity understood as a comprehensive natural landscape factor (Jedicke 2001), it is necessary to apply research methods which recognize the environment in a holistic way. Principal component analysis may be considered as one of such methods as it allows to distinguish the main factors determining landscape diversity on the one hand, and enables to discover regularities shaping the relationships between various elements of the environment under study on the other hand. The procedure adopted to assess ecodiversity with the use of principal component analysis involves: a) determining and selecting appropriate factors of the assessed environment qualities (hypsometric, geological, hydrographic, plant, and others); b) calculating the absolute value of individual qualities for the basic areas under analysis (e.g. river length, forest area, altitude differences, etc.); c) principal components analysis and obtaining factor maps (maps of selected components); d) generating a resultant, detailed map and isolating several classes of ecodiversity. An assessment of ecodiversity with the use of principal component analysis was conducted in the test area of 299,67 km2 in Debnica Kaszubska commune. The whole commune is situated in the Weichselian glaciation area of high hypsometric and morphological diversity as well as high geo- and biodiversity. The analysis was based on topographical maps of the commune area in scale 1:25000 and maps of forest habitats. Consequently, nine factors reflecting basic environment elements were calculated: maximum height (m), minimum height (m), average height (m), the length of watercourses (km), the area of water reservoirs (m2), total forest area (ha), coniferous forests habitats area (ha), deciduous forest habitats area (ha), alder habitats area (ha). The values for individual factors were analysed for 358 grid cells of 1 km2. Based on the principal components analysis, four major factors affecting commune ecodiversity
DEFF Research Database (Denmark)
Schreiber, Norman; Garcia, Emanuel; Kroon, Aart
2014-01-01
Principle Component Analysis (PCA) was performed on chemical data of two sediment cores from an urban fresh-water lake in Copenhagen, Denmark. X-ray fluorescence (XRF) core scanning provided the underlying datasets on 13 variables (Si, K, Ca, Ti, Cr, Mn, Fe, Ni, Cu, Zn, Rb, Cd, Pb). Principle...... Component Analysis helped to trace geochemical patterns and temporal trends in lake sedimentation. The PCA models explained more than 80 % of the original variation in the datasets using only 2 or 3 principle components. The first principle component (PC1) was mostly associated with geogenic elements (Si, K...
Passive RF component technology materials, techniques, and applications
Wang, Guoan
2012-01-01
Focusing on novel materials and techniques, this pioneering volume provides you with a solid understanding of the design and fabrication of smart RF passive components. You find comprehensive details on LCP, metal materials, ferrite materials, nano materials, high aspect ratio enabled materials, green materials for RFID, and silicon micromachining techniques. Moreover, this practical book offers expert guidance on how to apply these materials and techniques to design a wide range of cutting-edge RF passive components, from MEMS switch based tunable passives and 3D passives, to metamaterial-bas
Techniques for preventing damage to high power laser components
International Nuclear Information System (INIS)
Stowers, I.F.; Patton, H.G.; Jones, W.A.; Wentworth, D.E.
1977-09-01
Techniques for preventing damage to components of the LASL Shiva high power laser system were briefly presented. Optical element damage in the disk amplifier from the combined fluence of the primary laser beam and the Xenon flash lamps that pump the cavity was discussed. Assembly and cleaning techniques were described which have improved optical element life by minimizing particulate and optically absorbing film contamination on assembled amplifier structures. A Class-100 vertical flaw clean room used for assembly and inspection of laser components was also described. The life of a disk amplifier was extended from less than 50 shots to 500 shots through application of these assembly and cleaning techniques
Rosen, C; Yuan, Z
2001-01-01
In this paper a methodology for integrated multivariate monitoring and control of biological wastewater treatment plants during extreme events is presented. To monitor the process, on-line dynamic principal component analysis (PCA) is performed on the process data to extract the principal components that represent the underlying mechanisms of the process. Fuzzy o-means (FCM) clustering is used to classify the operational state. Performing clustering on scores from PCA solves computational problems as well as increases robustness due to noise attenuation. The class-membership information from FCM is used to derive adequate control set points for the local control loops. The methodology is illustrated by a simulation study of a biological wastewater treatment plant, on which disturbances of various types are imposed. The results show that the methodology can be used to determine and co-ordinate control actions in order to shift the control objective and improve the effluent quality.
Segil, Jacob L; Weir, Richard F ff
2014-03-01
An ideal myoelectric prosthetic hand should have the ability to continuously morph between any posture like an anatomical hand. This paper describes the design and validation of a morphing myoelectric hand controller based on principal component analysis of human grasping. The controller commands continuously morphing hand postures including functional grasps using between two and four surface electromyography (EMG) electrodes pairs. Four unique maps were developed to transform the EMG control signals in the principal component domain. A preliminary validation experiment was performed by 10 nonamputee subjects to determine the map with highest performance. The subjects used the myoelectric controller to morph a virtual hand between functional grasps in a series of randomized trials. The number of joints controlled accurately was evaluated to characterize the performance of each map. Additional metrics were studied including completion rate, time to completion, and path efficiency. The highest performing map controlled over 13 out of 15 joints accurately.
International Nuclear Information System (INIS)
Korkealaakso, J.; Vaittinen, T.; Pitkaenen, P.; Front, K.
1994-07-01
In Finland three crystalline rock sites (Romuvaara, Kivetty and Olkiluoto) have been selected for detailed site investigations of Teollisuuden Voima Oy (TVO) for the disposal of the high level waste. The objective of the work has been to study the possibilities of applying the principal component analysis (PCA) of single borehole data to the identification and classification of hydrogeologically important fracture zones. The fracture zone index (FZI) is defined as the first principal component of data from the hydraulic conductivity of 31 m tests, resistivity loggings, radar reflections, open and filled fracture observations and gamma-gamma density/P-wave velocity loggings. FZI anomalies have been compared against earlier conceptualizations based on the overall expert opinion of geological, geophysical and geohydrological information. (42 refs., 49 figs.)
Penttilä, Antti; Martikainen, Julia; Gritsevich, Maria; Muinonen, Karri
2018-02-01
Meteorite samples are measured with the University of Helsinki integrating-sphere UV-vis-NIR spectrometer. The resulting spectra of 30 meteorites are compared with selected spectra from the NASA Planetary Data System meteorite spectra database. The spectral measurements are transformed with the principal component analysis, and it is shown that different meteorite types can be distinguished from the transformed data. The motivation is to improve the link between asteroid spectral observations and meteorite spectral measurements.
Czech Academy of Sciences Publication Activity Database
Alán, Lukáš; Špaček, Tomáš; Ježek, Petr
2016-01-01
Roč. 45, č. 5 (2016), s. 443-461 ISSN 0175-7571 R&D Projects: GA ČR(CZ) GA13-02033S; GA MŠk(CZ) ED1.1.00/02.0109 Institutional support: RVO:67985823 Keywords : 3D object segmentation * Delaunay algorithm * principal component analysis * 3D super-resolution microscopy * nucleoids * mitochondrial DNA replication Subject RIV: BO - Biophysics Impact factor: 1.472, year: 2016
Ren, Jing-Zheng; Tan, Shi-yu; Dong, Li-chun
2012-01-01
Hydrogen energy which has been recognized as an alternative instead of fossil fuel has been developed rapidly in fuel cell vehicles. Different hydrogen energy systems have different performances on environmental, economic, and energy aspects. A methodology for the quantitative evaluation and analysis of the hydrogen systems is meaningful for decision makers to select the best scenario. principal component analysis (PCA) has been used to evaluate the integrated performance of different hydroge...
Qi, Danyi; Roe, Brian E.
2016-01-01
We estimate models of consumer food waste awareness and attitudes using responses from a national survey of U.S. residents. Our models are interpreted through the lens of several theories that describe how pro-social behaviors relate to awareness, attitudes and opinions. Our analysis of patterns among respondents' food waste attitudes yields a model with three principal components: one that represents perceived practical benefits households may lose if food waste were reduced, one that repres...
Rincon-Charris, Amilcar; Quevedo Casín, Joseba Jokin
2013-01-01
Multiple fault detection and diagnosis is a challenging problem because the number of candidates grows exponentially in the number of faults. In add ition, multiple faults in dynamic systems may be hard to detect, because they can mask or compensate each other’s effects. This paper presents the study of the detection and diagnosis of multiple faults in a SR-30 Gas Turbine using nonlinear principal component analys is as the detection method and structured residua...
International Nuclear Information System (INIS)
Spurr, R.; Natraj, V.; Lerot, C.; Van Roozendael, M.; Loyola, D.
2013-01-01
Principal Component Analysis (PCA) is a promising tool for enhancing radiative transfer (RT) performance. When applied to binned optical property data sets, PCA exploits redundancy in the optical data, and restricts the number of full multiple-scatter calculations to those optical states corresponding to the most important principal components, yet still maintaining high accuracy in the radiance approximations. We show that the entire PCA RT enhancement process is analytically differentiable with respect to any atmospheric or surface parameter, thus allowing for accurate and fast approximations of Jacobian matrices, in addition to radiances. This linearization greatly extends the power and scope of the PCA method to many remote sensing retrieval applications and sensitivity studies. In the first example, we examine accuracy for PCA-derived UV-backscatter radiance and Jacobian fields over a 290–340 nm window. In a second application, we show that performance for UV-based total ozone column retrieval is considerably improved without compromising the accuracy. -- Highlights: •Principal Component Analysis (PCA) of spectrally-binned atmospheric optical properties. •PCA-based accelerated radiative transfer with 2-stream model for fast multiple-scatter. •Atmospheric and surface property linearization of this PCA performance enhancement. •Accuracy of PCA enhancement for radiances and bulk-property Jacobians, 290–340 nm. •Application of PCA speed enhancement to UV backscatter total ozone retrievals
Yuan, Yuan-Yuan; Zhou, Yu-Bi; Sun, Jing; Deng, Juan; Bai, Ying; Wang, Jie; Lu, Xue-Feng
2017-06-01
The content of elements in fifteen different regions of Nitraria roborowskii samples were determined by inductively coupled plasma-atomic emission spectrometry(ICP-OES), and its elemental characteristics were analyzed by principal component analysis. The results indicated that 18 mineral elements were detected in N. roborowskii of which V cannot be detected. In addition, contents of Na, K and Ca showed high concentration. Ti showed maximum content variance, while K is minimum. Four principal components were gained from the original data. The cumulative variance contribution rate is 81.542% and the variance contribution of the first principal component was 44.997%, indicating that Cr, Fe, P and Ca were the characteristic elements of N. roborowskii.Thus, the established method was simple, precise and can be used for determination of mineral elements in N.roborowskii Kom. fruits. The elemental distribution characteristics among N.roborowskii fruits are related to geographical origins which were clearly revealed by PCA. All the results will provide good basis for comprehensive utilization of N.roborowskii. Copyright© by the Chinese Pharmaceutical Association.
Directory of Open Access Journals (Sweden)
S. Saravanan
2012-07-01
Full Text Available Power System planning starts with Electric load (demand forecasting. Accurate electricity load forecasting is one of the most important challenges in managing supply and demand of the electricity, since the electricity demand is volatile in nature; it cannot be stored and has to be consumed instantly. The aim of this study deals with electricity consumption in India, to forecast future projection of demand for a period of 19 years from 2012 to 2030. The eleven input variables used are Amount of CO2 emission, Population, Per capita GDP, Per capita gross national income, Gross Domestic savings, Industry, Consumer price index, Wholesale price index, Imports, Exports and Per capita power consumption. A new methodology based on Artificial Neural Networks (ANNs using principal components is also used. Data of 29 years used for training and data of 10 years used for testing the ANNs. Comparison made with multiple linear regression (based on original data and the principal components and ANNs with original data as input variables. The results show that the use of ANNs with principal components (PC is more effective.
Directory of Open Access Journals (Sweden)
Ji Yi
2017-06-01
Full Text Available Discrete wavelet transform (WT followed by principal component analysis (PCA has been a powerful approach for the analysis of biomedical signals. Wavelet coefficients at various scales and channels were usually transformed into a one-dimensional array, causing issues such as the curse of dimensionality dilemma and small sample size problem. In addition, lack of time-shift invariance of WT coefficients can be modeled as noise and degrades the classifier performance. In this study, we present a stationary wavelet-based two-directional two-dimensional principal component analysis (SW2D2PCA method for the efficient and effective extraction of essential feature information from signals. Time-invariant multi-scale matrices are constructed in the first step. The two-directional two-dimensional principal component analysis then operates on the multi-scale matrices to reduce the dimension, rather than vectors in conventional PCA. Results are presented from an experiment to classify eight hand motions using 4-channel electromyographic (EMG signals recorded in healthy subjects and amputees, which illustrates the efficiency and effectiveness of the proposed method for biomedical signal analysis.
Directory of Open Access Journals (Sweden)
Kevin J Parsons
2009-11-01
Full Text Available Comparing patterns of divergence among separate lineages or groups has posed an especially difficult challenge for biologists. Recently a new, conceptually simple methodology called the "ordered-axis plot" approach was introduced for the purpose of comparing patterns of diversity in a common morphospace. This technique involves a combination of principal components analysis (PCA and linear regression. Given the common use of these statistics the potential for the widespread use of the ordered axis approach is high. However, there are a number of drawbacks to this approach, most notably that lineages with the greatest amount of variance will largely bias interpretations from analyses involving a common morphospace. Therefore, without meeting a set of a priori requirements regarding data structure the ordered-axis plot approach will likely produce misleading results.Morphological data sets from cichlid fishes endemic to Lakes Tanganyika, Malawi, and Victoria were used to statistically demonstrate how separate groups can have differing contributions to a common morphospace produced by a PCA. Through a matrix superimposition of eigenvectors (scale-free trajectories of variation identified by PCA we show that some groups contribute more to the trajectories of variation identified in a common morphospace. Furthermore, through a set of randomization tests we show that a common morphospace model partitions variation differently than group-specific models. Finally, we demonstrate how these limitations may influence an ordered-axis plot approach by performing a comparison on data sets with known alterations in covariance structure. Using these results we provide a set of criteria that must be met before a common morphospace can be reliably used.Our results suggest that a common morphospace produced by PCA would not be useful for producing biologically meaningful results unless a restrictive set of criteria are met. We therefore suggest biologists be aware
International Nuclear Information System (INIS)
Silva, Joao Bosco P. da; Malvestiti, Ivani; Hallwass, Fernando; Ramos, Mozart N.; Leite, Lucia F.C. da Costa; Barreiro, Eliezer J.
2005-01-01
The 1 H NMR data set of a series of 3-aryl (1,2,4)-oxadiazole-5-carbohydrazide benzylidene derivatives synthesized in our group was analyzed using the chemometric technique of principal component analysis (PCA). Using the original 1H NMR data PCA allowed identifying some misassignments of the proton aromatic chemical shifts. As a consequence of this multivariate analysis, nuclear Overhauser difference experiments were performed to investigate the ambiguity of other assignments of the ortho and meta aromatic hydrogens for the compound with the bromine substituent. The effect of the 1,2,4-oxadiazole group as an electron acceptor, mainly for the hydrogens 12,13, has been highlighted. (author)
Faults and fractures detection in 2D seismic data based on principal component analysis
Directory of Open Access Journals (Sweden)
Poorandokht Soltani
2017-12-01
Full Text Available Various approached have been introduced to extract as much as information form seismic image for any specific reservoir or geological study. Modeling of faults and fractures are among the most attracted objects for interpretation in geological study on seismic images that several strategies have been presented for this specific purpose. In this study, we have presented a modified approach of application concept of the principle components analysis to enhance faults and fractures from low quality seismic image. In the first step, relevant attributes considering imaging faults and fractures were have drawn based on vast study on previous successful applications of different attributes. Subsequently, major informative components of each attribute were defined by performing principle component analysis. Since random noise in seismic image exhibits no correlation in seismic data, true reflectors and diffraction events show high coherency value thus these objects would be separated into different orthogonal components in principle component analysis. It will make it easy to remove irrelevant information considering faults and fractures from seismic image and thus will make a higher quality image by combining attribute sections in principle component analysis. Afterwards, selected components were stacked to enhance the fault position in final image. However, since that are other geological objects that might show correlation in other orthogonal components, so there should be refinement step on the final image to stack only the favorable information. This approach was performed on a field land data example form north east of Iran. Result of application the proposed strategy shows that the method is capable to image faults compared to the conventional image analysis for fault detection. The method was also capable to image accurate position of the body of mud volcanoes exited in the image that could not be easily tracked by conventional seismic image
Shi, Xiaowei; Zhang, Kai; Xue, Na; Su, Linfei; Ma, Gaixia; Qi, Jinlong; Wu, Yibing; Wang, Qiao; Shi, Qingwen
2013-12-15
The aim of this study was to investigate the chemical differences between genuine Inula britannica L. (I. britannica) and substitute specimens. A linear ion trap LC-MS/MS analytical method has been developed for the identification and quantification of 15 major components from I. britannica. Data acquisition was performed in multiple-reaction-monitoring transitions mode followed by an information-dependent acquisition using the enhanced product ion (EPI) scan in one run. The target compounds were further identified and confirmed using an EPI spectral library. The determination results of 45 batches of samples were then analysed and classified by principal component analysis (PCA). The content of 11 components could be used to distinguish the two official Flos Inulae species (I. britannica and Inula japonica) from unofficial species (Inula hupehensis), and the content of 3 components could be used to differentiate the two official species. Copyright © 2013 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Hesse Morten
2005-05-01
Full Text Available Abstract Background Personality disorders are common in substance abusers. Self-report questionnaires that can aid in the assessment of personality disorders are commonly used in assessment, but are rarely validated. Methods The Danish DIP-Q as a measure of co-morbid personality disorders in substance abusers was validated through principal components factor analysis and canonical correlation analysis. A 4 components structure was constructed based on 238 protocols, representing antagonism, neuroticism, introversion and conscientiousness. The structure was compared with (a a 4-factor solution from the DIP-Q in a sample of Swedish drug and alcohol abusers (N = 133, and (b a consensus 4-components solution based on a meta-analysis of published correlation matrices of dimensional personality disorder scales. Results It was found that the 4-factor model of personality was congruent across the Danish and Swedish samples, and showed good congruence with the consensus model. A canonical correlation analysis was conducted on a subset of the Danish sample with staff ratings of pathology. Three factors that correlated highly between the two variable sets were found. These variables were highly similar to the three first factors from the principal components analysis, antagonism, neuroticism and introversion. Conclusion The findings support the validity of the DIP-Q as a measure of DSM-IV personality disorders in substance abusers.
Werf, M.J. van der; Pieterse, B.; Luijk, N. van; Schuren, F.; Werff van der - Vat, B. van der; Overkamp, K.; Jellema, R.H.
2006-01-01
The value of the multivariate data analysis tools principal component analysis (PCA) and principal component discriminant analysis (PCDA) for prioritizing leads generated by microarrays was evaluated. To this end, Pseudomonas putida S12 was grown in independent triplicate fermentations on four
Hirose, Misa; Toyota, Saori; Ojima, Nobutoshi; Ogawa-Ochiai, Keiko; Tsumura, Norimichi
2015-03-01
In this paper, principal component analysis is applied to pigmentation distributions, surface reflectance components and facial landmarks in the whole facial images to obtain feature values. Furthermore, the relationship between the obtained feature vectors and age is estimated by multiple regression analysis to modulate facial images in woman of ages 10 to 70. In our previous work, we analyzed only pigmentation distributions and the reproduced images looked younger than the reproduced age by the subjective evaluation. We considered that this happened because we did not modulate the facial structures and detailed surfaces such as wrinkles. By analyzing landmarks represented facial structures and surface reflectance components, we analyzed the variation of facial structures and fine asperity distributions as well as pigmentation distributions in the whole face. As a result, our method modulate the appearance of a face by changing age more appropriately.
Marengo, Emilio; Robotti, Elisa; Righetti, Pier Giorgio; Antonucci, Francesca
2003-07-04
Two-dimensional (2D) electrophoresis is the most wide spread technique for the separation of proteins in biological systems. This technique produces 2D maps of high complexity, which creates difficulties in the comparison of different samples. The method proposed in this paper for the comparison of different 2D maps can be summarised in four steps: (a) digitalisation of the image; (b) fuzzyfication of the digitalised map in order to consider the variability of the two-dimensional electrophoretic separation; (c) decoding by principal component analysis of the previously obtained fuzzy maps, in order to reduce the system dimensionality; (d) classification analysis (linear discriminant analysis), in order to separate the samples contained in the dataset according to the classes present in said dataset. This method was applied to a dataset constituted by eight samples: four belonging to healthy human lymph-nodes and four deriving from non-Hodgkin lymphomas. The amount of fuzzyfication of the original map is governed by the sigma parameter. The larger the value, the more fuzzy theresulting transformed map. The effect of the fuzzyfication parameter was investigated, the optimal results being obtained for sigma = 1.75 and 2.25. Principal component analysis and linear discriminant analysis allowed the separation of the two classes of samples without any misclassification.
Nuclear analysis techniques as a component of thermoluminescence dating
Energy Technology Data Exchange (ETDEWEB)
Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)
1996-12-31
In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.
Diffusion bonding as joining technique for fusion reactor components
Energy Technology Data Exchange (ETDEWEB)
Ceccotti, G.C.; Magnoli, L.
1992-11-01
The development of joining techniques for fusion reactor divertors has been undertaken at ENEA (Italian Agency for Energy, New Technologies and the Environment) IFEC Saluggia. Joints were made by the diffusion bonding technique between graphite composite material with DS copper and with molybdenum TZM alloy, respectively. The inter-layers, when necessary, were obtained by metallization with an electronic gun. The same technique is employed in joining Be/SS, DS copper/Be, TZM/Be and graphite/Be for the first wall or plasma facing components of fusion reactors. In this case, a suitable inter-layer material can avoid the problems occuring with the more traditional brazing processes.
Otsuka, Tomoko; Iwao, Yasunori; Miyagishima, Atsuo; Itai, Shigeru
2011-05-16
Principal component analysis was applied to effectively optimize the operational conditions of a fluidized bed granulator for preparing granules with excellent compaction and tablet physical properties. The crucial variables that affect the properties of the granules, their compactability and the resulting tablet properties were determined through analysis of a series of granulation and tabletting experiments. Granulation was performed while the flow rate and concentration of the binder were changed as independent operational variables, according to a two-factor central composite design. Thirteen physicochemical properties of granules and tablets were examined: powder properties (particle size, size distribution width, Carr's index, Hausner ratio and aspect ratio), compactability properties (pressure transmission ratio, die wall force and ejection force) and tablet properties (tensile strength, friability, disintegration time, weight variation and drug content uniformity). Principal component analysis showed that the pressure transmission ratio, die wall force and Carr's index were the most important variables in granule preparation. Multiple regression analysis also confirmed these results. Furthermore, optimized operational conditions obtained from the multiple regression analysis enabled the production of granules with desirable properties for tabletting. This study presents the first use of principle component analysis for identifying and successfully predicting the most important variables in the process of granulation and tabletting. Copyright © 2011 Elsevier B.V. All rights reserved.
Qi, Danyi; Roe, Brian E
2016-01-01
We estimate models of consumer food waste awareness and attitudes using responses from a national survey of U.S. residents. Our models are interpreted through the lens of several theories that describe how pro-social behaviors relate to awareness, attitudes and opinions. Our analysis of patterns among respondents' food waste attitudes yields a model with three principal components: one that represents perceived practical benefits households may lose if food waste were reduced, one that represents the guilt associated with food waste, and one that represents whether households feel they could be doing more to reduce food waste. We find our respondents express significant agreement that some perceived practical benefits are ascribed to throwing away uneaten food, e.g., nearly 70% of respondents agree that throwing away food after the package date has passed reduces the odds of foodborne illness, while nearly 60% agree that some food waste is necessary to ensure meals taste fresh. We identify that these attitudinal responses significantly load onto a single principal component that may represent a key attitudinal construct useful for policy guidance. Further, multivariate regression analysis reveals a significant positive association between the strength of this component and household income, suggesting that higher income households most strongly agree with statements that link throwing away uneaten food to perceived private benefits.
Directory of Open Access Journals (Sweden)
Danyi Qi
Full Text Available We estimate models of consumer food waste awareness and attitudes using responses from a national survey of U.S. residents. Our models are interpreted through the lens of several theories that describe how pro-social behaviors relate to awareness, attitudes and opinions. Our analysis of patterns among respondents' food waste attitudes yields a model with three principal components: one that represents perceived practical benefits households may lose if food waste were reduced, one that represents the guilt associated with food waste, and one that represents whether households feel they could be doing more to reduce food waste. We find our respondents express significant agreement that some perceived practical benefits are ascribed to throwing away uneaten food, e.g., nearly 70% of respondents agree that throwing away food after the package date has passed reduces the odds of foodborne illness, while nearly 60% agree that some food waste is necessary to ensure meals taste fresh. We identify that these attitudinal responses significantly load onto a single principal component that may represent a key attitudinal construct useful for policy guidance. Further, multivariate regression analysis reveals a significant positive association between the strength of this component and household income, suggesting that higher income households most strongly agree with statements that link throwing away uneaten food to perceived private benefits.
Doherty, B; Vagnini, M; Dufourmantelle, K; Sgamellotti, A; Brunetti, B; Miliani, C
2014-01-01
This contribution examines the utility of vibrational spectroscopy by bench and portable Raman/surface enhanced Raman and infrared methods for the investigation of ten early triarlymethane dye powder references and dye solutions applied on paper. The complementary information afforded by the techniques is shown to play a key role in the identification of specific spectral marker ranges to distiguish early synthetic dyes of art-historical interest through the elaboration of an in-house database of modern organic dyes. Chemometric analysis has permitted a separation of data by the discrimination of di-phenyl-naphthalenes and triphenylmethanes (di-amino and tri-amino derivatives). This work serves as a prelude to the validation of a non-invasive working method for in situ characterization of these synthetic dyes through a careful comparison of respective strengths and limitations of each portable technique. Copyright © 2013 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Antonis Koukourikos
2016-07-01
Full Text Available The process of effectively applying techniques for fostering creativity in educational settings is – by nature – multifaceted and not straightforward, as it pertains to several fields such as cognitive theory and psychology. Furthermore, the quantification of the impact of different activities on creativity is a challenging and not yet thoroughly investigated task. In this paper, we present the process of applying the Semantic Lateral Thinking technique for fostering creativity in Creative Stories, a digital storytelling game, via the introduction of the appropriate stimuli in the game’s flow. Furthermore, we present a formalization for a person’s creativity as a derivative of his/her creations within the game, by transitioning from traditional computational creativity metrics over the produced stories to a space that adheres to the core principles of creativity as perceived by humans.
WAVELETS AND PRINCIPAL COMPONENT ANALYSIS METHOD FOR VIBRATION MONITORING OF ROTATING MACHINERY
Bendjama, Hocine; S. Boucherit, Mohamad
2017-01-01
Fault diagnosis is playing today a crucial role in industrial systems. To improve reliability, safety and efficiency advanced monitoring methods have become increasingly important for many systems. The vibration analysis method is essential in improving condition monitoring and fault diagnosis of rotating machinery. Effective utilization of vibration signals depends upon effectiveness of applied signal processing techniques. In this paper, fault diagnosis is performed using a com...
Directory of Open Access Journals (Sweden)
RAJAPPAN MANAVALAN
2006-11-01
Full Text Available Simultaneous estimation of all drug components in a multicomponent analgesic dosage form with artificial neural networks calibration models using UV spectrophotometry is reported as a simple alternative to using separate models for each component. Anovel approach for calibration using a compund spectral dataset derived from three spectra of each component is described. The spectra of mefenamic acid and paracetamol were recorded as several concentrations within their linear range and used to compute a calibration mixture between the wavelengths 220 to 340 nm. Neural networks trained by a Levenberg–Marquardt algorithm were used for building and optimizing the calibration models using MATALAB® Neural Network Toolbox and were compared with the principal component regression model. The calibration models were throughly evaluated at several concentration levels using 104 spectra obtained for 52 synthetic binary mixtures prepared using orthogonal designs. The optimized model showed sufficient robustness even when the calibration sets were constructed from a different set of pure spectra of the components. The simultaneous prediction of both components by a single neural netwook with the suggested calibration approach was successful. The model could accurately estimate the drugs, with satisfactory precision and accuracy, in tablet dosage with no interference from excipients as indicated by the results of a recovery study.
DEFF Research Database (Denmark)
Hajati, S.; Walton, J.; Tougaard, S.
2013-01-01
In a previous article, we studied the influence of spectral noise on a new method for three-dimensional X-ray photoelectron spectroscopy (3D XPS) imaging, which is based on analysis of the XPS peak shape [Hajati, S., Tougaard, S., Walton, J. & Fairley, N. (2008). Surf Sci 602, 3064-3070]. Here, we...... study in more detail the influence of noise reduction by principal component analysis (PCA) on 3D XPS images of carbon contamination of a patterned oxidized silicon sample and on 3D XPS images of Ag covered by a nanoscale patterned octadiene layer. PCA is very efficient for noise reduction, and using...
Ueda, Yoshiaki; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji
2017-12-01
High performance of color quantization processing is very important for obtaining limited-color images with good quality. The median cut algorithm (MCA) is a typical color quantization method. Its computational cost is low owing to its simple algorithm, but the quality of output images is mediocre at best. In this paper, we describe a modification of MCA. In our method, we use a combination of principal component analysis (PCA) and linear discriminant analysis (LDA) to accomplish effective partitioning of color space. Concretely, PCA and LDA are used to calculate partitioning planes and their positions, respectively. We verify the effectiveness of our method through experiments using 24-bit full-color natural images.
Aouabdi, Salim; Taibi, Mahmoud; Bouras, Slimane; Boutasseta, Nadir
2017-06-01
This paper describes an approach for identifying localized gear tooth defects, such as pitting, using phase currents measured from an induction machine driving the gearbox. A new tool of anomaly detection based on multi-scale entropy (MSE) algorithm SampEn which allows correlations in signals to be identified over multiple time scales. The motor current signature analysis (MCSA) in conjunction with principal component analysis (PCA) and the comparison of observed values with those predicted from a model built using nominally healthy data. The Simulation results show that the proposed method is able to detect gear tooth pitting in current signals.
Directory of Open Access Journals (Sweden)
Piotr CZECH
2007-01-01
Full Text Available This paper presents the results of an experimental application of artificial neural network as a classifier of the degree of cracking of a tooth root in a gear wheel. The neural classifier was based on the artificial neural network of Probabilistic Neural Network type (PNN. The input data for the classifier was in a form of matrix composedof statistical measures, obtained from fast Fourier transform (FFT and principal component analysis (PCA. The identified model of toothed gear transmission, operating in a circulating power system, served for generation of the teaching and testing set applied for the experiment.
Melquiades, F L; Andreoni, L F S; Thomaz, E L
2013-07-01
Differences in composition and chemical elemental concentration are important information for soil samples classification. The objective of this study is to present a direct methodology, that is non-destructive and without complex sample preparation, in order to discriminate different land-use types and soil degradation, employing energy dispersive X-ray fluorescence and multivariate analysis. Sample classification results from principal component analysis, utilizing spectral data and elemental concentration values demonstrate that the methodology is efficient to discriminate different land-use types. Copyright © 2013 Elsevier Ltd. All rights reserved.
Dacakis, Georgia; Oates, Jennifer M; Douglas, Jacinta M
2017-03-01
The Transsexual Voice Questionnaire (TVQ MtF ) is a population-specific self-report tool designed to capture the perceptions of male-to-female transsexual women (MtF women) regarding their vocal functioning and the voice-related impact on their everyday life. The aim of this study was to further the psychometric evaluation of the TVQ MtF by examining its construct validity and confirming its reliability. This is a prospective validity and reliability study. One hundred fifty-one MtF women provided data for principal components analysis with oblimin rotation. Data from 133 of these participants were also analyzed to evaluate the internal consistency of the TVQ MtF . Principal components analysis identified a two-factor structure. The largest component (accounting for 51.99% of the variance) captured individuals' perceptions of their vocal functioning and included items related to the link between voice and gender identity. This component was labeled vocal functioning. The second component (5.82% of the variance) contained items that related to the impact of voice on the individual's participation in everyday life and was labeled social participation. Internal consistency was high (Cronbach's α = .97). The findings support the construct validity of the TVQ MtF . Additionally, the high internal consistency of the TVQ MtF found in the current study confirms that the content of the TVQ MtF reliably measures the self-perceptions of MtF women regarding their voice. The current findings also support the clinical utility of the TVQ MtF providing a means of organizing TVQ MtF responses to inform the voice training process. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
Sakurai, Yumiko; Hardy, Elaissa T; Ahn, Byungwook; Tran, Reginald; Fay, Meredith E; Ciciliano, Jordan C; Mannino, Robert G; Myers, David R; Qiu, Yongzhi; Carden, Marcus A; Baldwin, W Hunter; Meeks, Shannon L; Gilbert, Gary E; Jobe, Shawn M; Lam, Wilbur A
2018-02-06
Hemostasis encompasses an ensemble of interactions among platelets, coagulation factors, blood cells, endothelium, and hemodynamic forces, but current assays assess only isolated aspects of this complex process. Accordingly, here we develop a comprehensive in vitro mechanical injury bleeding model comprising an "endothelialized" microfluidic system coupled with a microengineered pneumatic valve that induces a vascular "injury". With perfusion of whole blood, hemostatic plug formation is visualized and "in vitro bleeding time" is measured. We investigate the interaction of different components of hemostasis, gaining insight into several unresolved hematologic issues. Specifically, we visualize and quantitatively demonstrate: the effect of anti-platelet agent on clot contraction and hemostatic plug formation, that von Willebrand factor is essential for hemostasis at high shear, that hemophilia A blood confers unstable hemostatic plug formation and altered fibrin architecture, and the importance of endothelial phosphatidylserine in hemostasis. These results establish the versatility and clinical utility of our microfluidic bleeding model.
Popescu, Carmen-Mihaela; Navi, Parviz; Placencia Peña, María Inés; Popescu, Maria-Cristina
2018-02-01
Spruce wood samples were subjected to different conditions of thermal and hydro-thermal treatment by varying the temperature, relative humidity and period of exposure. The obtained treated samples were evaluated using near infrared spectroscopy (NIR), principal component analysis (PCA) and hierarchical cluster analysis (HCA) in order to evidence the structural changes which may occur during the applied treatment conditions. Following this, modification in all wood components were observed, modifications which were dependent on the temperature, amount of relative humidity and also the treatment time. Therefore, higher variations were evidenced for samples treated at higher temperatures and for longer periods. At the same time, the increase in the amount of water vapours in the medium induced a reduced rate of side chains and condensation reactions occurring in the wood structure. Further, by PCA and HCA was possible to discriminate the modifications in the wood samples according to treatment time and amount of relative humidity.
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
Bodanese, Benito; Silveira, Fabrício Luiz; Zângaro, Renato Amaro; Pacheco, Marcos Tadeu T; Pasqualucci, Carlos Augusto; Silveira, Landulfo
2012-07-01
Raman spectroscopy has been employed to discriminate between malignant (basal cell carcinoma [BCC] and melanoma [MEL]) and normal (N) skin tissues in vitro, aimed at developing a method for cancer diagnosis. Raman spectroscopy is an analytical tool that could be used to diagnose skin cancer rapidly and noninvasively. Skin biopsy fragments of ≈ 2 mm(2) from excisional surgeries were scanned through a Raman spectrometer (830 nm excitation wavelength, 50 to 200 mW of power, and 20 sec exposure time) coupled to a fiber optic Raman probe. Principal component analysis (PCA) and Euclidean distance were employed to develop a discrimination model to classify samples according to histopathology. In this model, we used a set of 145 spectra from N (30 spectra), BCC (96 spectra), and MEL (19 spectra) skin tissues. We demonstrated that principal components (PCs) 1 to 4 accounted for 95.4% of all spectral variation. These PCs have been spectrally correlated to the biochemicals present in tissues, such as proteins, lipids, and melanin. The scores of PC2 and PC3 revealed statistically significant differences among N, BCC, and MEL (ANOVA, p<0.05) and were used in the discrimination model. A total of 28 out of 30 spectra were correctly diagnosed as N, 93 out of 96 as BCC, and 13 out of 19 as MEL, with an overall accuracy of 92.4%. This discrimination model based on PCA and Euclidean distance could differentiate N from malignant (BCC and MEL) with high sensitivity and specificity.
McLeod, Lianne; Bharadwaj, Lalita; Epp, Tasha; Waldner, Cheryl L
2017-09-15
Groundwater drinking water supply surveillance data were accessed to summarize water quality delivered as public and private water supplies in southern Saskatchewan as part of an exposure assessment for epidemiologic analyses of associations between water quality and type 2 diabetes or cardiovascular disease. Arsenic in drinking water has been linked to a variety of chronic diseases and previous studies have identified multiple wells with arsenic above the drinking water standard of 0.01 mg/L; therefore, arsenic concentrations were of specific interest. Principal components analysis was applied to obtain principal component (PC) scores to summarize mixtures of correlated parameters identified as health standards and those identified as aesthetic objectives in the Saskatchewan Drinking Water Quality Standards and Objective. Ordinary, universal, and empirical Bayesian kriging were used to interpolate arsenic concentrations and PC scores in southern Saskatchewan, and the results were compared. Empirical Bayesian kriging performed best across all analyses, based on having the greatest number of variables for which the root mean square error was lowest. While all of the kriging methods appeared to underestimate high values of arsenic and PC scores, empirical Bayesian kriging was chosen to summarize large scale geographic trends in groundwater-sourced drinking water quality and assess exposure to mixtures of trace metals and ions.
Soares, Denise Paschoal; de Castro, Marcelo Peduzzi; Mendes, Emília; Machado, Leandro
2014-12-01
The elderly are susceptible to many disorders that alter the gait pattern and could lead to falls and reduction of mobility. One of the most applied therapeutical approaches to correct altered gait patterns is the insertion of insoles. Principal Component Analysis (PCA) is a powerful method used to reduce redundant information and it allows the comparison of the complete waveform. The purpose of this study was to verify the influence of wedges on lower limbs' net joint moment and range of motion (ROM) during the gait of healthy elderly participants using PCA. In addition, discrete values of lower limbs' peak net moment and ROM were also evaluated. 20 subjects walked with no wedges (control condition) and wearing six different wedges. The variables analyzed were the Principal Components from joint net moments and ROM in the sagittal plane in the ankle and knee and joint net moments in frontal plane in the knee. The discrete variables were peak joint net moments and ROM in sagittal plane in knee and ankle. The results showed the influence of the wedges to be clearer by analyzing through PCA methods than to use discrete parameters of gait curves, where the differences between conditions could be hidden. Copyright © 2014 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Das Lalita
2012-08-01
Full Text Available Abstract Background The chemotherapeutic agent paclitaxel arrests cell division by binding to the hetero-dimeric protein tubulin. Subtle differences in tubulin sequences, across eukaryotes and among β-tubulin isotypes, can have profound impact on paclitaxel-tubulin binding. To capture the experimentally observed paclitaxel-resistance of human βIII tubulin isotype and yeast β-tubulin, within a common theoretical framework, we have performed structural principal component analyses of β-tubulin sequences across eukaryotes. Results The paclitaxel-resistance of human βIII tubulin isotype and yeast β-tubulin uniquely mapped on to the lowest two principal components, defining the paclitaxel-binding site residues of β-tubulin. The molecular mechanisms behind paclitaxel-resistance, mediated through key residues, were identified from structural consequences of characteristic mutations that confer paclitaxel-resistance. Specifically, Ala277 in βIII isotype was shown to be crucial for paclitaxel-resistance. Conclusions The present analysis captures the origin of two apparently unrelated events, paclitaxel-insensitivity of yeast tubulin and human βIII tubulin isotype, through two common collective sequence vectors.
Directory of Open Access Journals (Sweden)
Yihang Yin
2015-08-01
Full Text Available Wireless sensor networks (WSNs have been widely used to monitor the environment, and sensors in WSNs are usually power constrained. Because inner-node communication consumes most of the power, efficient data compression schemes are needed to reduce the data transmission to prolong the lifetime of WSNs. In this paper, we propose an efficient data compression model to aggregate data, which is based on spatial clustering and principal component analysis (PCA. First, sensors with a strong temporal-spatial correlation are grouped into one cluster for further processing with a novel similarity measure metric. Next, sensor data in one cluster are aggregated in the cluster head sensor node, and an efficient adaptive strategy is proposed for the selection of the cluster head to conserve energy. Finally, the proposed model applies principal component analysis with an error bound guarantee to compress the data and retain the definite variance at the same time. Computer simulations show that the proposed model can greatly reduce communication and obtain a lower mean square error than other PCA-based algorithms.
Chen, Yun-Hao; Jiang, Jin-Bao; Huang, Wen-Jiang; Wang, Yuan-Yuan
2009-08-01
The canopy reflectance of winter wheat infected by yellow rust with different severity was measured through artificial inoculation, and the disease index (DI) of the wheat corresponding to the spectra acquired in the field was obtained. Principal component analysis (PCA) was used to compute the first 5 principal components (PCs) of canopy spectra in the 350-1 350 nm range and the first 3 PCs of first-order derivative in blue edge (490-530 nm), yellow edge (550-582 nm) and red edge (630-673 nm), respectively. Step-wise regression was used to build models, the results of those models are compared with that of VI-empirical models, and the result shows that the model based on PCs of first-order derivative is particularly accurate compared to others, with the RMSE of 7.65 and relative error of 15.59%. Comparison was made between the estimated DI and the measured DI, indicating that the model based on SDr'/SDg' is suitable to monitoring early disease and the model based on PCs of first-order derivative is suitable to monitoring the more severe disease of yellow rust of winter wheat. The conclusion has great practical and application value to acquiring and evaluating wheat disease severity using hyperspectral remote sensing, and has an important meaning for increasing yields of crops and ensuring security of food supplies.
Li, Miaoyun; Li, Yuanhui; Huang, Xianqing; Zhao, Gaiming; Tian, Wei
2014-06-01
The growth of Pseudomonas of pallet-packaged seasoned prepared chicken products under selected storage temperatures (5°°C, 10°°C, 15°°C, 20°°C and 25°°C) has been studied in this paper. The modified Gompertz, Baranyi and Huang models were used for data fitting. Statistical criteria such as residual sum of squares, mean square error, Akaike's information criterion, Pseudo-R(2) were used to evaluate model performance. Results showed that RSS (Residual sum of squares) index contribution rate was more than 90% of the variability, which could be explained by the first principal components analyzed by the principal component analysis (PCA). The index values reported in Sichuan-style chicken skewers and chicken flesh and bones were about 94.85% and 93.345% respectively, and both the rate were better than the standard (85%). Therefore, RSS can be used as the main evaluating index to analyze and compare the difference of those three models. With the smallest average values of RSS and the biggest pseudo-R2 at most temperatures, the Baranyi model was more suitable to fit the data of Pseudomonas obtained from the two prepared chicken products than Gompertz model and Huang model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Gasson, Peter; Miller, Regis; Stekel, Dov J; Whinder, Frances; Zieminska, Kasia
2010-01-01
Dalbergia nigra is one of the most valuable timber species of its genus, having been traded for over 300 years. Due to over-exploitation it is facing extinction and trade has been banned under CITES Appendix I since 1992. Current methods, primarily comparative wood anatomy, are inadequate for conclusive species identification. This study aims to find a set of anatomical characters that distinguish the wood of D. nigra from other commercially important species of Dalbergia from Latin America. Qualitative and quantitative wood anatomy, principal components analysis and naïve Bayes classification were conducted on 43 specimens of Dalbergia, eight D. nigra and 35 from six other Latin American species. Dalbergia cearensis and D. miscolobium can be distinguished from D. nigra on the basis of vessel frequency for the former, and ray frequency for the latter. Principal components analysis was unable to provide any further basis for separating the species. Naïve Bayes classification using the four characters: minimum vessel diameter; frequency of solitary vessels; mean ray width; and frequency of axially fused rays, classified all eight D. nigra correctly with no false negatives, but there was a false positive rate of 36.36 %. Wood anatomy alone cannot distinguish D. nigra from all other commercially important Dalbergia species likely to be encountered by customs officials, but can be used to reduce the number of specimens that would need further study.
Directory of Open Access Journals (Sweden)
Benoit Parmentier
2014-12-01
Full Text Available Characterizing biophysical changes in land change areas over large regions with short and noisy multivariate time series and multiple temporal parameters remains a challenging task. Most studies focus on detection rather than the characterization, i.e., the manner by which surface state variables are altered by the process of changes. In this study, a procedure is presented to extract and characterize simultaneous temporal changes in MODIS multivariate times series from three surface state variables the Normalized Difference Vegetation Index (NDVI, land surface temperature (LST and albedo (ALB. The analysis involves conducting a seasonal trend analysis (STA to extract three seasonal shape parameters (Amplitude 0, Amplitude 1 and Amplitude 2 and using principal component analysis (PCA to contrast trends in change and no-change areas. We illustrate the method by characterizing trends in burned and unburned pixels in Alaska over the 2001–2009 time period. Findings show consistent and meaningful extraction of temporal patterns related to fire disturbances. The first principal component (PC1 is characterized by a decrease in mean NDVI (Amplitude 0 with a concurrent increase in albedo (the mean and the annual amplitude and an increase in LST annual variability (Amplitude 1. These results provide systematic empirical evidence of surface changes associated with one type of land change, fire disturbances, and suggest that STA with PCA may be used to characterize many other types of land transitions over large landscape areas using multivariate Earth observation time series.
Implementation of the geoethics principal to environmental technologies by Biogeosystem Technique
Batukaev, Abdulmalik; Kalinitchenko, Valery; Minkina, Tatiana; Mandzhieva, Saglara; Sushkova, Svetlana
2017-04-01
The uncertainty and degradation of biosphere is a result of outdated industrial technologies. The incorrect principals of the nature resources use paradigm are to be radically changed corresponding to principals of Geoethics. Technological dead-end is linked to Philosophy of Technology. The organic protection and imitation of natural patterns are till now the theoretical base of technology. The technological and social determinism are proposed as the "inevitable" for humankind. One is forced to believe that the only way for humanity is to agree that the outdated way of technical development is the only possibility for humankind to survive. But rough imitation as a method of outdated technological platform is fruitless now. Survival under practice of industrial technology platform now has become extremely dangerous. The challenge for humanity is to overcome the chain of environmental hazards of agronomy, irrigation, industry, and other human activities in biosphere, which awkwardly imitate the natural processes: plowing leads to degradation of soil and greenhouse gases emission; irrigation leads to excessive moistening and degradation of soil, landscape, greenhouse gases emission, loss of freshwater - the global deficit; waste utilization leads to greenhouse gases emission, loss of oxigen and other ecological hazards. The fundamentally new technologies are to be generates for development of biosphere, food and resources renewing. Aristotle told that technique can go beyond nature and implement "what nature can't bring to a finish." To overcome fundamental shortcomings of industrial technologies, incorrect land use we propose the Biogeosystem Technique (BGT*) for biosphere sustainability. The BGT* key point is transcendent approach (not imitating of the natural processes) - new technical solutions for biosphere - soil construction, the fluxes of energy, matter, and water control and biological productivity of terrestrial systems. Intra-soil milling which provides the
Singh, Dharmendra; Kumar, Harish
Earth observation satellites provide data that covers different portions of the electromagnetic spectrum at different spatial and spectral resolutions. The increasing availability of information products generated from satellite images are extending the ability to understand the patterns and dynamics of the earth resource systems at all scales of inquiry. In which one of the most important application is the generation of land cover classification from satellite images for understanding the actual status of various land cover classes. The prospect for the use of satel-lite images in land cover classification is an extremely promising one. The quality of satellite images available for land-use mapping is improving rapidly by development of advanced sensor technology. Particularly noteworthy in this regard is the improved spatial and spectral reso-lution of the images captured by new satellite sensors like MODIS, ASTER, Landsat 7, and SPOT 5. For the full exploitation of increasingly sophisticated multisource data, fusion tech-niques are being developed. Fused images may enhance the interpretation capabilities. The images used for fusion have different temporal, and spatial resolution. Therefore, the fused image provides a more complete view of the observed objects. It is one of the main aim of image fusion to integrate different data in order to obtain more information that can be de-rived from each of the single sensor data alone. A good example of this is the fusion of images acquired by different sensors having a different spatial resolution and of different spectral res-olution. Researchers are applying the fusion technique since from three decades and propose various useful methods and techniques. The importance of high-quality synthesis of spectral information is well suited and implemented for land cover classification. More recently, an underlying multiresolution analysis employing the discrete wavelet transform has been used in image fusion. It was found
Duong, T. A.
2004-01-01
In this paper, we present a new, simple, and optimized hardware architecture sequential learning technique for adaptive Principle Component Analysis (PCA) which will help optimize the hardware implementation in VLSI and to overcome the difficulties of the traditional gradient descent in learning convergence and hardware implementation.
Todhunter, Fern
2015-07-01
To report on the relationship between competence and confidence in nursing students as users of information and communication technologies, using principal components analysis. In nurse education, learning about and learning using information and communication technologies is well established. Nursing students are one of the undergraduate populations in higher education required to use these resources for academic work and practice learning. Previous studies showing mixed experiences influenced the choice of an exploratory study to find out about information and communication technologies competence and confidence. A 48-item survey questionnaire was administered to a volunteer sample of first- and second-year nursing students between July 2008-April 2009. The cohort ( N = 375) represented 18·75% of first- and second-year undergraduates. A comparison between this work and subsequent studies reveal some similar ongoing issues and ways to address them. A principal components analysis (PCA) was carried out to determine the strength of the correlation between information and communication technologies competence and confidence. The aim was to show the presence of any underlying dimensions in the transformed data that would explain any variations in information and communication technologies competence and confidence. Cronbach's alpha values showed fair to good internal consistency. The five component structure gave medium to high results and explained 44·7% of the variance in the original data. Confidence had a high representation. The findings emphasized the shift towards social learning approaches for information and communication technologies. Informal social collaboration found favour with nursing students. Learning through talking, watching and listening all play a crucial role in the development of computing skills.
Wellens, H.L.L.; Kuijpers-Jagtman, A.M.
2016-01-01
OBJECTIVE: The combination of generalized Procrustes superimposition (GPS) and principal component analysis (PCA) has been hypothesized to solve some of the problems plaguing traditional cephalometry. This study demonstrates how to establish the currently unclear relationship between the shape space
Ma, Mengli; Lei, En; Meng, Hengling; Wang, Tiantao; Xie, Linyan; Shen, Dong; Xianwang, Zhou; Lu, Bingyue
2017-08-01
Amomum tsao-ko is a commercial plant that used for various purposes in medicinal and food industries. For the present investigation, 44 germplasm samples were collected from Jinping County of Yunnan Province. Clusters analysis and 2-dimensional principal component analysis (PCA) was used to represent the genetic relations among Amomum tsao-ko by using simple sequence repeat (SSR) markers. Clustering analysis clearly distinguished the samples groups. Two major clusters were formed; first (Cluster I) consisted of 34 individuals, the second (Cluster II) consisted of 10 individuals, Cluster I as the main group contained multiple sub-clusters. PCA also showed 2 groups: PCA Group 1 included 29 individuals, PCA Group 2 included 12 individuals, consistent with the results of cluster analysis. The purpose of the present investigation was to provide information on genetic relationship of Amomum tsao-ko germplasm resources in main producing areas, also provide a theoretical basis for the protection and utilization of Amomum tsao-ko resources.
Energy Technology Data Exchange (ETDEWEB)
Igwenagu, C.M. [Enugu State University of Science and Technology (Nigeria). Dept. of Industrial Mathematics, Applied Statistics and Demography
2011-07-01
This study has examined the position of Nigeria in relation to carbon dioxide (CO{sub 2}) emission in readiness for emission trading as proposed in the Kyoto protocol as a measure of reducing global warming. It was discovered that Nigeria emits only 0.4% of the world's total CO{sub 2} emission indicating that they will be possible sellers of emission as contained in the Kyoto protocol. Fifty countries were selected for the analysis and some possible correlates of CO{sub 2} were considered. Correlation analysis and principal component analysis revealed that gross domestic product and industrial output accounted for 93% of the total variation. Based on this, a very low economic activity is being experienced in the country.
Silveira, Landulfo, Jr.; Silveira, Fabrício L.; Bodanese, Benito; Pacheco, Marcos Tadeu T.; Zângaro, Renato A.
2012-02-01
This work demonstrated the discrimination among basal cell carcinoma (BCC) and normal human skin in vivo using near-infrared Raman spectroscopy. Spectra were obtained in the suspected lesion prior resectional surgery. After tissue withdrawn, biopsy fragments were submitted to histopathology. Spectra were also obtained in the adjacent, clinically normal skin. Raman spectra were measured using a Raman spectrometer (830 nm) with a fiber Raman probe. By comparing the mean spectra of BCC with the normal skin, it has been found important differences in the 800-1000 cm-1 and 1250-1350 cm-1 (vibrations of C-C and amide III, respectively, from lipids and proteins). A discrimination algorithm based on Principal Components Analysis and Mahalanobis distance (PCA/MD) could discriminate the spectra of both tissues with high sensitivity and specificity.
DEFF Research Database (Denmark)
Skov, V.; Thomassen, Mads; Kruse, T.A.
2012-01-01
The recent discovery of the Janus activating kinase 2 V617F mutation in most patients with polycythemia vera (PV) and half of those with essential thrombocythemia (ET) and primary myelofibrosis (PMF) has favored the hypothesis of a biological continuum from ET over PV to PMF. We performed gene...... expression profiling of whole blood from control subjects (n = 21) and patients with ET (n = 19), PV (n = 41), and PMF (n = 9) using DNA microarrays. Applying an unsupervised method, principal component analysis, to search for patterns in the data, we demonstrated a separation of the four groups...... with biological relevant overlaps between the different entities. Moreover, the analysis separates Janus activating kinase 2-negative ET patients from Janus activating kinase 2-positive ET patients. Functional annotation analysis demonstrates that clusters of gene ontology terms related to inflammation, immune...
Directory of Open Access Journals (Sweden)
Othman Nasri
2015-01-01
Full Text Available This paper presents a fault detection and isolation (FDI approach in order to detect and isolate actuators (thrusters and reaction wheels faults of an autonomous spacecraft involved in the rendez-vous phase of the Mars Sample Return (MSR mission. The principal component analysis (PCA has been adopted to estimate the relationships between the various variables of the process. To ensure the feasibility of the proposed FDI approach, a set of data provided by the industrial “high-fidelity” simulator of the MSR and representing the opening (resp., the rotation rates of the spacecraft thrusters (resp., reaction wheels has been considered. The test results demonstrate that the fault detection and isolation are successfully accomplished.
Šamec, Dunja; Maretić, Marina; Lugarić, Ivana; Mešić, Aleksandar; Salopek-Sondi, Branka; Duralija, Boris
2016-03-01
The worldwide established strawberry cultivar 'Albion' and three recently introduced cultivars in Europe: 'Monterey', 'Capri', and 'Murano', grown hydroponically, were studied to ascertain the influence of cultivar and harvesting date on the physical, chemical, antioxidant and phytochemical properties of their fruits. Interrelationships of investigated parameters and these cultivars were investigated by the statistical approach of principal component analysis (PCA). Results indicated that cultivar had a more significant effect on the analyzed parameters than harvesting date. Thus grouping of the variables in a PCA plot indicated that each cultivar has specific characteristics important for consumer or industrial use. Cultivar 'Monterey' was the richest in phytochemical contents and consequently in antioxidant activity, 'Albion' showed the highest contents of total soluble solids, titratable acidity content and ascorbic acid, 'Capri' had the highest value of firmness, while 'Murano' had lighter color in comparison to others. Potential use of these cultivars has been assessed according to these important measured attributes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Septa Firmansyah Putra
2017-01-01
Full Text Available Penelitian ini bertujuan untuk mengetahui atribut-atribut apa yang akan digunakan untuk klasterisasi provinsi di Indonesia berdasarkan faktor kesiapan dalam menghadapi bencana. Data yang digunakan terdiri dari tiga kelompok data yaitu data jumlah kejadian bencana yang terdiri dari 19 sub-atribut, data jumlah fasilitas kesehatan yang terdiri dari 14 sub-atribut dan data jumlah tenaga kesehatan yang terdiri dari 11 sub atribut. Penelitian ini dapat menjadi gambaran tentang bagaimana melakukan pembersihan dan pemilihan data sebelum digunakan dalam proses klasterisasi. Data-data ini akan dibersihkan dan dipilih sebelum nantinya digunakan pada proses klasterisasi. Proses pembersihan dan pemilihan data dilakukan dengan bantuan PCA (Principal Component Analysis namun sebelumnya dibersihkan telebih dahulu dengan cara manual. Penelitian dibagi menjadi 3 percobaan. Pada percobaan pertama didapatkan 31 sub-atribut yang siap digunakan, percobaan kedua didapatkan 29 sub-atribut yang siap digunakan dan pada percobaan ketiga didapatkan 24 sub-atribut yang siap digunakan.
Directory of Open Access Journals (Sweden)
Shaukat S. Shahid
2016-06-01
Full Text Available In this study, we used bootstrap simulation of a real data set to investigate the impact of sample size (N = 20, 30, 40 and 50 on the eigenvalues and eigenvectors resulting from principal component analysis (PCA. For each sample size, 100 bootstrap samples were drawn from environmental data matrix pertaining to water quality variables (p = 22 of a small data set comprising of 55 samples (stations from where water samples were collected. Because in ecology and environmental sciences the data sets are invariably small owing to high cost of collection and analysis of samples, we restricted our study to relatively small sample sizes. We focused attention on comparison of first 6 eigenvectors and first 10 eigenvalues. Data sets were compared using agglomerative cluster analysis using Ward’s method that does not require any stringent distributional assumptions.
Schievano, Elisabetta; Pasini, Gabriella; Cozzi, Giulio; Mammi, Stefano
2008-08-27
In the present work, a rapid and simple NMR method to discriminate Asiago d'Allevo cheese samples from different production chains is described. A fast and reproducible extraction of the organic fraction was employed. By applying chemometric analysis to NMR data, it is possible to differentiate PDO Asiago cheese produced in alpine farms from that produced in lowland and mountain industrialized factories. PCA of both (1)H and (13)C NMR spectra showed a good separation of alpine farm products from the other ones, whereas the lowland and mountain industrialized cheeses are undistinguishable. The samples were differentiated on the basis of a higher content of unsaturated fatty acids, principally oleic, linoleic, linolenic, and conjugated linoleic acids for the alpine farm cheeses and a higher content of saturated fatty acids for the industrialized products. Conjugated linoleic acid and 1-pentene are also discriminating components.
Lima, Marcelo A.; Rudd, Timothy R.; de Farias, Eduardo H. C.; Ebner, Lyvia F.; Gesteira, Tarsis F.; de Souza, Lauro M.; Mendes, Aline; Córdula, Carolina R.; Martins, João R. M.; Hoppensteadt, Debra; Fareed, Jawed; Sassaki, Guilherme L.; Yates, Edwin A.; Tersariol, Ivarne L. S.; Nader, Helena B.
2011-01-01
The year 2007 was marked by widespread adverse clinical responses to heparin use, leading to a global recall of potentially affected heparin batches in 2008. Several analytical methods have since been developed to detect impurities in heparin preparations; however, many are costly and dependent on instrumentation with only limited accessibility. A method based on a simple UV-scanning assay, combined with principal component analysis (PCA), was developed to detect impurities, such as glycosaminoglycans, other complex polysaccharides and aromatic compounds, in heparin preparations. Results were confirmed by NMR spectroscopy. This approach provides an additional, sensitive tool to determine heparin purity and safety, even when NMR spectroscopy failed, requiring only standard laboratory equipment and computing facilities. PMID:21267460
Soeiro, Bruno T; Boen, Thaís R; Wagner, Roger; Lima-Pallone, Juliana A
2009-01-01
The aim of the present work was to determine parameters of the corn and wheat flour matrix, such as protein, lipid, moisture, ash and carbohydrates, folic acid and iron contents. Three principal components explained 91% of the total variance. Wheat flours were characterized by high protein and moisture content. On the other hand, the corn flours had the greater carbohydrates, lipids and folic acid levels. The concentrations of folic acid were lower than the issued value for wheat flours. Nevertheless, corn flours presented extremely high values. The iron concentration was higher than that recommended in Brazilian legislation. Poor homogenization of folic acid and iron was observed in enriched flours. This study could be useful to help the governmental authorities in the enriched food programs evaluation.
de Siqueira e Oliveira, Fernanda SantAna; Giana, Hector Enrique; Silveira, Landulfo
2012-10-01
A method, based on Raman spectroscopy, for identification of different microorganisms involved in bacterial urinary tract infections has been proposed. Spectra were collected from different bacterial colonies (Gram-negative: Escherichia coli, Klebsiella pneumoniae, Proteus mirabilis, Pseudomonas aeruginosa and Enterobacter cloacae, and Gram-positive: Staphylococcus aureus and Enterococcus spp.), grown on culture medium (agar), using a Raman spectrometer with a fiber Raman probe (830 nm). Colonies were scraped from the agar surface and placed on an aluminum foil for Raman measurements. After preprocessing, spectra were submitted to a principal component analysis and Mahalanobis distance (PCA/MD) discrimination algorithm. We found that the mean Raman spectra of different bacterial species show similar bands, and S. aureus was well characterized by strong bands related to carotenoids. PCA/MD could discriminate Gram-positive bacteria with sensitivity and specificity of 100% and Gram-negative bacteria with sensitivity ranging from 58 to 88% and specificity ranging from 87% to 99%.
International Nuclear Information System (INIS)
Burns, W.A.; Mankiewicz, P.J.; Bence, A.E.; Page, D.S.; Parker, K.R.
1997-01-01
A method was developed to allocate polycyclic aromatic hydrocarbons (PAHs) in sediment samples to the PAH sources from which they came. The method uses principal-component analysis to identify possible sources and a least-squares model to find the source mix that gives the best fit of 36 PAH analytes in each sample. The method identified 18 possible PAH sources in a large set of field data collected in Prince William Sound, Alaska, USA, after the 1989 Exxon Valdez oil spill, including diesel oil, diesel soot, spilled crude oil in various weathering states, natural background, creosote, and combustion products from human activities and forest fires. Spill oil was generally found to be a small increment of the natural background in subtidal sediments, whereas combustion products were often the predominant sources for subtidal PAHs near sites of past or present human activity. The method appears to be applicable to other situations, including other spills
Fang, Leyuan; Wang, Chong; Li, Shutao; Yan, Jun; Chen, Xiangdong; Rabbani, Hossein
2017-11-01
We present an automatic method, termed as the principal component analysis network with composite kernel (PCANet-CK), for the classification of three-dimensional (3-D) retinal optical coherence tomography (OCT) images. Specifically, the proposed PCANet-CK method first utilizes the PCANet to automatically learn features from each B-scan of the 3-D retinal OCT images. Then, multiple kernels are separately applied to a set of very important features of the B-scans and these kernels are fused together, which can jointly exploit the correlations among features of the 3-D OCT images. Finally, the fused (composite) kernel is incorporated into an extreme learning machine for the OCT image classification. We tested our proposed algorithm on two real 3-D spectral domain OCT (SD-OCT) datasets (of normal subjects and subjects with the macular edema and age-related macular degeneration), which demonstrated its effectiveness. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Generalized principal component analysis
Vidal, René; Sastry, S S
2016-01-01
This book provides a comprehensive introduction to the latest advances in the mathematical theory and computational tools for modeling high-dimensional data drawn from one or multiple low-dimensional subspaces (or manifolds) and potentially corrupted by noise, gross errors, or outliers. This challenging task requires the development of new algebraic, geometric, statistical, and computational methods for efficient and robust estimation and segmentation of one or multiple subspaces. The book also presents interesting real-world applications of these new methods in image processing, image and video segmentation, face recognition and clustering, and hybrid system identification etc. This book is intended to serve as a textbook for graduate students and beginning researchers in data science, machine learning, computer vision, image and signal processing, and systems theory. It contains ample illustrations, examples, and exercises and is made largely self-contained with three Appendices which survey basic concepts ...
Directory of Open Access Journals (Sweden)
Tomáš Fekete
2016-05-01
Full Text Available The subject of this work was to examine differences in chemical composition of sliced and whole stick Nitran salamis, purchased from various manufacturers. Nitran salamis are traditional dry fermented meat products of Slovak origin. Taking into account variations in raw materials, production process and potential adulteration, differences in chemical composition within one brand of salami from different manufacturers might be expected. Ten salamis were determined for basic chemical composition attributes and Principal Component Analysis was applied on data matrix to identify anomalous ones. It has been shown that six attributes, namely: protein without collagen of total protein, total protein, total meat, total fat, collagen of total protein and NaCl, were the most important for salamis as first two Principal Components together explained 70.16% of variance among them. Nitran D was found to be the most anomalous salami, as had the lowest value of protein without collagen of total protein (14.14% ±0.26%, total protein (17.42% ±0.44%, total meat (120.29% ±0.98% and the highest one of total fat (50.85% ±0.95%, collagen of total protein (18.83% ±0.50% and NaCl (9.55% ±1.93%, when compared to its whole stick variant Nitran C and other samples. In addition to collagen of total protein content, Nitran D together with Nitran A, F and H did not satisfied the legislatively determined criterion, which is ≤16%. This suggested that extra connective tissues were added to intermediate products, which resulted in high variability and inferior quality of final products. It is a common practice in the meat industry to increase the protein content or water binding properties of meat products.
Iqbal, Abdullah; Valous, Nektarios A; Sun, Da-Wen; Allen, Paul
2011-02-01
Lacunarity is about quantifying the degree of spatial heterogeneity in the visual texture of imagery through the identification of the relationships between patterns and their spatial configurations in a two-dimensional setting. The computed lacunarity data can designate a mathematical index of spatial heterogeneity, therefore the corresponding feature vectors should possess the necessary inter-class statistical properties that would enable them to be used for pattern recognition purposes. The objectives of this study is to construct a supervised parsimonious classification model of binary lacunarity data-computed by Valous et al. (2009)-from pork ham slice surface images, with the aid of kernel principal component analysis (KPCA) and artificial neural networks (ANNs), using a portion of informative salient features. At first, the dimension of the initial space (510 features) was reduced by 90% in order to avoid any noise effects in the subsequent classification. Then, using KPCA, the first nineteen kernel principal components (99.04% of total variance) were extracted from the reduced feature space, and were used as input in the ANN. An adaptive feedforward multilayer perceptron (MLP) classifier was employed to obtain a suitable mapping from the input dataset. The correct classification percentages for the training, test and validation sets were 86.7%, 86.7%, and 85.0%, respectively. The results confirm that the classification performance was satisfactory. The binary lacunarity spatial metric captured relevant information that provided a good level of differentiation among pork ham slice images. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.
Hartmann, Susanne; Missimer, John H; Stoeckel, Cornelia; Abela, Eugenio; Shah, Jon; Seitz, Rüdiger J; Weder, Bruno J
2008-01-01
Tactile object discrimination is an essential human skill that relies on functional connectivity between the neural substrates of motor, somatosensory and supramodal areas. From a theoretical point of view, such distributed networks elude categorical analysis because subtraction methods are univariate. Thus, the aim of this study was to identify the neural networks involved in somatosensory object discrimination using a voxel-based principal component analysis (PCA) of event-related functional magnetic resonance images. Seven healthy, right-handed subjects aged between 22 and 44 years were required to discriminate with their dominant hand the length differences between otherwise identical parallelepipeds in a two-alternative forced-choice paradigm. Of the 34 principal components retained for analysis according to the 'bootstrapped' Kaiser-Guttman criterion, t-tests applied to the subject-condition expression coefficients showed significant mean differences between the object presentation and inter-stimulus phases in PC 1, 3, 26 and 32. Specifically, PC 1 reflected object exploration or manipulation, PC 3 somatosensory and short-term memory processes. PC 26 evinced the perception that certain parallelepipeds could not be distinguished, while PC 32 emerged in those choices when they could be. Among the cerebral regions evident in the PCs are the left posterior parietal lobe and premotor cortex in PC 1, the left superior parietal lobule (SPL) and the right cuneus in PC 3, the medial frontal and orbitofrontal cortex bilaterally in PC 26, and the right intraparietal sulcus, anterior SPL and dorsolateral prefrontal cortex in PC 32. The analysis provides evidence for the concerted action of large-scale cortico-subcortical networks mediating tactile object discrimination. Parallel to activity in nodes processing object-related impulses we found activity in key cerebral regions responsible for subjective assessment and validation.
Jung, Brian C; Choi, Soo I; Du, Annie X; Cuzzocreo, Jennifer L; Geng, Zhuo Z; Ying, Howard S; Perlman, Susan L; Toga, Arthur W; Prince, Jerry L; Ying, Sarah H
2012-12-01
Although "cerebellar ataxia" is often used in reference to a disease process, presumably there are different underlying pathogenetic mechanisms for different subtypes. Indeed, spinocerebellar ataxia (SCA) types 2 and 6 demonstrate complementary phenotypes, thus predicting a different anatomic pattern of degeneration. Here, we show that an unsupervised classification method, based on principal component analysis (PCA) of cerebellar shape characteristics, can be used to separate SCA2 and SCA6 into two classes, which may represent disease-specific archetypes. Patients with SCA2 (n=11) and SCA6 (n=7) were compared against controls (n=15) using PCA to classify cerebellar anatomic shape characteristics. Within the first three principal components, SCA2 and SCA6 differed from controls and from each other. In a secondary analysis, we studied five additional subjects and found that these patients were consistent with the previously defined archetypal clusters of clinical and anatomical characteristics. Secondary analysis of five subjects with related diagnoses showed that disease groups that were clinically and pathophysiologically similar also shared similar anatomic characteristics. Specifically, Archetype #1 consisted of SCA3 (n=1) and SCA2, suggesting that cerebellar syndromes accompanied by atrophy of the pons may be associated with a characteristic pattern of cerebellar neurodegeneration. In comparison, Archetype #2 was comprised of disease groups with pure cerebellar atrophy (episodic ataxia type 2 (n=1), idiopathic late-onset cerebellar ataxias (n=3), and SCA6). This suggests that cerebellar shape analysis could aid in discriminating between different pathologies. Our findings further suggest that magnetic resonance imaging is a promising imaging biomarker that could aid in the diagnosis and therapeutic management in patients with cerebellar syndromes.
Directory of Open Access Journals (Sweden)
Viktor KROTOV
2009-01-01
Full Text Available Application of the method of the principal components at the analysis of bearing ability of the wheel pair of the car. In this article it is given proof of the method of the principal components to the analysis of calculation of the stress-deformed condition of the wheel pair of the freight car. The statistical result estimation method allows to get mathematical models for researching parameters of the reliability of the wheel pair at changing of loading parameters.
Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal
2017-07-01
To improve the functionality in regards to manufacturing products at low cost, several innovative additive manufacturing processes have been developed. Fused deposition modeling (FDM) is one the most prominent additive manufacturing processes for producing plastic products. Referring to the quality and functionality, there are many FDM process conditions contributing to the occurrence of poor quality and functionality of FDM manufactured products. Therefore, the effect of the build parameters on the functionality of FDM produced products need to be examined. In this study, an attempt has also been made to investigate the critical processing parameters affecting dynamic mechanical response of plastic part processed by FDM technology using IV-optimal response surface method coupled with principal component analysis technique. The results obtained from this study can be used in the future as a guide for future in selecting the appropriate process conditions before the manufacturing process of the product take place.
Handbook of microwave component measurements with advanced VNA techniques
Dunsmore, Joel P
2012-01-01
This book provides state-of-the-art coverage for making measurements on RF and Microwave Components, both active and passive. A perfect reference for R&D and Test Engineers, with topics ranging from the best practices for basic measurements, to an in-depth analysis of errors, correction methods, and uncertainty analysis, this book provides everything you need to understand microwave measurements. With primary focus on active and passive measurements using a Vector Network Analyzer, these techniques and analysis are equally applicable to measurements made with Spectrum Analyzers or Noise Figure
Otero, Federico; Norte, Federico; Araneo, Diego
2018-01-01
The aim of this work is to obtain an index for predicting the probability of occurrence of zonda event at surface level from sounding data at Mendoza city, Argentine. To accomplish this goal, surface zonda wind events were previously found with an objective classification method (OCM) only considering the surface station values. Once obtained the dates and the onset time of each event, the prior closest sounding for each event was taken to realize a principal component analysis (PCA) that is used to identify the leading patterns of the vertical structure of the atmosphere previously to a zonda wind event. These components were used to construct the index model. For the PCA an entry matrix of temperature ( T) and dew point temperature (Td) anomalies for the standard levels between 850 and 300 hPa was build. The analysis yielded six significant components with a 94 % of the variance explained and the leading patterns of favorable weather conditions for the development of the phenomenon were obtained. A zonda/non-zonda indicator c can be estimated by a logistic multiple regressions depending on the PCA component loadings, determining a zonda probability index \\widehat{c} calculable from T and Td profiles and it depends on the climatological features of the region. The index showed 74.7 % efficiency. The same analysis was performed by adding surface values of T and Td from Mendoza Aero station increasing the index efficiency to 87.8 %. The results revealed four significantly correlated PCs with a major improvement in differentiating zonda cases and a reducing of the uncertainty interval.
Directory of Open Access Journals (Sweden)
İSTEM KÖYMEN KESER
2013-06-01
Full Text Available The objective of the Functional Data Analysis techniques is to study such type of data which consist of observed functions or curves evaluated at a finite subset of some real interval. Techniques in Functional Data Analysis can be used to study the variation in a random sample of real functions, xi(t, i=1, 2, …, N and their derivatives. In practice, these functions are often a consequence of a preliminary smoothing process applied to discrete data and in this work, Spline Smoothing Methods are used. As the number of functions and the number of observation points increases, it would be difficult to handle the functions altogether. In order to overcome this complexity, we utilize Functional and Regularized Functional Principal Component Analyses where a high percentage of total variation could be accounted for with only a few component functions. Finally, an application on the daily closing data for the share prices of the companies belonging to the ISE-30 index is also given.
Directory of Open Access Journals (Sweden)
Adu-Sarkodie Yaw
2010-07-01
Full Text Available Abstract Background The socioeconomic and sociodemographic situation are important components for the design and assessment of malaria control measures. In malaria endemic areas, however, valid classification of socioeconomic factors is difficult due to the lack of standardized tax and income data. The objective of this study was to quantify household socioeconomic levels using principal component analyses (PCA to a set of indicator variables and to use a classification scheme for the multivariate analysis of children Methods In total, 1,496 children presenting to the hospital were examined for malaria parasites and interviewed with a standardized questionnaire. The information of eleven indicators of the family's housing situation was reduced by PCA to a socioeconomic score, which was then classified into three socioeconomic status (poor, average and rich. Their influence on the malaria occurrence was analysed together with malaria risk co-factors, such as sex, parent's educational and ethnic background, number of children living in a household, applied malaria protection measures, place of residence and age of the child and the mother. Results The multivariate regression analysis demonstrated that the proportion of children with malaria decreased with increasing socioeconomic status as classified by PCA (p Conclusions The socioeconomic situation is significantly associated with malaria even in holoendemic rural areas where economic differences are not much pronounced. Valid classification of the socioeconomic level is crucial to be considered as confounder in intervention trials and in the planning of malaria control measures.
Directory of Open Access Journals (Sweden)
Golzarian Mahmood R
2011-09-01
Full Text Available Abstract Wheat is one of the most important crops in Australia, and the identification of young plants is an important step towards developing an automated system for monitoring crop establishment and also for differentiating crop from weeds. In this paper, a framework to differentiate early narrow-leaf wheat from two common weeds from their digital images is developed. A combination of colour, texture and shape features is used. These features are reduced to three descriptors using Principal Component Analysis. The three components provide an effective and significant means for distinguishing the three grasses. Further analysis enables threshold levels to be set for the discrimination of the plant species. The PCA model was evaluated on an independent data set of plants and the results show accuracy of 88% and 85% in the differentiation of ryegrass and brome grass from wheat, respectively. The outcomes of this study can be integrated into new knowledge in developing computer vision systems used in automated weed management.
International Nuclear Information System (INIS)
Villalobos Chaves, Alberto E.
2006-01-01
Principal Components Analysis (PCA) was applied to the determination of the origin of samples of vodkas. Analytical parameters used were: the alcoholic degree, the difference between the alcoholic experimental degree and declared in the etiquette, the dried extract, the relative intensities of calcium atomic emission (beak area at 422,67 nm), sodium (sum of beaks areas Ca, Na / 588,99 and 589,59 nm) and potassium (sum of beaks areas to K/766,49 nm and 769,89 nm) and finally the ultraviolet absorbency to 200 nm. The accumulation of K-averages was used. The hypothesis of item is that the sample was constituted, approximately, for two big natural groupings, this is, national vodkas and foreign vodkas. Of the application of the above mentioned procedure there was obtained that really the components of the sample were distinguishable according to the national or foreign origin in two groups, which ellipses of confidence to 95 % not achieving , even if there were eliminated the variables of alcoholic degree and difference of the alcoholic degree. (author) [es
International Nuclear Information System (INIS)
Saveleva, E. I.; Koryagina, N. L.; Radilov, A. S.; Khlebnikova, N. S.; Khrustaleva, V. S.
2007-01-01
A package of chemical analytical procedures was developed for the detection of products indicative of the presence of damped chemical weapons in the Baltic Sea. The principal requirements imposed upon the procedures were the following: high sensitivity, reliable identification of target compounds, wide range of components covered by survey analysis, and lack of interferences from sea salts. Thiodiglycol, a product of hydrolysis of sulfur mustard reportedly always detected in the sites of damping chemical weapons in the Baltic Sea, was considered the principal marker. We developed a high-sensitivity procedure for the determination of thiodiglycol in sea water, involving evaporation of samples to dryness in a vacuum concentrator, followed by tert-butyldimethylsilylation of the residue and GCMS analysis in the SIM mode with meta-fluorobenzoic acid as internal reference. The detection limit of thiodiglycol was 0.001 mg/l, and the procedure throughput was up to 30 samples per day. The same procedure, but with BSTFA as derivatizing agent instead of MTBSTFA, was used for preparing samples for survey analysis of nonvolatile components. In this case, full mass spectra were measured in the GCMS analysis. The use of BSTFA was motivated by the fact that trimethylsilyl derivatives are much wider represented in electronic mass spectral databases. The identification of sulfur mustard, volatile transformation products of sulfur mustard and lewisite, as well as chloroacetophenone in sea water was performed by means of GCMS in combination with SPME. The survey GC-MS analysis was focused on the identification of volatile and nonvolatile toxic chemicals whose mass spectra are included in the OPCW database (3219 toxic chemicals, precursors, and transformation products) with the use of AMDIS software (version 2.62). Using 2 GC-MS instruments, we could perform the survey analysis for volatile and nonvolatile components of up to 20 samples per day. Thus, the package of three procedures
Glass, plastic, and semiconductors: packaging techniques for miniature optoelectronic components
Pocha, Michael D.; Garrett, Henry E.; Patel, Rajesh R.; Jones, Leslie M., III; Larson, Michael C.; Emanuel, Mark A.; Bond, Steven W.; Deri, Robert J.; Drayton, R. F.; Petersen, Holly E.; Lowry, Mark E.
2000-03-01
At Lawrence Livermore National Laboratory, we have extensive experience with the design and development of miniature photonic systems which require novel packaging schemes. Over the years we have developed silicon micro-optical benches to serve as a stable platform for precision mounting of optical and electronic components. We have developed glass ball lenses that can be fabricated in-situ on the microbench substrate. We have modified commercially available molded plastic fiber ribbon connectors (MT) and added thin film multilayer semiconductor coatings to create potentially low-cost wavelength combiners and wavelength selective filters. We have fabricated both vertical-cavity and in-plane semiconductor lasers and amplifiers, and have packaged these and other components into several miniature photonics systems. For example, we have combined the silicon optical bench with standard electronic packaging techniques and our custom-made wavelength-selective filters to develop a four-wavelength wavelength-division-multiplexing transmitter module mounted in a standard 120-pin ceramic PGA package that couples light from several vertical-cavity-surface-emitting-laser arrays into one multimode fiber-ribbon array. The coupling loss can be as low as 2 dB, and the transmitters can be operated at over 1.25 GHz. While these systems were not designed for biomedical or environmental applications, the concepts and techniques are general and widely applicable.
Directory of Open Access Journals (Sweden)
Moradeyo Adebanjo Otitoju
2016-12-01
Full Text Available This study focused on the constraints to the use of climate variability/change adaptation strategies in South-west Nigeria. Multistage random technique was employed to select the location and the respondents. Descriptive statistics and principal component analysis (PCA were the analytical tools engaged in this study. The constraints to climate variability and change examined before did not use PCA but generalized factor analysis. Hence, there is need to examine these constraints extensively using PCA. Uncovering the constraints to the use of climate variability/change adaptation strategies among crop framers is important to give a realistic direction in the development of farmer-inclusive climate policies in Nigeria. The PCA result showed that the principal constraints that the farmers faced in climate change adaptation were public, institutional and labour constraint; land, neighbourhood norms and religious beliefs constraint; high cost of inputs, technological and information constraint; farm distance, access to climate information, off-farm job and credit constraint; and poor agricultural programmes and service delivery constraint. These findings pointed out the need for both the government and non-government organizations to intensify efforts on institutional, technological and farmers’ friendly land tenure and information systems as effective measures to guide inclusive climate change adaptation policies and development in South-west Nigeria.
Directory of Open Access Journals (Sweden)
Chen Shih-Wei
2011-11-01
Full Text Available Abstract Background The computer-aided identification of specific gait patterns is an important issue in the assessment of Parkinson's disease (PD. In this study, a computer vision-based gait analysis approach is developed to assist the clinical assessments of PD with kernel-based principal component analysis (KPCA. Method Twelve PD patients and twelve healthy adults with no neurological history or motor disorders within the past six months were recruited and separated according to their "Non-PD", "Drug-On", and "Drug-Off" states. The participants were asked to wear light-colored clothing and perform three walking trials through a corridor decorated with a navy curtain at their natural pace. The participants' gait performance during the steady-state walking period was captured by a digital camera for gait analysis. The collected walking image frames were then transformed into binary silhouettes for noise reduction and compression. Using the developed KPCA-based method, the features within the binary silhouettes can be extracted to quantitatively determine the gait cycle time, stride length, walking velocity, and cadence. Results and Discussion The KPCA-based method uses a feature-extraction approach, which was verified to be more effective than traditional image area and principal component analysis (PCA approaches in classifying "Non-PD" controls and "Drug-Off/On" PD patients. Encouragingly, this method has a high accuracy rate, 80.51%, for recognizing different gaits. Quantitative gait parameters are obtained, and the power spectrums of the patients' gaits are analyzed. We show that that the slow and irregular actions of PD patients during walking tend to transfer some of the power from the main lobe frequency to a lower frequency band. Our results indicate the feasibility of using gait performance to evaluate the motor function of patients with PD. Conclusion This KPCA-based method requires only a digital camera and a decorated corridor setup
Rodrigue, Christine M.
2011-01-01
This paper presents a laboratory exercise used to teach principal components analysis (PCA) as a means of surface zonation. The lab was built around abundance data for 16 oxides and elements collected by the Mars Exploration Rover Spirit in Gusev Crater between Sol 14 and Sol 470. Students used PCA to reduce 15 of these into 3 components, which,…
Directory of Open Access Journals (Sweden)
Papadopoulos Efthemis A
2006-06-01
Full Text Available Abstract Background In the recent years there is a growing interest in Greece concerning the measurement of the satisfaction of patients who are visiting the outpatient clinics of National Health System (NHS general acute hospitals. The aim of this study is therefore to develop a patient satisfaction questionnaire and provide its preliminary validation. Methods A questionnaire in Greek has been developed by literature review, researchers' on the spot observation and interviews. Pretesting has been followed by telephone surveys in two short-term general NHS hospitals in Macedonia, Greece. A proportional stratified random sample of 285 subjects and a second random sample of 100 outpatients, drawn on March 2004, have been employed for the analysis. These have resulted in scale creation via Principal Components Analysis and psychometric testing for internal consistency, test-retest and interrater reliability as well as construct validity. Results Four summated scales have emerged regarding the pure outpatient component of the patients' visits, namely medical examination, hospital environment, comfort and appointment time. Cronbach's alpha coefficients and Pearson, Spearman and intraclass correlations indicate a high degree of scale reliability and validity. Two other scales -lab appointment time and lab experience- capture the apparently distinct yet complementary visitor experience related to the radiographic and laboratory tests. Psychometric tests are equally promising, however, some discriminant validity differences lack statistical significance. Conclusion The instrument appears to be reliable and valid regarding the pure outpatient experience, whereas more research employing larger samples is required in order to establish the apparent psychometric properties of the complementary radiographic and laboratory-testing process, which is only relevant to about 25% of the subjects analysed here.
Krefis, Anne Caroline; Schwarz, Norbert Georg; Nkrumah, Bernard; Acquah, Samuel; Loag, Wibke; Sarpong, Nimako; Adu-Sarkodie, Yaw; Ranft, Ulrich; May, Jürgen
2010-07-13
The socioeconomic and sociodemographic situation are important components for the design and assessment of malaria control measures. In malaria endemic areas, however, valid classification of socioeconomic factors is difficult due to the lack of standardized tax and income data. The objective of this study was to quantify household socioeconomic levels using principal component analyses (PCA) to a set of indicator variables and to use a classification scheme for the multivariate analysis of children<15 years of age presented with and without malaria to an outpatient department of a rural hospital. In total, 1,496 children presenting to the hospital were examined for malaria parasites and interviewed with a standardized questionnaire. The information of eleven indicators of the family's housing situation was reduced by PCA to a socioeconomic score, which was then classified into three socioeconomic status (poor, average and rich). Their influence on the malaria occurrence was analysed together with malaria risk co-factors, such as sex, parent's educational and ethnic background, number of children living in a household, applied malaria protection measures, place of residence and age of the child and the mother. The multivariate regression analysis demonstrated that the proportion of children with malaria decreased with increasing socioeconomic status as classified by PCA (p<0.05). Other independent factors for malaria risk were the use of malaria protection measures (p<0.05), the place of residence (p<0.05), and the age of the child (p<0.05). The socioeconomic situation is significantly associated with malaria even in holoendemic rural areas where economic differences are not much pronounced. Valid classification of the socioeconomic level is crucial to be considered as confounder in intervention trials and in the planning of malaria control measures.
Quan, J.
2016-12-01
Near surface air temperature (Ta) is one of the most critical variables in climatology, hydrology, epidemiology and environmental health. In-situ measurements are not efficient for characterizing spatially heterogeneous Ta, while remote sensing is a powerful tool to break this limitation. This study proposes a mapping framework for daily mean Ta using an enhanced empirical regression method based on remote sensing data. It differs from previous studies in three aspects. First, nighttime light data is introduced as a predictor (besides seven most Ta-relevant variables, i.e., land surface temperature, normalized difference vegetation index, impervious surface area, black sky albedo, normalized difference water index, elevation, and duration of daylight) considering the urbanization-induced Ta increase over a large area. Second, independent components are extracted using principal component analysis considering the correlations among the above predictors. Third, a composite sinusoidal coefficient regression is developed considering the dynamic Ta-predictor relationship. The derived coefficients are then applied back to the spatially collocated predictors to reconstruct spatio-temporal Ta. This method is performed with 333 weather stations in China during the 2001-2012 period. Evaluation shows overall mean error of -0.01 K, root mean square error (RMSE) of 2.53 K, correlation coefficient (R2) of 0.96, and average uncertainty of 0.21 K. Model inter-comparison shows that this method outperforms six additional empirical regressions that have not incorporated nighttime light data or considered multi-predictor correlations or coefficient dynamics (by 0.18-2.60 K in RMSE and 0.00-0.15 in R2).
Directory of Open Access Journals (Sweden)
Brian Becker
2009-08-01
Full Text Available Mapping species composition is a focus of the wetland science community as this information will substantially enhance assessment and monitoring abilities. Hyperspectral remote sensing has been utilized as a cost-efficient approach. While hyperspectral instruments can record hundreds of contiguous narrow bands, much of the data are redundant and/or provide no increase in utility for distinguishing objects. Knowledge of the optimal bands allows users to efficiently focus on bands that provide the most information and several data reduction tools are available. The objective of this Communication was to evaluate Principal Components Analysis (PCA for identifying optimal bands to discriminate wetland plant species. In-situ hyperspectral reflectance measurements were obtained for thirty-five species in two diverse Great Lakes wetlands. PCA was executed on a suite of categories based on botanical plant/substrate characteristics and spectral configuration schemes. Results showed that the data dependency of PCA makes it a poor, stand alone tool for selecting optimal wavelengths. PCA does not allow diagnostic comparison across sites and wavelengths identified by PCA do not necessarily represent wavelengths that indicate biophysical attributes of interest. Further, narrow bands captured by hyperspectral sensors need to be substantially re-sampled and/or smoothed in order for PCA to identify useful information.
Li, Xuejian; Wang, Youqing
2016-12-01
Offline general-type models are widely used for patients' monitoring in intensive care units (ICUs), which are developed by using past collected datasets consisting of thousands of patients. However, these models may fail to adapt to the changing states of ICU patients. Thus, to be more robust and effective, the monitoring models should be adaptable to individual patients. A novel combination of just-in-time learning (JITL) and principal component analysis (PCA), referred to learning-type PCA (L-PCA), was proposed for adaptive online monitoring of patients in ICUs. JITL was used to gather the most relevant data samples for adaptive modeling of complex physiological processes. PCA was used to build an online individual-type model and calculate monitoring statistics, and then to judge whether the patient's status is normal or not. The adaptability of L-PCA lies in the usage of individual data and the continuous updating of the training dataset. Twelve subjects were selected from the Physiobank's Multi-parameter Intelligent Monitoring for Intensive Care II (MIMIC II) database, and five vital signs of each subject were chosen. The proposed method was compared with the traditional PCA and fast moving-window PCA (Fast MWPCA). The experimental results demonstrated that the fault detection rates respectively increased by 20 % and 47 % compared with PCA and Fast MWPCA. L-PCA is first introduced into ICU patients monitoring and achieves the best monitoring performance in terms of adaptability to changes in patient status and sensitivity for abnormality detection.
Energy Technology Data Exchange (ETDEWEB)
Pineda-Martinez, L.F.; Carbajal, N.; Medina-Roldan, E. [Instituto Potosino de Investigacion Cientifica y Tecnologica, A. C., San Luis Potosi (Mexico)]. E-mail: lpineda@ipicyt.edu.mx
2007-04-15
Applying principal component analysis (PCA), we determined climate zones in a topographic gradient in the central-northeastern part of Mexico. We employed nearly 30 years of monthly temperature and precipitation data at 173 meteorological stations. The climate classification was carried out applying the Koeppen system modified for the conditions of Mexico. PCA indicates a regionalization in agreement with topographic characteristics and vegetation. We describe the different bioclimatic zones, associated with typical vegetation, for each climate using geographical information systems (GIS). [Spanish] Utilizando un analisis de componentes principales, determinamos zonas climaticas en un gradiente topografico en la zona centro-noreste de Mexico. Se emplearon datos de precipitacion y temperatura medias mensuales por un periodo de 30 anos de 173 estaciones meteorologicas. La clasificacion del clima fue llevada a cabo de acuerdo con el sistema de Koeppen modificado para las condiciones de Mexico. El analisis de componentes principales indico una regionalizacion que concuerda con caracteristicas de topografia y vegetacion. Se describen zonas bioclimaticas, asociadas a vegetacion tipica para cada clima, usando sistemas de informacion geografica (SIG).
Shiokawa, Yuka; Date, Yasuhiro; Kikuchi, Jun
2018-02-21
Computer-based technological innovation provides advancements in sophisticated and diverse analytical instruments, enabling massive amounts of data collection with relative ease. This is accompanied by a fast-growing demand for technological progress in data mining methods for analysis of big data derived from chemical and biological systems. From this perspective, use of a general "linear" multivariate analysis alone limits interpretations due to "non-linear" variations in metabolic data from living organisms. Here we describe a kernel principal component analysis (KPCA)-incorporated analytical approach for extracting useful information from metabolic profiling data. To overcome the limitation of important variable (metabolite) determinations, we incorporated a random forest conditional variable importance measure into our KPCA-based analytical approach to demonstrate the relative importance of metabolites. Using a market basket analysis, hippurate, the most important variable detected in the importance measure, was associated with high levels of some vitamins and minerals present in foods eaten the previous day, suggesting a relationship between increased hippurate and intake of a wide variety of vegetables and fruits. Therefore, the KPCA-incorporated analytical approach described herein enabled us to capture input-output responses, and should be useful not only for metabolic profiling but also for profiling in other areas of biological and environmental systems.