WorldWideScience

Sample records for bivariate correlation analysis

  1. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez, E-mail: valter.costa@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  2. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    International Nuclear Information System (INIS)

    Costa, Valter Magalhaes; Pereira, Iraci Martinez

    2011-01-01

    The monitoring of variables and diagnosis of sensor fault in nuclear power plants or processes industries is very important because a previous diagnosis allows the correction of the fault and, like this, to prevent the production stopped, improving operator's security and it's not provoking economics losses. The objective of this work is to build a set, using bivariate correlation and canonical correlation, which will be the set of input variables of an artificial neural network to monitor the greater number of variables. This methodology was applied to the IEA-R1 Research Reactor at IPEN. Initially, for the input set of neural network we selected the variables: nuclear power, primary circuit flow rate, control/safety rod position and difference in pressure in the core of the reactor, because almost whole of monitoring variables have relation with the variables early described or its effect can be result of the interaction of two or more. The nuclear power is related to the increasing and decreasing of temperatures as well as the amount radiation due fission of the uranium; the rods are controls of power and influence in the amount of radiation and increasing and decreasing of temperatures; the primary circuit flow rate has the function of energy transport by removing the nucleus heat. An artificial neural network was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables. (author)

  3. An improved method for bivariate meta-analysis when within-study correlations are unknown.

    Science.gov (United States)

    Hong, Chuan; D Riley, Richard; Chen, Yong

    2018-03-01

    Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the

  4. Analysis of input variables of an artificial neural network using bivariate correlation and canonical correlation

    International Nuclear Information System (INIS)

    Costa, Valter Magalhaes

    2011-01-01

    was trained and the results were satisfactory since the IEA-R1 Data Acquisition System reactor monitors 64 variables and, with a set of 9 input variables resulting from the correlation analysis, it was possible to monitor 51 variables using neural networks. (author)

  5. Covariate analysis of bivariate survival data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  6. Bivariate Correlation Analysis of the Chemometric Profiles of Chinese Wild Salvia miltiorrhiza Based on UPLC-Qqq-MS and Antioxidant Activities

    Directory of Open Access Journals (Sweden)

    Xiaodan Zhang

    2018-02-01

    Full Text Available To better understand the mechanisms underlying the pharmacological actions of Salvia miltiorrhiza, correlation between the chemical profiles and in vitro antioxidant activities in 50 batches of wild S. miltiorrhiza samples was analyzed. Our ultra-performance liquid chromatography–tandem mass spectrometry analysis detected twelve phenolic acids and five tanshinones and obtained various chemical profiles from different origins. In a principal component analysis (PCA and cluster analysis, the tanshinones cryptotanshinone, tanshinone IIA and dihydrotanshinone I exhibited higher weights in PC1, whereas the phenolic acids danshensu, salvianolic acids A and B and lithospermic acid were highly loaded in PC2. All components could be optimized as markers of different locations and might be suitable for S. miltiorrhiza quality analyses. Additionally, the DPPH and ABTS assays used to comprehensively evaluate antioxidant activities indicated large variations, with mean DPPH and ABTS scavenging potencies of 32.24 and 23.39 μg/mL, respectively, among S. miltiorrhiza extract solutions. Notably, samples that exceeded the mean IC50 values had higher phenolic acid contents. A correlation analysis indicated a strong correlation between the antioxidant activities and phenolic acid contents. Caffeic acid, danshensu, rosmarinic acid, lithospermic acid and salvianolic acid B were major contributors to antioxidant activity. In conclusion, phenolic compounds were the predominant antioxidant components in the investigated plant species. These plants may be sources of potent natural antioxidants and beneficial chemopreventive agents.

  7. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...

  8. Two new bivariate zero-inflated generalized Poisson distributions with a flexible correlation structure

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2015-05-01

    Full Text Available To model correlated bivariate count data with extra zero observations, this paper proposes two new bivariate zero-inflated generalized Poisson (ZIGP distributions by incorporating a multiplicative factor (or dependency parameter λ, named as Type I and Type II bivariate ZIGP distributions, respectively. The proposed distributions possess a flexible correlation structure and can be used to fit either positively or negatively correlated and either over- or under-dispersed count data, comparing to the existing models that can only fit positively correlated count data with over-dispersion. The two marginal distributions of Type I bivariate ZIGP share a common parameter of zero inflation while the two marginal distributions of Type II bivariate ZIGP have their own parameters of zero inflation, resulting in a much wider range of applications. The important distributional properties are explored and some useful statistical inference methods including maximum likelihood estimations of parameters, standard errors estimation, bootstrap confidence intervals and related testing hypotheses are developed for the two distributions. A real data are thoroughly analyzed by using the proposed distributions and statistical methods. Several simulation studies are conducted to evaluate the performance of the proposed methods.

  9. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  10. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    Science.gov (United States)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further

  12. Genetics of Obesity Traits: A Bivariate Genome-Wide Association Analysis

    DEFF Research Database (Denmark)

    Wu, Yili; Duan, Haiping; Tian, Xiaocao

    2018-01-01

    Previous genome-wide association studies on anthropometric measurements have identified more than 100 related loci, but only a small portion of heritability in obesity was explained. Here we present a bivariate twin study to look for the genetic variants associated with body mass index and waist......-hip ratio, and to explore the obesity-related pathways in Northern Han Chinese. Cholesky decompositionmodel for 242monozygotic and 140 dizygotic twin pairs indicated a moderate genetic correlation (r = 0.53, 95%CI: 0.42–0.64) between body mass index and waist-hip ratio. Bivariate genome-wide association.......05. Expression quantitative trait loci analysis identified rs2242044 as a significant cis-eQTL in both the normal adipose-subcutaneous (P = 1.7 × 10−9) and adipose-visceral (P = 4.4 × 10−15) tissue. These findings may provide an important entry point to unravel genetic pleiotropy in obesity traits....

  13. Bivariate Drought Analysis Using Streamflow Reconstruction with Tree Ring Indices in the Sacramento Basin, California, USA

    Directory of Open Access Journals (Sweden)

    Jaewon Kwak

    2016-03-01

    Full Text Available Long-term streamflow data are vital for analysis of hydrological droughts. Using an artificial neural network (ANN model and nine tree-ring indices, this study reconstructed the annual streamflow of the Sacramento River for the period from 1560 to 1871. Using the reconstructed streamflow data, the copula method was used for bivariate drought analysis, deriving a hydrological drought return period plot for the Sacramento River basin. Results showed strong correlation among drought characteristics, and the drought with a 20-year return period (17.2 million acre-feet (MAF per year in the Sacramento River basin could be considered a critical level of drought for water shortages.

  14. On the construction of bivariate exponential distributions with an arbitrary correlation coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    In this paper we use a concept of multivariate phase-type distributions to define a class of bivariate exponential distributions. This class has the following three appealing properties. Firstly, we may construct a pair of exponentially distributed random variables with any feasible correlation...... coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase {type distributions, The latter property is potentially important for the development hypothesis testing in linear models. Thirdly, it is very easy to simulate...

  15. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  16. Bivariate extreme value with application to PM10 concentration analysis

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-05-01

    This study is focus on a bivariate extreme of renormalized componentwise maxima with generalized extreme value distribution as a marginal function. The limiting joint distribution of several parametric models are presented. Maximum likelihood estimation is employed for parameter estimations and the best model is selected based on the Akaike Information Criterion. The weekly and monthly componentwise maxima series are extracted from the original observations of daily maxima PM10 data for two air quality monitoring stations located in Pasir Gudang and Johor Bahru. The 10 years data are considered for both stations from year 2001 to 2010. The asymmetric negative logistic model is found as the best fit bivariate extreme model for both weekly and monthly maxima componentwise series. However the dependence parameters show that the variables for weekly maxima series is more dependence to each other compared to the monthly maxima.

  17. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  18. Genetic correlations between body condition scores and fertility in dairy cattle using bivariate random regression models.

    Science.gov (United States)

    De Haas, Y; Janss, L L G; Kadarmideen, H N

    2007-10-01

    Genetic correlations between body condition score (BCS) and fertility traits in dairy cattle were estimated using bivariate random regression models. BCS was recorded by the Swiss Holstein Association on 22,075 lactating heifers (primiparous cows) from 856 sires. Fertility data during first lactation were extracted for 40,736 cows. The fertility traits were days to first service (DFS), days between first and last insemination (DFLI), calving interval (CI), number of services per conception (NSPC) and conception rate to first insemination (CRFI). A bivariate model was used to estimate genetic correlations between BCS as a longitudinal trait by random regression components, and daughter's fertility at the sire level as a single lactation measurement. Heritability of BCS was 0.17, and heritabilities for fertility traits were low (0.01-0.08). Genetic correlations between BCS and fertility over the lactation varied from: -0.45 to -0.14 for DFS; -0.75 to 0.03 for DFLI; from -0.59 to -0.02 for CI; from -0.47 to 0.33 for NSPC and from 0.08 to 0.82 for CRFI. These results show (genetic) interactions between fat reserves and reproduction along the lactation trajectory of modern dairy cows, which can be useful in genetic selection as well as in management. Maximum genetic gain in fertility from indirect selection on BCS should be based on measurements taken in mid lactation when the genetic variance for BCS is largest, and the genetic correlations between BCS and fertility is strongest.

  19. Preparation and bivariate analysis of suspensions of human chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    van den Engh, G.J.; Trask, B.J.; Gray, J.W.; Langlois, R.G.; Yu, L.C.

    1985-01-01

    Chromosomes were isolated from a variety of human cell types using a HEPES-buffered hypotonic solution (pH 8.0) containing KCl, MgSO/sub 4/ dithioerythritol, and RNase. The chromosomes isolated by this procedure could be stained with a variety of fluorescent stains including propidium iodide, chromomycin A3, and Hoeschst 33258. Addition of sodium citrate to the stained chromosomes was found to improve the total fluorescence resolution. High-quality bivariate Hoeschst vs. chromomycin fluorescence distributions were obtained for chromosomes isolated from a human fibroblast cell strain, a human colon carcinoma cell line, and human peripheral blood lymphocyte cultures. Good flow karyotypes were also obtained from primary amniotic cell cultures. The Hoeschst vs. chromomycin flow karyotypes of a given cell line, made at different times and at dye concentrations varying over fourfold ranges, show little variation in the relative peak positions of the chromosomes. The size of the DNA in chromosomes isolated using this procedure ranges from 20 to 50 kilobases. The described isolation procedure is simple, it yields high-quality flow karyotypes, and it can be used to prepare chromosomes from clinical samples. 22 references, 7 figures, 1 table.

  20. Bivariate Cointegration Analysis of Energy-Economy Interactions in Iran

    Directory of Open Access Journals (Sweden)

    Ismail Oladimeji Soile

    2015-12-01

    Full Text Available Fixing the prices of energy products below their opportunity cost for welfare and redistribution purposes is common with governments of many oil producing developing countries. This has often resulted in huge energy consumption in developing countries and the question that emerge is whether this increased energy consumption results in higher economic activities. Available statistics show that Iran’s economy growth shrunk for the first time in two decades from 2011 amidst the introduction of pricing reform in 2010 and 2014 suggesting a relationship between energy use and economic growth. Accordingly, the study examined the causality and the likelihood of a long term relationship between energy and economic growth in Iran. Unlike previous studies which have focused on the effects and effectiveness of the reform, the paper investigates the rationale for the reform. The study applied a bivariate cointegration time series econometric approach. The results reveals a one-way causality running from economic growth to energy with no feedback with evidence of long run connection. The implication of this is that energy conservation policy is not inimical to economic growth. This evidence lend further support for the ongoing subsidy reforms in Iran as a measure to check excessive and inefficient use of energy.

  1. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    Science.gov (United States)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  2. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    Science.gov (United States)

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  3. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  4. A comparison of bivariate, multivariate random-effects, and Poisson correlated gamma-frailty models to meta-analyze individual patient data of ordinal scale diagnostic tests.

    Science.gov (United States)

    Simoneau, Gabrielle; Levis, Brooke; Cuijpers, Pim; Ioannidis, John P A; Patten, Scott B; Shrier, Ian; Bombardier, Charles H; de Lima Osório, Flavia; Fann, Jesse R; Gjerdingen, Dwenda; Lamers, Femke; Lotrakul, Manote; Löwe, Bernd; Shaaban, Juwita; Stafford, Lesley; van Weert, Henk C P M; Whooley, Mary A; Wittkampf, Karin A; Yeung, Albert S; Thombs, Brett D; Benedetti, Andrea

    2017-11-01

    Individual patient data (IPD) meta-analyses are increasingly common in the literature. In the context of estimating the diagnostic accuracy of ordinal or semi-continuous scale tests, sensitivity and specificity are often reported for a given threshold or a small set of thresholds, and a meta-analysis is conducted via a bivariate approach to account for their correlation. When IPD are available, sensitivity and specificity can be pooled for every possible threshold. Our objective was to compare the bivariate approach, which can be applied separately at every threshold, to two multivariate methods: the ordinal multivariate random-effects model and the Poisson correlated gamma-frailty model. Our comparison was empirical, using IPD from 13 studies that evaluated the diagnostic accuracy of the 9-item Patient Health Questionnaire depression screening tool, and included simulations. The empirical comparison showed that the implementation of the two multivariate methods is more laborious in terms of computational time and sensitivity to user-supplied values compared to the bivariate approach. Simulations showed that ignoring the within-study correlation of sensitivity and specificity across thresholds did not worsen inferences with the bivariate approach compared to the Poisson model. The ordinal approach was not suitable for simulations because the model was highly sensitive to user-supplied starting values. We tentatively recommend the bivariate approach rather than more complex multivariate methods for IPD diagnostic accuracy meta-analyses of ordinal scale tests, although the limited type of diagnostic data considered in the simulation study restricts the generalization of our findings. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    Science.gov (United States)

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Using bivariate signal analysis to characterize the epileptic focus: the benefit of surrogates.

    Science.gov (United States)

    Andrzejak, R G; Chicharro, D; Lehnertz, K; Mormann, F

    2011-04-01

    The disease epilepsy is related to hypersynchronous activity of networks of neurons. While acute epileptic seizures are the most extreme manifestation of this hypersynchronous activity, an elevated level of interdependence of neuronal dynamics is thought to persist also during the seizure-free interval. In multichannel recordings from brain areas involved in the epileptic process, this interdependence can be reflected in an increased linear cross correlation but also in signal properties of higher order. Bivariate time series analysis comprises a variety of approaches, each with different degrees of sensitivity and specificity for interdependencies reflected in lower- or higher-order properties of pairs of simultaneously recorded signals. Here we investigate which approach is best suited to detect putatively elevated interdependence levels in signals recorded from brain areas involved in the epileptic process. For this purpose, we use the linear cross correlation that is sensitive to lower-order signatures of interdependence, a nonlinear interdependence measure that integrates both lower- and higher-order properties, and a surrogate-corrected nonlinear interdependence measure that aims to specifically characterize higher-order properties. We analyze intracranial electroencephalographic recordings of the seizure-free interval from 29 patients with an epileptic focus located in the medial temporal lobe. Our results show that all three approaches detect higher levels of interdependence for signals recorded from the brain hemisphere containing the epileptic focus as compared to signals recorded from the opposite hemisphere. For the linear cross correlation, however, these differences are not significant. For the nonlinear interdependence measure, results are significant but only of moderate accuracy with regard to the discriminative power for the focal and nonfocal hemispheres. The highest significance and accuracy is obtained for the surrogate-corrected nonlinear

  7. Effect of catchment properties and flood generation regime on copula selection for bivariate flood frequency analysis

    Science.gov (United States)

    Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald

    2018-02-01

    Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.

  8. A comparison between multivariate and bivariate analysis used in marketing research

    Directory of Open Access Journals (Sweden)

    Constantin, C.

    2012-01-01

    Full Text Available This paper is about an instrumental research conducted in order to compare the information given by two multivariate data analysis in comparison with the usual bivariate analysis. The outcomes of the research reveal that sometimes the multivariate methods use more information from a certain variable, but sometimes they use only a part of the information considered the most important for certain associations. For this reason, a researcher should use both categories of data analysis in order to obtain entirely useful information.

  9. Regression analysis for bivariate gap time with missing first gap time data.

    Science.gov (United States)

    Huang, Chia-Hui; Chen, Yi-Hau

    2017-01-01

    We consider ordered bivariate gap time while data on the first gap time are unobservable. This study is motivated by the HIV infection and AIDS study, where the initial HIV contracting time is unavailable, but the diagnosis times for HIV and AIDS are available. We are interested in studying the risk factors for the gap time between initial HIV contraction and HIV diagnosis, and gap time between HIV and AIDS diagnoses. Besides, the association between the two gap times is also of interest. Accordingly, in the data analysis we are faced with two-fold complexity, namely data on the first gap time is completely missing, and the second gap time is subject to induced informative censoring due to dependence between the two gap times. We propose a modeling framework for regression analysis of bivariate gap time under the complexity of the data. The estimating equations for the covariate effects on, as well as the association between, the two gap times are derived through maximum likelihood and suitable counting processes. Large sample properties of the resulting estimators are developed by martingale theory. Simulations are performed to examine the performance of the proposed analysis procedure. An application of data from the HIV and AIDS study mentioned above is reported for illustration.

  10. On bivariate geometric distribution

    Directory of Open Access Journals (Sweden)

    K. Jayakumar

    2013-05-01

    Full Text Available Characterizations of bivariate geometric distribution using univariate and bivariate geometric compounding are obtained. Autoregressive models with marginals as bivariate geometric distribution are developed. Various bivariate geometric distributions analogous to important bivariate exponential distributions like, Marshall-Olkin’s bivariate exponential, Downton’s bivariate exponential and Hawkes’ bivariate exponential are presented.

  11. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.

    2015-05-22

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields\\' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  12. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    KAUST Repository

    Philbin, R.; Jun, M.

    2015-01-01

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matérn spatial covariance model, accounting for the two fields' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  13. Obtaining DDF Curves of Extreme Rainfall Data Using Bivariate Copula and Frequency Analysis

    DEFF Research Database (Denmark)

    Sadri, Sara; Madsen, Henrik; Mikkelsen, Peter Steen

    2009-01-01

    , situated near Copenhagen in Denmark. For rainfall extracted using method 2, the marginal distribution of depth was found to fit the Generalized Pareto distribution while duration was found to fit the Gamma distribution, using the method of L-moments. The volume was fit with a generalized Pareto...... with duration for a given return period and name them DDF (depth-duration-frequency) curves. The copula approach does not assume the rainfall variables are independent or jointly normally distributed. Rainfall series are extracted in three ways: (1) by maximum mean intensity; (2) by depth and duration...... distribution and the duration was fit with a Pearson type III distribution for rainfall extracted using method 3. The Clayton copula was found to be appropriate for bivariate analysis of rainfall depth and duration for both methods 2 and 3. DDF curves derived using the Clayton copula for depth and duration...

  14. The Role of Wealth and Health in Insurance Choice: Bivariate Probit Analysis in China

    Directory of Open Access Journals (Sweden)

    Yiding Yue

    2014-01-01

    Full Text Available This paper captures the correlation between the choices of health insurance and pension insurance using the bivariate probit model and then studies the effect of wealth and health on insurance choice. Our empirical evidence shows that people who participate in a health care program are more likely to participate in a pension plan at the same time, while wealth and health have different effects on the choices of the health care program and the pension program. Generally, the higher an individual’s wealth level is, the more likelihood he will participate in a health care program; but wealth has no effect on the participation of pension. Health status has opposite effects on choices of health care programs and pension plans; the poorer an individual’s health is, the more likely he is to participate in health care programs, while the better health he enjoys, the more likely he is to participate in pension plans. When the investigation scope narrows down to commercial insurance, there is only a significant effect of health status on commercial health insurance. The commercial insurance choice and the insurance choice of the agricultural population are more complicated.

  15. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  16. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  17. GIS-based bivariate statistical techniques for groundwater potential analysis (an example of Iran)

    Science.gov (United States)

    Haghizadeh, Ali; Moghaddam, Davoud Davoudi; Pourghasemi, Hamid Reza

    2017-12-01

    Groundwater potential analysis prepares better comprehension of hydrological settings of different regions. This study shows the potency of two GIS-based data driven bivariate techniques namely statistical index (SI) and Dempster-Shafer theory (DST) to analyze groundwater potential in Broujerd region of Iran. The research was done using 11 groundwater conditioning factors and 496 spring positions. Based on the ground water potential maps (GPMs) of SI and DST methods, 24.22% and 23.74% of the study area is covered by poor zone of groundwater potential, and 43.93% and 36.3% of Broujerd region is covered by good and very good potential zones, respectively. The validation of outcomes displayed that area under the curve (AUC) of SI and DST techniques are 81.23% and 79.41%, respectively, which shows SI method has slightly a better performance than the DST technique. Therefore, SI and DST methods are advantageous to analyze groundwater capacity and scrutinize the complicated relation between groundwater occurrence and groundwater conditioning factors, which permits investigation of both systemic and stochastic uncertainty. Finally, it can be realized that these techniques are very beneficial for groundwater potential analyzing and can be practical for water-resource management experts.

  18. Which global stock indices trigger stronger contagion risk in the Vietnamese stock market? Evidence using a bivariate analysis

    OpenAIRE

    Wang Kuan-Min; Lai Hung-Cheng

    2013-01-01

    This paper extends recent investigations into risk contagion effects on stock markets to the Vietnamese stock market. Daily data spanning October 9, 2006 to May 3, 2012 are sourced to empirically validate the contagion effects between stock markets in Vietnam, and China, Japan, Singapore, and the US. To facilitate the validation of contagion effects with market-related coefficients, this paper constructs a bivariate EGARCH model of dynamic conditional correlation coefficients. Using the...

  19. A power study of bivariate LOD score analysis of a complex trait and fear/discomfort with strangers.

    Science.gov (United States)

    Ji, Fei; Lee, Dayoung; Mendell, Nancy Role

    2005-12-30

    Complex diseases are often reported along with disease-related traits (DRT). Sometimes investigators consider both disease and DRT phenotypes separately and sometimes they consider individuals as affected if they have either the disease or the DRT, or both. We propose instead to consider the joint distribution of the disease and the DRT and do a linkage analysis assuming a pleiotropic model. We evaluated our results through analysis of the simulated datasets provided by Genetic Analysis Workshop 14. We first conducted univariate linkage analysis of the simulated disease, Kofendrerd Personality Disorder and one of its simulated associated traits, phenotype b (fear/discomfort with strangers). Subsequently, we considered the bivariate phenotype, which combined the information on Kofendrerd Personality Disorder and fear/discomfort with strangers. We developed a program to perform bivariate linkage analysis using an extension to the Elston-Stewart peeling method of likelihood calculation. Using this program we considered the microsatellites within 30 cM of the gene pleiotropic for this simulated disease and DRT. Based on 100 simulations of 300 families we observed excellent power to detect linkage within 10 cM of the disease locus using the DRT and the bivariate trait.

  20. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  1. Ordinal Bivariate Inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    2016-01-01

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlation-increasing switches and....../or median-preserving spreads. For the canonical 2 × 2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  2. Bivariate genome-wide association meta-analysis of pediatric musculoskeletal traits reveals pleiotropic effects at the SREBF1/TOM1L2 locus

    DEFF Research Database (Denmark)

    Medina-Gomez, Carolina; Kemp, John P; Dimou, Niki L

    2017-01-01

    Bone mineral density is known to be a heritable, polygenic trait whereas genetic variants contributing to lean mass variation remain largely unknown. We estimated the shared SNP heritability and performed a bivariate GWAS meta-analysis of total-body lean mass (TB-LM) and total-body less head bone...... as in human muscle tissue. This is the first bivariate GWAS meta-analysis to demonstrate genetic factors with pleiotropic effects on bone mineral density and lean mass.Bone mineral density and lean skeletal mass are heritable traits. Here, Medina-Gomez and colleagues perform bivariate GWAS analyses of total...

  3. Using bivariate latent basis growth curve analysis to better understand treatment outcome in youth with anorexia nervosa.

    Science.gov (United States)

    Byrne, Catherine E; Wonderlich, Joseph A; Curby, Timothy; Fischer, Sarah; Lock, James; Le Grange, Daniel

    2018-04-25

    This study explored the relation between eating-related obsessionality and weight restoration utilizing bivariate latent basis growth curve modelling. Eating-related obsessionality is a moderator of treatment outcome for adolescents with anorexia nervosa (AN). This study examined the degree to which the rate of change in eating-related obsessionality was associated with the rate of change in weight over time in family-based treatment (FBT) and individual therapy for AN. Data were drawn from a 2-site randomized controlled trial that compared FBT and adolescent focused therapy for AN. Bivariate latent basis growth curves were used to examine the differences of the relations between trajectories of body weight and symptoms associated with eating and weight obsessionality. In the FBT group, the slope of eating-related obsessionality scores and the slope of weight were significantly (negatively) correlated. This finding indicates that a decrease in overall eating-relating obsessionality is significantly associated with an increase in weight for individuals who received FBT. However, there was no relation between change in obsessionality scores and change in weight in the adolescent focused therapy group. Results suggest that FBT has a specific impact on both weight gain and obsessive compulsive behaviour that is distinct from individual therapy. Copyright © 2018 John Wiley & Sons, Ltd and Eating Disorders Association.

  4. A bivariate contaminated binormal model for robust fitting of proper ROC curves to a pair of correlated, possibly degenerate, ROC datasets.

    Science.gov (United States)

    Zhai, Xuetong; Chakraborty, Dev P

    2017-06-01

    The objective was to design and implement a bivariate extension to the contaminated binormal model (CBM) to fit paired receiver operating characteristic (ROC) datasets-possibly degenerate-with proper ROC curves. Paired datasets yield two correlated ratings per case. Degenerate datasets have no interior operating points and proper ROC curves do not inappropriately cross the chance diagonal. The existing method, developed more than three decades ago utilizes a bivariate extension to the binormal model, implemented in CORROC2 software, which yields improper ROC curves and cannot fit degenerate datasets. CBM can fit proper ROC curves to unpaired (i.e., yielding one rating per case) and degenerate datasets, and there is a clear scientific need to extend it to handle paired datasets. In CBM, nondiseased cases are modeled by a probability density function (pdf) consisting of a unit variance peak centered at zero. Diseased cases are modeled with a mixture distribution whose pdf consists of two unit variance peaks, one centered at positive μ with integrated probability α, the mixing fraction parameter, corresponding to the fraction of diseased cases where the disease was visible to the radiologist, and one centered at zero, with integrated probability (1-α), corresponding to disease that was not visible. It is shown that: (a) for nondiseased cases the bivariate extension is a unit variances bivariate normal distribution centered at (0,0) with a specified correlation ρ 1 ; (b) for diseased cases the bivariate extension is a mixture distribution with four peaks, corresponding to disease not visible in either condition, disease visible in only one condition, contributing two peaks, and disease visible in both conditions. An expression for the likelihood function is derived. A maximum likelihood estimation (MLE) algorithm, CORCBM, was implemented in the R programming language that yields parameter estimates and the covariance matrix of the parameters, and other statistics

  5. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  6. Bivariate analysis of basal serum anti-Mullerian hormone measurements and human blastocyst development after IVF

    LENUS (Irish Health Repository)

    Sills, E Scott

    2011-12-02

    Abstract Background To report on relationships among baseline serum anti-Müllerian hormone (AMH) measurements, blastocyst development and other selected embryology parameters observed in non-donor oocyte IVF cycles. Methods Pre-treatment AMH was measured in patients undergoing IVF (n = 79) and retrospectively correlated to in vitro embryo development noted during culture. Results Mean (+\\/- SD) age for study patients in this study group was 36.3 ± 4.0 (range = 28-45) yrs, and mean (+\\/- SD) terminal serum estradiol during IVF was 5929 +\\/- 4056 pmol\\/l. A moderate positive correlation (0.49; 95% CI 0.31 to 0.65) was noted between basal serum AMH and number of MII oocytes retrieved. Similarly, a moderate positive correlation (0.44) was observed between serum AMH and number of early cleavage-stage embryos (95% CI 0.24 to 0.61), suggesting a relationship between serum AMH and embryo development in IVF. Of note, serum AMH levels at baseline were significantly different for patients who did and did not undergo blastocyst transfer (15.6 vs. 10.9 pmol\\/l; p = 0.029). Conclusions While serum AMH has found increasing application as a predictor of ovarian reserve for patients prior to IVF, its roles to estimate in vitro embryo morphology and potential to advance to blastocyst stage have not been extensively investigated. These data suggest that baseline serum AMH determinations can help forecast blastocyst developmental during IVF. Serum AMH measured before treatment may assist patients, clinicians and embryologists as scheduling of embryo transfer is outlined. Additional studies are needed to confirm these correlations and to better define the role of baseline serum AMH level in the prediction of blastocyst formation.

  7. Which global stock indices trigger stronger contagion risk in the Vietnamese stock market? Evidence using a bivariate analysis

    Directory of Open Access Journals (Sweden)

    Wang Kuan-Min

    2013-01-01

    Full Text Available This paper extends recent investigations into risk contagion effects on stock markets to the Vietnamese stock market. Daily data spanning October 9, 2006 to May 3, 2012 are sourced to empirically validate the contagion effects between stock markets in Vietnam, and China, Japan, Singapore, and the US. To facilitate the validation of contagion effects with market-related coefficients, this paper constructs a bivariate EGARCH model of dynamic conditional correlation coefficients. Using the correlation contagion test and Dungey et al.’s (2005 contagion test, we find contagion effects between the Vietnamese and four other stock markets, namely Japan, Singapore, China, and the US. Second, we show that the Japanese stock market causes stronger contagion risk in the Vietnamese stock market compared to the stock markets of China, Singapore, and the US. Finally, we show that the Chinese and US stock markets cause weaker contagion effects in the Vietnamese stock market because of stronger interdependence effects between the former two markets.

  8. Application of bivariate mapping for hydrological classification and analysis of temporal change and scale effects in Switzerland

    NARCIS (Netherlands)

    Speich, Matthias J.R.; Bernhard, Luzi; Teuling, Ryan; Zappa, Massimiliano

    2015-01-01

    Hydrological classification schemes are important tools for assessing the impacts of a changing climate on the hydrology of a region. In this paper, we present bivariate mapping as a simple means of classifying hydrological data for a quantitative and qualitative assessment of temporal change.

  9. Quasi-bivariate variational mode decomposition as a tool of scale analysis in wall-bounded turbulence

    Science.gov (United States)

    Wang, Wenkang; Pan, Chong; Wang, Jinjun

    2018-01-01

    The identification and separation of multi-scale coherent structures is a critical task for the study of scale interaction in wall-bounded turbulence. Here, we propose a quasi-bivariate variational mode decomposition (QB-VMD) method to extract structures with various scales from instantaneous two-dimensional (2D) velocity field which has only one primary dimension. This method is developed from the one-dimensional VMD algorithm proposed by Dragomiretskiy and Zosso (IEEE Trans Signal Process 62:531-544, 2014) to cope with a quasi-2D scenario. It poses the feature of length-scale bandwidth constraint along the decomposed dimension, together with the central frequency re-balancing along the non-decomposed dimension. The feasibility of this method is tested on both a synthetic flow field and a turbulent boundary layer at moderate Reynolds number (Re_{τ } = 3458) measured by 2D particle image velocimetry (PIV). Some other popular scale separation tools, including pseudo-bi-dimensional empirical mode decomposition (PB-EMD), bi-dimensional EMD (B-EMD) and proper orthogonal decomposition (POD), are also tested for comparison. Among all these methods, QB-VMD shows advantages in both scale characterization and energy recovery. More importantly, the mode mixing problem, which degrades the performance of EMD-based methods, is avoided or minimized in QB-VMD. Finally, QB-VMD analysis of the wall-parallel plane in the log layer (at y/δ = 0.12) of the studied turbulent boundary layer shows the coexistence of large- or very large-scale motions (LSMs or VLSMs) and inner-scaled structures, which can be fully decomposed in both physical and spectral domains.

  10. Bivariate Kumaraswamy Models via Modified FGM Copulas: Properties and Applications

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2017-11-01

    Full Text Available A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It is established that construction of bivariate distributions by this method allows for greater flexibility in the values of Spearman’s correlation coefficient, ρ and Kendall’s τ .

  11. Cost-offsets of prescription drug expenditures: data analysis via a copula-based bivariate dynamic hurdle model.

    Science.gov (United States)

    Deb, Partha; Trivedi, Pravin K; Zimmer, David M

    2014-10-01

    In this paper, we estimate a copula-based bivariate dynamic hurdle model of prescription drug and nondrug expenditures to test the cost-offset hypothesis, which posits that increased expenditures on prescription drugs are offset by reductions in other nondrug expenditures. We apply the proposed methodology to data from the Medical Expenditure Panel Survey, which have the following features: (i) the observed bivariate outcomes are a mixture of zeros and continuously measured positives; (ii) both the zero and positive outcomes show state dependence and inter-temporal interdependence; and (iii) the zeros and the positives display contemporaneous association. The point mass at zero is accommodated using a hurdle or a two-part approach. The copula-based approach to generating joint distributions is appealing because the contemporaneous association involves asymmetric dependence. The paper studies samples categorized by four health conditions: arthritis, diabetes, heart disease, and mental illness. There is evidence of greater than dollar-for-dollar cost-offsets of expenditures on prescribed drugs for relatively low levels of spending on drugs and less than dollar-for-dollar cost-offsets at higher levels of drug expenditures. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Ordinal bivariate inequality

    DEFF Research Database (Denmark)

    Sonne-Schmidt, Christoffer Scavenius; Tarp, Finn; Østerdal, Lars Peter Raahave

    This paper introduces a concept of inequality comparisons with ordinal bivariate categorical data. In our model, one population is more unequal than another when they have common arithmetic median outcomes and the first can be obtained from the second by correlationincreasing switches and/or median......-preserving spreads. For the canonical 2x2 case (with two binary indicators), we derive a simple operational procedure for checking ordinal inequality relations in practice. As an illustration, we apply the model to childhood deprivation in Mozambique....

  13. Effectiveness of enforcement levels of speed limit and drink driving laws and associated factors – Exploratory empirical analysis using a bivariate ordered probit model

    Directory of Open Access Journals (Sweden)

    Behram Wali

    2017-06-01

    Full Text Available The contemporary traffic safety research comprises little information on quantifying the simultaneous association between drink driving and speeding among fatally injured drivers. Potential correlation between driver's drink driving and speeding behavior poses a substantial methodological concern which needs investigation. This study therefore focused on investigating the simultaneous impact of socioeconomic factors, fatalities, vehicle ownership, health services and highway agency road safety policies on enforcement levels of speed limit and drink driving laws. The effectiveness of enforcement levels of speed limit and drink driving laws has been investigated through development of bivariate ordered probit model using data extricated from WHO's global status report on road safety in 2013. The consistent and intuitive parameter estimates along with statistically significant correlation between response outcomes validates the statistical supremacy of bivariate ordered probit model. The results revealed that fatalities per thousand registered vehicles, hospital beds per hundred thousand population and road safety policies are associated with a likely medium or high effectiveness of enforcement levels of speed limit and drink driving laws, respectively. Also, the model encapsulates the effect of several other agency related variables and socio-economic status on the response outcomes. Marginal effects are reported for analyzing the impact of such factors on intermediate categories of response outcomes. The results of this study are expected to provide necessary insights to elemental enforcement programs. Also, marginal effects of explanatory variables may provide useful directions for formulating effective policy countermeasures for overcoming driver's speeding and drink driving behavior.

  14. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    Science.gov (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  15. Spectral analysis by correlation

    International Nuclear Information System (INIS)

    Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G.

    1969-01-01

    The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [fr

  16. Multifractal detrended cross-correlation analysis in the MENA area

    Science.gov (United States)

    El Alaoui, Marwane; Benbachir, Saâd

    2013-12-01

    In this paper, we investigated multifractal cross-correlations qualitatively and quantitatively using a cross-correlation test and the Multifractal detrended cross-correlation analysis method (MF-DCCA) for markets in the MENA area. We used cross-correlation coefficients to measure the level of this correlation. The analysis concerns four stock market indices of Morocco, Tunisia, Egypt and Jordan. The countries chosen are signatory of the Agadir agreement concerning the establishment of a free trade area comprising Arab Mediterranean countries. We computed the bivariate generalized Hurst exponent, Rényi exponent and spectrum of singularity for each pair of indices to measure quantitatively the cross-correlations. By analyzing the results, we found the existence of multifractal cross-correlations between all of these markets. We compared the spectrum width of these indices; we also found which pair of indices has a strong multifractal cross-correlation.

  17. Bivariate analysis of basal serum anti-Müllerian hormone measurements and human blastocyst development after IVF

    Directory of Open Access Journals (Sweden)

    Sills E Scott

    2011-12-01

    Full Text Available Abstract Background To report on relationships among baseline serum anti-Müllerian hormone (AMH measurements, blastocyst development and other selected embryology parameters observed in non-donor oocyte IVF cycles. Methods Pre-treatment AMH was measured in patients undergoing IVF (n = 79 and retrospectively correlated to in vitro embryo development noted during culture. Results Mean (+/- SD age for study patients in this study group was 36.3 ± 4.0 (range = 28-45 yrs, and mean (+/- SD terminal serum estradiol during IVF was 5929 +/- 4056 pmol/l. A moderate positive correlation (0.49; 95% CI 0.31 to 0.65 was noted between basal serum AMH and number of MII oocytes retrieved. Similarly, a moderate positive correlation (0.44 was observed between serum AMH and number of early cleavage-stage embryos (95% CI 0.24 to 0.61, suggesting a relationship between serum AMH and embryo development in IVF. Of note, serum AMH levels at baseline were significantly different for patients who did and did not undergo blastocyst transfer (15.6 vs. 10.9 pmol/l; p = 0.029. Conclusions While serum AMH has found increasing application as a predictor of ovarian reserve for patients prior to IVF, its roles to estimate in vitro embryo morphology and potential to advance to blastocyst stage have not been extensively investigated. These data suggest that baseline serum AMH determinations can help forecast blastocyst developmental during IVF. Serum AMH measured before treatment may assist patients, clinicians and embryologists as scheduling of embryo transfer is outlined. Additional studies are needed to confirm these correlations and to better define the role of baseline serum AMH level in the prediction of blastocyst formation.

  18. Investigating the relationship between costs and outcomes for English mental health providers: a bi-variate multi-level regression analysis.

    Science.gov (United States)

    Moran, Valerie; Jacobs, Rowena

    2018-06-01

    Provider payment systems for mental health care that incentivize cost control and quality improvement have been a policy focus in a number of countries. In England, a new prospective provider payment system is being introduced to mental health that should encourage providers to control costs and improve outcomes. The aim of this research is to investigate the relationship between costs and outcomes to ascertain whether there is a trade-off between controlling costs and improving outcomes. The main data source is the Mental Health Minimum Data Set (MHMDS) for the years 2011/12 and 2012/13. Costs are calculated using NHS reference cost data while outcomes are measured using the Health of the Nation Outcome Scales (HoNOS). We estimate a bivariate multi-level model with costs and outcomes simultaneously. We calculate the correlation and plot the pairwise relationship between residual costs and outcomes at the provider level. After controlling for a range of demographic, need, social, and treatment variables, residual variation in costs and outcomes remains at the provider level. The correlation between residual costs and outcomes is negative, but very small, suggesting that cost-containment efforts by providers should not undermine outcome-improving efforts under the new payment system.

  19. Bivariate Left-Censored Bayesian Model for Predicting Exposure: Preliminary Analysis of Worker Exposure during the Deepwater Horizon Oil Spill.

    Science.gov (United States)

    Groth, Caroline; Banerjee, Sudipto; Ramachandran, Gurumurthy; Stenzel, Mark R; Sandler, Dale P; Blair, Aaron; Engel, Lawrence S; Kwok, Richard K; Stewart, Patricia A

    2017-01-01

    In April 2010, the Deepwater Horizon oil rig caught fire and exploded, releasing almost 5 million barrels of oil into the Gulf of Mexico over the ensuing 3 months. Thousands of oil spill workers participated in the spill response and clean-up efforts. The GuLF STUDY being conducted by the National Institute of Environmental Health Sciences is an epidemiological study to investigate potential adverse health effects among these oil spill clean-up workers. Many volatile chemicals were released from the oil into the air, including total hydrocarbons (THC), which is a composite of the volatile components of oil including benzene, toluene, ethylbenzene, xylene, and hexane (BTEXH). Our goal is to estimate exposure levels to these toxic chemicals for groups of oil spill workers in the study (hereafter called exposure groups, EGs) with likely comparable exposure distributions. A large number of air measurements were collected, but many EGs are characterized by datasets with a large percentage of censored measurements (below the analytic methods' limits of detection) and/or a limited number of measurements. We use THC for which there was less censoring to develop predictive linear models for specific BTEXH air exposures with higher degrees of censoring. We present a novel Bayesian hierarchical linear model that allows us to predict, for different EGs simultaneously, exposure levels of a second chemical while accounting for censoring in both THC and the chemical of interest. We illustrate the methodology by estimating exposure levels for several EGs on the Development Driller III, a rig vessel charged with drilling one of the relief wells. The model provided credible estimates in this example for geometric means, arithmetic means, variances, correlations, and regression coefficients for each group. This approach should be considered when estimating exposures in situations when multiple chemicals are correlated and have varying degrees of censoring. © The Author 2017

  20. Comparison between two bivariate Poisson distributions through the ...

    African Journals Online (AJOL)

    These two models express themselves by their probability mass function. ... To remedy this problem, Berkhout and Plug proposed a bivariate Poisson distribution accepting the correlation as well negative, equal to zero, that positive.

  1. Diagnostic value of sTREM-1 in bronchoalveolar lavage fluid in ICU patients with bacterial lung infections: a bivariate meta-analysis.

    Science.gov (United States)

    Shi, Jia-Xin; Li, Jia-Shu; Hu, Rong; Li, Chun-Hua; Wen, Yan; Zheng, Hong; Zhang, Feng; Li, Qin

    2013-01-01

    The serum soluble triggering receptor expressed on myeloid cells-1 (sTREM-1) is a useful biomarker in differentiating bacterial infections from others. However, the diagnostic value of sTREM-1 in bronchoalveolar lavage fluid (BALF) in lung infections has not been well established. We performed a meta-analysis to assess the accuracy of sTREM-1 in BALF for diagnosis of bacterial lung infections in intensive care unit (ICU) patients. We searched PUBMED, EMBASE and Web of Knowledge (from January 1966 to October 2012) databases for relevant studies that reported diagnostic accuracy data of BALF sTREM-1 in the diagnosis of bacterial lung infections in ICU patients. Pooled sensitivity, specificity, and positive and negative likelihood ratios were calculated by a bivariate regression analysis. Measures of accuracy and Q point value (Q*) were calculated using summary receiver operating characteristic (SROC) curve. The potential between-studies heterogeneity was explored by subgroup analysis. Nine studies were included in the present meta-analysis. Overall, the prevalence was 50.6%; the sensitivity was 0.87 (95% confidence interval (CI), 0.72-0.95); the specificity was 0.79 (95% CI, 0.56-0.92); the positive likelihood ratio (PLR) was 4.18 (95% CI, 1.78-9.86); the negative likelihood ratio (NLR) was 0.16 (95% CI, 0.07-0.36), and the diagnostic odds ratio (DOR) was 25.60 (95% CI, 7.28-89.93). The area under the SROC curve was 0.91 (95% CI, 0.88-0.93), with a Q* of 0.83. Subgroup analysis showed that the assay method and cutoff value influenced the diagnostic accuracy of sTREM-1. BALF sTREM-1 is a useful biomarker of bacterial lung infections in ICU patients. Further studies are needed to confirm the optimized cutoff value.

  2. Regional Analysis of Precipitation by Means of Bivariate Distribution Adjusted by Maximum Entropy; Analisis regional de precipitacion con base en una distribucion bivariada ajustada por maxima entropia

    Energy Technology Data Exchange (ETDEWEB)

    Escalante Sandoval, Carlos A.; Dominguez Esquivel, Jose Y. [Universidad Nacional Autonoma de Mexico (Mexico)

    2001-09-01

    The principle of maximum entropy (POME) is used to derive an alternative method of parameter estimation for the bivariate Gumbel distribution. A simple algorithm for this parameter estimation technique is presented. This method is applied to analyze the precipitation in a region of Mexico. Design events are compered with those obtained by the maximum likelihood procedure. According to the results, the proposed technique is a suitable option to be considered when performing frequency analysis of precipitation with small samples. [Spanish] El principio de maxima entropia, conocido como POME, es utilizado para derivar un procedimiento alternativo de estimacion de parametros de la distribucion bivariada de valores extremos con marginales Gumbel. El modelo se aplica al analisis de la precipitacion maxima en 24 horas en una region de Mexico y los eventos de diseno obtenidos son comparados con los proporcionados por la tecnica de maxima verosimilitud. De acuerdo con los resultados obtenidos, se concluye que la tecnica propuesta representa una buena opcion, sobre todo para el caso de muestras pequenas.

  3. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis.

    Science.gov (United States)

    Shields, Katherine F; Bain, Robert E S; Cronk, Ryan; Wright, Jim A; Bartram, Jamie

    2015-12-01

    Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water.

  4. A comparison of bivariate and univariate QTL mapping in livestock populations

    Directory of Open Access Journals (Sweden)

    Sorensen Daniel

    2003-11-01

    Full Text Available Abstract This study presents a multivariate, variance component-based QTL mapping model implemented via restricted maximum likelihood (REML. The method was applied to investigate bivariate and univariate QTL mapping analyses, using simulated data. Specifically, we report results on the statistical power to detect a QTL and on the precision of parameter estimates using univariate and bivariate approaches. The model and methodology were also applied to study the effectiveness of partitioning the overall genetic correlation between two traits into a component due to many genes of small effect, and one due to the QTL. It is shown that when the QTL has a pleiotropic effect on two traits, a bivariate analysis leads to a higher statistical power of detecting the QTL and to a more precise estimate of the QTL's map position, in particular in the case when the QTL has a small effect on the trait. The increase in power is most marked in cases where the contributions of the QTL and of the polygenic components to the genetic correlation have opposite signs. The bivariate REML analysis can successfully partition the two components contributing to the genetic correlation between traits.

  5. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  6. Bivariate value-at-risk

    Directory of Open Access Journals (Sweden)

    Giuseppe Arbia

    2007-10-01

    Full Text Available In this paper we extend the concept of Value-at-risk (VaR to bivariate return distributions in order to obtain measures of the market risk of an asset taking into account additional features linked to downside risk exposure. We first present a general definition of risk as the probability of an adverse event over a random distribution and we then introduce a measure of market risk (b-VaR that admits the traditional b of an asset in portfolio management as a special case when asset returns are normally distributed. Empirical evidences are provided by using Italian stock market data.

  7. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    Science.gov (United States)

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  8. A New Measure Of Bivariate Asymmetry And Its Evaluation

    International Nuclear Information System (INIS)

    Ferreira, Flavio Henn; Kolev, Nikolai Valtchev

    2008-01-01

    In this paper we propose a new measure of bivariate asymmetry, based on conditional correlation coefficients. A decomposition of the Pearson correlation coefficient in terms of its conditional versions is studied and an example of application of the proposed measure is given.

  9. Intermittency analysis of correlated data

    International Nuclear Information System (INIS)

    Wosiek, B.

    1992-01-01

    We describe the method of the analysis of the dependence of the factorial moments on the bin size in which the correlations between the moments computed for different bin sizes are taken into account. For large multiplicity nucleus-nucleus data inclusion of the correlations does not change the values of the slope parameter, but gives errors significantly reduced as compared to the case of fits with no correlations. (author)

  10. Evidence for bivariate linkage of obesity and HDL-C levels in the Framingham Heart Study.

    Science.gov (United States)

    Arya, Rector; Lehman, Donna; Hunt, Kelly J; Schneider, Jennifer; Almasy, Laura; Blangero, John; Stern, Michael P; Duggirala, Ravindranath

    2003-12-31

    Epidemiological studies have indicated that obesity and low high-density lipoprotein (HDL) levels are strong cardiovascular risk factors, and that these traits are inversely correlated. Despite the belief that these traits are correlated in part due to pleiotropy, knowledge on specific genes commonly affecting obesity and dyslipidemia is very limited. To address this issue, we first conducted univariate multipoint linkage analysis for body mass index (BMI) and HDL-C to identify loci influencing variation in these phenotypes using Framingham Heart Study data relating to 1702 subjects distributed across 330 pedigrees. Subsequently, we performed bivariate multipoint linkage analysis to detect common loci influencing covariation between these two traits. We scanned the genome and identified a major locus near marker D6S1009 influencing variation in BMI (LOD = 3.9) using the program SOLAR. We also identified a major locus for HDL-C near marker D2S1334 on chromosome 2 (LOD = 3.5) and another region near marker D6S1009 on chromosome 6 with suggestive evidence for linkage (LOD = 2.7). Since these two phenotypes have been independently mapped to the same region on chromosome 6q, we used the bivariate multipoint linkage approach using SOLAR. The bivariate linkage analysis of BMI and HDL-C implicated the genetic region near marker D6S1009 as harboring a major gene commonly influencing these phenotypes (bivariate LOD = 6.2; LODeq = 5.5) and appears to improve power to map the correlated traits to a region, precisely. We found substantial evidence for a quantitative trait locus with pleiotropic effects, which appears to influence both BMI and HDL-C phenotypes in the Framingham data.

  11. Accuracy of serum uric acid as a predictive test for maternal complications in pre-eclampsia: Bivariate meta-analysis and decision analysis

    NARCIS (Netherlands)

    Koopmans, Corine M.; van Pampus, Maria G.; Groen, Henk; Aarnoudse, Jan G.; van den Berg, Paul P.; Mol, Ben W. J.

    2009-01-01

    The aim of this study is to determine the accuracy and clinical value of serum uric acid in predicting maternal complications in women with pre-eclampsia. An existing meta-analysis on the subject was updated. The accuracy of serum uric acid for the prediction of maternal complications was assessed

  12. Accuracy of serum uric acid as a predictive test for maternal complications in pre-eclampsia : Bivariate meta-analysis and decision analysis

    NARCIS (Netherlands)

    Koopmans, C.M.; van Pampus, Maria; Groen, H.; Aarnoudse, J.G.; van den Berg, P.P.; Mol, B.W.J.

    The aim of this study is to determine the accuracy and clinical value of serum uric acid in predicting maternal complications in women with pre-eclampsia. An existing meta-analysis on the subject was updated. The accuracy of serum uric acid for the prediction of maternal complications was assessed

  13. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    Science.gov (United States)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  14. Spectrum-based estimators of the bivariate Hurst exponent

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2014-01-01

    Roč. 90, č. 6 (2014), art. 062802 ISSN 1539-3755 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : bivariate Hurst exponent * power- law cross-correlations * estimation Subject RIV: AH - Economics Impact factor: 2.288, year: 2014 http://library.utia.cas.cz/separaty/2014/E/kristoufek-0436818.pdf

  15. Generalized canonical correlation analysis with missing values

    NARCIS (Netherlands)

    M. van de Velden (Michel); Y. Takane

    2012-01-01

    textabstractGeneralized canonical correlation analysis is a versatile technique that allows the joint analysis of several sets of data matrices. The generalized canonical correlation analysis solution can be obtained through an eigenequation and distributional assumptions are not required. When

  16. Functional Multiple-Set Canonical Correlation Analysis

    Science.gov (United States)

    Hwang, Heungsun; Jung, Kwanghee; Takane, Yoshio; Woodward, Todd S.

    2012-01-01

    We propose functional multiple-set canonical correlation analysis for exploring associations among multiple sets of functions. The proposed method includes functional canonical correlation analysis as a special case when only two sets of functions are considered. As in classical multiple-set canonical correlation analysis, computationally, the…

  17. Bivariate analysis of the genetic variability among some accessions of African Yam Bean (Sphenostylis stenocarpa (Hochst ex A. RichHarms

    Directory of Open Access Journals (Sweden)

    Solomon Tayo AKINYOSOYE

    2017-12-01

    Full Text Available Variability is an important factor to consider in crop improvement programmes. This study was conducted in two years to assess genetic variability and determine relationship between seed yield, its components and tuber production characters among twelve accessions of African yam bean. Data collected were subjected to combined analysis of variance (ANOVA, Principal Component Analysis (PCA, hierarchical and K-means clustering analyses. Results obtained revealed that genotype by year (G × Y interaction had significant effects on some of variables measured (days to first flowering, days to 50 % flowering, number of pod per plant, pod length, seed yield and tuber yield per plant in this study.The first five principal components (PC with Eigen values greater than 1.0 accounted for about 66.70 % of the total variation, where PC1 and PC 2 accounted for 39.48 % of variation and were associated with seed and tuber yield variables. Three heterotic groups were clearly delineated among genotypes with accessions AY03 and AY10 identified for high seed yield and tuber yield respectively. Non-significant relationship that existed between tuber and seed yield per plant of these accessions was recommended for further test in various agro-ecologies for their suitability, adaptability and possible exploitation of heterosis to further improve the accessions.

  18. Chain Plot: A Tool for Exploiting Bivariate Temporal Structures

    OpenAIRE

    Taylor, CC; Zempeni, A

    2004-01-01

    In this paper we present a graphical tool useful for visualizing the cyclic behaviour of bivariate time series. We investigate its properties and link it to the asymmetry of the two variables concerned. We also suggest adding approximate confidence bounds to the points on the plot and investigate the effect of lagging to the chain plot. We conclude our paper by some standard Fourier analysis, relating and comparing this to the chain plot.

  19. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  20. A Vehicle for Bivariate Data Analysis

    Science.gov (United States)

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  1. Bivariate copula in fitting rainfall data

    Science.gov (United States)

    Yee, Kong Ching; Suhaila, Jamaludin; Yusof, Fadhilah; Mean, Foo Hui

    2014-07-01

    The usage of copula to determine the joint distribution between two variables is widely used in various areas. The joint distribution of rainfall characteristic obtained using the copula model is more ideal than the standard bivariate modelling where copula is belief to have overcome some limitation. Six copula models will be applied to obtain the most suitable bivariate distribution between two rain gauge stations. The copula models are Ali-Mikhail-Haq (AMH), Clayton, Frank, Galambos, Gumbel-Hoogaurd (GH) and Plackett. The rainfall data used in the study is selected from rain gauge stations which are located in the southern part of Peninsular Malaysia, during the period from 1980 to 2011. The goodness-of-fit test in this study is based on the Akaike information criterion (AIC).

  2. Reliability for some bivariate beta distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate beta. The calculations involve the use of special functions.

  3. Reliability for some bivariate gamma distributions

    Directory of Open Access Journals (Sweden)

    Nadarajah Saralees

    2005-01-01

    Full Text Available In the area of stress-strength models, there has been a large amount of work as regards estimation of the reliability R=Pr( Xbivariate distribution with dependence between X and Y . In particular, we derive explicit expressions for R when the joint distribution is bivariate gamma. The calculations involve the use of special functions.

  4. Bivariate functional data clustering: grouping streams based on a varying coefficient model of the stream water and air temperature relationship

    Science.gov (United States)

    H. Li; X. Deng; Andy Dolloff; E. P. Smith

    2015-01-01

    A novel clustering method for bivariate functional data is proposed to group streams based on their water–air temperature relationship. A distance measure is developed for bivariate curves by using a time-varying coefficient model and a weighting scheme. This distance is also adjusted by spatial correlation of streams via the variogram. Therefore, the proposed...

  5. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  6. Spectral density regression for bivariate extremes

    KAUST Repository

    Castro Camilo, Daniela

    2016-05-11

    We introduce a density regression model for the spectral density of a bivariate extreme value distribution, that allows us to assess how extremal dependence can change over a covariate. Inference is performed through a double kernel estimator, which can be seen as an extension of the Nadaraya–Watson estimator where the usual scalar responses are replaced by mean constrained densities on the unit interval. Numerical experiments with the methods illustrate their resilience in a variety of contexts of practical interest. An extreme temperature dataset is used to illustrate our methods. © 2016 Springer-Verlag Berlin Heidelberg

  7. Recurrent major depression and right hippocampal volume: A bivariate linkage and association study.

    Science.gov (United States)

    Mathias, Samuel R; Knowles, Emma E M; Kent, Jack W; McKay, D Reese; Curran, Joanne E; de Almeida, Marcio A A; Dyer, Thomas D; Göring, Harald H H; Olvera, Rene L; Duggirala, Ravi; Fox, Peter T; Almasy, Laura; Blangero, John; Glahn, David C

    2016-01-01

    Previous work has shown that the hippocampus is smaller in the brains of individuals suffering from major depressive disorder (MDD) than those of healthy controls. Moreover, right hippocampal volume specifically has been found to predict the probability of subsequent depressive episodes. This study explored the utility of right hippocampal volume as an endophenotype of recurrent MDD (rMDD). We observed a significant genetic correlation between the two traits in a large sample of Mexican American individuals from extended pedigrees (ρg = -0.34, p = 0.013). A bivariate linkage scan revealed a significant pleiotropic quantitative trait locus on chromosome 18p11.31-32 (LOD = 3.61). Bivariate association analysis conducted under the linkage peak revealed a variant (rs574972) within an intron of the gene SMCHD1 meeting the corrected significance level (χ(2) = 19.0, p = 7.4 × 10(-5)). Univariate association analyses of each phenotype separately revealed that the same variant was significant for right hippocampal volume alone, and also revealed a suggestively significant variant (rs12455524) within the gene DLGAP1 for rMDD alone. The results implicate right-hemisphere hippocampal volume as a possible endophenotype of rMDD, and in so doing highlight a potential gene of interest for rMDD risk. © 2015 Wiley Periodicals, Inc.

  8. Personality Traits as Predictors of Shopping Motivations and Behaviors: A Canonical Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Ali Gohary

    2014-10-01

    Full Text Available This study examines the relationship between Big Five personality traits with shopping motivation variables consisting of compulsive and impulsive buying, hedonic and utilitarian shopping values. Two hundred forty seven college students were recruited to participate in this research. Bivariate correlation demonstrates an overlap between personality traits; consequently, canonical correlation was performed to prevent this phenomenon. The results of multiple regression analysis suggested conscientiousness, neuroticism and openness as predictors of compulsive buying, impulsive buying and utilitarian shopping values. In addition, the results showed significant differences between males and females on conscientiousness, neuroticism, openness, compulsive buying and hedonic shopping value. Besides, using hierarchical regression analysis, we examined sex as moderator between Big Five personality traits and shopping variables, but we didn’t find sufficient evidence to prove it.

  9. Structural Analysis of Covariance and Correlation Matrices.

    Science.gov (United States)

    Joreskog, Karl G.

    1978-01-01

    A general approach to analysis of covariance structures is considered, in which the variances and covariances or correlations of the observed variables are directly expressed in terms of the parameters of interest. The statistical problems of identification, estimation and testing of such covariance or correlation structures are discussed.…

  10. Correlation analysis in chemistry: recent advances

    National Research Council Canada - National Science Library

    Shorter, John; Chapman, Norman Bellamy

    1978-01-01

    ..., and applications of LFER to polycyclic arenes, heterocyclic compounds, and olefinic systems. Of particular interest is the extensive critical compilation of substituent constants and the numerous applications of correlation analysis to spectroscopy...

  11. Bivariate Rayleigh Distribution and its Properties

    Directory of Open Access Journals (Sweden)

    Ahmad Saeed Akhter

    2007-01-01

    Full Text Available Rayleigh (1880 observed that the sea waves follow no law because of the complexities of the sea, but it has been seen that the probability distributions of wave heights, wave length, wave induce pitch, wave and heave motions of the ships follow the Rayleigh distribution. At present, several different quantities are in use for describing the state of the sea; for example, the mean height of the waves, the root mean square height, the height of the “significant waves” (the mean height of the highest one-third of all the waves the maximum height over a given interval of the time, and so on. At present, the ship building industry knows less than any other construction industry about the service conditions under which it must operate. Only small efforts have been made to establish the stresses and motions and to incorporate the result of such studies in to design. This is due to the complexity of the problem caused by the extensive variability of the sea and the corresponding response of the ships. Although the problem appears feasible, yet it is possible to predict service conditions for ships in an orderly and relatively simple manner Rayleigh (1980 derived it from the amplitude of sound resulting from many independent sources. This distribution is also connected with one or two dimensions and is sometimes referred to as “random walk” frequency distribution. The Rayleigh distribution can be derived from the bivariate normal distribution when the variate are independent and random with equal variances. We try to construct bivariate Rayleigh distribution with marginal Rayleigh distribution function and discuss its fundamental properties.

  12. General correlation and partial correlation analysis in finding interactions: with Spearman rank correlation and proportion correlation as correlation measures

    OpenAIRE

    WenJun Zhang; Xin Li

    2015-01-01

    Between-taxon interactions can be detected by calculating the sampling data of taxon sample type. In present study, Spearman rank correlation and proportion correlation are chosen as the general correlation measures, and their partial correlations are calculated and compared. The results show that for Spearman rank correlation measure, in all predicted candidate direct interactions by partial correlation, about 16.77% (x, 0-45.4%) of them are not successfully detected by Spearman rank correla...

  13. Dissecting the correlation structure of a bivariate phenotype ...

    Indian Academy of Sciences (India)

    Unknown

    We use Monte-Carlo simulations to evaluate the performance of the proposed test under different trait parameters and quantitative trait distributions. An application of the method is illustrated using data on two alcohol-related phenotypes from a project on the collaborative study on the genetics of alcoholism. [Ghosh S 2005 ...

  14. Generalized canonical correlation analysis with missing values

    NARCIS (Netherlands)

    M. van de Velden (Michel); Y. Takane

    2009-01-01

    textabstractTwo new methods for dealing with missing values in generalized canonical correlation analysis are introduced. The first approach, which does not require iterations, is a generalization of the Test Equating method available for principal component analysis. In the second approach,

  15. Detrended cross-correlation analysis of electroencephalogram

    International Nuclear Information System (INIS)

    Wang Jun; Zhao Da-Qing

    2012-01-01

    In the paper we use detrended cross-correlation analysis (DCCA) to study the electroencephalograms of healthy young subjects and healthy old subjects. It is found that the cross-correlation between different leads of a healthy young subject is larger than that of a healthy old subject. It was shown that the cross-correlation relationship decreases with the aging process and the phenomenon can help to diagnose whether the subject's brain function is healthy or not. (interdisciplinary physics and related areas of science and technology)

  16. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    Science.gov (United States)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  17. Multicollinearity in canonical correlation analysis in maize.

    Science.gov (United States)

    Alves, B M; Cargnelutti Filho, A; Burin, C

    2017-03-30

    The objective of this study was to evaluate the effects of multicollinearity under two methods of canonical correlation analysis (with and without elimination of variables) in maize (Zea mays L.) crop. Seventy-six maize genotypes were evaluated in three experiments, conducted in a randomized block design with three replications, during the 2009/2010 crop season. Eleven agronomic variables (number of days from sowing until female flowering, number of days from sowing until male flowering, plant height, ear insertion height, ear placement, number of plants, number of ears, ear index, ear weight, grain yield, and one thousand grain weight), 12 protein-nutritional variables (crude protein, lysine, methionine, cysteine, threonine, tryptophan, valine, isoleucine, leucine, phenylalanine, histidine, and arginine), and 6 energetic-nutritional variables (apparent metabolizable energy, apparent metabolizable energy corrected for nitrogen, ether extract, crude fiber, starch, and amylose) were measured. A phenotypic correlation matrix was first generated among the 29 variables for each of the experiments. A multicollinearity diagnosis was later performed within each group of variables using methodologies such as variance inflation factor and condition number. Canonical correlation analysis was then performed, with and without the elimination of variables, among groups of agronomic and protein-nutritional, and agronomic and energetic-nutritional variables. The canonical correlation analysis in the presence of multicollinearity (without elimination of variables) overestimates the variability of canonical coefficients. The elimination of variables is an efficient method to circumvent multicollinearity in canonical correlation analysis.

  18. Bayesian Correlation Analysis for Sequence Count Data.

    Directory of Open Access Journals (Sweden)

    Daniel Sánchez-Taltavull

    Full Text Available Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities' measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low-especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities' signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset.

  19. An Affine Invariant Bivariate Version of the Sign Test.

    Science.gov (United States)

    1987-06-01

    words: affine invariance, bivariate quantile, bivariate symmetry, model,. generalized median, influence function , permutation test, normal efficiency...calculate a bivariate version of the influence function , and the resulting form is bounded, as is the case for the univartate sign test, and shows the...terms of a blvariate analogue of IHmpel’s (1974) influence function . The latter, though usually defined as a von-Mises derivative of certain

  20. Graphology and personality: a correlational analysis

    OpenAIRE

    2008-01-01

    M.A. The title of this dissertation reads as follows: Graphology and Personality: A Correlational Analysis. The aim of this dissertation is to introduce a different projective technique (as of yet not very widely used) into the psychological arena of assessment. Graphology is a projective technique that allows the analyst to delve into the personality of the individual. Very shortly, graphology can be defined as the assessment or analysis of a person’s handwriting. When a child first attem...

  1. Detrended fluctuation analysis made flexible to detect range of cross-correlated fluctuations

    Science.gov (United States)

    Kwapień, Jarosław; Oświecimka, Paweł; DroŻdŻ, Stanisław

    2015-11-01

    The detrended cross-correlation coefficient ρDCCA has recently been proposed to quantify the strength of cross-correlations on different temporal scales in bivariate, nonstationary time series. It is based on the detrended cross-correlation and detrended fluctuation analyses (DCCA and DFA, respectively) and can be viewed as an analog of the Pearson coefficient in the case of the fluctuation analysis. The coefficient ρDCCA works well in many practical situations but by construction its applicability is limited to detection of whether two signals are generally cross-correlated, without the possibility to obtain information on the amplitude of fluctuations that are responsible for those cross-correlations. In order to introduce some related flexibility, here we propose an extension of ρDCCA that exploits the multifractal versions of DFA and DCCA: multifractal detrended fluctuation analysis and multifractal detrended cross-correlation analysis, respectively. The resulting new coefficient ρq not only is able to quantify the strength of correlations but also allows one to identify the range of detrended fluctuation amplitudes that are correlated in two signals under study. We show how the coefficient ρq works in practical situations by applying it to stochastic time series representing processes with long memory: autoregressive and multiplicative ones. Such processes are often used to model signals recorded from complex systems and complex physical phenomena like turbulence, so we are convinced that this new measure can successfully be applied in time-series analysis. In particular, we present an example of such application to highly complex empirical data from financial markets. The present formulation can straightforwardly be extended to multivariate data in terms of the q -dependent counterpart of the correlation matrices and then to the network representation.

  2. A bivariate model for analyzing recurrent multi-type automobile failures

    Science.gov (United States)

    Sunethra, A. A.; Sooriyarachchi, M. R.

    2017-09-01

    The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by

  3. A Bivariate return period for levee failure monitoring

    Science.gov (United States)

    Isola, M.; Caporali, E.

    2017-12-01

    Levee breaches are strongly linked with the interaction processes among water, soil and structure, thus many are the factors that affect the breach development. One of the main is the hydraulic load, characterized by intensity and duration, i.e. by the flood event hydrograph. On the magnitude of the hydraulic load is based the levee design, generally without considering the fatigue failure due to the load duration. Moreover, many are the cases in which the levee breach are characterized by flood of magnitude lower than the design one. In order to implement the strategies of flood risk management, we built here a procedure based on a multivariate statistical analysis of flood peak and volume together with the analysis of the past levee failure events. Particularly, in order to define the probability of occurrence of the hydraulic load on a levee, a bivariate copula model is used to obtain the bivariate joint distribution of flood peak and volume. Flood peak is the expression of the load magnitude, while the volume is the expression of the stress over time. We consider the annual flood peak and the relative volume. The volume is given by the hydrograph area between the beginning and the end of event. The beginning of the event is identified as an abrupt rise of the discharge by more than 20%. The end is identified as the point from which the receding limb is characterized by the baseflow, using a nonlinear reservoir algorithm as baseflow separation technique. By this, with the aim to define warning thresholds we consider the past levee failure events and the relative bivariate return period (BTr) compared with the estimation of a traditional univariate model. The discharge data of 30 hydrometric stations of Arno River in Tuscany, Italy, in the period 1995-2016 are analysed. The database of levee failure events, considering for each event the location as well as the failure mode, is also created. The events were registered in the period 2000-2014 by EEA

  4. Unadjusted Bivariate Two-Group Comparisons: When Simpler is Better.

    Science.gov (United States)

    Vetter, Thomas R; Mascha, Edward J

    2018-01-01

    Hypothesis testing involves posing both a null hypothesis and an alternative hypothesis. This basic statistical tutorial discusses the appropriate use, including their so-called assumptions, of the common unadjusted bivariate tests for hypothesis testing and thus comparing study sample data for a difference or association. The appropriate choice of a statistical test is predicated on the type of data being analyzed and compared. The unpaired or independent samples t test is used to test the null hypothesis that the 2 population means are equal, thereby accepting the alternative hypothesis that the 2 population means are not equal. The unpaired t test is intended for comparing dependent continuous (interval or ratio) data from 2 study groups. A common mistake is to apply several unpaired t tests when comparing data from 3 or more study groups. In this situation, an analysis of variance with post hoc (posttest) intragroup comparisons should instead be applied. Another common mistake is to apply a series of unpaired t tests when comparing sequentially collected data from 2 study groups. In this situation, a repeated-measures analysis of variance, with tests for group-by-time interaction, and post hoc comparisons, as appropriate, should instead be applied in analyzing data from sequential collection points. The paired t test is used to assess the difference in the means of 2 study groups when the sample observations have been obtained in pairs, often before and after an intervention in each study subject. The Pearson chi-square test is widely used to test the null hypothesis that 2 unpaired categorical variables, each with 2 or more nominal levels (values), are independent of each other. When the null hypothesis is rejected, 1 concludes that there is a probable association between the 2 unpaired categorical variables. When comparing 2 groups on an ordinal or nonnormally distributed continuous outcome variable, the 2-sample t test is usually not appropriate. The

  5. The relative performance of bivariate causality tests in small samples

    NARCIS (Netherlands)

    Bult, J..R.; Leeflang, P.S.H.; Wittink, D.R.

    1997-01-01

    Causality tests have been applied to establish directional effects and to reduce the set of potential predictors, For the latter type of application only bivariate tests can be used, In this study we compare bivariate causality tests. Although the problem addressed is general and could benefit

  6. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1bivariate distribution. Such distribution includes bivariate compound Weibull, bivariate compound Gompertz, bivariate compound Pareto, among others. In the parametric case, the maximum likelihood estimates of the parameters and reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  7. A dynamic bivariate Poisson model for analysing and forecasting match results in the English Premier League

    NARCIS (Netherlands)

    Koopman, S.J.; Lit, R.

    2015-01-01

    Summary: We develop a statistical model for the analysis and forecasting of football match results which assumes a bivariate Poisson distribution with intensity coefficients that change stochastically over time. The dynamic model is a novelty in the statistical time series analysis of match results

  8. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  9. Can the bivariate Hurst exponent be higher than an average of the separate Hurst exponents?

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    2015-01-01

    Roč. 431, č. 1 (2015), s. 124-127 ISSN 0378-4371 R&D Projects: GA ČR(CZ) GP14-11402P Institutional support: RVO:67985556 Keywords : Correlations * Power- law cross-correlations * Bivariate Hurst exponent * Spectrum coherence Subject RIV: AH - Economics Impact factor: 1.785, year: 2015 http://library.utia.cas.cz/separaty/2015/E/kristoufek-0452314.pdf

  10. International Space Station Future Correlation Analysis Improvements

    Science.gov (United States)

    Laible, Michael R.; Pinnamaneni, Murthy; Sugavanam, Sujatha; Grygier, Michael

    2018-01-01

    Ongoing modal analyses and model correlation are performed on different configurations of the International Space Station (ISS). These analyses utilize on-orbit dynamic measurements collected using four main ISS instrumentation systems: External Wireless Instrumentation System (EWIS), Internal Wireless Instrumentation System (IWIS), Space Acceleration Measurement System (SAMS), and Structural Dynamic Measurement System (SDMS). Remote Sensor Units (RSUs) are network relay stations that acquire flight data from sensors. Measured data is stored in the Remote Sensor Unit (RSU) until it receives a command to download data via RF to the Network Control Unit (NCU). Since each RSU has its own clock, it is necessary to synchronize measurements before analysis. Imprecise synchronization impacts analysis results. A study was performed to evaluate three different synchronization techniques: (i) measurements visually aligned to analytical time-response data using model comparison, (ii) Frequency Domain Decomposition (FDD), and (iii) lag from cross-correlation to align measurements. This paper presents the results of this study.

  11. STUDI PERBANDINGAN ANTARA ALGORITMA BIVARIATE MARGINAL DISTRIBUTION DENGAN ALGORITMA GENETIKA

    Directory of Open Access Journals (Sweden)

    Chastine Fatichah

    2006-01-01

    Full Text Available Bivariate Marginal Distribution Algorithm is extended from Estimation of Distribution Algorithm. This heuristic algorithm proposes the new approach for recombination of generate new individual that without crossover and mutation process such as genetic algorithm. Bivariate Marginal Distribution Algorithm uses connectivity variable the pair gene for recombination of generate new individual. Connectivity between variable is doing along optimization process. In this research, genetic algorithm performance with one point crossover is compared with Bivariate Marginal Distribution Algorithm performance in case Onemax, De Jong F2 function, and Traveling Salesman Problem. In this research, experimental results have shown performance the both algorithm is dependence of parameter respectively and also population size that used. For Onemax case with size small problem, Genetic Algorithm perform better with small number of iteration and more fast for get optimum result. However, Bivariate Marginal Distribution Algorithm perform better of result optimization for case Onemax with huge size problem. For De Jong F2 function, Genetic Algorithm perform better from Bivariate Marginal Distribution Algorithm of a number of iteration and time. For case Traveling Salesman Problem, Bivariate Marginal Distribution Algorithm have shown perform better from Genetic Algorithm of optimization result. Abstract in Bahasa Indonesia : Bivariate Marginal Distribution Algorithm merupakan perkembangan lebih lanjut dari Estimation of Distribution Algorithm. Algoritma heuristik ini mengenalkan pendekatan baru dalam melakukan rekombinasi untuk membentuk individu baru, yaitu tidak menggunakan proses crossover dan mutasi seperti pada Genetic Algorithm. Bivariate Marginal Distribution Algorithm menggunakan keterkaitan pasangan variabel dalam melakukan rekombinasi untuk membentuk individu baru. Keterkaitan antar variabel tersebut ditemukan selama proses optimasi berlangsung. Aplikasi yang

  12. Bivariate discrete beta Kernel graduation of mortality data.

    Science.gov (United States)

    Mazza, Angelo; Punzo, Antonio

    2015-07-01

    Various parametric/nonparametric techniques have been proposed in literature to graduate mortality data as a function of age. Nonparametric approaches, as for example kernel smoothing regression, are often preferred because they do not assume any particular mortality law. Among the existing kernel smoothing approaches, the recently proposed (univariate) discrete beta kernel smoother has been shown to provide some benefits. Bivariate graduation, over age and calendar years or durations, is common practice in demography and actuarial sciences. In this paper, we generalize the discrete beta kernel smoother to the bivariate case, and we introduce an adaptive bandwidth variant that may provide additional benefits when data on exposures to the risk of death are available; furthermore, we outline a cross-validation procedure for bandwidths selection. Using simulations studies, we compare the bivariate approach proposed here with its corresponding univariate formulation and with two popular nonparametric bivariate graduation techniques, based on Epanechnikov kernels and on P-splines. To make simulations realistic, a bivariate dataset, based on probabilities of dying recorded for the US males, is used. Simulations have confirmed the gain in performance of the new bivariate approach with respect to both the univariate and the bivariate competitors.

  13. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  14. International Space Station Model Correlation Analysis

    Science.gov (United States)

    Laible, Michael R.; Fitzpatrick, Kristin; Hodge, Jennifer; Grygier, Michael

    2018-01-01

    This paper summarizes the on-orbit structural dynamic data and the related modal analysis, model validation and correlation performed for the International Space Station (ISS) configuration ISS Stage ULF7, 2015 Dedicated Thruster Firing (DTF). The objective of this analysis is to validate and correlate the analytical models used to calculate the ISS internal dynamic loads and compare the 2015 DTF with previous tests. During the ISS configurations under consideration, on-orbit dynamic measurements were collected using the three main ISS instrumentation systems; Internal Wireless Instrumentation System (IWIS), External Wireless Instrumentation System (EWIS) and the Structural Dynamic Measurement System (SDMS). The measurements were recorded during several nominal on-orbit DTF tests on August 18, 2015. Experimental modal analyses were performed on the measured data to extract modal parameters including frequency, damping, and mode shape information. Correlation and comparisons between test and analytical frequencies and mode shapes were performed to assess the accuracy of the analytical models for the configurations under consideration. These mode shapes were also compared to earlier tests. Based on the frequency comparisons, the accuracy of the mathematical models is assessed and model refinement recommendations are given. In particular, results of the first fundamental mode will be discussed, nonlinear results will be shown, and accelerometer placement will be assessed.

  15. System Reliability Analysis Considering Correlation of Performances

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Saekyeol; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of); Lim, Woochul [Mando Corporation, Seongnam (Korea, Republic of)

    2017-04-15

    Reliability analysis of a mechanical system has been developed in order to consider the uncertainties in the product design that may occur from the tolerance of design variables, uncertainties of noise, environmental factors, and material properties. In most of the previous studies, the reliability was calculated independently for each performance of the system. However, the conventional methods cannot consider the correlation between the performances of the system that may lead to a difference between the reliability of the entire system and the reliability of the individual performance. In this paper, the joint probability density function (PDF) of the performances is modeled using a copula which takes into account the correlation between performances of the system. The system reliability is proposed as the integral of joint PDF of performances and is compared with the individual reliability of each performance by mathematical examples and two-bar truss example.

  16. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2010-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.

  17. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2009-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  18. System Reliability Analysis Considering Correlation of Performances

    International Nuclear Information System (INIS)

    Kim, Saekyeol; Lee, Tae Hee; Lim, Woochul

    2017-01-01

    Reliability analysis of a mechanical system has been developed in order to consider the uncertainties in the product design that may occur from the tolerance of design variables, uncertainties of noise, environmental factors, and material properties. In most of the previous studies, the reliability was calculated independently for each performance of the system. However, the conventional methods cannot consider the correlation between the performances of the system that may lead to a difference between the reliability of the entire system and the reliability of the individual performance. In this paper, the joint probability density function (PDF) of the performances is modeled using a copula which takes into account the correlation between performances of the system. The system reliability is proposed as the integral of joint PDF of performances and is compared with the individual reliability of each performance by mathematical examples and two-bar truss example.

  19. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting

    2011-03-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  20. Approximation of bivariate copulas by patched bivariate Fréchet copulas

    KAUST Repository

    Zheng, Yanting; Yang, Jingping; Huang, Jianhua Z.

    2011-01-01

    Bivariate Fréchet (BF) copulas characterize dependence as a mixture of three simple structures: comonotonicity, independence and countermonotonicity. They are easily interpretable but have limitations when used as approximations to general dependence structures. To improve the approximation property of the BF copulas and keep the advantage of easy interpretation, we develop a new copula approximation scheme by using BF copulas locally and patching the local pieces together. Error bounds and a probabilistic interpretation of this approximation scheme are developed. The new approximation scheme is compared with several existing copula approximations, including shuffle of min, checkmin, checkerboard and Bernstein approximations and exhibits better performance, especially in characterizing the local dependence. The utility of the new approximation scheme in insurance and finance is illustrated in the computation of the rainbow option prices and stop-loss premiums. © 2010 Elsevier B.V.

  1. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  2. Bivariational calculations for radiation transfer in an inhomogeneous participating media

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Machali, H.M.; Haggag, M.H.; Attia, M.T.

    1986-07-01

    Equations for radiation transfer are obtained for dispersive media with space dependent albedo. Bivariational bound principle is used to calculate the reflection and transmission coefficients for such media. Numerical results are given and compared. (author)

  3. Interpreting canonical correlation analysis through biplots of stucture correlations and weights

    NARCIS (Netherlands)

    Braak, ter C.J.F.

    1990-01-01

    This paper extends the biplot technique to canonical correlation analysis and redundancy analysis. The plot of structure correlations is shown to the optimal for displaying the pairwise correlations between the variables of the one set and those of the second. The link between multivariate

  4. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  5. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  6. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  7. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    Science.gov (United States)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  8. Correlation analysis of fracture arrangement in space

    Science.gov (United States)

    Marrett, Randall; Gale, Julia F. W.; Gómez, Leonel A.; Laubach, Stephen E.

    2018-03-01

    We present new techniques that overcome limitations of standard approaches to documenting spatial arrangement. The new techniques directly quantify spatial arrangement by normalizing to expected values for randomly arranged fractures. The techniques differ in terms of computational intensity, robustness of results, ability to detect anti-correlation, and use of fracture size data. Variation of spatial arrangement across a broad range of length scales facilitates distinguishing clustered and periodic arrangements-opposite forms of organization-from random arrangements. Moreover, self-organized arrangements can be distinguished from arrangements due to extrinsic organization. Traditional techniques for analysis of fracture spacing are hamstrung because they account neither for the sequence of fracture spacings nor for possible coordination between fracture size and position, attributes accounted for by our methods. All of the new techniques reveal fractal clustering in a test case of veins, or cement-filled opening-mode fractures, in Pennsylvanian Marble Falls Limestone. The observed arrangement is readily distinguishable from random and periodic arrangements. Comparison of results that account for fracture size with results that ignore fracture size demonstrates that spatial arrangement is dominated by the sequence of fracture spacings, rather than coordination of fracture size with position. Fracture size and position are not completely independent in this example, however, because large fractures are more clustered than small fractures. Both spatial and size organization of veins here probably emerged from fracture interaction during growth. The new approaches described here, along with freely available software to implement the techniques, can be applied with effect to a wide range of structures, or indeed many other phenomena such as drilling response, where spatial heterogeneity is an issue.

  9. Analysis of Baryon Angular Correlations with Pythia

    CERN Document Server

    Mccune, Amara

    2017-01-01

    Our current understanding of baryon production is encompassed in the framework of the Lund String Fragmentation Model, which is then encoded in the Monte Carlo event generator program Pythia. In proton-proton collisions, daughter particles of the same baryon number produce an anti-correlation in $\\Delta\\eta\\Delta\\varphi$ space in ALICE data, while Pythia programs predict a correlation. To understand this unusual effect, where it comes from, and where our models of baryon production go wrong, correlation functions were systematically generated with Pythia. Effects of energy scaling, color reconnection, and popcorn parameters were investigated.

  10. Optimizing an objective function under a bivariate probability model

    NARCIS (Netherlands)

    X. Brusset; N.M. Temme (Nico)

    2007-01-01

    htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be

  11. GIS-Based bivariate statistical techniques for groundwater potential ...

    Indian Academy of Sciences (India)

    24

    This study shows the potency of two GIS-based data driven bivariate techniques namely ... In the view of these weaknesses , there is a strong requirement for reassessment of .... Font color: Text 1, Not Expanded by / Condensed by , ...... West Bengal (India) using remote sensing, geographical information system and multi-.

  12. Building Bivariate Tables: The compareGroups Package for R

    Directory of Open Access Journals (Sweden)

    Isaac Subirana

    2014-05-01

    Full Text Available The R package compareGroups provides functions meant to facilitate the construction of bivariate tables (descriptives of several variables for comparison between groups and generates reports in several formats (LATEX, HTML or plain text CSV. Moreover, bivariate tables can be viewed directly on the R console in a nice format. A graphical user interface (GUI has been implemented to build the bivariate tables more easily for those users who are not familiar with the R software. Some new functions and methods have been incorporated in the newest version of the compareGroups package (version 1.x to deal with time-to-event variables, stratifying tables, merging several tables, and revising the statistical methods used. The GUI interface also has been improved, making it much easier and more intuitive to set the inputs for building the bivariate tables. The ?rst version (version 0.x and this version were presented at the 2010 useR! conference (Sanz, Subirana, and Vila 2010 and the 2011 useR! conference (Sanz, Subirana, and Vila 2011, respectively. Package compareGroups is available from the Comprehensive R Archive Network at http://CRAN.R-project.org/package=compareGroups.

  13. About some properties of bivariate splines with shape parameters

    Science.gov (United States)

    Caliò, F.; Marchetti, E.

    2017-07-01

    The paper presents and proves geometrical properties of a particular bivariate function spline, built and algorithmically implemented in previous papers. The properties typical of this family of splines impact the field of computer graphics in particular that of the reverse engineering.

  14. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization.

    Directory of Open Access Journals (Sweden)

    Margaretha A Vink

    Full Text Available Post-vaccine monitoring programs for human papillomavirus (HPV have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination.

  15. Statistical analysis of angular correlation measurements

    International Nuclear Information System (INIS)

    Oliveira, R.A.A.M. de.

    1986-01-01

    Obtaining the multipole mixing ratio, δ, of γ transitions in angular correlation measurements is a statistical problem characterized by the small number of angles in which the observation is made and by the limited statistic of counting, α. The inexistence of a sufficient statistics for the estimator of δ, is shown. Three different estimators for δ were constructed and their properties of consistency, bias and efficiency were tested. Tests were also performed in experimental results obtained in γ-γ directional correlation measurements. (Author) [pt

  16. Uncertainty analysis with statistically correlated failure data

    International Nuclear Information System (INIS)

    Modarres, M.; Dezfuli, H.; Roush, M.L.

    1987-01-01

    Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)

  17. A hybrid correlation analysis with application to imaging genetics

    Science.gov (United States)

    Hu, Wenxing; Fang, Jian; Calhoun, Vince D.; Wang, Yu-Ping

    2018-03-01

    Investigating the association between brain regions and genes continues to be a challenging topic in imaging genetics. Current brain region of interest (ROI)-gene association studies normally reduce data dimension by averaging the value of voxels in each ROI. This averaging may lead to a loss of information due to the existence of functional sub-regions. Pearson correlation is widely used for association analysis. However, it only detects linear correlation whereas nonlinear correlation may exist among ROIs. In this work, we introduced distance correlation to ROI-gene association analysis, which can detect both linear and nonlinear correlations and overcome the limitation of averaging operations by taking advantage of the information at each voxel. Nevertheless, distance correlation usually has a much lower value than Pearson correlation. To address this problem, we proposed a hybrid correlation analysis approach, by applying canonical correlation analysis (CCA) to the distance covariance matrix instead of directly computing distance correlation. Incorporating CCA into distance correlation approach may be more suitable for complex disease study because it can detect highly associated pairs of ROI and gene groups, and may improve the distance correlation level and statistical power. In addition, we developed a novel nonlinear CCA, called distance kernel CCA, which seeks the optimal combination of features with the most significant dependence. This approach was applied to imaging genetic data from the Philadelphia Neurodevelopmental Cohort (PNC). Experiments showed that our hybrid approach produced more consistent results than conventional CCA across resampling and both the correlation and statistical significance were increased compared to distance correlation analysis. Further gene enrichment analysis and region of interest (ROI) analysis confirmed the associations of the identified genes with brain ROIs. Therefore, our approach provides a powerful tool for finding

  18. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Science.gov (United States)

    Zscheischler, Jakob; Orth, Rene; Seneviratne, Sonia I.

    2017-07-01

    Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature-precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 %) than models relying directly on temperature and precipitation as predictors (36 %). Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate-crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  19. Thematic mapper studies band correlation analysis

    Science.gov (United States)

    Ungar, S. G.; Kiang, R.

    1976-01-01

    Spectral data representative of thematic mapper candidate bands 1 and 3 to 7 were obtained by selecting appropriate combinations of bands from the JSC 24 channel multispectral scanner. Of all the bands assigned, only candidate bands 4 (.74 mu to .80 mu) and 5 (.80 mu to .91 mu) showed consistently high intercorrelation from region to region and time to time. This extremely high correlation persisted when looking at the composite data set in a multitemporal, multilocation domain. The GISS investigations lend positive confirmation to the hypothesis, that TM bands 4 and 5 are redundant.

  20. Handwriting: Feature Correlation Analysis for Biometric Hashes

    Science.gov (United States)

    Vielhauer, Claus; Steinmetz, Ralf

    2004-12-01

    In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  1. Handwriting: Feature Correlation Analysis for Biometric Hashes

    Directory of Open Access Journals (Sweden)

    Ralf Steinmetz

    2004-04-01

    Full Text Available In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation, the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  2. Semiclassical analysis spectral correlations in mesoscopic systems

    International Nuclear Information System (INIS)

    Argaman, N.; Imry, Y.; Smilansky, U.

    1991-07-01

    We consider the recently developed semiclassical analysis of the quantum mechanical spectral form factor, which may be expressed in terms of classically defiable properties. When applied to electrons whose classical behaviour is diffusive, the results of earlier quantum mechanical perturbative derivations, which were developed under a different set of assumptions, are reproduced. The comparison between the two derivations shows that the results depends not on their specific details, but to a large extent on the principle of quantum coherent superposition, and on the generality of the notion of diffusion. The connection with classical properties facilitates application to many physical situations. (author)

  3. Computational approach to Thornley's problem by bivariate operational calculus

    Science.gov (United States)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  4. Two-dimensional multifractal cross-correlation analysis

    International Nuclear Information System (INIS)

    Xi, Caiping; Zhang, Shuning; Xiong, Gang; Zhao, Huichang; Yang, Yonghong

    2017-01-01

    Highlights: • We study the mathematical models of 2D-MFXPF, 2D-MFXDFA and 2D-MFXDMA. • Present the definition of the two-dimensional N 2 -partitioned multiplicative cascading process. • Do the comparative analysis of 2D-MC by 2D-MFXPF, 2D-MFXDFA and 2D-MFXDMA. • Provide a reference on the choice and parameter settings of these methods in practice. - Abstract: There are a number of situations in which several signals are simultaneously recorded in complex systems, which exhibit long-term power-law cross-correlations. This paper presents two-dimensional multifractal cross-correlation analysis based on the partition function (2D-MFXPF), two-dimensional multifractal cross-correlation analysis based on the detrended fluctuation analysis (2D-MFXDFA) and two-dimensional multifractal cross-correlation analysis based on the detrended moving average analysis (2D-MFXDMA). We apply these methods to pairs of two-dimensional multiplicative cascades (2D-MC) to do a comparative study. Then, we apply the two-dimensional multifractal cross-correlation analysis based on the detrended fluctuation analysis (2D-MFXDFA) to real images and unveil intriguing multifractality in the cross correlations of the material structures. At last, we give the main conclusions and provide a valuable reference on how to choose the multifractal algorithms in the potential applications in the field of SAR image classification and detection.

  5. Psychobiological Correlates of Vaginismus: An Exploratory Analysis.

    Science.gov (United States)

    Maseroli, Elisa; Scavello, Irene; Cipriani, Sarah; Palma, Manuela; Fambrini, Massimiliano; Corona, Giovanni; Mannucci, Edoardo; Maggi, Mario; Vignozzi, Linda

    2017-11-01

    Evidence concerning the determinants of vaginismus (V), in particular medical conditions, is inconclusive. To investigate, in a cohort of subjects consulting for female sexual dysfunction, whether there is a difference in medical and psychosocial parameters between women with V and women with other sexual complaints. A series of 255 women attending our clinic for female sexual dysfunction was consecutively recruited. V was diagnosed according to Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision criteria. Lifelong and acquired V cases were included. Patients underwent a structured interview and physical, gynecologic, laboratory, and clitoral ultrasound examinations; they completed the Female Sexual Function Index (FSFI), the Middlesex Hospital Questionnaire, the Female Sexual Distress Scale-Revised (FSDS), and the Body Uneasiness Test. V was diagnosed in 20 patients (7.8%). Women with V were significantly younger than the rest of the sample (P Vaginismus: An Exploratory Analysis. J Sex Med 2017;14:1392-1402. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.

  6. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    Directory of Open Access Journals (Sweden)

    Rupert Faltermeier

    2015-01-01

    Full Text Available Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP and intracranial pressure (ICP. Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP, with the outcome of the patients represented by the Glasgow Outcome Scale (GOS. For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  7. Nonlinear canonical correlation analysis with k sets of variables

    NARCIS (Netherlands)

    van der Burg, Eeke; de Leeuw, Jan

    1987-01-01

    The multivariate technique OVERALS is introduced as a non-linear generalization of canonical correlation analysis (CCA). First, two sets CCA is introduced. Two sets CCA is a technique that computes linear combinations of sets of variables that correlate in an optimal way. Two sets CCA is then

  8. Ten Years Trend Analysis of Malaria Prevalence and its Correlation ...

    African Journals Online (AJOL)

    The data were analyzed using SPSS software package 16.0. Pearson's correlation analysis was conducted to see the correlation between plasmodium species and climatic variables. Within the last decade (2004–2013) a total of 30,070 blood films were examined for malaria in Sire health center and of this 6036 (20.07%) ...

  9. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    Science.gov (United States)

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  10. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  11. GIS and correlation analysis of geo-environmental variables ...

    African Journals Online (AJOL)

    Key words: Correlation, GIS, malaria geography, malaria incidence ... problems, as it has created the possibility for geocoding, extracting and spatial analysis of health ...... Bulletin of the World Health Organization, 78(12), 1438–1444. Carter ...

  12. Bivariate generalized Pareto distribution for extreme atmospheric particulate matter

    Science.gov (United States)

    Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma; Aris, Ahmad Zaharin

    2015-02-01

    The high particulate matter (PM10) level is the prominent issue causing various impacts to human health and seriously affecting the economics. The asymptotic theory of extreme value is apply for analyzing the relation of extreme PM10 data from two nearby air quality monitoring stations. The series of daily maxima PM10 for Johor Bahru and Pasir Gudang stations are consider for year 2001 to 2010 databases. The 85% and 95% marginal quantile apply to determine the threshold values and hence construct the series of exceedances over the chosen threshold. The logistic, asymmetric logistic, negative logistic and asymmetric negative logistic models areconsidered as the dependence function to the joint distribution of a bivariate observation. Maximum likelihood estimation is employed for parameter estimations. The best fitted model is chosen based on the Akaike Information Criterion and the quantile plots. It is found that the asymmetric logistic model gives the best fitted model for bivariate extreme PM10 data and shows the weak dependence between two stations.

  13. Multiscale Detrended Cross-Correlation Analysis of STOCK Markets

    Science.gov (United States)

    Yin, Yi; Shang, Pengjian

    2014-06-01

    In this paper, we employ the detrended cross-correlation analysis (DCCA) to investigate the cross-correlations between different stock markets. We report the results of cross-correlated behaviors in US, Chinese and European stock markets in period 1997-2012 by using DCCA method. The DCCA shows the cross-correlated behaviors of intra-regional and inter-regional stock markets in the short and long term which display the similarities and differences of cross-correlated behaviors simply and roughly and the persistence of cross-correlated behaviors of fluctuations. Then, because of the limitation and inapplicability of DCCA method, we propose multiscale detrended cross-correlation analysis (MSDCCA) method to avoid "a priori" selecting the ranges of scales over which two coefficients of the classical DCCA method are identified, and employ MSDCCA to reanalyze these cross-correlations to exhibit some important details such as the existence and position of minimum, maximum and bimodal distribution which are lost if the scale structure is described by two coefficients only and essential differences and similarities in the scale structures of cross-correlation of intra-regional and inter-regional markets. More statistical characteristics of cross-correlation obtained by MSDCCA method help us to understand how two different stock markets influence each other and to analyze the influence from thus two inter-regional markets on the cross-correlation in detail, thus we get a richer and more detailed knowledge of the complex evolutions of dynamics of the cross-correlations between stock markets. The application of MSDCCA is important to promote our understanding of the internal mechanisms and structures of financial markets and helps to forecast the stock indices based on our current results demonstrated the cross-correlations between stock indices. We also discuss the MSDCCA methods of secant rolling window with different sizes and, lastly, provide some relevant implications and

  14. REGRES: A FORTRAN-77 program to calculate nonparametric and ``structural'' parametric solutions to bivariate regression equations

    Science.gov (United States)

    Rock, N. M. S.; Duffy, T. R.

    REGRES allows a range of regression equations to be calculated for paired sets of data values in which both variables are subject to error (i.e. neither is the "independent" variable). Nonparametric regressions, based on medians of all possible pairwise slopes and intercepts, are treated in detail. Estimated slopes and intercepts are output, along with confidence limits, Spearman and Kendall rank correlation coefficients. Outliers can be rejected with user-determined stringency. Parametric regressions can be calculated for any value of λ (the ratio of the variances of the random errors for y and x)—including: (1) major axis ( λ = 1); (2) reduced major axis ( λ = variance of y/variance of x); (3) Y on Xλ = infinity; or (4) X on Y ( λ = 0) solutions. Pearson linear correlation coefficients also are output. REGRES provides an alternative to conventional isochron assessment techniques where bivariate normal errors cannot be assumed, or weighting methods are inappropriate.

  15. Meta-Analysis of Correlations Among Usability Measures

    DEFF Research Database (Denmark)

    Hornbæk, Kasper Anders Søren; Effie Lai Chong, Law

    2007-01-01

    are generally low: effectiveness measures (e.g., errors) and efficiency measures (e.g., time) has a correlation of .247 ± .059 (Pearson's product-moment correlation with 95% confidence interval), efficiency and satisfaction (e.g., preference) one of .196 ± .064, and effectiveness and satisfaction one of .164......Understanding the relation between usability measures seems crucial to deepen our conception of usability and to select the right measures for usability studies. We present a meta-analysis of correlations among usability measures calculated from the raw data of 73 studies. Correlations...... ± .062. Changes in task complexity do not influence these correlations, but use of more complex measures attenuates them. Standard questionnaires for measuring satisfaction appear more reliable than homegrown ones. Measures of users' perceptions of phenomena are generally not correlated with objective...

  16. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  17. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    Science.gov (United States)

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  18. WGCNA: an R package for weighted correlation network analysis.

    Science.gov (United States)

    Langfelder, Peter; Horvath, Steve

    2008-12-29

    Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.

  19. Correlation analysis of respiratory signals by using parallel coordinate plots.

    Science.gov (United States)

    Saatci, Esra

    2018-01-01

    The understanding of the bonds and the relationships between the respiratory signals, i.e. the airflow, the mouth pressure, the relative temperature and the relative humidity during breathing may provide the improvement on the measurement methods of respiratory mechanics and sensor designs or the exploration of the several possible applications in the analysis of respiratory disorders. Therefore, the main objective of this study was to propose a new combination of methods in order to determine the relationship between respiratory signals as a multidimensional data. In order to reveal the coupling between the processes two very different methods were used: the well-known statistical correlation analysis (i.e. Pearson's correlation and cross-correlation coefficient) and parallel coordinate plots (PCPs). Curve bundling with the number intersections for the correlation analysis, Least Mean Square Time Delay Estimator (LMS-TDE) for the point delay detection and visual metrics for the recognition of the visual structures were proposed and utilized in PCP. The number of intersections was increased when the correlation coefficient changed from high positive to high negative correlation between the respiratory signals, especially if whole breath was processed. LMS-TDE coefficients plotted in PCP indicated well-matched point delay results to the findings in the correlation analysis. Visual inspection of PCB by visual metrics showed range, dispersions, entropy comparisons and linear and sinusoidal-like relationships between the respiratory signals. It is demonstrated that the basic correlation analysis together with the parallel coordinate plots perceptually motivates the visual metrics in the display and thus can be considered as an aid to the user analysis by providing meaningful views of the data. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. SNPMClust: Bivariate Gaussian Genotype Clustering and Calling for Illumina Microarrays

    Directory of Open Access Journals (Sweden)

    Stephen W. Erickson

    2016-07-01

    Full Text Available SNPMClust is an R package for genotype clustering and calling with Illumina microarrays. It was originally developed for studies using the GoldenGate custom genotyping platform but can be used with other Illumina platforms, including Infinium BeadChip. The algorithm first rescales the fluorescent signal intensity data, adds empirically derived pseudo-data to minor allele genotype clusters, then uses the package mclust for bivariate Gaussian model fitting. We compared the accuracy and sensitivity of SNPMClust to that of GenCall, Illumina's proprietary algorithm, on a data set of 94 whole-genome amplified buccal (cheek swab DNA samples. These samples were genotyped on a custom panel which included 1064 SNPs for which the true genotype was known with high confidence. SNPMClust produced uniformly lower false call rates over a wide range of overall call rates.

  1. Efficient estimation of semiparametric copula models for bivariate survival data

    KAUST Repository

    Cheng, Guang

    2014-01-01

    A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

  2. Selection effects in the bivariate brightness distribution for spiral galaxies

    International Nuclear Information System (INIS)

    Phillipps, S.; Disney, M.

    1986-01-01

    The joint distribution of total luminosity and characteristic surface brightness (the bivariate brightness distribution) is investigated for a complete sample of spiral galaxies in the Virgo cluster. The influence of selection and physical limits of various kinds on the apparent distribution are detailed. While the distribution of surface brightness for bright galaxies may be genuinely fairly narrow, faint galaxies exist right across the (quite small) range of accessible surface brightnesses so no statement can be made about the true extent of the distribution. The lack of high surface brightness bright galaxies in the Virgo sample relative to an overall RC2 sample (mostly field galaxies) supports the contention that the star-formation rate is reduced in the inner region of the cluster for environmental reasons. (author)

  3. Correlation analysis of the Taurus molecular cloud complex

    International Nuclear Information System (INIS)

    Kleiner, S.C.

    1985-01-01

    Autocorrelation and power spectrum methods were applied to the analysis of the density and velocity structure of the Taurus Complex and Heiles Cloud 2 as traced out by 13 CO J = 1 → 0 molecular line observations obtained with the 14m antenna of the Five College Radio Astronomy Observatory. Statistically significant correlations in the spacing of density fluctuations within the Taurus Complex and Heiles 2 were uncovered. The length scales of the observed correlations correspond in magnitude to the Jeans wavelengths characterizing gravitational instabilities with (i) interstellar atomic hydrogen gas for the case of the Taurus complex, and (ii) molecular hydrogen for Heiles 2. The observed correlations may be the signatures of past and current gravitational instabilities frozen into the structure of the molecular gas. The appendices provide a comprehensive description of the analytical and numerical methods developed for the correlation analysis of molecular clouds

  4. A non-stationary cost-benefit based bivariate extreme flood estimation approach

    Science.gov (United States)

    Qi, Wei; Liu, Junguo

    2018-02-01

    Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. Canonical correlation analysis of course and teacher evaluation

    DEFF Research Database (Denmark)

    Sliusarenko, Tamara; Ersbøll, Bjarne Kjær

    2010-01-01

    At the Technical University of Denmark course evaluations are performed by the students on a questionnaire. On one form the students are asked specific questions regarding the course. On a second form they are asked specific questions about the teacher. This study investigates the extent to which...... information obtained from the course evaluation form overlaps with information obtained from the teacher evaluation form. Employing canonical correlation analysis it was found that course and teacher evaluations are correlated. However, the structure of the canonical correlation is subject to change...

  7. Analysis of charge-dependent azimuthal correlations with HADES

    Energy Technology Data Exchange (ETDEWEB)

    Kornas, Frederic [TU Darmstadt (Germany); Selyuzhenkov, Ilya [GSI (Germany); Galatyuk, Tetyana [TU Darmstadt (Germany); GSI (Germany); Collaboration: HADES-Collaboration

    2016-07-01

    Charge-dependent azimuthal correlations relative to the reaction plane have been proposed as a probe in the search for the chiral magnetic effect in relativistic heavy-ion collisions. These type of correlations have been measured at the RHIC BES by STAR and at the LHC by ALICE. This contribution discusses two charged particle correlations with respect to the reaction plane measured with high statistic sample of Au+Au collisions at 1.23 AGeV collected by HADES. The Forward wall detector allows to reconstruct the reaction plane using the spectator fragments. The status of the analysis with protons and charged pions will be presented.

  8. Data analytics using canonical correlation analysis and Monte Carlo simulation

    Science.gov (United States)

    Rickman, Jeffrey M.; Wang, Yan; Rollett, Anthony D.; Harmer, Martin P.; Compson, Charles

    2017-07-01

    A canonical correlation analysis is a generic parametric model used in the statistical analysis of data involving interrelated or interdependent input and output variables. It is especially useful in data analytics as a dimensional reduction strategy that simplifies a complex, multidimensional parameter space by identifying a relatively few combinations of variables that are maximally correlated. One shortcoming of the canonical correlation analysis, however, is that it provides only a linear combination of variables that maximizes these correlations. With this in mind, we describe here a versatile, Monte-Carlo based methodology that is useful in identifying non-linear functions of the variables that lead to strong input/output correlations. We demonstrate that our approach leads to a substantial enhancement of correlations, as illustrated by two experimental applications of substantial interest to the materials science community, namely: (1) determining the interdependence of processing and microstructural variables associated with doped polycrystalline aluminas, and (2) relating microstructural decriptors to the electrical and optoelectronic properties of thin-film solar cells based on CuInSe2 absorbers. Finally, we describe how this approach facilitates experimental planning and process control.

  9. Probabilistic leak-before-break analysis with correlated input parameters

    International Nuclear Information System (INIS)

    Qian Guian; Niffenegger, Markus; Karanki, Durga Rao; Li Shuxin

    2013-01-01

    Highlights: ► The correlation of crack growth has the most significant impact on LBB behavior. ► The correlation impact increases with the correlation coefficients. ► The correlation impact increases with the number of cracks. ► Independent assumption may lead to nonconservative result. - Abstract: The paper presents a probabilistic methodology considering the correlations between the input variables for the analysis of leak-before-break (LBB) behavior of a pressure tube. A computer program based on Monte Carlo (MC) simulation with Nataf transformation has been developed to allow the proposed methodology to calculate both the time from the first leakage to unstable fracture and the time from leakage detection to unstable fracture. The results show that the correlation of the crack growth rates between different cracks has the most significant impact on the LBB behavior of the pressure tube. The impact of the parameters correlation on LBB behavior increases with the crack numbers. If the correlations between different parameters for an individual crack are not considered, the predicted results are nonconservative when the cumulative probability is below 50% and conservative when it is above 50%.

  10. A bivariate measurement error model for semicontinuous and continuous variables: Application to nutritional epidemiology.

    Science.gov (United States)

    Kipnis, Victor; Freedman, Laurence S; Carroll, Raymond J; Midthune, Douglas

    2016-03-01

    Semicontinuous data in the form of a mixture of a large portion of zero values and continuously distributed positive values frequently arise in many areas of biostatistics. This article is motivated by the analysis of relationships between disease outcomes and intakes of episodically consumed dietary components. An important aspect of studies in nutritional epidemiology is that true diet is unobservable and commonly evaluated by food frequency questionnaires with substantial measurement error. Following the regression calibration approach for measurement error correction, unknown individual intakes in the risk model are replaced by their conditional expectations given mismeasured intakes and other model covariates. Those regression calibration predictors are estimated using short-term unbiased reference measurements in a calibration substudy. Since dietary intakes are often "energy-adjusted," e.g., by using ratios of the intake of interest to total energy intake, the correct estimation of the regression calibration predictor for each energy-adjusted episodically consumed dietary component requires modeling short-term reference measurements of the component (a semicontinuous variable), and energy (a continuous variable) simultaneously in a bivariate model. In this article, we develop such a bivariate model, together with its application to regression calibration. We illustrate the new methodology using data from the NIH-AARP Diet and Health Study (Schatzkin et al., 2001, American Journal of Epidemiology 154, 1119-1125), and also evaluate its performance in a simulation study. © 2015, The International Biometric Society.

  11. A method of moments to estimate bivariate survival functions: the copula approach

    Directory of Open Access Journals (Sweden)

    Silvia Angela Osmetti

    2013-05-01

    Full Text Available In this paper we discuss the problem on parametric and non parametric estimation of the distributions generated by the Marshall-Olkin copula. This copula comes from the Marshall-Olkin bivariate exponential distribution used in reliability analysis. We generalize this model by the copula and different marginal distributions to construct several bivariate survival functions. The cumulative distribution functions are not absolutely continuous and they unknown parameters are often not be obtained in explicit form. In order to estimate the parameters we propose an easy procedure based on the moments. This method consist in two steps: in the first step we estimate only the parameters of marginal distributions and in the second step we estimate only the copula parameter. This procedure can be used to estimate the parameters of complex survival functions in which it is difficult to find an explicit expression of the mixed moments. Moreover it is preferred to the maximum likelihood one for its simplex mathematic form; in particular for distributions whose maximum likelihood parameters estimators can not be obtained in explicit form.

  12. Comparative analysis of heat transfer correlations for forced convection boiling

    International Nuclear Information System (INIS)

    Guglielmini, G.; Nannei, E.; Pisoni, C.

    1978-01-01

    A critical survey was conducted of the most relevant correlations of boiling heat transfer in forced convection flow. Most of the investigations carried out on partial nucleate boiling and fully developed nucleate boiling have led to the formulation of correlations that are not able to cover a wide range of operating conditions, due to the empirical approach of the problem. A comparative analysis is therefore required in order to delineate the relative accuracy of the proposed correlations, on the basis of the experimental data presently available. The survey performed allows the evaluation of the accuracy of the different calculating procedure; the results obtained, moreover, indicate the most reliable heat transfer correlations for the different operating conditions investigated. This survey was developed for five pressure range (up to 180bar) and for both saturation and subcooled boiling condition

  13. The Neural Correlates of Moral Thinking: A Meta-Analysis

    OpenAIRE

    Douglas J. Bryant; Wang F; Kelley Deardeuff; Emily Zoccoli; Chang S. Nam

    2016-01-01

    We conducted a meta-analysis to evaluate current research that aims to map the neural correlates of two typical conditions of moral judgment: right-wrong moral judgments and decision-making in moral dilemmas. Utilizing the activation likelihood estimation (ALE) method, we conducted a meta-analysis using neuroimaging data obtained from twenty-one previous studies that measured responses in one or the other of these conditions. We found that across the studies (n = 400), distinct neural circuit...

  14. GIS and correlation analysis of geo-environmental variables ...

    African Journals Online (AJOL)

    GIS and correlation analysis of geo-environmental variables influencing malaria prevalence in the Saboba district of Northern Ghana. ... The study also applied spline interpolation technique to map malaria prevalence in the district using standardised malaria incidence. The result indicates that distance to marshy areas is ...

  15. Variability, correlation and path coefficient analysis of seedling traits ...

    African Journals Online (AJOL)

    Indirect selection is a useful means for improving yield in cotton crop. The objective of the present study was to determine the genetic variability, broad sense heritability, genetic advance and correlation among the six seedling traits and their direct and indirect effects on cotton yield by using path coefficient analysis.

  16. Use of fuel failure correlations in accident analysis

    International Nuclear Information System (INIS)

    O'Dell, L.D.; Baars, R.E.; Waltar, A.E.

    1975-05-01

    The MELT-III code for analysis of a Transient Overpower (TOP) accident in an LMFBR is briefly described, including failure criteria currently applied in the code. Preliminary results of calculations exploring failure patterns in time and space in the reactor core are reported and compared for the two empirical fuel failure correlations employed in the code. (U.S.)

  17. Model-independent analysis with BPM correlation matrices

    International Nuclear Information System (INIS)

    Irwin, J.; Wang, C.X.; Yan, Y.T.; Bane, K.; Cai, Y.; Decker, F.; Minty, M.; Stupakov, G.; Zimmermann, F.

    1998-06-01

    The authors discuss techniques for Model-Independent Analysis (MIA) of a beamline using correlation matrices of physical variables and Singular Value Decomposition (SVD) of a beamline BPM matrix. The beamline matrix is formed from BPM readings for a large number of pulses. The method has been applied to the Linear Accelerator of the SLAC Linear Collider (SLC)

  18. Registration of prone and supine CT colonography scans using correlation optimized warping and canonical correlation analysis

    International Nuclear Information System (INIS)

    Wang Shijun; Yao Jianhua; Liu Jiamin; Petrick, Nicholas; Van Uitert, Robert L.; Periaswamy, Senthil; Summers, Ronald M.

    2009-01-01

    Purpose: In computed tomographic colonography (CTC), a patient will be scanned twice--Once supine and once prone--to improve the sensitivity for polyp detection. To assist radiologists in CTC reading, in this paper we propose an automated method for colon registration from supine and prone CTC scans. Methods: We propose a new colon centerline registration method for prone and supine CTC scans using correlation optimized warping (COW) and canonical correlation analysis (CCA) based on the anatomical structure of the colon. Four anatomical salient points on the colon are first automatically distinguished. Then correlation optimized warping is applied to the segments defined by the anatomical landmarks to improve the global registration based on local correlation of segments. The COW method was modified by embedding canonical correlation analysis to allow multiple features along the colon centerline to be used in our implementation. Results: We tested the COW algorithm on a CTC data set of 39 patients with 39 polyps (19 training and 20 test cases) to verify the effectiveness of the proposed COW registration method. Experimental results on the test set show that the COW method significantly reduces the average estimation error in a polyp location between supine and prone scans by 67.6%, from 46.27±52.97 to 14.98 mm±11.41 mm, compared to the normalized distance along the colon centerline algorithm (p<0.01). Conclusions: The proposed COW algorithm is more accurate for the colon centerline registration compared to the normalized distance along the colon centerline method and the dynamic time warping method. Comparison results showed that the feature combination of z-coordinate and curvature achieved lowest registration error compared to the other feature combinations used by COW. The proposed method is tolerant to centerline errors because anatomical landmarks help prevent the propagation of errors across the entire colon centerline.

  19. Bivariate Genomic Footprinting Detects Changes in Transcription Factor Activity

    Directory of Open Access Journals (Sweden)

    Songjoon Baek

    2017-05-01

    Full Text Available In response to activating signals, transcription factors (TFs bind DNA and regulate gene expression. TF binding can be measured by protection of the bound sequence from DNase digestion (i.e., footprint. Here, we report that 80% of TF binding motifs do not show a measurable footprint, partly because of a variable cleavage pattern within the motif sequence. To more faithfully portray the effect of TFs on chromatin, we developed an algorithm that captures two TF-dependent effects on chromatin accessibility: footprinting and motif-flanking accessibility. The algorithm, termed bivariate genomic footprinting (BaGFoot, efficiently detects TF activity. BaGFoot is robust to different accessibility assays (DNase-seq, ATAC-seq, all examined peak-calling programs, and a variety of cut bias correction approaches. BaGFoot reliably predicts TF binding and provides valuable information regarding the TFs affecting chromatin accessibility in various biological systems and following various biological events, including in cases where an absolute footprint cannot be determined.

  20. Epileptic seizure prediction based on a bivariate spectral power methodology.

    Science.gov (United States)

    Bandarabadi, Mojtaba; Teixeira, Cesar A; Direito, Bruno; Dourado, Antonio

    2012-01-01

    The spectral power of 5 frequently considered frequency bands (Alpha, Beta, Gamma, Theta and Delta) for 6 EEG channels is computed and then all the possible pairwise combinations among the 30 features set, are used to create a 435 dimensional feature space. Two new feature selection methods are introduced to choose the best candidate features among those and to reduce the dimensionality of this feature space. The selected features are then fed to Support Vector Machines (SVMs) that classify the cerebral state in preictal and non-preictal classes. The outputs of the SVM are regularized using a method that accounts for the classification dynamics of the preictal class, also known as "Firing Power" method. The results obtained using our feature selection approaches are compared with the ones obtained using minimum Redundancy Maximum Relevance (mRMR) feature selection method. The results in a group of 12 patients of the EPILEPSIAE database, containing 46 seizures and 787 hours multichannel recording for out-of-sample data, indicate the efficiency of the bivariate approach as well as the two new feature selection methods. The best results presented sensitivity of 76.09% (35 of 46 seizures predicted) and a false prediction rate of 0.15(-1).

  1. A bivariate optimal replacement policy for a multistate repairable system

    International Nuclear Information System (INIS)

    Zhang Yuanlin; Yam, Richard C.M.; Zuo, Ming J.

    2007-01-01

    In this paper, a deteriorating simple repairable system with k+1 states, including k failure states and one working state, is studied. It is assumed that the system after repair is not 'as good as new' and the deterioration of the system is stochastic. We consider a bivariate replacement policy, denoted by (T,N), in which the system is replaced when its working age has reached T or the number of failures it has experienced has reached N, whichever occurs first. The objective is to determine the optimal replacement policy (T,N)* such that the long-run expected profit per unit time is maximized. The explicit expression of the long-run expected profit per unit time is derived and the corresponding optimal replacement policy can be determined analytically or numerically. We prove that the optimal policy (T,N)* is better than the optimal policy N* for a multistate simple repairable system. We also show that a general monotone process model for a multistate simple repairable system is equivalent to a geometric process model for a two-state simple repairable system in the sense that they have the same structure for the long-run expected profit (or cost) per unit time and the same optimal policy. Finally, a numerical example is given to illustrate the theoretical results

  2. Sparse canonical correlation analysis: new formulation and algorithm.

    Science.gov (United States)

    Chu, Delin; Liao, Li-Zhi; Ng, Michael K; Zhang, Xiaowei

    2013-12-01

    In this paper, we study canonical correlation analysis (CCA), which is a powerful tool in multivariate data analysis for finding the correlation between two sets of multidimensional variables. The main contributions of the paper are: 1) to reveal the equivalent relationship between a recursive formula and a trace formula for the multiple CCA problem, 2) to obtain the explicit characterization for all solutions of the multiple CCA problem even when the corresponding covariance matrices are singular, 3) to develop a new sparse CCA algorithm, and 4) to establish the equivalent relationship between the uncorrelated linear discriminant analysis and the CCA problem. We test several simulated and real-world datasets in gene classification and cross-language document retrieval to demonstrate the effectiveness of the proposed algorithm. The performance of the proposed method is competitive with the state-of-the-art sparse CCA algorithms.

  3. Information-Pooling Bias in Collaborative Security Incident Correlation Analysis.

    Science.gov (United States)

    Rajivan, Prashanth; Cooke, Nancy J

    2018-03-01

    Incident correlation is a vital step in the cybersecurity threat detection process. This article presents research on the effect of group-level information-pooling bias on collaborative incident correlation analysis in a synthetic task environment. Past research has shown that uneven information distribution biases people to share information that is known to most team members and prevents them from sharing any unique information available with them. The effect of such biases on security team collaborations are largely unknown. Thirty 3-person teams performed two threat detection missions involving information sharing and correlating security incidents. Incidents were predistributed to each person in the team based on the hidden profile paradigm. Participant teams, randomly assigned to three experimental groups, used different collaboration aids during Mission 2. Communication analysis revealed that participant teams were 3 times more likely to discuss security incidents commonly known to the majority. Unaided team collaboration was inefficient in finding associations between security incidents uniquely available to each member of the team. Visualizations that augment perceptual processing and recognition memory were found to mitigate the bias. The data suggest that (a) security analyst teams, when conducting collaborative correlation analysis, could be inefficient in pooling unique information from their peers; (b) employing off-the-shelf collaboration tools in cybersecurity defense environments is inadequate; and (c) collaborative security visualization tools developed considering the human cognitive limitations of security analysts is necessary. Potential applications of this research include development of team training procedures and collaboration tool development for security analysts.

  4. Analysis of the Correlation between GDP and the Final Consumption

    Directory of Open Access Journals (Sweden)

    Constantin ANGHELACHE

    2011-09-01

    Full Text Available This paper presents the results of the researches performed by the author regarding the evolution of Gross Domestic Product. One of the main aspects of GDP analysis is the correlation with the final consumption, an important macroeconomic indicator. The evolution of the Gross Domestic Product is highly influenced by the evolution of the final consumption. To analyze the correlation, the paper proposes the use of the linear regression model, as one of the most appropriate instruments for such scientific approach. The regression model described in the article uses the GDP as resultant variable and the final consumption as factorial variable.

  5. Linear and Nonlinear Multiset Canonical Correlation Analysis (invited talk)

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Larsen, Rasmus

    2002-01-01

    This paper deals with decompositioning of multiset data. Friedman's alternating conditional expectations (ACE) algorithm is extended to handle multiple sets of variables of different mixtures. The new algorithm finds estimates of the optimal transformations of the involved variables that maximize...... the sum of the pair-wise correlations over all sets. The new algorithm is termed multi-set ACE (MACE) and can find multiple orthogonal eigensolutions. MACE is a generalization of the linear multiset correlations analysis (MCCA). It handles multivariate multisets of arbitrary mixtures of both continuous...

  6. A multimodal stress monitoring system with canonical correlation analysis.

    Science.gov (United States)

    Unsoo Ha; Changhyeon Kim; Yongsu Lee; Hyunki Kim; Taehwan Roh; Hoi-Jun Yoo

    2015-08-01

    The multimodal stress monitoring headband is proposed for mobile stress management system. It is composed of headband and earplugs. Electroencephalography (EEG), hemoencephalography (HEG) and heart-rate variability (HRV) can be achieved simultaneously in the proposed system for user status estimation. With canonical correlation analysis (CCA) and temporal-kernel CCA (tkCCA) algorithm, those different signals can be combined for maximum correlation. Thanks to the proposed combination algorithm, the accuracy of the proposed system increased up to 19 percentage points than unimodal monitoring system in n-back task.

  7. Cross-correlation analysis of Ge/Li/ spectra

    International Nuclear Information System (INIS)

    MacDonald, R.; Robertson, A.; Kennett, T.J.; Prestwich, W.V.

    1974-01-01

    A sensitive technique is proposed for activation analysis using cross-correlation and improved spectral orthogonality achieved through use of a rectangular zero area digital filter. To test the accuracy and reliability of the cross-correlation procedure five spectra obtained with a Ge/Li detector were combined in different proportions. Gaussian distributed statistics were then added to the composite spectra by means of a pseudo-random number generator. The basis spectra used were 76 As, 82 Br, 72 Ga, 77 Ge, and room background. In general, when the basis spectra were combined in roughly comparable proportions the accuracy of the techique proved to be excelent (>1%). However, of primary importance was the ability of the correlation technique to identify low intensity components in the presence of high intensity components. It was found that the detection threshold for Ge, for example, was not reached until the Ge content in the unfiltered spectrum was <0.16%. (T.G.)

  8. Bivariate Gaussian bridges: directional factorization of diffusion in Brownian bridge models.

    Science.gov (United States)

    Kranstauber, Bart; Safi, Kamran; Bartumeus, Frederic

    2014-01-01

    In recent years high resolution animal tracking data has become the standard in movement ecology. The Brownian Bridge Movement Model (BBMM) is a widely adopted approach to describe animal space use from such high resolution tracks. One of the underlying assumptions of the BBMM is isotropic diffusive motion between consecutive locations, i.e. invariant with respect to the direction. Here we propose to relax this often unrealistic assumption by separating the Brownian motion variance into two directional components, one parallel and one orthogonal to the direction of the motion. Our new model, the Bivariate Gaussian bridge (BGB), tracks movement heterogeneity across time. Using the BGB and identifying directed and non-directed movement within a trajectory resulted in more accurate utilisation distributions compared to dynamic Brownian bridges, especially for trajectories with a non-isotropic diffusion, such as directed movement or Lévy like movements. We evaluated our model with simulated trajectories and observed tracks, demonstrating that the improvement of our model scales with the directional correlation of a correlated random walk. We find that many of the animal trajectories do not adhere to the assumptions of the BBMM. The proposed model improves accuracy when describing the space use both in simulated correlated random walks as well as observed animal tracks. Our novel approach is implemented and available within the "move" package for R.

  9. Asymptotics of bivariate generating functions with algebraic singularities

    Science.gov (United States)

    Greenwood, Torin

    Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.

  10. Windowed Multitaper Correlation Analysis of Multimodal Brain Monitoring Parameters

    Directory of Open Access Journals (Sweden)

    Rupert Faltermeier

    2015-01-01

    Full Text Available Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.

  11. Windowed multitaper correlation analysis of multimodal brain monitoring parameters.

    Science.gov (United States)

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Although multimodal monitoring sets the standard in daily practice of neurocritical care, problem-oriented analysis tools to interpret the huge amount of data are lacking. Recently a mathematical model was presented that simulates the cerebral perfusion and oxygen supply in case of a severe head trauma, predicting the appearance of distinct correlations between arterial blood pressure and intracranial pressure. In this study we present a set of mathematical tools that reliably detect the predicted correlations in data recorded at a neurocritical care unit. The time resolved correlations will be identified by a windowing technique combined with Fourier-based coherence calculations. The phasing of the data is detected by means of Hilbert phase difference within the above mentioned windows. A statistical testing method is introduced that allows tuning the parameters of the windowing method in such a way that a predefined accuracy is reached. With this method the data of fifteen patients were examined in which we found the predicted correlation in each patient. Additionally it could be shown that the occurrence of a distinct correlation parameter, called scp, represents a predictive value of high quality for the patients outcome.

  12. Message Correlation Analysis Tool for NOvA

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic realtime correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the DAQ of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  13. Message Correlation Analysis Tool for NOvA

    International Nuclear Information System (INIS)

    Lu Qiming; Biery, Kurt A; Kowalkowski, James B

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  14. Message correlation analysis tool for NOvA

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Qiming [Fermilab; Biery, Kurt A. [Fermilab; Kowalkowski, James B. [Fermilab

    2012-01-01

    A complex running system, such as the NOvA online data acquisition, consists of a large number of distributed but closely interacting components. This paper describes a generic real-time correlation analysis and event identification engine, named Message Analyzer. Its purpose is to capture run time abnormalities and recognize system failures based on log messages from participating components. The initial design of analysis engine is driven by the data acquisition (DAQ) of the NOvA experiment. The Message Analyzer performs filtering and pattern recognition on the log messages and reacts to system failures identified by associated triggering rules. The tool helps the system maintain a healthy running state and to minimize data corruption. This paper also describes a domain specific language that allows the recognition patterns and correlation rules to be specified in a clear and flexible way. In addition, the engine provides a plugin mechanism for users to implement specialized patterns or rules in generic languages such as C++.

  15. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  16. Signal correlations in biomass combustion. An information theoretic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruusunen, M.

    2013-09-01

    Increasing environmental and economic awareness are driving the development of combustion technologies to efficient biomass use and clean burning. To accomplish these goals, quantitative information about combustion variables is needed. However, for small-scale combustion units the existing monitoring methods are often expensive or complex. This study aimed to quantify correlations between flue gas temperatures and combustion variables, namely typical emission components, heat output, and efficiency. For this, data acquired from four small-scale combustion units and a large circulating fluidised bed boiler was studied. The fuel range varied from wood logs, wood chips, and wood pellets to biomass residue. Original signals and a defined set of their mathematical transformations were applied to data analysis. In order to evaluate the strength of the correlations, a multivariate distance measure based on information theory was derived. The analysis further assessed time-varying signal correlations and relative time delays. Ranking of the analysis results was based on the distance measure. The uniformity of the correlations in the different data sets was studied by comparing the 10-quantiles of the measured signal. The method was validated with two benchmark data sets. The flue gas temperatures and the combustion variables measured carried similar information. The strongest correlations were mainly linear with the transformed signal combinations and explicable by the combustion theory. Remarkably, the results showed uniformity of the correlations across the data sets with several signal transformations. This was also indicated by simulations using a linear model with constant structure to monitor carbon dioxide in flue gas. Acceptable performance was observed according to three validation criteria used to quantify modelling error in each data set. In general, the findings demonstrate that the presented signal transformations enable real-time approximation of the studied

  17. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  18. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  19. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  20. Correlation analysis between ceramic insulator pollution and acoustic emissions

    Directory of Open Access Journals (Sweden)

    Benjamín Álvarez-Nasrallah

    2015-01-01

    Full Text Available Most of the studies related to insulator pollution are normally performed based on individual analysis among leakage current, relative humidity and equivalent salt deposit density (ESDD. This paper presents a correlation analysis between the leakage current and the acoustic emissions measured in a 230 kV electrical substations in the city of Barranquilla, Colombia. Furthermore, atmospheric variables were considered to develop a characterization model of the insulator contamination process. This model was used to demonstrate that noise emission levels are a reliable indicator to detect and characterize pollution on high voltage insulators. The correlation found amount the atmospheric, electrical and sound variables allowed to determine the relations for the maintenance of ceramic insulators in high-polluted areas. In this article, the results on the behavior of the leakage current in ceramic insulators and the sound produced with different atmospheric conditions are shown, which allow evaluating the best time to clean the insulator at the substation. Furthermore, by experimentation on site and using statistical models, the correlation between ambient variables and the leakage current of insulators in an electrical substation was obtained. Some of the problems that bring the external noise were overcome using multiple microphones and specialized software that enabled properly filter the sound and better measure the variables.

  1. Bivariate return periods of temperature and precipitation explain a large fraction of European crop yields

    Directory of Open Access Journals (Sweden)

    J. Zscheischler

    2017-07-01

    Full Text Available Crops are vital for human society. Crop yields vary with climate and it is important to understand how climate and crop yields are linked to ensure future food security. Temperature and precipitation are among the key driving factors of crop yield variability. Previous studies have investigated mostly linear relationships between temperature and precipitation and crop yield variability. Other research has highlighted the adverse impacts of climate extremes, such as drought and heat waves, on crop yields. Impacts are, however, often non-linearly related to multivariate climate conditions. Here we derive bivariate return periods of climate conditions as indicators for climate variability along different temperature–precipitation gradients. We show that in Europe, linear models based on bivariate return periods of specific climate conditions explain on average significantly more crop yield variability (42 % than models relying directly on temperature and precipitation as predictors (36 %. Our results demonstrate that most often crop yields increase along a gradient from hot and dry to cold and wet conditions, with lower yields associated with hot and dry periods. The majority of crops are most sensitive to climate conditions in summer and to maximum temperatures. The use of bivariate return periods allows the integration of non-linear impacts into climate–crop yield analysis. This offers new avenues to study the link between climate and crop yield variability and suggests that they are possibly more strongly related than what is inferred from conventional linear models.

  2. Group sparse canonical correlation analysis for genomic data integration.

    Science.gov (United States)

    Lin, Dongdong; Zhang, Jigang; Li, Jingyao; Calhoun, Vince D; Deng, Hong-Wen; Wang, Yu-Ping

    2013-08-12

    The emergence of high-throughput genomic datasets from different sources and platforms (e.g., gene expression, single nucleotide polymorphisms (SNP), and copy number variation (CNV)) has greatly enhanced our understandings of the interplay of these genomic factors as well as their influences on the complex diseases. It is challenging to explore the relationship between these different types of genomic data sets. In this paper, we focus on a multivariate statistical method, canonical correlation analysis (CCA) method for this problem. Conventional CCA method does not work effectively if the number of data samples is significantly less than that of biomarkers, which is a typical case for genomic data (e.g., SNPs). Sparse CCA (sCCA) methods were introduced to overcome such difficulty, mostly using penalizations with l-1 norm (CCA-l1) or the combination of l-1and l-2 norm (CCA-elastic net). However, they overlook the structural or group effect within genomic data in the analysis, which often exist and are important (e.g., SNPs spanning a gene interact and work together as a group). We propose a new group sparse CCA method (CCA-sparse group) along with an effective numerical algorithm to study the mutual relationship between two different types of genomic data (i.e., SNP and gene expression). We then extend the model to a more general formulation that can include the existing sCCA models. We apply the model to feature/variable selection from two data sets and compare our group sparse CCA method with existing sCCA methods on both simulation and two real datasets (human gliomas data and NCI60 data). We use a graphical representation of the samples with a pair of canonical variates to demonstrate the discriminating characteristic of the selected features. Pathway analysis is further performed for biological interpretation of those features. The CCA-sparse group method incorporates group effects of features into the correlation analysis while performs individual feature

  3. Correlation analysis of the physiological factors controlling fundamental voice frequency.

    Science.gov (United States)

    Atkinson, J E

    1978-01-01

    A technique has been developed to obtain a quantitative measure of correlation between electromyographic (EMG) activity of various laryngeal muscles, subglottal air pressure, and the fundamental frequency of vibration of the vocal folds (Fo). Data were collected and analyzed on one subject, a native speaker of American English. The results show that an analysis of this type can provide a useful measure of correlation between the physiological and acoustical events in speech and, furthermore, can yield detailed insights into the organization and nature of the speech production process. In particular, based on these results, a model is suggested of Fo control involving laryngeal state functions that seems to agree with present knowledge of laryngeal control and experimental evidence.

  4. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  5. Long-range correlation analysis of urban traffic data

    International Nuclear Information System (INIS)

    Peng, Sheng; Jun-Feng, Wang; Shu-Long, Zhao; Tie-Qiao, Tang

    2010-01-01

    This paper investigates urban traffic data by analysing the long-range correlation with detrended fluctuation analysis. Through a large number of real data collected by the travel time detection system in Beijing, the variation of flow in different time periods and intersections is studied. According to the long-range correlation in different time scales, it mainly discusses the effect of intersection location in road net, people activity customs and special traffic controls on urban traffic flow. As demonstrated by the obtained results, the urban traffic flow represents three-phase characters similar to highway traffic. Moreover, compared by the two groups of data obtained before and after the special traffic restrictions (vehicles with special numbered plates only run in a special workday) enforcement, it indicates that the rules not only reduce the flow but also avoid irregular fluctuation. (general)

  6. Protein structure similarity from principle component correlation analysis

    Directory of Open Access Journals (Sweden)

    Chou James

    2006-01-01

    Full Text Available Abstract Background Owing to rapid expansion of protein structure databases in recent years, methods of structure comparison are becoming increasingly effective and important in revealing novel information on functional properties of proteins and their roles in the grand scheme of evolutionary biology. Currently, the structural similarity between two proteins is measured by the root-mean-square-deviation (RMSD in their best-superimposed atomic coordinates. RMSD is the golden rule of measuring structural similarity when the structures are nearly identical; it, however, fails to detect the higher order topological similarities in proteins evolved into different shapes. We propose new algorithms for extracting geometrical invariants of proteins that can be effectively used to identify homologous protein structures or topologies in order to quantify both close and remote structural similarities. Results We measure structural similarity between proteins by correlating the principle components of their secondary structure interaction matrix. In our approach, the Principle Component Correlation (PCC analysis, a symmetric interaction matrix for a protein structure is constructed with relationship parameters between secondary elements that can take the form of distance, orientation, or other relevant structural invariants. When using a distance-based construction in the presence or absence of encoded N to C terminal sense, there are strong correlations between the principle components of interaction matrices of structurally or topologically similar proteins. Conclusion The PCC method is extensively tested for protein structures that belong to the same topological class but are significantly different by RMSD measure. The PCC analysis can also differentiate proteins having similar shapes but different topological arrangements. Additionally, we demonstrate that when using two independently defined interaction matrices, comparison of their maximum

  7. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  8. Partial correlation analysis method in ultrarelativistic heavy-ion collisions

    Science.gov (United States)

    Olszewski, Adam; Broniowski, Wojciech

    2017-11-01

    We argue that statistical data analysis of two-particle longitudinal correlations in ultrarelativistic heavy-ion collisions may be efficiently carried out with the technique of partial covariance. In this method, the spurious event-by-event fluctuations due to imprecise centrality determination are eliminated via projecting out the component of the covariance influenced by the centrality fluctuations. We bring up the relationship of the partial covariance to the conditional covariance. Importantly, in the superposition approach, where hadrons are produced independently from a collection of sources, the framework allows us to impose centrality constraints on the number of sources rather than hadrons, that way unfolding of the trivial fluctuations from statistical hadronization and focusing better on the initial-state physics. We show, using simulated data from hydrodynamics followed with statistical hadronization, that the technique is practical and very simple to use, giving insight into the correlations generated in the initial stage. We also discuss the issues related to separation of the short- and long-range components of the correlation functions and show that in our example the short-range component from the resonance decays is largely reduced by considering pions of the same sign. We demonstrate the method explicitly on the cases where centrality is determined with a single central control bin or with two peripheral control bins.

  9. Automated modal parameter estimation using correlation analysis and bootstrap sampling

    Science.gov (United States)

    Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.

    2018-02-01

    The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to

  10. Nanoscale protein diffusion by STED-based pair correlation analysis.

    Directory of Open Access Journals (Sweden)

    Paolo Bianchini

    Full Text Available We describe for the first time the combination between cross-pair correlation function analysis (pair correlation analysis or pCF and stimulated emission depletion (STED to obtain diffusion maps at spatial resolution below the optical diffraction limit (super-resolution. Our approach was tested in systems characterized by high and low signal to noise ratio, i.e. Capsid Like Particles (CLPs bearing several (>100 active fluorescent proteins and monomeric fluorescent proteins transiently expressed in living Chinese Hamster Ovary cells, respectively. The latter system represents the usual condition encountered in living cell studies on fluorescent protein chimeras. Spatial resolution of STED-pCF was found to be about 110 nm, with a more than twofold improvement over conventional confocal acquisition. We successfully applied our method to highlight how the proximity to nuclear envelope affects the mobility features of proteins actively imported into the nucleus in living cells. Remarkably, STED-pCF unveiled the existence of local barriers to diffusion as well as the presence of a slow component at distances up to 500-700 nm from either sides of nuclear envelope. The mobility of this component is similar to that previously described for transport complexes. Remarkably, all these features were invisible in conventional confocal mode.

  11. Scalable and Flexible Multiview MAX-VAR Canonical Correlation Analysis

    Science.gov (United States)

    Fu, Xiao; Huang, Kejun; Hong, Mingyi; Sidiropoulos, Nicholas D.; So, Anthony Man-Cho

    2017-08-01

    Generalized canonical correlation analysis (GCCA) aims at finding latent low-dimensional common structure from multiple views (feature vectors in different domains) of the same entities. Unlike principal component analysis (PCA) that handles a single view, (G)CCA is able to integrate information from different feature spaces. Here we focus on MAX-VAR GCCA, a popular formulation which has recently gained renewed interest in multilingual processing and speech modeling. The classic MAX-VAR GCCA problem can be solved optimally via eigen-decomposition of a matrix that compounds the (whitened) correlation matrices of the views; but this solution has serious scalability issues, and is not directly amenable to incorporating pertinent structural constraints such as non-negativity and sparsity on the canonical components. We posit regularized MAX-VAR GCCA as a non-convex optimization problem and propose an alternating optimization (AO)-based algorithm to handle it. Our algorithm alternates between {\\em inexact} solutions of a regularized least squares subproblem and a manifold-constrained non-convex subproblem, thereby achieving substantial memory and computational savings. An important benefit of our design is that it can easily handle structure-promoting regularization. We show that the algorithm globally converges to a critical point at a sublinear rate, and approaches a global optimal solution at a linear rate when no regularization is considered. Judiciously designed simulations and large-scale word embedding tasks are employed to showcase the effectiveness of the proposed algorithm.

  12. On the Construction of Bivariate Exponential Distributions with an Arbitrary Correlation Coefficient

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis

    2010-01-01

    coefficient (also negative). Secondly, the class satisfies that any linear combination (projection) of the marginal random variables is a phase-type distribution. The latter property is partially important for the development of hypothesis testing in linear models. Finally, it is easy to simulate...

  13. Powerful bivariate genome-wide association analyses suggest the SOX6 gene influencing both obesity and osteoporosis phenotypes in males.

    Directory of Open Access Journals (Sweden)

    Yao-Zhong Liu

    2009-08-01

    Full Text Available Current genome-wide association studies (GWAS are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically.To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI, with the osteoporosis risk phenotype, hip bone mineral density (BMD, scanning approximately 380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6 gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82x10(-7 and 1.47x10(-6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the approximately 380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS cohort containing 3,355 Caucasians (1,370 males and 1,985 females from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat.Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis.

  14. Powerful Bivariate Genome-Wide Association Analyses Suggest the SOX6 Gene Influencing Both Obesity and Osteoporosis Phenotypes in Males

    Science.gov (United States)

    Liu, Yao-Zhong; Pei, Yu-Fang; Liu, Jian-Feng; Yang, Fang; Guo, Yan; Zhang, Lei; Liu, Xiao-Gang; Yan, Han; Wang, Liang; Zhang, Yin-Ping; Levy, Shawn; Recker, Robert R.; Deng, Hong-Wen

    2009-01-01

    Background Current genome-wide association studies (GWAS) are normally implemented in a univariate framework and analyze different phenotypes in isolation. This univariate approach ignores the potential genetic correlation between important disease traits. Hence this approach is difficult to detect pleiotropic genes, which may exist for obesity and osteoporosis, two common diseases of major public health importance that are closely correlated genetically. Principal Findings To identify such pleiotropic genes and the key mechanistic links between the two diseases, we here performed the first bivariate GWAS of obesity and osteoporosis. We searched for genes underlying co-variation of the obesity phenotype, body mass index (BMI), with the osteoporosis risk phenotype, hip bone mineral density (BMD), scanning ∼380,000 SNPs in 1,000 unrelated homogeneous Caucasians, including 499 males and 501 females. We identified in the male subjects two SNPs in intron 1 of the SOX6 (SRY-box 6) gene, rs297325 and rs4756846, which were bivariately associated with both BMI and hip BMD, achieving p values of 6.82×10−7 and 1.47×10−6, respectively. The two SNPs ranked at the top in significance for bivariate association with BMI and hip BMD in the male subjects among all the ∼380,000 SNPs examined genome-wide. The two SNPs were replicated in a Framingham Heart Study (FHS) cohort containing 3,355 Caucasians (1,370 males and 1,985 females) from 975 families. In the FHS male subjects, the two SNPs achieved p values of 0.03 and 0.02, respectively, for bivariate association with BMI and femoral neck BMD. Interestingly, SOX6 was previously found to be essential to both cartilage formation/chondrogenesis and obesity-related insulin resistance, suggesting the gene's dual role in both bone and fat. Conclusions Our findings, together with the prior biological evidence, suggest the SOX6 gene's importance in co-regulation of obesity and osteoporosis. PMID:19714249

  15. Linearized spectrum correlation analysis for line emission measurements.

    Science.gov (United States)

    Nishizawa, T; Nornberg, M D; Den Hartog, D J; Sarff, J S

    2017-08-01

    A new spectral analysis method, Linearized Spectrum Correlation Analysis (LSCA), for charge exchange and passive ion Doppler spectroscopy is introduced to provide a means of measuring fast spectral line shape changes associated with ion-scale micro-instabilities. This analysis method is designed to resolve the fluctuations in the emission line shape from a stationary ion-scale wave. The method linearizes the fluctuations around a time-averaged line shape (e.g., Gaussian) and subdivides the spectral output channels into two sets to reduce contributions from uncorrelated fluctuations without averaging over the fast time dynamics. In principle, small fluctuations in the parameters used for a line shape model can be measured by evaluating the cross spectrum between different channel groupings to isolate a particular fluctuating quantity. High-frequency ion velocity measurements (100-200 kHz) were made by using this method. We also conducted simulations to compare LSCA with a moment analysis technique under a low photon count condition. Both experimental and synthetic measurements demonstrate the effectiveness of LSCA.

  16. Principal Component Analysis Based Two-Dimensional (PCA-2D) Correlation Spectroscopy: PCA Denoising for 2D Correlation Spectroscopy

    International Nuclear Information System (INIS)

    Jung, Young Mee

    2003-01-01

    Principal component analysis based two-dimensional (PCA-2D) correlation analysis is applied to FTIR spectra of polystyrene/methyl ethyl ketone/toluene solution mixture during the solvent evaporation. Substantial amount of artificial noise were added to the experimental data to demonstrate the practical noise-suppressing benefit of PCA-2D technique. 2D correlation analysis of the reconstructed data matrix from PCA loading vectors and scores successfully extracted only the most important features of synchronicity and asynchronicity without interference from noise or insignificant minor components. 2D correlation spectra constructed with only one principal component yield strictly synchronous response with no discernible a asynchronous features, while those involving at least two or more principal components generated meaningful asynchronous 2D correlation spectra. Deliberate manipulation of the rank of the reconstructed data matrix, by choosing the appropriate number and type of PCs, yields potentially more refined 2D correlation spectra

  17. Improving runoff risk estimates: Formulating runoff as a bivariate process using the SCS curve number method

    Science.gov (United States)

    Shaw, Stephen B.; Walter, M. Todd

    2009-03-01

    The Soil Conservation Service curve number (SCS-CN) method is widely used to predict storm runoff for hydraulic design purposes, such as sizing culverts and detention basins. As traditionally used, the probability of calculated runoff is equated to the probability of the causative rainfall event, an assumption that fails to account for the influence of variations in soil moisture on runoff generation. We propose a modification to the SCS-CN method that explicitly incorporates rainfall return periods and the frequency of different soil moisture states to quantify storm runoff risks. Soil moisture status is assumed to be correlated to stream base flow. Fundamentally, this approach treats runoff as the outcome of a bivariate process instead of dictating a 1:1 relationship between causative rainfall and resulting runoff volumes. Using data from the Fall Creek watershed in western New York and the headwaters of the French Broad River in the mountains of North Carolina, we show that our modified SCS-CN method improves frequency discharge predictions in medium-sized watersheds in the eastern United States in comparison to the traditional application of the method.

  18. A generalized right truncated bivariate Poisson regression model with applications to health data.

    Science.gov (United States)

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  19. On the matched pairs sign test using bivariate ranked set sampling ...

    African Journals Online (AJOL)

    BVRSS) is introduced and investigated. We show that this test is asymptotically more efficient than its counterpart sign test based on a bivariate simple random sample (BVSRS). The asymptotic null distribution and the efficiency of the test are derived.

  20. Meconium microbiome analysis identifies bacteria correlated with premature birth.

    Directory of Open Access Journals (Sweden)

    Alexandria N Ardissone

    Full Text Available Preterm birth is the second leading cause of death in children under the age of five years worldwide, but the etiology of many cases remains enigmatic. The dogma that the fetus resides in a sterile environment is being challenged by recent findings and the question has arisen whether microbes that colonize the fetus may be related to preterm birth. It has been posited that meconium reflects the in-utero microbial environment. In this study, correlations between fetal intestinal bacteria from meconium and gestational age were examined in order to suggest underlying mechanisms that may contribute to preterm birth.Meconium from 52 infants ranging in gestational age from 23 to 41 weeks was collected, the DNA extracted, and 16S rRNA analysis performed. Resulting taxa of microbes were correlated to clinical variables and also compared to previous studies of amniotic fluid and other human microbiome niches.Increased detection of bacterial 16S rRNA in meconium of infants of <33 weeks gestational age was observed. Approximately 61·1% of reads sequenced were classified to genera that have been reported in amniotic fluid. Gestational age had the largest influence on microbial community structure (R = 0·161; p = 0·029, while mode of delivery (C-section versus vaginal delivery had an effect as well (R = 0·100; p = 0·044. Enterobacter, Enterococcus, Lactobacillus, Photorhabdus, and Tannerella, were negatively correlated with gestational age and have been reported to incite inflammatory responses, suggesting a causative role in premature birth.This provides the first evidence to support the hypothesis that the fetal intestinal microbiome derived from swallowed amniotic fluid may be involved in the inflammatory response that leads to premature birth.

  1. Analysis of correlations between sites in models of protein sequences

    International Nuclear Information System (INIS)

    Giraud, B.G.; Lapedes, A.; Liu, L.C.

    1998-01-01

    A criterion based on conditional probabilities, related to the concept of algorithmic distance, is used to detect correlated mutations at noncontiguous sites on sequences. We apply this criterion to the problem of analyzing correlations between sites in protein sequences; however, the analysis applies generally to networks of interacting sites with discrete states at each site. Elementary models, where explicit results can be derived easily, are introduced. The number of states per site considered ranges from 2, illustrating the relation to familiar classical spin systems, to 20 states, suitable for representing amino acids. Numerical simulations show that the criterion remains valid even when the genetic history of the data samples (e.g., protein sequences), as represented by a phylogenetic tree, introduces nonindependence between samples. Statistical fluctuations due to finite sampling are also investigated and do not invalidate the criterion. A subsidiary result is found: The more homogeneous a population, the more easily its average properties can drift from the properties of its ancestor. copyright 1998 The American Physical Society

  2. Forecast Correlation Coefficient Matrix of Stock Returns in Portfolio Analysis

    OpenAIRE

    Zhao, Feng

    2013-01-01

    In Modern Portfolio Theory, the correlation coefficients decide the risk of a set of stocks in the portfolio. So, to understand the correlation coefficients between returns of stocks, is a challenge but is very important for the portfolio management. Usually, the stocks with small correlation coefficients or even negative correlation coefficients are preferred. One can calculate the correlation coefficients of stock returns based on the historical stock data. However, in order to control the ...

  3. A flexible time recording and time correlation analysis system

    International Nuclear Information System (INIS)

    Shenhav, N.J.; Leiferman, G.; Segal, Y.; Notea, A.

    1983-01-01

    A system was developed to digitize and record the time intervals between detection event pulses, feed to its input channels from a detection device. The accumulated data is transferred continuously in real time to a disc through a PDP 11/34 minicomputer. Even though the system was designed for a specific scope, i.e., the comparative study of passive neutron nondestructive assay methods, it can be characterized by its features as a general purpose time series recorder. The time correlation analysis is performed by software after completion of the data accumulation. The digitizing clock period is selectable and any value, larger than a minimum of 100 ns, may be selected. Bursts of up to 128 events with a frequency up to 10 MHz may be recorded. With the present recorder-minicomputer combination, the maximal average recording frequency is 40 kHz. (orig.)

  4. Brillouin optical correlation domain analysis in composite material beams

    DEFF Research Database (Denmark)

    Stern, Yonatan; London, Yosef; Preter, Eyal

    2017-01-01

    Structural health monitoring is a critical requirement in many composites. Numerous monitoring strategies rely on measurements of temperature or strain (or both), however these are often restricted to point-sensing or to the coverage of small areas. Spatially-continuous data can be obtained...... with optical fiber sensors. In this work, we report high-resolution distributed Brillouin sensing over standard fibers that are embedded in composite structures. A phase-coded, Brillouin optical correlation domain analysis (B-OCDA) protocol was employed, with spatial resolution of 2 cm and sensitivity of 1 °K...... or 20 micro-strain. A portable measurement setup was designed and assembled on the premises of a composite structures manufacturer. The setup was successfully utilized in several structural health monitoring scenarios: (a) monitoring the production and curing of a composite beam over 60 h; (b...

  5. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  6. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    Science.gov (United States)

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  7. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  8. Brillouin Optical Correlation Domain Analysis in Composite Material Beams

    Directory of Open Access Journals (Sweden)

    Yonatan Stern

    2017-10-01

    Full Text Available Structural health monitoring is a critical requirement in many composites. Numerous monitoring strategies rely on measurements of temperature or strain (or both, however these are often restricted to point-sensing or to the coverage of small areas. Spatially-continuous data can be obtained with optical fiber sensors. In this work, we report high-resolution distributed Brillouin sensing over standard fibers that are embedded in composite structures. A phase-coded, Brillouin optical correlation domain analysis (B-OCDA protocol was employed, with spatial resolution of 2 cm and sensitivity of 1 °K or 20 micro-strain. A portable measurement setup was designed and assembled on the premises of a composite structures manufacturer. The setup was successfully utilized in several structural health monitoring scenarios: (a monitoring the production and curing of a composite beam over 60 h; (b estimating the stiffness and Young’s modulus of a composite beam; and (c distributed strain measurements across the surfaces of a model wing of an unmanned aerial vehicle. The measurements are supported by the predictions of structural analysis calculations. The results illustrate the potential added values of high-resolution, distributed Brillouin sensing in the structural health monitoring of composites.

  9. Brillouin Optical Correlation Domain Analysis in Composite Material Beams.

    Science.gov (United States)

    Stern, Yonatan; London, Yosef; Preter, Eyal; Antman, Yair; Diamandi, Hilel Hagai; Silbiger, Maayan; Adler, Gadi; Levenberg, Eyal; Shalev, Doron; Zadok, Avi

    2017-10-02

    Structural health monitoring is a critical requirement in many composites. Numerous monitoring strategies rely on measurements of temperature or strain (or both), however these are often restricted to point-sensing or to the coverage of small areas. Spatially-continuous data can be obtained with optical fiber sensors. In this work, we report high-resolution distributed Brillouin sensing over standard fibers that are embedded in composite structures. A phase-coded, Brillouin optical correlation domain analysis (B-OCDA) protocol was employed, with spatial resolution of 2 cm and sensitivity of 1 °K or 20 micro-strain. A portable measurement setup was designed and assembled on the premises of a composite structures manufacturer. The setup was successfully utilized in several structural health monitoring scenarios: (a) monitoring the production and curing of a composite beam over 60 h; (b) estimating the stiffness and Young's modulus of a composite beam; and (c) distributed strain measurements across the surfaces of a model wing of an unmanned aerial vehicle. The measurements are supported by the predictions of structural analysis calculations. The results illustrate the potential added values of high-resolution, distributed Brillouin sensing in the structural health monitoring of composites.

  10. Interactive Correlation Analysis and Visualization of Climate Data

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Kwan-Liu [Univ. of California, Davis, CA (United States)

    2016-09-21

    The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods for visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.

  11. Multifractal Detrended Cross-Correlation Analysis of agricultural futures markets

    International Nuclear Information System (INIS)

    He Lingyun; Chen Shupeng

    2011-01-01

    Highlights: → We investigated cross-correlations between China's and US agricultural futures markets. → Power-law cross-correlations are found between the geographically far but correlated markets. → Multifractal features are significant in all the markets. → Cross-correlation exponent is less than averaged GHE when q 0. - Abstract: We investigated geographically far but temporally correlated China's and US agricultural futures markets. We found that there exists a power-law cross-correlation between them, and that multifractal features are significant in all the markets. It is very interesting that the geographically far markets show strong cross-correlations and share much of their multifractal structure. Furthermore, we found that for all the agricultural futures markets in our studies, the cross-correlation exponent is less than the averaged generalized Hurst exponents (GHE) when q 0.

  12. Correlation and network analysis of global financial indices.

    Science.gov (United States)

    Kumar, Sunil; Deo, Nivedita

    2012-08-01

    Random matrix theory (RMT) and network methods are applied to investigate the correlation and network properties of 20 financial indices. The results are compared before and during the financial crisis of 2008. In the RMT method, the components of eigenvectors corresponding to the second largest eigenvalue form two clusters of indices in the positive and negative directions. The components of these two clusters switch in opposite directions during the crisis. The network analysis uses the Fruchterman-Reingold layout to find clusters in the network of indices at different thresholds. At a threshold of 0.6, before the crisis, financial indices corresponding to the Americas, Europe, and Asia-Pacific form separate clusters. On the other hand, during the crisis at the same threshold, the American and European indices combine together to form a strongly linked cluster while the Asia-Pacific indices form a separate weakly linked cluster. If the value of the threshold is further increased to 0.9 then the European indices (France, Germany, and the United Kingdom) are found to be the most tightly linked indices. The structure of the minimum spanning tree of financial indices is more starlike before the crisis and it changes to become more chainlike during the crisis. The average linkage hierarchical clustering algorithm is used to find a clearer cluster structure in the network of financial indices. The cophenetic correlation coefficients are calculated and found to increase significantly, which indicates that the hierarchy increases during the financial crisis. These results show that there is substantial change in the structure of the organization of financial indices during a financial crisis.

  13. Stoichiometric Correlation Analysis: Principles of Metabolic Functionality from Metabolomics Data

    Directory of Open Access Journals (Sweden)

    Kevin Schwahn

    2017-12-01

    Full Text Available Recent advances in metabolomics technologies have resulted in high-quality (time-resolved metabolic profiles with an increasing coverage of metabolic pathways. These data profiles represent read-outs from often non-linear dynamics of metabolic networks. Yet, metabolic profiles have largely been explored with regression-based approaches that only capture linear relationships, rendering it difficult to determine the extent to which the data reflect the underlying reaction rates and their couplings. Here we propose an approach termed Stoichiometric Correlation Analysis (SCA based on correlation between positive linear combinations of log-transformed metabolic profiles. The log-transformation is due to the evidence that metabolic networks can be modeled by mass action law and kinetics derived from it. Unlike the existing approaches which establish a relation between pairs of metabolites, SCA facilitates the discovery of higher-order dependence between more than two metabolites. By using a paradigmatic model of the tricarboxylic acid cycle we show that the higher-order dependence reflects the coupling of concentration of reactant complexes, capturing the subtle difference between the employed enzyme kinetics. Using time-resolved metabolic profiles from Arabidopsis thaliana and Escherichia coli, we show that SCA can be used to quantify the difference in coupling of reactant complexes, and hence, reaction rates, underlying the stringent response in these model organisms. By using SCA with data from natural variation of wild and domesticated wheat and tomato accession, we demonstrate that the domestication is accompanied by loss of such couplings, in these species. Therefore, application of SCA to metabolomics data from natural variation in wild and domesticated populations provides a mechanistic way to understanding domestication and its relation to metabolic networks.

  14. Canonical correlation analysis for gene-based pleiotropy discovery.

    Directory of Open Access Journals (Sweden)

    Jose A Seoane

    2014-10-01

    Full Text Available Genome-wide association studies have identified a wealth of genetic variants involved in complex traits and multifactorial diseases. There is now considerable interest in testing variants for association with multiple phenotypes (pleiotropy and for testing multiple variants for association with a single phenotype (gene-based association tests. Such approaches can increase statistical power by combining evidence for association over multiple phenotypes or genetic variants respectively. Canonical Correlation Analysis (CCA measures the correlation between two sets of multidimensional variables, and thus offers the potential to combine these two approaches. To apply CCA, we must restrict the number of attributes relative to the number of samples. Hence we consider modules of genetic variation that can comprise a gene, a pathway or another biologically relevant grouping, and/or a set of phenotypes. In order to do this, we use an attribute selection strategy based on a binary genetic algorithm. Applied to a UK-based prospective cohort study of 4286 women (the British Women's Heart and Health Study, we find improved statistical power in the detection of previously reported genetic associations, and identify a number of novel pleiotropic associations between genetic variants and phenotypes. New discoveries include gene-based association of NSF with triglyceride levels and several genes (ACSM3, ERI2, IL18RAP, IL23RAP and NRG1 with left ventricular hypertrophy phenotypes. In multiple-phenotype analyses we find association of NRG1 with left ventricular hypertrophy phenotypes, fibrinogen and urea and pleiotropic relationships of F7 and F10 with Factor VII, Factor IX and cholesterol levels.

  15. Prognostic value of correlation analysis of perinatal anamnesis

    Directory of Open Access Journals (Sweden)

    V. V. Sofronov

    2017-01-01

    Full Text Available Objective research: is to establish the prognostic value of the analysis of correlative relationships of qualitative indicators of the perinatal history. Correlative groups of interactions of the investigated qualitative indicators in the antenatal, intranatal and postnatal periods are constructed. It was shown that in antenatal history for newborns 22–37 weeks. gestation (group 1 the most important parameters are the «gestational age», «chronic respiratory diseases in the mother,» «premature birth in an anamnesis,» and «exacerbation of chronic infections during pregnancy»; for newborns 38–41 weeks. gestation (2nd group – «cervical erosion», «ovarian cyst», «fibromyoma» and «colpitis ». In the intranatal history for children of the 1st group, the most important parameters are «anhydrous period» and «prolonged labor»; for children of the second group – only «prolonged labor». In the postnatal history for the first group, the two most important parameters are the «gestational age» and the «zonal elevation of the brain echogenicity,» and for the 2 nd group only the parameter «degree of asphyxia» is as important. The obtained results confirm the main known interrelationships of parameters of the perinatal history. At the same time, nontrivial connections between the parameters of the perinatal history: «allergic diseases in the mother» – «threatened miscarriage » – «ovarian cyst»; «chronic respiratory diseases in the mother» – «allergic diseases of the mother» – «diseases of the digestive system in the father.»

  16. Modal Analysis and Model Correlation of the Mir Space Station

    Science.gov (United States)

    Kim, Hyoung M.; Kaouk, Mohamed

    2000-01-01

    This paper will discuss on-orbit dynamic tests, modal analysis, and model refinement studies performed as part of the Mir Structural Dynamics Experiment (MiSDE). Mir is the Russian permanently manned Space Station whose construction first started in 1986. The MiSDE was sponsored by the NASA International Space Station (ISS) Phase 1 Office and was part of the Shuttle-Mir Risk Mitigation Experiment (RME). One of the main objectives for MiSDE is to demonstrate the feasibility of performing on-orbit modal testing on large space structures to extract modal parameters that will be used to correlate mathematical models. The experiment was performed over a one-year span on the Mir-alone and Mir with a Shuttle docked. A total of 45 test sessions were performed including: Shuttle and Mir thruster firings, Shuttle-Mir and Progress-Mir dockings, crew exercise and pushoffs, and ambient noise during night-to-day and day-to-night orbital transitions. Test data were recorded with a variety of existing and new instrumentation systems that included: the MiSDE Mir Auxiliary Sensor Unit (MASU), the Space Acceleration Measurement System (SAMS), the Russian Mir Structural Dynamic Measurement System (SDMS), the Mir and Shuttle Inertial Measurement Units (IMUs), and the Shuttle payload bay video cameras. Modal analysis was performed on the collected test data to extract modal parameters, i.e. frequencies, damping factors, and mode shapes. A special time-domain modal identification procedure was used on free-decay structural responses. The results from this study show that modal testing and analysis of large space structures is feasible within operational constraints. Model refinements were performed on both the Mir alone and the Shuttle-Mir mated configurations. The design sensitivity approach was used for refinement, which adjusts structural properties in order to match analytical and test modal parameters. To verify the refinement results, the analytical responses calculated using

  17. Asymmetric correlation matrices: an analysis of financial data

    Science.gov (United States)

    Livan, G.; Rebecchi, L.

    2012-06-01

    We analyse the spectral properties of correlation matrices between distinct statistical systems. Such matrices are intrinsically non-symmetric, and lend themselves to extend the spectral analyses usually performed on standard Pearson correlation matrices to the realm of complex eigenvalues. We employ some recent random matrix theory results on the average eigenvalue density of this type of matrix to distinguish between noise and non-trivial correlation structures, and we focus on financial data as a case study. Namely, we employ daily prices of stocks belonging to the American and British stock exchanges, and look for the emergence of correlations between two such markets in the eigenvalue spectrum of their non-symmetric correlation matrix. We find several non trivial results when considering time-lagged correlations over short lags, and we corroborate our findings by additionally studying the asymmetric correlation matrix of the principal components of our datasets.

  18. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    Science.gov (United States)

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  19. Estimating twin concordance for bivariate competing risks twin data

    DEFF Research Database (Denmark)

    Scheike, Thomas; Holst, Klaus K.; Hjelmborg, Jacob B.

    2014-01-01

    For twin time-to-event data, we consider different concordance probabilities, such as the casewise concordance that are routinely computed as a measure of the lifetime dependence/correlation for specific diseases. The concordance probability here is the probability that both twins have experience...... events with the competing risk death. We thus aim to quantify the degree of dependence through the casewise concordance function and show a significant genetic component...... the event of interest. Under the assumption that both twins are censored at the same time, we show how to estimate this probability in the presence of right censoring, and as a consequence, we can then estimate the casewise twin concordance. In addition, we can model the magnitude of within pair dependence...... over time, and covariates may be further influential on the marginal risk and dependence structure. We establish the estimators large sample properties and suggest various tests, for example, for inferring familial influence. The method is demonstrated and motivated by specific twin data on cancer...

  20. Multifractal detrended Cross Correlation Analysis of Foreign Exchange and SENSEX fluctuation in Indian perspective

    Science.gov (United States)

    Dutta, Srimonti; Ghosh, Dipak; Chatterjee, Sucharita

    2016-12-01

    The manuscript studies autocorrelation and cross correlation of SENSEX fluctuations and Forex Exchange Rate in respect to Indian scenario. Multifractal detrended fluctuation analysis (MFDFA) and multifractal detrended cross correlation analysis (MFDXA) were employed to study the correlation between the two series. It was observed that the two series are strongly cross correlated. The change of degree of cross correlation with time was studied and the results are interpreted qualitatively.

  1. Provider attributes correlation analysis to their referral frequency and awards.

    Science.gov (United States)

    Wiley, Matthew T; Rivas, Ryan L; Hristidis, Vagelis

    2016-03-14

    There has been a recent growth in health provider search portals, where patients specify filters-such as specialty or insurance-and providers are ranked by patient ratings or other attributes. Previous work has identified attributes associated with a provider's quality through user surveys. Other work supports that intuitive quality-indicating attributes are associated with a provider's quality. We adopt a data-driven approach to study how quality indicators of providers are associated with a rich set of attributes including medical school, graduation year, procedures, fellowships, patient reviews, location, and technology usage. In this work, we only consider providers as individuals (e.g., general practitioners) and not organizations (e.g., hospitals). As quality indicators, we consider the referral frequency of a provider and a peer-nominated quality designation. We combined data from the Centers for Medicare and Medicaid Services (CMS) and several provider rating web sites to perform our analysis. Our data-driven analysis identified several attributes that correlate with and discriminate against referral volume and peer-nominated awards. In particular, our results consistently demonstrate that these attributes vary by locality and that the frequency of an attribute is more important than its value (e.g., the number of patient reviews or hospital affiliations are more important than the average review rating or the ranking of the hospital affiliations, respectively). We demonstrate that it is possible to build accurate classifiers for referral frequency and quality designation, with accuracies over 85 %. Our findings show that a one-size-fits-all approach to ranking providers is inadequate and that provider search portals should calibrate their ranking function based on location and specialty. Further, traditional filters of provider search portals should be reconsidered, and patients should be aware of existing pitfalls with these filters and educated on local

  2. Correlation Between Posttraumatic Growth and Posttraumatic Stress Disorder Symptoms Based on Pearson Correlation Coefficient: A Meta-Analysis.

    Science.gov (United States)

    Liu, An-Nuo; Wang, Lu-Lu; Li, Hui-Ping; Gong, Juan; Liu, Xiao-Hong

    2017-05-01

    The literature on posttraumatic growth (PTG) is burgeoning, with the inconsistencies in the literature of the relationship between PTG and posttraumatic stress disorder (PTSD) symptoms becoming a focal point of attention. Thus, this meta-analysis aims to explore the relationship between PTG and PTSD symptoms through the Pearson correlation coefficient. A systematic search of the literature from January 1996 to November 2015 was completed. We retrieved reports on 63 studies that involved 26,951 patients. The weighted correlation coefficient revealed an effect size of 0.22 with a 95% confidence interval of 0.18 to 0.25. Meta-analysis provides evidence that PTG may be positively correlated with PTSD symptoms and that this correlation may be modified by age, trauma type, and time since trauma. Accordingly, people with high levels of PTG should not be ignored, but rather, they should continue to receive help to alleviate their PTSD symptoms.

  3. Engineering Properties and Correlation Analysis of Fiber Cementitious Materials

    Directory of Open Access Journals (Sweden)

    Wei-Ting Lin

    2014-11-01

    Full Text Available This study focuses on the effect of the amount of silica fume addition and volume fraction of steel fiber on the engineering properties of cementitious materials. Test variables include dosage of silica fume (5% and 10%, water/cement ratio (0.35 and 0.55 and steel fiber dosage (0.5%, 1.0% and 2.0%. The experimental results included: compressive strength, direct tensile strength, splitting tensile strength, surface abrasion and drop-weight test, which were collected to carry out the analysis of variance to realize the relevancy and significance between material parameters and those mechanical properties. Test results illustrate that the splitting tensile strength, direct tensile strength, strain capacity and ability of crack-arresting increase with increasing steel fiber and silica fume dosages, as well as the optimum mixture of the fiber cementitious materials is 5% replacement silica fume and 2% fiber dosage. In addition, the Pearson correlation coefficient was conducted to evaluate the influence of the material variables and corresponds to the experiment result.

  4. Applications of temporal kernel canonical correlation analysis in adherence studies.

    Science.gov (United States)

    John, Majnu; Lencz, Todd; Ferbinteanu, Janina; Gallego, Juan A; Robinson, Delbert G

    2017-10-01

    Adherence to medication is often measured as a continuous outcome but analyzed as a dichotomous outcome due to lack of appropriate tools. In this paper, we illustrate the use of the temporal kernel canonical correlation analysis (tkCCA) as a method to analyze adherence measurements and symptom levels on a continuous scale. The tkCCA is a novel method developed for studying the relationship between neural signals and hemodynamic response detected by functional MRI during spontaneous activity. Although the tkCCA is a powerful tool, it has not been utilized outside the application that it was originally developed for. In this paper, we simulate time series of symptoms and adherence levels for patients with a hypothetical brain disorder and show how the tkCCA can be used to understand the relationship between them. We also examine, via simulations, the behavior of the tkCCA under various missing value mechanisms and imputation methods. Finally, we apply the tkCCA to a real data example of psychotic symptoms and adherence levels obtained from a study based on subjects with a first episode of schizophrenia, schizophreniform or schizoaffective disorder.

  5. An analysis of correlation between occlusion classification and skeletal pattern

    International Nuclear Information System (INIS)

    Lu Xinhua; Cai Bin; Wang Dawei; Wu Liping

    2003-01-01

    Objective: To study the correlation between dental relationship and skeletal pattern of individuals. Methods: 194 cases were selected and classified by angle classification, incisor relationship and skeletal pattern respectively. The correlation of angle classification and incisor relationship to skeletal pattern was analyzed with SPSS 10.0. Results: The values of correlation index (Kappa) were 0.379 and 0.494 respectively. Conclusion: The incisor relationship is more consistent with skeletal pattern than angle classification

  6. Asset correlations and credit portfolio risk: an empirical analysis

    OpenAIRE

    Düllmann, Klaus; Scheicher, Martin; Schmieder, Christian

    2007-01-01

    In credit risk modelling, the correlation of unobservable asset returns is a crucial component for the measurement of portfolio risk. In this paper, we estimate asset correlations from monthly time series of Moody's KMV asset values for around 2,000 European firms from 1996 to 2004. We compare correlation and value-atrisk (VaR) estimates in a one-factor or market model and a multi-factor or sector model. Our main finding is a complex interaction of credit risk correlations and default probabi...

  7. Alarm reduction with correlation analysis; Larmsanering genom korrelationsanalys

    Energy Technology Data Exchange (ETDEWEB)

    Bergquist, Tord; Ahnlund, Jonas; Johansson, Bjoern; Gaardman, Lennart; Raaberg, Martin [Lund Univ. (Sweden). Dept. of Information Technology

    2004-09-01

    This project's main interest is to improve the overall alarm situation in the control rooms. By doing so, the operators working environment is less overstrained, which simplifies the decision-making. According to a study of the British refinery industry, the operators make wrong decisions in four times out of ten due to badly tuned alarm systems, with heavy expenses as a result. Furthermore, a more efficiently alarm handling is estimated to decrease the production loss with between three and eight percent. This sounds, according to Swedish standards, maybe a bit extreme, but there is no doubt about the benefits of having a well-tuned alarm system. This project can be seen as an extension of 'General Methods for Alarm Reduction' (VARMEFORSK--835), where the process improvements were the result of suggestions tailored for every signal. Here, instead causal dependences in the process are examined. A method for this, specially designed to fit process signals, has been developed. It is called MLPC (Multiple Local Property Correlation) and could be seen as an unprejudiced way of increase the information value in the process. There are a number of ways to make use of the additional process understanding a correlation analysis provides. In the report some are mentioned, foremost aiming to improve the alarm situation for operators. Signals from two heating plants have been analyzed with MLPC. In simulations, with the use of the result from these analyses as a base, a large number of alarms have been successfully suppressed. The results have been studied by personal with process knowledge, and they are very positive to the use of MLPC and they express many benefits by the clarification of process relations. It was established in 'General Methods for Alarm Reduction' that low pass filter are superior to mean value filter and time delay when trying to suppress alarms. As a result, a module for signal processing has been developed. The main purpose is

  8. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui

    2013-01-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  9. Linear analysis of degree correlations in complex networks

    Indian Academy of Sciences (India)

    Many real-world networks such as the protein–protein interaction networks and metabolic networks often display nontrivial correlations between degrees of vertices connected by edges. Here, we analyse the statistical methods used usually to describe the degree correlation in the networks, and analytically give linear ...

  10. Mutational analysis and clinical correlation of metastatic colorectal cancer.

    Science.gov (United States)

    Russo, Andrea L; Borger, Darrell R; Szymonifka, Jackie; Ryan, David P; Wo, Jennifer Y; Blaszkowsky, Lawrence S; Kwak, Eunice L; Allen, Jill N; Wadlow, Raymond C; Zhu, Andrew X; Murphy, Janet E; Faris, Jason E; Dias-Santagata, Dora; Haigis, Kevin M; Ellisen, Leif W; Iafrate, Anthony J; Hong, Theodore S

    2014-05-15

    Early identification of mutations may guide patients with metastatic colorectal cancer toward targeted therapies that may be life prolonging. The authors assessed tumor genotype correlations with clinical characteristics to determine whether mutational profiling can account for clinical similarities, differences, and outcomes. Under Institutional Review Board approval, 222 patients with metastatic colon adenocarcinoma (n = 158) and rectal adenocarcinoma (n = 64) who underwent clinical tumor genotyping were reviewed. Multiplexed tumor genotyping screened for >150 mutations across 15 commonly mutated cancer genes. The chi-square test was used to assess genotype frequency by tumor site and additional clinical characteristics. Cox multivariate analysis was used to assess the impact of genotype on overall survival. Broad-based tumor genotyping revealed clinical and anatomic differences that could be linked to gene mutations. NRAS mutations were associated with rectal cancer versus colon cancer (12.5% vs 0.6%; P colon cancer (13% vs 3%; P = .024) and older age (15.8% vs 4.6%; P = .006). TP53 mutations were associated with rectal cancer (30% vs 18%; P = .048), younger age (14% vs 28.7%; P = .007), and men (26.4% vs 14%; P = .03). Lung metastases were associated with PIK3CA mutations (23% vs 8.7%; P = .004). Only mutations in BRAF were independently associated with decreased overall survival (hazard ratio, 2.4; 95% confidence interval, 1.09-5.27; P = .029). The current study suggests that underlying molecular profiles can differ between colon and rectal cancers. Further investigation is warranted to assess whether the differences identified are important in determining the optimal treatment course for these patients. © 2014 American Cancer Society.

  11. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    Science.gov (United States)

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    Perhaps no other pair of variables in ecology has generated as much discussion as species richness and ecosystem productivity, as illustrated by the reactions by Pierce (2013) and others to Adler et al.'s (2011) report that empirical patterns are weak and inconsistent. Adler et al. (2011) argued we need to move beyond a focus on simplistic bivariate relationships and test mechanistic, multivariate causal hypotheses. We feel the continuing debate over productivity–richness relationships (PRRs) provides a focused context for illustrating the fundamental difficulties of using bivariate relationships to gain scientific understanding.

  12. Random matrix theory analysis of cross-correlations in the US stock market: Evidence from Pearson’s correlation coefficient and detrended cross-correlation coefficient

    Science.gov (United States)

    Wang, Gang-Jin; Xie, Chi; Chen, Shou; Yang, Jiao-Jiao; Yang, Ming-Yan

    2013-09-01

    In this study, we first build two empirical cross-correlation matrices in the US stock market by two different methods, namely the Pearson’s correlation coefficient and the detrended cross-correlation coefficient (DCCA coefficient). Then, combining the two matrices with the method of random matrix theory (RMT), we mainly investigate the statistical properties of cross-correlations in the US stock market. We choose the daily closing prices of 462 constituent stocks of S&P 500 index as the research objects and select the sample data from January 3, 2005 to August 31, 2012. In the empirical analysis, we examine the statistical properties of cross-correlation coefficients, the distribution of eigenvalues, the distribution of eigenvector components, and the inverse participation ratio. From the two methods, we find some new results of the cross-correlations in the US stock market in our study, which are different from the conclusions reached by previous studies. The empirical cross-correlation matrices constructed by the DCCA coefficient show several interesting properties at different time scales in the US stock market, which are useful to the risk management and optimal portfolio selection, especially to the diversity of the asset portfolio. It will be an interesting and meaningful work to find the theoretical eigenvalue distribution of a completely random matrix R for the DCCA coefficient because it does not obey the Marčenko-Pastur distribution.

  13. Correlation Function Analysis of Fiber Networks: Implications for Thermal Conductivity

    Science.gov (United States)

    Martinez-Garcia, Jorge; Braginsky, Leonid; Shklover, Valery; Lawson, John W.

    2011-01-01

    The heat transport in highly porous fiber structures is investigated. The fibers are supposed to be thin, but long, so that the number of the inter-fiber connections along each fiber is large. We show that the effective conductivity of such structures can be found from the correlation length of the two-point correlation function of the local conductivities. Estimation of the parameters, determining the conductivity, from the 2D images of the structures is analyzed.

  14. Approximation generation for correlations in thermal-hydraulic analysis codes

    International Nuclear Information System (INIS)

    Pereira, Luiz C.M.; Carmo, Eduardo G.D. do

    1997-01-01

    A fast and precise evaluation of fluid thermodynamic and transport properties is needed for the efficient mass, energy and momentum transport phenomena simulation related to nuclear plant power generation. A fully automatic code capable to generate suitable approximation for correlations with one or two independent variables is presented. Comparison in terms of access speed and precision with original correlations currently used shows the adequacy of the approximation obtained. (author). 4 refs., 8 figs., 1 tab

  15. Comprehensive analysis of electron correlations in three-electron atoms

    International Nuclear Information System (INIS)

    Morishita, T.; Lin, C.D.

    1999-01-01

    We study the electron correlations in singly, doubly, and triply excited states of a three-electron atom. While electron correlation in general is weak for singly excited states, correlation plays major roles in determining the characteristics of doubly and triply excited states. Using the adiabatic approximation in hyperspherical coordinates, we show that the distinction between singly, doubly, and triply excited states is determined by the radial correlations, while finer distinctions within doubly or triply excited states lie in the angular correlations. Partial projections of the body-fixed frame wave functions are used to demonstrate the characteristic nodal surfaces which provide clues to the energy ordering of the states. We show that doubly excited states of a three-electron atom exhibit correlations that are similar to the doubly excited states of a two-electron atom. For the triply excited states, we show that the motion of the three electrons resemble approximately that of a symmetric top. copyright 1999 The American Physical Society

  16. Bivariate least squares linear regression: Towards a unified analytic formalism. I. Functional models

    Science.gov (United States)

    Caimmi, R.

    2011-08-01

    Concerning bivariate least squares linear regression, the classical approach pursued for functional models in earlier attempts ( York, 1966, 1969) is reviewed using a new formalism in terms of deviation (matrix) traces which, for unweighted data, reduce to usual quantities leaving aside an unessential (but dimensional) multiplicative factor. Within the framework of classical error models, the dependent variable relates to the independent variable according to the usual additive model. The classes of linear models considered are regression lines in the general case of correlated errors in X and in Y for weighted data, and in the opposite limiting situations of (i) uncorrelated errors in X and in Y, and (ii) completely correlated errors in X and in Y. The special case of (C) generalized orthogonal regression is considered in detail together with well known subcases, namely: (Y) errors in X negligible (ideally null) with respect to errors in Y; (X) errors in Y negligible (ideally null) with respect to errors in X; (O) genuine orthogonal regression; (R) reduced major-axis regression. In the limit of unweighted data, the results determined for functional models are compared with their counterparts related to extreme structural models i.e. the instrumental scatter is negligible (ideally null) with respect to the intrinsic scatter ( Isobe et al., 1990; Feigelson and Babu, 1992). While regression line slope and intercept estimators for functional and structural models necessarily coincide, the contrary holds for related variance estimators even if the residuals obey a Gaussian distribution, with the exception of Y models. An example of astronomical application is considered, concerning the [O/H]-[Fe/H] empirical relations deduced from five samples related to different stars and/or different methods of oxygen abundance determination. For selected samples and assigned methods, different regression models yield consistent results within the errors (∓ σ) for both

  17. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    Science.gov (United States)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  18. Analysis of factors correlating with medical radiological examination frequencies

    International Nuclear Information System (INIS)

    Jahnen, A.; Jaervinen, H.; Bly, R.; Olerud, H.; Vassilieva, J.; Vogiatzi, S.; Shannoun, F.

    2015-01-01

    The European Commission (EC) funded project Dose Datamed 2 (DDM2) had two objectives: to collect available data on patient doses from the radiodiagnostic procedures (X-ray and nuclear medicine) in Europe, and to facilitate the implementation of the Radiation Protection 154 Guidelines (RP154). Besides the collection of frequency and dose data, two questionnaires were issued to gather information about medical radiological imaging. This article analyses a possible correlation between the collected frequency data, selected variables from the results of the detailed questionnaire and national economic data. Based on a 35 countries dataset, there is no correlation between the gross domestic product (GDP) and the total number of X-ray examinations in a country. However, there is a significant correlation ( p < 0.01) between the GDP and the overall CT examination frequency. High income countries perform more CT examinations per inhabitant. That suggests that planar X-ray examinations are replaced by CT examinations. (authors)

  19. Thermodynamic correlations for the accident analysis of HTR's

    International Nuclear Information System (INIS)

    Rehm, W.; Jahn, W.; Finken, R.

    1976-12-01

    The thermal properties of Helium and for the case of a depressurized primary circuit, various mixtures of primary cooling gas were taken into consideration. The temperature dependence of the correlations for the thermal properties of the graphite components in the core and for the structural materials in the primary circuit are extrapolated about normal operation conditions. Furthermore the correlations for the effective thermal conductivity, the heat transfer and pressure drop are described for pebble bed HTR's. In addition some important heat transfer data of the steam generator are included. With these correlations, for example accident sequences with failure of the afterheat removal systems are discussed for pebble bed HTR's. It is concluded that the transient temperature behaviour demonstrates the inherent safety features of the HTR in extreme accidents. (orig.) [de

  20. Semi-automated detection of aberrant chromosomes in bivariate flow karyotypes

    NARCIS (Netherlands)

    Boschman, G. A.; Manders, E. M.; Rens, W.; Slater, R.; Aten, J. A.

    1992-01-01

    A method is described that is designed to compare, in a standardized procedure, bivariate flow karyotypes of Hoechst 33258 (HO)/Chromomycin A3 (CA) stained human chromosomes from cells with aberrations with a reference flow karyotype of normal chromosomes. In addition to uniform normalization of

  1. Carbon and oxygen isotopic ratio bi-variate distribution for marble artifacts quarry assignment

    International Nuclear Information System (INIS)

    Pentia, M.

    1995-01-01

    Statistical description, by a Gaussian bi-variate probability distribution of 13 C/ 12 C and 18 O/ 16 O isotopic ratios in the ancient marble quarries has been done and the new method for obtaining the confidence level quarry assignment for marble artifacts has been presented. (author) 8 figs., 3 tabs., 4 refs

  2. Technical note: Towards a continuous classification of climate using bivariate colour mapping

    NARCIS (Netherlands)

    Teuling, A.J.

    2011-01-01

    Climate is often defined in terms of discrete classes. Here I use bivariate colour mapping to show that the global distribution of K¨oppen-Geiger climate classes can largely be reproduced by combining the simple means of two key states of the climate system 5 (i.e., air temperature and relative

  3. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Science.gov (United States)

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K; Schad, Lothar R; Zöllner, Frank Gerrit

    2015-01-01

    Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  4. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    Directory of Open Access Journals (Sweden)

    Jakob Nikolas Kather

    Full Text Available Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions.In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images.To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images.Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  5. A view on coupled cluster perturbation theory using a bivariational Lagrangian formulation.

    Science.gov (United States)

    Kristensen, Kasper; Eriksen, Janus J; Matthews, Devin A; Olsen, Jeppe; Jørgensen, Poul

    2016-02-14

    We consider two distinct coupled cluster (CC) perturbation series that both expand the difference between the energies of the CCSD (CC with single and double excitations) and CCSDT (CC with single, double, and triple excitations) models in orders of the Møller-Plesset fluctuation potential. We initially introduce the E-CCSD(T-n) series, in which the CCSD amplitude equations are satisfied at the expansion point, and compare it to the recently developed CCSD(T-n) series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)], in which not only the CCSD amplitude, but also the CCSD multiplier equations are satisfied at the expansion point. The computational scaling is similar for the two series, and both are term-wise size extensive with a formal convergence towards the CCSDT target energy. However, the two series are different, and the CCSD(T-n) series is found to exhibit a more rapid convergence up through the series, which we trace back to the fact that more information at the expansion point is utilized than for the E-CCSD(T-n) series. The present analysis can be generalized to any perturbation expansion representing the difference between a parent CC model and a higher-level target CC model. In general, we demonstrate that, whenever the parent parameters depend upon the perturbation operator, a perturbation expansion of the CC energy (where only parent amplitudes are used) differs from a perturbation expansion of the CC Lagrangian (where both parent amplitudes and parent multipliers are used). For the latter case, the bivariational Lagrangian formulation becomes more than a convenient mathematical tool, since it facilitates a different and faster convergent perturbation series than the simpler energy-based expansion.

  6. Variability, correlation and path coefficient analysis of seedling traits ...

    African Journals Online (AJOL)

    use

    2011-12-12

    Dec 12, 2011 ... Indirect selection is a useful means for improving yield in cotton crop. The objective of the present study was to determine the genetic variability, broad sense heritability, genetic advance and correlation among the six seedling traits and their direct and indirect effects on cotton yield by using path coefficient ...

  7. Correlation of energy balance method to dynamic pipe rupture analysis

    International Nuclear Information System (INIS)

    Kuo, H.H.; Durkee, M.

    1983-01-01

    When using an energy balance approach in the design of pipe rupture restraints for nuclear power plants, the NRC specifies in its Standard Review Plan 3.6.2 that the input energy to the system must be multiplied by a factor of 1.1 unless a lower value can be justified. Since the energy balance method is already quite conservative, an across-the-board use of 1.1 to amplify the energy input appears unneccessary. The paper's purpose is to show that this 'correlation factor' could be substantially less than unity if certain design parameters are met. In this paper, result of nonlinear dynamic analyses were compared to the results of the corresponding analyses based on the energy balance method which assumes constant blowdown forces and rigid plastic material properties. The appropriate correlation factors required to match the energy balance results with the dynamic analyses results were correlated to design parameters such as restraint location from the break, yield strength of the energy absorbing component, and the restraint gap. It is shown that the correlation factor is related to a single nondimensional design parameter and can be limited to a value below unity if appropriate design parameters are chosen. It is also shown that the deformation of the restraints can be related to dimensionless system parameters. This, therefore, allows the maximum restraint deformation to be evaluated directly for design purposes. (orig.)

  8. Correlation Analysis of some Growth, Yield, Yield Components and ...

    African Journals Online (AJOL)

    three critical growth stages which was imposed by withholding water (at ... November, 5th December, 19th December and 2nd January) laid out in a split ... Simple correlation coefficient ® of different crop parameters and grain yield ... The husk bran and germ are rich sources of ..... heat in 2009/2010 dry season at Fadam a ...

  9. Linear analysis of degree correlations in complex networks

    Indian Academy of Sciences (India)

    2016-11-02

    Nov 2, 2016 ... 4College of Science, Qi Lu University of Technology, Jinan 250353, Shandong, China ... cal methods used usually to describe the degree correlation in the ... Most social networks show assorta- .... a clear but only qualitative description of the degree ... is difficult to give quantitative relation between DCC.

  10. Correlational Analysis of Servant Leadership and School Climate

    Science.gov (United States)

    Black, Glenda Lee

    2010-01-01

    The purpose of this mixed-method research study was to determine the extent that servant leadership was correlated with perceptions of school climate to identify whether there was a relationship between principals' and teachers' perceived practice of servant leadership and of school climate. The study employed a mixed-method approach by first…

  11. Analysis of Current HT9 Creep Correlations and Modification

    International Nuclear Information System (INIS)

    Lee, Cheol Min; Sohn, Dongseong; Cheon, Jin Sik

    2014-01-01

    It has high thermal conductivity, high mechanical strength and low irradiation induced swelling. However high temperature creep of HT9 has always been a life limiting factor. Above 600 .deg. C, the dislocation density in HT9 is decreased and the M 23 C 6 precipitates coarsen, these processes are accelerated if there is irradiation. Finally microstructural changes at high temperature lead to lower creep strength and large creep strain. For HT9 to be used as a future cladding, creep behavior of the HT9 should be predicted accurately based on the physical understanding of the creep phenomenon. Most of the creep correlations are composed of irradiation creep and thermal creep terms. However, it is certain that in-pile thermal creep and out-of-pile thermal creep are different because of the microstructure changes induced from neutron irradiation. To explain creep behavior more accurately, thermal creep contributions other than neutron irradiation should be discriminated in a creep correlation. To perform this work, existing HT9 creep correlations are analyzed, and the results are used to develop more accurate thermal creep correlation. Then, the differences between in-pile thermal creep and out-of-pile thermal creep are examined

  12. Approximate models for the analysis of laser velocimetry correlation functions

    International Nuclear Information System (INIS)

    Robinson, D.P.

    1981-01-01

    Velocity distributions in the subchannels of an eleven pin test section representing a slice through a Fast Reactor sub-assembly were measured with a dual beam laser velocimeter system using a Malvern K 7023 digital photon correlator for signal processing. Two techniques were used for data reduction of the correlation function to obtain velocity and turbulence values. Whilst both techniques were in excellent agreement on the velocity, marked discrepancies were apparent in the turbulence levels. As a consequence of this the turbulence data were not reported. Subsequent investigation has shown that the approximate technique used as the basis of Malvern's Data Processor 7023V is restricted in its range of application. In this note alternative approximate models are described and evaluated. The objective of this investigation was to develop an approximate model which could be used for on-line determination of the turbulence level. (author)

  13. Groundwater travel time uncertainty analysis. Sensitivity of results to model geometry, and correlations and cross correlations among input parameters

    International Nuclear Information System (INIS)

    Clifton, P.M.

    1985-03-01

    This study examines the sensitivity of the travel time distribution predicted by a reference case model to (1) scale of representation of the model parameters, (2) size of the model domain, (3) correlation range of log-transmissivity, and (4) cross correlations between transmissivity and effective thickness. The basis for the reference model is the preliminary stochastic travel time model previously documented by the Basalt Waste Isolation Project. Results of this study show the following. The variability of the predicted travel times can be adequately represented when the ratio between the size of the zones used to represent the model parameters and the log-transmissivity correlation range is less than about one-fifth. The size of the model domain and the types of boundary conditions can have a strong impact on the distribution of travel times. Longer log-transmissivity correlation ranges cause larger variability in the predicted travel times. Positive cross correlation between transmissivity and effective thickness causes a decrease in the travel time variability. These results demonstrate the need for a sound conceptual model prior to conducting a stochastic travel time analysis

  14. Dynamics of market correlations: taxonomy and portfolio analysis.

    Science.gov (United States)

    Onnela, J-P; Chakraborti, A; Kaski, K; Kertész, J; Kanto, A

    2003-11-01

    The time dependence of the recently introduced minimum spanning tree description of correlations between stocks, called the "asset tree" has been studied in order to reflect the financial market taxonomy. The nodes of the tree are identified with stocks and the distance between them is a unique function of the corresponding element of the correlation matrix. By using the concept of a central vertex, chosen as the most strongly connected node of the tree, an important characteristic is defined by the mean occupation layer. During crashes, due to the strong global correlation in the market, the tree shrinks topologically, and this is shown by a low value of the mean occupation layer. The tree seems to have a scale-free structure where the scaling exponent of the degree distribution is different for "business as usual" and "crash" periods. The basic structure of the tree topology is very robust with respect to time. We also point out that the diversification aspect of portfolio optimization results in the fact that the assets of the classic Markowitz portfolio are always located on the outer leaves of the tree. Technical aspects such as the window size dependence of the investigated quantities are also discussed.

  15. Comparison and Correlation Analysis of Different Swine Breeds Meat Quality

    Directory of Open Access Journals (Sweden)

    Y. X. Li

    2013-07-01

    Full Text Available This study was performed to determine the influence of pig breed and gender on the ultimate pH and physicochemical properties of pork. The correlations between pH and pork quality traits directly related to carcass grade, and consumer’s preference were also evaluated. The pH and meat grading scores for cold carcasses of 215 purebred pigs (Duroc, Landrace, and Yorkshire from four different farms were obtained. Meat quality parameters of the pork loin were analyzed. Duroc and female animals were more affected compared to other breeds and male pigs. Duroc animals had the highest ultimate pH, carcass back fat thickness, marbling scores, yellowness, and fat content (p<0.05. Landrace pigs had the highest color lightness and cooking loss values (p<0.05. Among all trait parameters, marbling scores showed the highest significant differences when evaluating the impact of breed and gender on meat quality characteristics (p<0.001. Ultimate pH was positively correlated with carcass weight (0.20, back fat thickness (0.19, marbling score (0.17, and color score (0.16 while negatively correlated with cooking loss (−0.24 and shear force (−0.20. Therefore, pork samples with lower ultimate pH had lower cooking loss, higher lightness, and higher shear force values irrespective of breed.

  16. Dynamics of market correlations: Taxonomy and portfolio analysis

    Science.gov (United States)

    Onnela, J.-P.; Chakraborti, A.; Kaski, K.; Kertész, J.; Kanto, A.

    2003-11-01

    The time dependence of the recently introduced minimum spanning tree description of correlations between stocks, called the “asset tree” has been studied in order to reflect the financial market taxonomy. The nodes of the tree are identified with stocks and the distance between them is a unique function of the corresponding element of the correlation matrix. By using the concept of a central vertex, chosen as the most strongly connected node of the tree, an important characteristic is defined by the mean occupation layer. During crashes, due to the strong global correlation in the market, the tree shrinks topologically, and this is shown by a low value of the mean occupation layer. The tree seems to have a scale-free structure where the scaling exponent of the degree distribution is different for “business as usual” and “crash” periods. The basic structure of the tree topology is very robust with respect to time. We also point out that the diversification aspect of portfolio optimization results in the fact that the assets of the classic Markowitz portfolio are always located on the outer leaves of the tree. Technical aspects such as the window size dependence of the investigated quantities are also discussed.

  17. Correlation analysis on alpha attenuation and nasal skin temperature

    International Nuclear Information System (INIS)

    Nozawa, Akio; Tacano, Munecazu

    2009-01-01

    Some serious accidents caused by declines in arousal level, such as traffic accidents and mechanical control mistakes, have become issues of social concern. The physiological index obtained by human body measurement is expected to offer a leading tool for evaluating arousal level as an objective indicator. In this study, declines in temporal arousal levels were evaluated by nasal skin temperature. As arousal level declines, sympathetic nervous activity is decreased and blood flow in peripheral vessels is increased. Since peripheral vessels exist just under the skin on the fingers and nose, the psychophysiological state can be judged from the displacement of skin temperature caused by changing blood flow volume. Declining arousal level is expected to be observable as a temperature rise in peripheral parts of the body. The objective of this experiment was to obtain assessment criteria for judging declines in arousal level by nasal skin temperature using the alpha attenuation coefficient (AAC) of electroencephalography (EEG) as a reference benchmark. Furthermore, a psychophysical index of sleepiness was also measured using a visual analogue scale (VAS). Correlations between nasal skin temperature index and EEG index were analyzed. AAC and maximum displacement of nasal skin temperature displayed a clear negative correlation, with a correlation coefficient of −0.55

  18. Feynman-α correlation analysis by prompt-photon detection

    International Nuclear Information System (INIS)

    Hashimoto, Kengo; Yamada, Sumasu; Hasegawa, Yasuhiro; Horiguchi, Tetsuo

    1998-01-01

    Two-detector Feynman-α measurements were carried out using the UTR-KINKI reactor, a light-water-moderated and graphite-reflected reactor, by detecting high-energy, prompt gamma rays. For comparison, the conventional measurements by detecting neutrons were also performed. These measurements were carried out in the subcriticality range from 0 to $1.8. The gate-time dependence of the variance-and covariance-to-mean ratios measured by gamma-ray detection were nearly identical with those obtained using standard neutron-detection techniques. Consequently, the prompt-neutron decay constants inferred from the gamma-ray correlation data agreed with those from the neutron data. Furthermore, the correlated-to-uncorrelated amplitude ratios obtained by gamma-ray detection significantly depended on the low-energy discriminator level of the single-channel analyzer. The discriminator level was determined as optimum for obtaining a maximum value of the amplitude ratio. The maximum amplitude ratio was much larger than that obtained by neutron detection. The subcriticality dependence of the decay constant obtained by gamma-ray detection was consistent with that obtained by neutron detection and followed the linear relation based on the one-point kinetic model in the vicinity of delayed critical. These experimental results suggest that the gamma-ray correlation technique can be applied to measure reactor kinetic parameters more efficiently

  19. Strong anticipation and long-range cross-correlation: Application of detrended cross-correlation analysis to human behavioral data

    Science.gov (United States)

    Delignières, Didier; Marmelat, Vivien

    2014-01-01

    In this paper, we analyze empirical data, accounting for coordination processes between complex systems (bimanual coordination, interpersonal coordination, and synchronization with a fractal metronome), by using a recently proposed method: detrended cross-correlation analysis (DCCA). This work is motivated by the strong anticipation hypothesis, which supposes that coordination between complex systems is not achieved on the basis of local adaptations (i.e., correction, predictions), but results from a more global matching of complexity properties. Indeed, recent experiments have evidenced a very close correlation between the scaling properties of the series produced by two coordinated systems, despite a quite weak local synchronization. We hypothesized that strong anticipation should result in the presence of long-range cross-correlations between the series produced by the two systems. Results allow a detailed analysis of the effects of coordination on the fluctuations of the series produced by the two systems. In the long term, series tend to present similar scaling properties, with clear evidence of long-range cross-correlation. Short-term results strongly depend on the nature of the task. Simulation studies allow disentangling the respective effects of noise and short-term coupling processes on DCCA results, and suggest that the matching of long-term fluctuations could be the result of short-term coupling processes.

  20. application of multilinear regression analysis in modeling of soil

    African Journals Online (AJOL)

    Windows User

    Accordingly [1, 3] in their work, they applied linear regression ... (MLRA) is a statistical technique that uses several explanatory ... order to check this, they adopted bivariate correlation analysis .... groups, namely A-1 through A-7, based on their relative expected ..... Multivariate Regression in Gorgan Province North of Iran” ...

  1. Low Carbon-Oriented Optimal Reliability Design with Interval Product Failure Analysis and Grey Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Yixiong Feng

    2017-03-01

    Full Text Available The problem of large amounts of carbon emissions causes wide concern across the world, and it has become a serious threat to the sustainable development of the manufacturing industry. The intensive research into technologies and methodologies for green product design has significant theoretical meaning and practical value in reducing the emissions of the manufacturing industry. Therefore, a low carbon-oriented product reliability optimal design model is proposed in this paper: (1 The related expert evaluation information was prepared in interval numbers; (2 An improved product failure analysis considering the uncertain carbon emissions of the subsystem was performed to obtain the subsystem weight taking the carbon emissions into consideration. The interval grey correlation analysis was conducted to obtain the subsystem weight taking the uncertain correlations inside the product into consideration. Using the above two kinds of subsystem weights and different caution indicators of the decision maker, a series of product reliability design schemes is available; (3 The interval-valued intuitionistic fuzzy sets (IVIFSs were employed to select the optimal reliability and optimal design scheme based on three attributes, namely, low carbon, correlation and functions, and economic cost. The case study of a vertical CNC lathe proves the superiority and rationality of the proposed method.

  2. Statistical analysis of latent generalized correlation matrix estimation in transelliptical distribution

    OpenAIRE

    Han, Fang; Liu, Han

    2016-01-01

    Correlation matrices play a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, it is not an effective estimator when facing heavy-tailed distributions. As a robust alternative, Han and Liu [J. Am. Stat. Assoc. 109 (2015) 275-2...

  3. S-matrix analysis of the baryon electric charge correlation

    Science.gov (United States)

    Lo, Pok Man; Friman, Bengt; Redlich, Krzysztof; Sasaki, Chihiro

    2018-03-01

    We compute the correlation of the net baryon number with the electric charge (χBQ) for an interacting hadron gas using the S-matrix formulation of statistical mechanics. The observable χBQ is particularly sensitive to the details of the pion-nucleon interaction, which are consistently incorporated in the current scheme via the empirical scattering phase shifts. Comparing to the recent lattice QCD studies in the (2 + 1)-flavor system, we find that the natural implementation of interactions and the proper treatment of resonances in the S-matrix approach lead to an improved description of the lattice data over that obtained in the hadron resonance gas model.

  4. The Asian crisis contagion: A dynamic correlation approach analysis

    Directory of Open Access Journals (Sweden)

    Essaadi Essahbi

    2009-01-01

    Full Text Available In this paper we are testing for contagion caused by the Thai baht collapse of July 1997. In line with earlier work, shift-contagion is defined as a structural change within the international propagation mechanisms of financial shocks. We adopt Bai and Perron's (1998 structural break approach in order to detect the endogenous break points of the pair-wise time-varying correlations between Thailand and seven Asian stock market returns. Our approach enables us to solve the misspecification problem of the crisis window. Our results illustrate the existence of shift-contagion in the Asian crisis caused by the crisis in Thailand.

  5. Re-analysis of correlations among four impulsivity scales.

    Science.gov (United States)

    Gallardo-Pujol, David; Andrés-Pueyo, Antonio

    2006-08-01

    Impulsivity plays a key role in normal and pathological behavior. Although there is some consensus about its conceptualization, there have been many attempts to build a multidimensional tool due to the lack of agreement in how to measure it. A recent study claimed support for a three-dimensional structure of impulsivity, however with weak empirical support. By re-analysing those data, a four-factor structure was found to describe the correlation matrix much better. The debate remains open and further research is needed to clarify the factor structure. The desirability of constructing new measures, perhaps analogously to the Wechsler Intelligence Scale, is emphasized.

  6. Correlation Analysis between Nominal and Real Convergence. The Romanian Case

    Directory of Open Access Journals (Sweden)

    Marius-Corneliu Marinas

    2006-05-01

    Full Text Available This study aims to analyze the sources of the correlation between the nominal and real convergence, as well as the impact of the macroeconomic politics on it. The perspective of Euro adoption will impose stricter management of monetary and budgetary politics, which will affect negatively the catching up process of the economic delays given the lack of higher economic flexibility. This enables a more rapid adjustment of the economy to some persistent shocks as a result of applying growth aggregate supply politics.

  7. Correlation Analysis between Nominal and Real Convergence. The Romanian Case

    Directory of Open Access Journals (Sweden)

    Marius-Corneliu Marinas

    2006-03-01

    Full Text Available This study aims to analyze the sources of the correlation between the nominal and real convergence, as well as the impact of the macroeconomic politics on it. The perspective of Euro adoption will impose stricter management of monetary and budgetary politics, which will affect negatively the catching up process of the economic delays given the lack of higher economic flexibility. This enables a more rapid adjustment of the economy to some persistent shocks as a result of applying growth aggregate supply politics.

  8. Correlation, Regression, and Cointegration of Nonstationary Economic Time Series

    DEFF Research Database (Denmark)

    Johansen, Søren

    ), and Phillips (1986) found the limit distributions. We propose to distinguish between empirical and population correlation coefficients and show in a bivariate autoregressive model for nonstationary variables that the empirical correlation and regression coefficients do not converge to the relevant population...... values, due to the trending nature of the data. We conclude by giving a simple cointegration analysis of two interests. The analysis illustrates that much more insight can be gained about the dynamic behavior of the nonstationary variables then simply by calculating a correlation coefficient......Yule (1926) introduced the concept of spurious or nonsense correlation, and showed by simulation that for some nonstationary processes, that the empirical correlations seem not to converge in probability even if the processes were independent. This was later discussed by Granger and Newbold (1974...

  9. [Determination and correlation analysis of trace elements in Boletus tomentipes].

    Science.gov (United States)

    Li, Tao; Wang, Yuan-zhong; Zhang, Ji; Zhao, Yan-li; Liu, Hong-gao

    2011-07-01

    The contents of eleven trace elements in Boletus tomentipes were determined by inductively coupled plasma atomic emission spectroscopy (ICP-AES). The results showed that the fruiting bodies of B. tomentipes were very rich in Mg and Fe (>100 mg x kg(-1)) and rich in Mn, Zn and Cu (>10 mg x kg(-1)). Cr, Pb, Ni, Cd, and As were relatively minor contents (0.1-10.0 mg x kg(-1)) of this species, while Hg occurred at the smallest content (< 0.1 mg x kg(-1)). Among the determined 11 trace elements, Zn-Cu had significantly positive correlation (r = 0.659, P < 0.05), whereas, Hg-As, Ni-Fe, and Zn-Mg had significantly negative correlation (r = -0.672, -0.610, -0.617, P < 0.05). This paper presented the trace elements properties of B. tomentipes, and is expected to be useful for exploitation and quality evaluation of this species.

  10. Partitioning Water Vapor and Carbon Dioxide Fluxes using Correlation Analysis

    Science.gov (United States)

    Scanlon, T. M.

    2008-12-01

    A variety of methods are currently available to partition water vapor fluxes (into components of transpiration and direct evaporation) and carbon dioxide fluxes (into components of photosynthesis and respiration), using chambers, isotopes, and regression modeling approaches. Here, a methodology is presented that accounts for correlations between high-frequency measurements of water vapor (q) and carbon dioxide (c) concentrations being influenced by their non-identical source-sink distributions and the relative magnitude of their constituent fluxes. Flux-variance similarity assumptions are applied separately to the stomatal and the non-stomatal exchange, and the flux components are identified by considering the q-c correlation. Water use efficiency for the vegetation, and how it varies with respect to vapor pressure deficit, is the only input needed for this approach that uses standard eddy covariance measurements. The method is demonstrated using data collected over a corn field throughout a growing season. In particular, the research focuses on the partitioning of the water flux with the aim of improving how direct evaporation is handled in soil-vegetation- atmosphere transfer models over the course of wetting and dry-down cycles.

  11. Auto-correlation analysis of wave heights in the Bay of Bengal

    Indian Academy of Sciences (India)

    Time series observations of significant wave heights in the Bay of Bengal were subjected to auto- correlation analysis to determine temporal variability scale. The analysis indicates an exponen- tial fall of auto-correlation in the first few hours with a decorrelation time scale of about six hours. A similar figure was found earlier ...

  12. correlation studies and path coefficient analysis for seed yield

    African Journals Online (AJOL)

    Prof. Adipala Ekwamu

    African Crop Science Journal, Vol. 21, No. 1, pp. 51 - 59 ... Yield being a quantitative trait has complex inheritance, which is ... Analysis for seed yield and yield components in Ethiopian coriander. 53 ..... The financial assistance of Canadian.

  13. CORRELATION ANALYSIS OF THE AUDIT COMMITTEE AND STRUCTURAL INDICATORS

    Directory of Open Access Journals (Sweden)

    FÜLÖP MELINDA TIMEA

    2014-02-01

    Full Text Available The main role of corporate governance is to restore market confidence and in this process plays an important role the audit committee. The purpose of this case study is to analyze the correlations between the Audit Committee and structural indicators. Considering the achievement of the objectives proposed in this research, our research is based on a deductive approach from general aspects to particular aspects that combines quantitative and qualitative studies. Theoretical knowledge is used for a better understanding of a phenomenon and not for making assumptions. Thus, in order to achieve our study, we selected 25 companies listed on Berlin Stock Exchange. Following this study, we concluded that the role of the audit committee is crucial.

  14. CORRELATION ANALYSIS OF THE AUDIT COMMITTEE AND PROFITABILITY INDICATORS

    Directory of Open Access Journals (Sweden)

    MELINDA TIMEA FÜLÖP

    2013-10-01

    Full Text Available The main role of corporate governance is to restore market confidence and in this process plays an important role the audit committee. The purpose of this case study is to analyze the correlations between the Audit Committee and profitability indicators. Considering the achievement of the objectives proposed in this research, our research is based on a deductive approach from general aspects to particular aspects that combines quantitative and qualitative studies. Theoretical knowledge is used for a better understanding of a phenomenon and not for making assumptions. Thus, in order to achieve our study, we selected 25 companies listed on Berlin Stock Exchange. Following this study, we concluded that the role of the audit committee is crucial.

  15. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    Science.gov (United States)

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The effects of common risk factors on stock returns: A detrended cross-correlation analysis

    Science.gov (United States)

    Ruan, Qingsong; Yang, Bingchan

    2017-10-01

    In this paper, we investigate the cross-correlations between Fama and French three factors and the return of American industries on the basis of cross-correlation statistic test and multifractal detrended cross-correlation analysis (MF-DCCA). Qualitatively, we find that the return series of Fama and French three factors and American industries were overall significantly cross-correlated based on the analysis of a statistic. Quantitatively, we find that the cross-correlations between three factors and the return of American industries were strongly multifractal, and applying MF-DCCA we also investigate the cross-correlation of industry returns and residuals. We find that there exists multifractality of industry returns and residuals. The result of correlation coefficients we can verify that there exist other factors which influence the industry returns except Fama three factors.

  17. On minimum divergence adaptation of discrete bivariate distributions to given marginals

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor; van der Meulen, E. C.

    2005-01-01

    Roč. 51, č. 1 (2005), s. 313-320 ISSN 0018-9448 R&D Projects: GA ČR GA201/02/1391; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : approximation of contingency tables * bivariate discrete distributions * minimization of divergences Subject RIV: BD - Theory of Information Impact factor: 2.183, year: 2005

  18. The bivariate probit model of uncomplicated control of tumor: a heuristic exposition of the methodology

    International Nuclear Information System (INIS)

    Herbert, Donald

    1997-01-01

    Purpose: To describe the concept, models, and methods for the construction of estimates of joint probability of uncomplicated control of tumors in radiation oncology. Interpolations using this model can lead to the identification of more efficient treatment regimens for an individual patient. The requirement to find the treatment regimen that will maximize the joint probability of uncomplicated control of tumors suggests a new class of evolutionary experimental designs--Response Surface Methods--for clinical trials in radiation oncology. Methods and Materials: The software developed by Lesaffre and Molenberghs is used to construct bivariate probit models of the joint probability of uncomplicated control of cancer of the oropharynx from a set of 45 patients for each of whom the presence/absence of recurrent tumor (the binary event E-bar 1 /E 1 ) and the presence/absence of necrosis (the binary event E 2 /E-bar 2 ) of the normal tissues of the target volume is recorded, together with the treatment variables dose, time, and fractionation. Results: The bivariate probit model can be used to select a treatment regime that will give a specified probability, say P(S) = 0.60, of uncomplicated control of tumor by interpolation within a set of treatment regimes with known outcomes of recurrence and necrosis. The bivariate probit model can be used to guide a sequence of clinical trials to find the maximum probability of uncomplicated control of tumor for patients in a given prognostic stratum using Response Surface methods by extrapolation from an initial set of treatment regimens. Conclusions: The design of treatments for individual patients and the design of clinical trials might be improved by use of a bivariate probit model and Response Surface Methods

  19. Comparison of Six Methods for the Detection of Causality in a Bivariate Time Series

    Czech Academy of Sciences Publication Activity Database

    Krakovská, A.; Jakubík, J.; Chvosteková, M.; Coufal, David; Jajcay, Nikola; Paluš, Milan

    2018-01-01

    Roč. 97, č. 4 (2018), č. článku 042207. ISSN 2470-0045 R&D Projects: GA MZd(CZ) NV15-33250A Institutional support: RVO:67985807 Keywords : comparative study * causality detection * bivariate models * Granger causality * transfer entropy * convergent cross mappings Impact factor: 2.366, year: 2016 https://journals.aps.org/pre/abstract/10.1103/PhysRevE.97.042207

  20. Correlation analysis of milk production traits across three ...

    African Journals Online (AJOL)

    The relationship between milk production traits over whole lactations was evaluated across three generations of Simmental cows (between daughters, dams and granddams) by a corelation analysis with whole lactation traits in the daughter generation being used as the dependent variables (x1), and those in ...

  1. Dominating clasp of the financial sector revealed by partial correlation analysis of the stock market.

    Science.gov (United States)

    Kenett, Dror Y; Tumminello, Michele; Madi, Asaf; Gur-Gershgoren, Gitit; Mantegna, Rosario N; Ben-Jacob, Eshel

    2010-12-20

    What are the dominant stocks which drive the correlations present among stocks traded in a stock market? Can a correlation analysis provide an answer to this question? In the past, correlation based networks have been proposed as a tool to uncover the underlying backbone of the market. Correlation based networks represent the stocks and their relationships, which are then investigated using different network theory methodologies. Here we introduce a new concept to tackle the above question--the partial correlation network. Partial correlation is a measure of how the correlation between two variables, e.g., stock returns, is affected by a third variable. By using it we define a proxy of stock influence, which is then used to construct partial correlation networks. The empirical part of this study is performed on a specific financial system, namely the set of 300 highly capitalized stocks traded at the New York Stock Exchange, in the time period 2001-2003. By constructing the partial correlation network, unlike the case of standard correlation based networks, we find that stocks belonging to the financial sector and, in particular, to the investment services sub-sector, are the most influential stocks affecting the correlation profile of the system. Using a moving window analysis, we find that the strong influence of the financial stocks is conserved across time for the investigated trading period. Our findings shed a new light on the underlying mechanisms and driving forces controlling the correlation profile observed in a financial market.

  2. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying

    2017-01-18

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  3. Robust bivariate error detection in skewed data with application to historical radiosonde winds

    KAUST Repository

    Sun, Ying; Hering, Amanda S.; Browning, Joshua M.

    2017-01-01

    The global historical radiosonde archives date back to the 1920s and contain the only directly observed measurements of temperature, wind, and moisture in the upper atmosphere, but they contain many random errors. Most of the focus on cleaning these large datasets has been on temperatures, but winds are important inputs to climate models and in studies of wind climatology. The bivariate distribution of the wind vector does not have elliptical contours but is skewed and heavy-tailed, so we develop two methods for outlier detection based on the bivariate skew-t (BST) distribution, using either distance-based or contour-based approaches to flag observations as potential outliers. We develop a framework to robustly estimate the parameters of the BST and then show how the tuning parameter to get these estimates is chosen. In simulation, we compare our methods with one based on a bivariate normal distribution and a nonparametric approach based on the bagplot. We then apply all four methods to the winds observed for over 35,000 radiosonde launches at a single station and demonstrate differences in the number of observations flagged across eight pressure levels and through time. In this pilot study, the method based on the BST contours performs very well.

  4. Groundwater travel time uncertainty analysis: Sensitivity of results to model geometry, and correlations and cross correlations among input parameters

    International Nuclear Information System (INIS)

    Clifton, P.M.

    1984-12-01

    The deep basalt formations beneath the Hanford Site are being investigated for the Department of Energy (DOE) to assess their suitability as a host medium for a high level nuclear waste repository. Predicted performance of the proposed repository is an important part of the investigation. One of the performance measures being used to gauge the suitability of the host medium is pre-waste-emplacement groundwater travel times to the accessible environment. Many deterministic analyses of groundwater travel times have been completed by Rockwell and other independent organizations. Recently, Rockwell has completed a preliminary stochastic analysis of groundwater travel times. This document presents analyses that show the sensitivity of the results from the previous stochastic travel time study to: (1) scale of representation of model parameters, (2) size of the model domain, (3) correlation range of log-transmissivity, and (4) cross-correlation between transmissivity and effective thickness. 40 refs., 29 figs., 6 tabs

  5. Data analysis of backscattering LIDAR system correlated with meteorological data

    International Nuclear Information System (INIS)

    Uehara, Sandro Toshio

    2009-01-01

    In these last years, we had an increase in the interest in the monitoring of the effect of the human activity being on the atmosphere and the climate in the planet. The remote sensing techniques has been used in many studies, also related the global changes. A backscattering LIDAR system, the first of this kind in Brazil, has been used to provide the vertical profile of the aerosol backscatter coefficient at 532 nm up to an altitude of 4-6 km above sea level. In this study, data has was collected in the year of 2005. These data had been correlated with data of solar photometer CIMEL and also with meteorological data. The main results had indicated to exist a standard in the behavior of these meteorological data and the vertical distribution of the extinction coefficient gotten through LIDAR. In favorable periods of atmospheric dispersion, that is, rise of the temperature of associated air the fall of relative humidity, increase of the atmospheric pressure and low ventilation tax, was possible to determine with good precision the height of the Planetary Boundary Layer, as much through the vertical profile of the extinction coefficient how much through the technique of the vertical profile of the potential temperature. The technique LIDAR showed to be an important tool in the determination of the thermodynamic structure of the atmosphere, assisting to characterize the evolution of the CLP throughout the day, which had its good space and secular resolution. (author)

  6. Analysis of three particle correlations with the INDRA detector

    International Nuclear Information System (INIS)

    Rahmani, A.; Eudes, Ph.; Lautridou, P.; Lebrun, C.; Reposeur, T.

    1997-01-01

    In the framework of the study of light particle production with the INDRA detector, we have analysed the invariant mass distribution of three particles produced in the Xe + Sn collisions at 50 A.MeV making use of an original interferometric method which offers the possibilities to access the intrinsic parameters of intermediate 'resonances' created during the nuclear collisions. By analyzing the correlations of (α,α,α) it was possible to make evident a signal equivalent to that from 12 C. The study of this signal allows: - to estimate the production rate of αs coming from the 12 C * decay; - accordingly, to introduce a correction for α multiplicity measured by INDRA; - to extract the temperature of the emitting fragment ( 12 C * ); to establish the sequential or direct decay mode of the emitting fragments ( 12 C * → α + 8 Be → α + α + α or 12 C * → α + α + α). Thus, the measured signal is an apparent consequence of the occurrence of the intermediate fragments excited in a metastable state from which the particles are emitted. The emission rate of the α particles coming from the decay of these fragments is estimated to several percents (< 10 %)

  7. Correlation analysis for forced vibration test of the Hualien large scale seismic test (LSST) program

    International Nuclear Information System (INIS)

    Sugawara, Y.; Sugiyama, T.; Kobayashi, T.; Yamaya, H.; Kitamura, E.

    1995-01-01

    The correlation analysis for a forced vibration test of a 1/4-scale containment SSI test model constructed in Hualien, Taiwan was carried out for the case of after backfilling. Prior to this correlation analysis, the structural properties were revised to adjust the calculated fundamental frequency in the fixed base condition to that derived from the test results. A correlation analysis was carried out using the Lattice Model which was able to estimate the soil-structure effects with embedment. The analysis results coincide well with test results and it is concluded that the mathematical soil-structure interaction model established by the correlation analysis is efficient in estimating the dynamic soil-structure interaction effect with embedment. This mathematical model will be applied as a basic model for simulation analysis of earthquake observation records. (author). 3 refs., 12 figs., 2 tabs

  8. NDVI and Panchromatic Image Correlation Using Texture Analysis

    Science.gov (United States)

    2010-03-01

    6 Figure 5. Spectral reflectance of vegetation and soil from 0.4 to 1.1 mm (From Perry...should help the classification methods to be able to classify kelp. Figure 5. Spectral reflectance of vegetation and soil from 0.4 to 1.1 mm...1988). Image processing software for imaging spectrometry analysis. Remote Sensing of Enviroment , 24: 201–210. Perry, C., & Lautenschlager, L. F

  9. Generalization of proposed tendon friction correlation and its application to PCCV structural analysis

    International Nuclear Information System (INIS)

    Kashiwase, Takako; Nagasaka, Hideo

    2000-01-01

    The present paper dealt with the extension of tendon friction coefficient correlation as a function of loading end load and circumferential angle, proposed in the former paper. The extended correlation further included the effects of the number of strands contacted with sheath, tendon diameter, politicization of tendon and tendon local curvature. The validity of the correlation was confirmed by several published measured data. The structural analysis of middle cylinder part of 1/4 PCCV (Prestressed Concrete Containment Vessel) model was conducted using the present friction coefficient correlation. The results were compared with the analysis using constant friction coefficient, focused on the tendon tension force distribution. (author)

  10. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    Science.gov (United States)

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  11. Analysis of the influences of thermal correlations on neutronic–thermohydraulic coupling calculation of SCWR

    International Nuclear Information System (INIS)

    Xu, Weifeng; Cai, Jiejin; Liu, Shichang; Tang, Qi

    2015-01-01

    Highlights: • Different thermal correlations for supercritical water are summarized. • Influences of thermal correlations on neutronic–thermohydraulic coupling calculation are analyzed. • Sensitivity analysis has been done for the thermal correlations. - Abstract: The neutronic–thermohydraulic coupling (N–T coupling) calculation is important on core design, security and stability analysis of supercritical water-coolant reactor (SCWR), and a suitable thermal correlation is also necessary for the N–T coupling calculation. In this paper, the scheme of the U.S. SCWR design and the process of the N–T coupling will be introduced as well as some of different thermal correlations firstly. Then, based on the N–T coupling system ARNT, the U.S. SCWR design is simulated to analyze the influences of thermal correlations on N–T coupling calculation of SCWR so as to find out which correlation is best. The result shows that all thermal correlations are suitable. However, using different correlations for calculation leads to a great difference in safety margin of SCWR. What's more, the Bishop and Jackson correlations are more suitable and conservative, but the Griem correlation is not very precise. And the effect of buoyancy lift makes little influence on the calculation of heat transfer of SCWR. This research is also of great significance for the further study of N–T coupling of SCWR

  12. The use of bivariate spatial modeling of questionnaire and parasitology data to predict the distribution of Schistosoma haematobium in Coastal Kenya.

    Directory of Open Access Journals (Sweden)

    Hugh J W Sturrock

    Full Text Available Questionnaires of reported blood in urine (BIU distributed through the existing school system provide a rapid and reliable method to classify schools according to the prevalence of Schistosoma haematobium, thereby helping in the targeting of schistosomiasis control. However, not all schools return questionnaires and it is unclear whether treatment is warranted in such schools. This study investigates the use of bivariate spatial modelling of available and multiple data sources to predict the prevalence of S. haematobium at every school along the Kenyan coast.Data from a questionnaire survey conducted by the Kenya Ministry of Education in Coast Province in 2009 were combined with available parasitological and environmental data in a Bayesian bivariate spatial model. This modeled the relationship between BIU data and environmental covariates, as well as the relationship between BIU and S. haematobium infection prevalence, to predict S. haematobium infection prevalence at all schools in the study region. Validation procedures were implemented to assess the predictive accuracy of endemicity classification.The prevalence of BIU was negatively correlated with distance to nearest river and there was considerable residual spatial correlation at small (~15 km spatial scales. There was a predictable relationship between the prevalence of reported BIU and S. haematobium infection. The final model exhibited excellent sensitivity (0.94 but moderate specificity (0.69 in identifying low (<10% prevalence schools, and had poor performance in differentiating between moderate and high prevalence schools (sensitivity 0.5, specificity 1.Schistosomiasis is highly focal and there is a need to target treatment on a school-by-school basis. The use of bivariate spatial modelling can supplement questionnaire data to identify schools requiring mass treatment, but is unable to distinguish between moderate and high prevalence schools.

  13. Using beta coefficients to impute missing correlations in meta-analysis research: Reasons for caution.

    Science.gov (United States)

    Roth, Philip L; Le, Huy; Oh, In-Sue; Van Iddekinge, Chad H; Bobko, Philip

    2018-06-01

    Meta-analysis has become a well-accepted method for synthesizing empirical research about a given phenomenon. Many meta-analyses focus on synthesizing correlations across primary studies, but some primary studies do not report correlations. Peterson and Brown (2005) suggested that researchers could use standardized regression weights (i.e., beta coefficients) to impute missing correlations. Indeed, their beta estimation procedures (BEPs) have been used in meta-analyses in a wide variety of fields. In this study, the authors evaluated the accuracy of BEPs in meta-analysis. We first examined how use of BEPs might affect results from a published meta-analysis. We then developed a series of Monte Carlo simulations that systematically compared the use of existing correlations (that were not missing) to data sets that incorporated BEPs (that impute missing correlations from corresponding beta coefficients). These simulations estimated ρ̄ (mean population correlation) and SDρ (true standard deviation) across a variety of meta-analytic conditions. Results from both the existing meta-analysis and the Monte Carlo simulations revealed that BEPs were associated with potentially large biases when estimating ρ̄ and even larger biases when estimating SDρ. Using only existing correlations often substantially outperformed use of BEPs and virtually never performed worse than BEPs. Overall, the authors urge a return to the standard practice of using only existing correlations in meta-analysis. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. DNA microarray data and contextual analysis of correlation graphs

    Directory of Open Access Journals (Sweden)

    Hingamp Pascal

    2003-04-01

    Full Text Available Abstract Background DNA microarrays are used to produce large sets of expression measurements from which specific biological information is sought. Their analysis requires efficient and reliable algorithms for dimensional reduction, classification and annotation. Results We study networks of co-expressed genes obtained from DNA microarray experiments. The mathematical concept of curvature on graphs is used to group genes or samples into clusters to which relevant gene or sample annotations are automatically assigned. Application to publicly available yeast and human lymphoma data demonstrates the reliability of the method in spite of its simplicity, especially with respect to the small number of parameters involved. Conclusions We provide a method for automatically determining relevant gene clusters among the many genes monitored with microarrays. The automatic annotations and the graphical interface improve the readability of the data. A C++ implementation, called Trixy, is available from http://tagc.univ-mrs.fr/bioinformatics/trixy.html.

  15. Multidimensional correlation among plan complexity, quality and deliverability parameters for volumetric-modulated arc therapy using canonical correlation analysis.

    Science.gov (United States)

    Shen, Lanxiao; Chen, Shan; Zhu, Xiaoyang; Han, Ce; Zheng, Xiaomin; Deng, Zhenxiang; Zhou, Yongqiang; Gong, Changfei; Xie, Congying; Jin, Xiance

    2018-03-01

    A multidimensional exploratory statistical method, canonical correlation analysis (CCA), was applied to evaluate the impact of complexity parameters on the plan quality and deliverability of volumetric-modulated arc therapy (VMAT) and to determine parameters in the generation of an ideal VMAT plan. Canonical correlations among complexity, quality and deliverability parameters of VMAT, as well as the contribution weights of different parameters were investigated with 71 two-arc VMAT nasopharyngeal cancer (NPC) patients, and further verified with 28 one-arc VMAT prostate cancer patients. The average MU and MU per control point (MU/CP) for two-arc VMAT plans were 702.6 ± 55.7 and 3.9 ± 0.3 versus 504.6 ± 99.2 and 5.6 ± 1.1 for one-arc VMAT plans, respectively. The individual volume-based 3D gamma passing rates of clinical target volume (γCTV) and planning target volume (γPTV) for NPC and prostate cancer patients were 85.7% ± 9.0% vs 92.6% ± 7.8%, and 88.0% ± 7.6% vs 91.2% ± 7.7%, respectively. Plan complexity parameters of NPC patients were correlated with plan quality (P = 0.047) and individual volume-based 3D gamma indices γ(IV) (P = 0.01), in which, MU/CP and segment area (SA) per control point (SA/CP) were weighted highly in correlation with γ(IV) , and SA/CP, percentage of CPs with SA plan quality with coefficients of 0.98, 0.68 and -0.99, respectively. Further verification with one-arc VMAT plans demonstrated similar results. In conclusion, MU, SA-related parameters and PTV volume were found to have strong effects on the plan quality and deliverability.

  16. Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient.

    Science.gov (United States)

    Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J

    2008-06-18

    Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. This study shows that SCC is an alternative to the Pearson

  17. Authentication of reprocessing plant safeguards data through correlation analysis

    International Nuclear Information System (INIS)

    Burr, T.L.; Wangen, L.E.; Mullen, M.F.

    1995-04-01

    This report investigates the feasibility and benefits of two new approaches to the analysis of safeguards data from reprocessing plants. Both approaches involve some level of plant modeling. All models involve some form of mass balance, either applied in the usual way that leads to material balances for individual process vessels at discrete times or applied by accounting for pipe flow rates that leads to material balances for individual process vessels at continuous times. In the first case, material balances are computed after each tank-to-tank transfer. In the second case, material balances can be computed at any desired time. The two approaches can be described as follows. The first approach considers the application of a new multivariate sequential test. The test statistic is a scalar, but the monitored residual is a vector. The second approach considers the application of recent nonlinear time series methods for the purpose of empirically building a model for the expected magnitude of a material balance or other scalar variable. Although the report restricts attention to monitoring scalar time series, the methodology can be extended to vector time series

  18. Correlation between videogame mechanics and executive functions through EEG analysis.

    Science.gov (United States)

    Mondéjar, Tania; Hervás, Ramón; Johnson, Esperanza; Gutierrez, Carlos; Latorre, José Miguel

    2016-10-01

    This paper addresses a different point of view of videogames, specifically serious games for health. This paper contributes to that area with a multidisciplinary perspective focus on neurosciences and computation. The experiment population has been pre-adolescents between the ages of 8 and 12 without any cognitive issues. The experiment consisted in users playing videogames as well as performing traditional psychological assessments; during these tasks the frontal brain activity was evaluated. The main goal was to analyse how the frontal lobe of the brain (executive function) works in terms of prominent cognitive skills during five types of game mechanics widely used in commercial videogames. The analysis was made by collecting brain signals during the two phases of the experiment, where the signals were analysed with an electroencephalogram neuroheadset. The validated hypotheses were whether videogames can develop executive functioning and if it was possible to identify which kind of cognitive skills are developed during each kind of typical videogame mechanic. The results contribute to the design of serious games for health purposes on a conceptual level, particularly in support of the diagnosis and treatment of cognitive-related pathologies. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Time-correlated neutron analysis of a multiplying HEU source

    International Nuclear Information System (INIS)

    Miller, E.C.; Kalter, J.M.; Lavelle, C.M.; Watson, S.M.; Kinlaw, M.T.; Chichester, D.L.; Noonan, W.A.

    2015-01-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3 He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations

  20. Time-correlated neutron analysis of a multiplying HEU source

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.C., E-mail: Eric.Miller@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Kalter, J.M.; Lavelle, C.M. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Watson, S.M.; Kinlaw, M.T.; Chichester, D.L. [Idaho National Laboratory, Idaho Falls, ID (United States); Noonan, W.A. [Johns Hopkins University Applied Physics Laboratory, Laurel, MD (United States)

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated {sup 3}He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  1. Time-correlated neutron analysis of a multiplying HEU source

    Science.gov (United States)

    Miller, E. C.; Kalter, J. M.; Lavelle, C. M.; Watson, S. M.; Kinlaw, M. T.; Chichester, D. L.; Noonan, W. A.

    2015-06-01

    The ability to quickly identify and characterize special nuclear material remains a national security challenge. In counter-proliferation applications, identifying the neutron multiplication of a sample can be a good indication of the level of threat. Currently neutron multiplicity measurements are performed with moderated 3He proportional counters. These systems rely on the detection of thermalized neutrons, a process which obscures both energy and time information from the source. Fast neutron detectors, such as liquid scintillators, have the ability to detect events on nanosecond time scales, providing more information on the temporal structure of the arriving signal, and provide an alternative method for extracting information from the source. To explore this possibility, a series of measurements were performed on the Idaho National Laboratory's MARVEL assembly, a configurable HEU source. The source assembly was measured in a variety of different HEU configurations and with different reflectors, covering a range of neutron multiplications from 2 to 8. The data was collected with liquid scintillator detectors and digitized for offline analysis. A gap based approach for identifying the bursts of detected neutrons associated with the same fission chain was used. Using this approach, we are able to study various statistical properties of individual fission chains. One of these properties is the distribution of neutron arrival times within a given burst. We have observed two interesting empirical trends. First, this distribution exhibits a weak, but definite, dependence on source multiplication. Second, there are distinctive differences in the distribution depending on the presence and type of reflector. Both of these phenomena might prove to be useful when assessing an unknown source. The physical origins of these phenomena can be illuminated with help of MCNPX-PoliMi simulations.

  2. Structural Analysis of Correlated Factors: Lessons from the Verbal-Performance Dichotomy of the Wechsler Scales.

    Science.gov (United States)

    Macmann, Gregg M.; Barnett, David W.

    1994-01-01

    Describes exploratory and confirmatory analyses of verbal-performance procedures to illustrate concepts and procedures for analysis of correlated factors. Argues that, based on convergent and discriminant validity criteria, factors should have higher correlations with variables that they purport to measure than with other variables. Discusses…

  3. L2 Reading Comprehension and Its Correlates: A Meta-Analysis

    Science.gov (United States)

    Jeon, Eun Hee; Yamashita, Junko

    2014-01-01

    The present meta-analysis examined the overall average correlation (weighted for sample size and corrected for measurement error) between passage-level second language (L2) reading comprehension and 10 key reading component variables investigated in the research domain. Four high-evidence correlates (with 18 or more accumulated effect sizes: L2…

  4. Structure-constrained sparse canonical correlation analysis with an application to microbiome data analysis.

    Science.gov (United States)

    Chen, Jun; Bushman, Frederic D; Lewis, James D; Wu, Gary D; Li, Hongzhe

    2013-04-01

    Motivated by studying the association between nutrient intake and human gut microbiome composition, we developed a method for structure-constrained sparse canonical correlation analysis (ssCCA) in a high-dimensional setting. ssCCA takes into account the phylogenetic relationships among bacteria, which provides important prior knowledge on evolutionary relationships among bacterial taxa. Our ssCCA formulation utilizes a phylogenetic structure-constrained penalty function to impose certain smoothness on the linear coefficients according to the phylogenetic relationships among the taxa. An efficient coordinate descent algorithm is developed for optimization. A human gut microbiome data set is used to illustrate this method. Both simulations and real data applications show that ssCCA performs better than the standard sparse CCA in identifying meaningful variables when there are structures in the data.

  5. Climate Prediction Center (CPC)Ensemble Canonical Correlation Analysis 90-Day Seasonal Forecast of Precipitation

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ensemble Canonical Correlation Analysis (ECCA) precipitation forecast is a 90-day (seasonal) outlook of US surface precipitation anomalies. The ECCA uses...

  6. Sparse canonical correlation analysis for identifying, connecting and completing gene-expression networks

    NARCIS (Netherlands)

    Waaijenborg, S.; Zwinderman, A.H.

    2009-01-01

    ABSTRACT: BACKGROUND: We generalized penalized canonical correlation analysis for analyzing microarray gene-expression measurements for checking completeness of known metabolic pathways and identifying candidate genes for incorporation in the pathway. We used Wold's method for calculation of the

  7. Study of relationship between MUF correlation and detection sensitivity of statistical analysis

    International Nuclear Information System (INIS)

    Tamura, Toshiaki; Ihara, Hitoshi; Yamamoto, Yoichi; Ikawa, Koji

    1989-11-01

    Various kinds of statistical analysis are proposed to NRTA (Near Real Time Materials Accountancy) which was devised to satisfy the timeliness goal of one of the detection goals of IAEA. It will be presumed that different statistical analysis results will occur between the case of considered rigorous error propagation (with MUF correlation) and the case of simplified error propagation (without MUF correlation). Therefore, measurement simulation and decision analysis were done using flow simulation of 800 MTHM/Y model reprocessing plant, and relationship between MUF correlation and detection sensitivity and false alarm of statistical analysis was studied. Specific character of material accountancy for 800 MTHM/Y model reprocessing plant was grasped by this simulation. It also became clear that MUF correlation decreases not only false alarm but also detection probability for protracted loss in case of CUMUF test and Page's test applied to NRTA. (author)

  8. Climate Prediction Center(CPC)Ensemble Canonical Correlation Analysis Forecast of Temperature

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ensemble Canonical Correlation Analysis (ECCA) temperature forecast is a 90-day (seasonal) outlook of US surface temperature anomalies. The ECCA uses Canonical...

  9. Serum adiponectin levels are inversely correlated with leukemia: A meta-analysis

    Directory of Open Access Journals (Sweden)

    Jun-Jie Ma

    2016-01-01

    Conclusion: Our meta-analysis suggested that serum ADPN levels may be inversely correlated with leukemia, and ADPN levels can be used as an effective biologic marker in early diagnosis and therapeutic monitoring of leukemia.

  10. Within-Subject Correlation Analysis to Detect Functional Areas Associated With Response Inhibition

    Directory of Open Access Journals (Sweden)

    Tomoko Yamasaki

    2018-05-01

    Full Text Available Functional areas in fMRI studies are often detected by brain-behavior correlation, calculating across-subject correlation between the behavioral index and the brain activity related to a function of interest. Within-subject correlation analysis is also employed in a single subject level, which utilizes cognitive fluctuations in a shorter time period by correlating the behavioral index with the brain activity across trials. In the present study, the within-subject analysis was applied to the stop-signal task, a standard task to probe response inhibition, where efficiency of response inhibition can be evaluated by the stop-signal reaction time (SSRT. Since the SSRT is estimated, by definition, not in a trial basis but from pooled trials, the correlation across runs was calculated between the SSRT and the brain activity related to response inhibition. The within-subject correlation revealed negative correlations in the anterior cingulate cortex and the cerebellum. Moreover, the dissociation pattern was observed in the within-subject analysis when earlier vs. later parts of the runs were analyzed: negative correlation was dominant in earlier runs, whereas positive correlation was dominant in later runs. Regions of interest analyses revealed that the negative correlation in the anterior cingulate cortex, but not in the cerebellum, was dominant in earlier runs, suggesting multiple mechanisms associated with inhibitory processes that fluctuate on a run-by-run basis. These results indicate that the within-subject analysis compliments the across-subject analysis by highlighting different aspects of cognitive/affective processes related to response inhibition.

  11. Econometric analysis of realised covariation: high frequency covariance, regression and correlation in financial economics

    OpenAIRE

    Ole E. Barndorff-Nielsen; Neil Shephard

    2002-01-01

    This paper analyses multivariate high frequency financial data using realised covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis and covariance. It will be based on a fixed interval of time (e.g. a day or week), allowing the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions and covariances change through time. In particular w...

  12. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Science.gov (United States)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  13. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    Directory of Open Access Journals (Sweden)

    Jinchao Feng

    2018-03-01

    Full Text Available We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data. The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  14. Non-linear canonical correlation for joint analysis of MEG signals from two subjects

    Directory of Open Access Journals (Sweden)

    Cristina eCampi

    2013-06-01

    Full Text Available We consider the problem of analysing magnetoencephalography (MEG data measured from two persons undergoing the same experiment, and we propose a method that searches for sources with maximally correlated energies. Our method is based on canonical correlation analysis (CCA, which provides linear transformations, one for each subject, such that the correlation between the transformed MEG signals is maximized. Here, we present a nonlinear version of CCA which measures the correlation of energies. Furthermore, we introduce a delay parameter in the modelto analyse, e.g., leader-follower changes in experiments where the two subjects are engaged in social interaction.

  15. Assessing characteristics related to the use of seatbelts and cell phones by drivers: application of a bivariate probit model.

    Science.gov (United States)

    Russo, Brendan J; Kay, Jonathan J; Savolainen, Peter T; Gates, Timothy J

    2014-06-01

    The effects of cell phone use and safety belt use have been an important focus of research related to driver safety. Cell phone use has been shown to be a significant source of driver distraction contributing to substantial degradations in driver performance, while safety belts have been demonstrated to play a vital role in mitigating injuries to crash-involved occupants. This study examines the prevalence of cell phone use and safety belt non-use among the driving population through direct observation surveys. A bivariate probit model is developed to simultaneously examine the factors that affect cell phone and safety belt use among motor vehicle drivers. The results show that several factors may influence drivers' decision to use cell phones and safety belts, and that these decisions are correlated. Understanding the factors that affect both cell phone use and safety belt non-use is essential to targeting policy and programs that reduce such behavior. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Spatio-chromatic adaptation via higher-order canonical correlation analysis of natural images.

    Science.gov (United States)

    Gutmann, Michael U; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús

    2014-01-01

    Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation.

  17. A COMPARISON OF SOME ROBUST BIVARIATE CONTROL CHARTS FOR INDIVIDUAL OBSERVATIONS

    Directory of Open Access Journals (Sweden)

    Moustafa Omar Ahmed Abu - Shawiesh

    2014-06-01

    Full Text Available This paper proposed and considered some bivariate control charts to monitor individual observations from a statistical process control. Usual control charts which use mean and variance-covariance estimators are sensitive to outliers. We consider the following robust alternatives to the classical Hoteling's T2: T2MedMAD, T2MCD, T2MVE a simulation study has been conducted to compare the performance of these control charts. Two real life data are analyzed to illustrate the application of these robust alternatives.

  18. [Correlation analysis of major agronomic characters and the polysaccharide contents in Dendrobium officinale].

    Science.gov (United States)

    Zhang, Lei; Zheng, Xi-Long; Qiu, Dao-Shou; Cai, Shi-Ke; Luo, Huan-Ming; Deng, Rui-Yun; Liu, Xiao-Jin

    2013-10-01

    In order to provide theoretical and technological basis for the germplasm innovation and variety breeding in Dendrobium officinale, a study of the correlation between polysaccharide content and agronomic characters was conducted. Based on the polysaccharide content determination and the agronomic characters investigation of 30 copies (110 individual plants) of Dendrobium officinale germplasm resources, the correlation between polysaccharide content and agronomic characters was analyzed via path and correlation analysis. Correlation analysis results showed that there was a significant negative correlation between average spacing and polysaccharide content, the correlation coefficient was -0.695. And the blade thickness was positively correlated with the polysaccharide content, but the correlation was not significant. The path analysis results showed that the stem length was the maximum influence factor to the polysaccharide, and it was positive effect, the direct path coefficient was 1.568. According to thess results, the polysaccharide content can be easily and intuitively estimated by the agronomic characters investigating data in the germpalsm resources screening and variety breeding. Therefore, it is a visual and practical technology guidance in quality variety breeding of Dendrobium officinale.

  19. Detrended cross-correlation analysis on RMB exchange rate and Hang Seng China Enterprises Index

    Science.gov (United States)

    Ruan, Qingsong; Yang, Bingchan; Ma, Guofeng

    2017-02-01

    In this paper, we investigate the cross-correlations between the Hang Seng China Enterprises Index and RMB exchange markets on the basis of a cross-correlation statistic test and multifractal detrended cross-correlation analysis (MF-DCCA). MF-DCCA has, at best, serious limitations for most of the signals describing complex natural processes and often indicates multifractal cross-correlations when there are none. In order to prevent these false multifractal cross-correlations, we apply MFCCA to verify the cross-correlations. Qualitatively, we find that the return series of the Hang Seng China Enterprises Index and RMB exchange markets were, overall, significantly cross-correlated based on the statistical analysis. Quantitatively, we find that the cross-correlations between the stock index and RMB exchange markets were strongly multifractal, and the multifractal degree of the onshore RMB exchange markets was somewhat larger than the offshore RMB exchange markets. Moreover, we use the absolute return series to investigate and confirm the fact of multifractality. The results from the rolling windows show that the short-term cross-correlations between volatility series remain high.

  20. Irregular Liesegang-type patterns in gas phase revisited. II. Statistical correlation analysis

    Science.gov (United States)

    Torres-Guzmán, José C.; Martínez-Mekler, Gustavo; Müller, Markus F.

    2016-05-01

    We present a statistical analysis of Liesegang-type patterns formed in a gaseous HCl-NH3 system by ammonium chloride precipitation along glass tubes, as described in Paper I [J. C. Torres-Guzmán et al., J. Chem. Phys. 144, 174701 (2016)] of this work. We focus on the detection and characterization of short and long-range correlations within the non-stationary sequence of apparently irregular precipitation bands. To this end we applied several techniques to estimate spatial correlations stemming from different fields, namely, linear auto-correlation via the power spectral density, detrended fluctuation analysis (DFA), and methods developed in the context of random matrix theory (RMT). In particular RMT methods disclose well pronounced long-range correlations over at least 40 bands in terms of both, band positions and intensity values. By using a variant of the DFA we furnish proof of the nonlinear nature of the detected long-range correlations.

  1. Statistical analysis of latent generalized correlation matrix estimation in transelliptical distribution.

    Science.gov (United States)

    Han, Fang; Liu, Han

    2017-02-01

    Correlation matrix plays a key role in many multivariate methods (e.g., graphical model estimation and factor analysis). The current state-of-the-art in estimating large correlation matrices focuses on the use of Pearson's sample correlation matrix. Although Pearson's sample correlation matrix enjoys various good properties under Gaussian models, its not an effective estimator when facing heavy-tail distributions with possible outliers. As a robust alternative, Han and Liu (2013b) advocated the use of a transformed version of the Kendall's tau sample correlation matrix in estimating high dimensional latent generalized correlation matrix under the transelliptical distribution family (or elliptical copula). The transelliptical family assumes that after unspecified marginal monotone transformations, the data follow an elliptical distribution. In this paper, we study the theoretical properties of the Kendall's tau sample correlation matrix and its transformed version proposed in Han and Liu (2013b) for estimating the population Kendall's tau correlation matrix and the latent Pearson's correlation matrix under both spectral and restricted spectral norms. With regard to the spectral norm, we highlight the role of "effective rank" in quantifying the rate of convergence. With regard to the restricted spectral norm, we for the first time present a "sign subgaussian condition" which is sufficient to guarantee that the rank-based correlation matrix estimator attains the optimal rate of convergence. In both cases, we do not need any moment condition.

  2. CORRELATIONS BETWEEN FINDINGS OF OCCLUSAL AND MANUAL ANALYSIS IN TMD-PATIENTS

    Directory of Open Access Journals (Sweden)

    Mariana Dimova

    2016-08-01

    Full Text Available The aim of this study was to investigate and analyze the possible correlations between findings by manual functional analysis and clinical occlusal analysis in TMD-patients. Material and methods: Material of this study are 111 TMD-patients selected after visual diagnostics, functional brief review under Ahlers Jakstatt, intraoral examination and taking periodontal status. In the period September 2014 - March 2016 all patients were subjected to manual functional analysis and clinical occlusal analysis. 17 people (10 women and 7 men underwent imaging with cone-beam computed tomography. Results: There were found many statistically significant correlations between tests of the structural analysis that indicate the relationships between findings. Conclusion: The presence of statistically significant correlations between occlusal relationships, freedom in the centric and condition of the muscle complex of masticatory system and TMJ confirm the relationship between the state of occlusal components and TMD.

  3. Cross-correlation time-of-flight analysis of molecular beam scattering

    International Nuclear Information System (INIS)

    Nowikow, C.V.; Grice, R.

    1979-01-01

    The theory of the cross-correlation method of time-of-flight analysis is presented in a form which highlights its formal similarity to the conventional method. A time-of-flight system for the analysis of crossed molecular beam scattering is described, which is based on a minicomputer interface and can operate in both the cross-correlation and conventional modes. The interface maintains the synchronisation of chopper disc rotation and channel advance indefinitely in the cross-correlation method and can acquire data in phase with the beam modulation in both methods. The shutter function of the cross-correlation method is determined and the deconvolution analysis of the data is discussed. (author)

  4. ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.

    Science.gov (United States)

    Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey

    2018-06-21

    Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.

  5. Non-Normality and Testing that a Correlation Equals Zero

    Science.gov (United States)

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  6. Geovisualization of land use and land cover using bivariate maps and Sankey flow diagrams

    Science.gov (United States)

    Strode, Georgianna; Mesev, Victor; Thornton, Benjamin; Jerez, Marjorie; Tricarico, Thomas; McAlear, Tyler

    2018-05-01

    The terms `land use' and `land cover' typically describe categories that convey information about the landscape. Despite the major difference of land use implying some degree of anthropogenic disturbance, the two terms are commonly used interchangeably, especially when anthropogenic disturbance is ambiguous, say managed forestland or abandoned agricultural fields. Cartographically, land use and land cover are also sometimes represented interchangeably within common legends, giving with the impression that the landscape is a seamless continuum of land use parcels spatially adjacent to land cover tracts. We believe this is misleading, and feel we need to reiterate the well-established symbiosis of land uses as amalgams of land covers; in other words land covers are subsets of land use. Our paper addresses this spatially complex, and frequently ambiguous relationship, and posits that bivariate cartographic techniques are an ideal vehicle for representing both land use and land cover simultaneously. In more specific terms, we explore the use of nested symbology as ways to represent graphically land use and land cover, where land cover are circles nested with land use squares. We also investigate bivariate legends for representing statistical covariance as a means for visualizing the combinations of land use and cover. Lastly, we apply Sankey flow diagrams to further illustrate the complex, multifaceted relationships between land use and land cover. Our work is demonstrated on data representing land use and cover data for the US state of Florida.

  7. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    Science.gov (United States)

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  8. Probabilistic modeling using bivariate normal distributions for identification of flow and displacement intervals in longwall overburden

    Energy Technology Data Exchange (ETDEWEB)

    Karacan, C.O.; Goodman, G.V.R. [NIOSH, Pittsburgh, PA (United States). Off Mine Safety & Health Research

    2011-01-15

    Gob gas ventholes (GGV) are used to control methane emissions in longwall mines by capturing it within the overlying fractured strata before it enters the work environment. In order for GGVs to effectively capture more methane and less mine air, the length of the slotted sections and their proximity to top of the coal bed should be designed based on the potential gas sources and their locations, as well as the displacements in the overburden that will create potential flow paths for the gas. In this paper, an approach to determine the conditional probabilities of depth-displacement, depth-flow percentage, depth-formation and depth-gas content of the formations was developed using bivariate normal distributions. The flow percentage, displacement and formation data as a function of distance from coal bed used in this study were obtained from a series of borehole experiments contracted by the former US Bureau of Mines as part of a research project. Each of these parameters was tested for normality and was modeled using bivariate normal distributions to determine all tail probabilities. In addition, the probability of coal bed gas content as a function of depth was determined using the same techniques. The tail probabilities at various depths were used to calculate conditional probabilities for each of the parameters. The conditional probabilities predicted for various values of the critical parameters can be used with the measurements of flow and methane percentage at gob gas ventholes to optimize their performance.

  9. A bivariate space-time downscaler under space and time misalignment.

    Science.gov (United States)

    Berrocal, Veronica J; Gelfand, Alan E; Holland, David M

    2010-12-01

    Ozone and particulate matter PM(2.5) are co-pollutants that have long been associated with increased public health risks. Information on concentration levels for both pollutants come from two sources: monitoring sites and output from complex numerical models that produce concentration surfaces over large spatial regions. In this paper, we offer a fully-model based approach for fusing these two sources of information for the pair of co-pollutants which is computationally feasible over large spatial regions and long periods of time. Due to the association between concentration levels of the two environmental contaminants, it is expected that information regarding one will help to improve prediction of the other. Misalignment is an obvious issue since the monitoring networks for the two contaminants only partly intersect and because the collection rate for PM(2.5) is typically less frequent than that for ozone.Extending previous work in Berrocal et al. (2009), we introduce a bivariate downscaler that provides a flexible class of bivariate space-time assimilation models. We discuss computational issues for model fitting and analyze a dataset for ozone and PM(2.5) for the ozone season during year 2002. We show a modest improvement in predictive performance, not surprising in a setting where we can anticipate only a small gain.

  10. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    Energy Technology Data Exchange (ETDEWEB)

    McCabe, E.R.B.; Towbin, J.A. (Baylor College of Medicine, Houston, TX (United States)); Engh, G. van den; Trask, B.J. (Lawrence Livermore National Lab., CA (United States))

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  11. Cross-Correlations between Energy and Emissions Markets: New Evidence from Fractal and Multifractal Analysis

    Directory of Open Access Journals (Sweden)

    Gang-Jin Wang

    2014-01-01

    Full Text Available We supply a new perspective to describe and understand the behavior of cross-correlations between energy and emissions markets. Namely, we investigate cross-correlations between oil and gas (Oil-Gas, oil and CO2 (Oil-CO2, and gas and CO2 (Gas-CO2 based on fractal and multifractal analysis. We focus our study on returns of the oil, gas, and CO2 during the period of April 22, 2005–April 30, 2013. In the empirical analysis, by using the detrended cross-correlation analysis (DCCA method, we find that cross-correlations for Oil-Gas, Oil-CO2, and Gas-CO2 obey a power-law and are weakly persistent. Then, we adopt the method of DCCA cross-correlation coefficient to quantify cross-correlations between energy and emissions markets. The results show that their cross-correlations are diverse at different time scales. Next, based on the multifractal DCCA method, we find that cross-correlated markets have the nonlinear and multifractal nature and that the multifractality strength for three cross-correlated markets is arranged in the order of Gas-CO2 > Oil-Gas > Oil-CO2. Finally, by employing the rolling windows method, which can be used to investigate time-varying cross-correlation scaling exponents, we analyze short-term and long-term market dynamics and find that the recent global financial crisis has a notable influence on short-term and long-term market dynamics.

  12. Relationship between climatic variables and the variation in bulk tank milk composition using canonical correlation analysis.

    Science.gov (United States)

    Stürmer, Morgana; Busanello, Marcos; Velho, João Pedro; Heck, Vanessa Isabel; Haygert-Velho, Ione Maria Pereira

    2018-06-04

    A number of studies have addressed the relations between climatic variables and milk composition, but these works used univariate statistical approaches. In our study, we used a multivariate approach (canonical correlation) to study the impact of climatic variables on milk composition, price, and monthly milk production at a dairy farm using bulk tank milk data. Data on milk composition, price, and monthly milk production were obtained from a dairy company that purchased the milk from the farm, while climatic variable data were obtained from the National Institute of Meteorology (INMET). The data are from January 2014 to December 2016. Univariate correlation analysis and canonical correlation analysis were performed. Few correlations between the climatic variables and milk composition were found using a univariate approach. However, using canonical correlation analysis, we found a strong and significant correlation (r c  = 0.95, p value = 0.0029). Lactose, ambient temperature measures (mean, minimum, and maximum), and temperature-humidity index (THI) were found to be the most important variables for the canonical correlation. Our study indicated that 10.2% of the variation in milk composition, pricing, and monthly milk production can be explained by climatic variables. Ambient temperature variables, together with THI, seem to have the most influence on variation in milk composition.

  13. Correlations between MRI and Information Processing Speed in MS: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    S. M. Rao

    2014-01-01

    Full Text Available Objectives. To examine relationships between conventional MRI measures and the paced auditory serial addition test (PASAT and symbol digit modalities test (SDMT. Methods. A systematic literature review was conducted. Included studies had ≥30 multiple sclerosis (MS patients, administered the SDMT or PASAT, and measured T2LV or brain atrophy. Meta-analysis of MRI/information processing speed (IPS correlations, analysis of MRI/IPS significance tests to account for reporting bias, and binomial testing to detect trends when comparing correlation strengths of SDMT versus PASAT and T2LV versus atrophy were conducted. Results. The 39 studies identified frequently reported only significant correlations, suggesting reporting bias. Direct meta-analysis was only feasible for correlations between SDMT and T2LV (r=-0.45, P<0.001 and atrophy in patients with mixed-MS subtypes (r=-0.54, P<0.001. Familywise Holm-Bonferroni testing found that selective reporting was not the source of at least half of significant results reported. Binomial tests (P=0.006 favored SDMT over PASAT in strength of MRI correlations. Conclusions. A moderate-to-strong correlation exists between impaired IPS and MRI in mixed MS populations. Correlations with MRI were stronger for SDMT than for PASAT. Neither heterogeneity among populations nor reporting bias appeared to be responsible for these findings.

  14. Performance of Modified Test Statistics in Covariance and Correlation Structure Analysis under Conditions of Multivariate Nonnormality.

    Science.gov (United States)

    Fouladi, Rachel T.

    2000-01-01

    Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…

  15. Research of diagnosis sensors fault based on correlation analysis of the bridge structural health monitoring system

    Science.gov (United States)

    Hu, Shunren; Chen, Weimin; Liu, Lin; Gao, Xiaoxia

    2010-03-01

    Bridge structural health monitoring system is a typical multi-sensor measurement system due to the multi-parameters of bridge structure collected from the monitoring sites on the river-spanning bridges. Bridge structure monitored by multi-sensors is an entity, when subjected to external action; there will be different performances to different bridge structure parameters. Therefore, the data acquired by each sensor should exist countless correlation relation. However, complexity of the correlation relation is decided by complexity of bridge structure. Traditionally correlation analysis among monitoring sites is mainly considered from physical locations. unfortunately, this method is so simple that it cannot describe the correlation in detail. The paper analyzes the correlation among the bridge monitoring sites according to the bridge structural data, defines the correlation of bridge monitoring sites and describes its several forms, then integrating the correlative theory of data mining and signal system to establish the correlation model to describe the correlation among the bridge monitoring sites quantificationally. Finally, The Chongqing Mashangxi Yangtze river bridge health measurement system is regards as research object to diagnosis sensors fault, and simulation results verify the effectiveness of the designed method and theoretical discussions.

  16. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    International Nuclear Information System (INIS)

    Munoz-Diosdado, A

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems

  17. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  18. Analysis method of high-order collective-flow correlations based on the concept of correlative degree

    International Nuclear Information System (INIS)

    Zhang Weigang

    2000-01-01

    Based on the concept of correlative degree, a new method of high-order collective-flow measurement is constructed, with which azimuthal correlations, correlations of final state transverse momentum magnitude and transverse correlations can be inspected respectively. Using the new method the contributions of the azimuthal correlations of particles distribution and the correlations of transverse momentum magnitude of final state particles to high-order collective-flow correlations are analyzed respectively with 4π experimental events for 1.2 A GeV Ar + BaI 2 collisions at the Bevalac stream chamber. Comparing with the correlations of transverse momentum magnitude, the azimuthal correlations of final state particles distribution dominate high-order collective-flow correlations in experimental samples. The contributions of correlations of transverse momentum magnitude of final state particles not only enhance the strength of the high-order correlations of particle group, but also provide important information for the measurement of the collectivity of collective flow within the more constraint district

  19. Level and correlates of physical activity and sedentary behavior in patients with type 2 diabetes: A cross-sectional analysis of the Italian Diabetes and Exercise Study_2.

    Directory of Open Access Journals (Sweden)

    Stefano Balducci

    Full Text Available Patients with type 2 diabetes usually show reduced physical activity (PA and increased sedentary (SED-time, though to a varying extent, especially for low-intensity PA (LPA, a major determinant of daily energy expenditure that is not accurately captured by questionnaires. This study assessed the level and correlates of PA and SED-time in patients from the Italian Diabetes and Exercise Study_2 (IDES_2.Three-hundred physically inactive and sedentary patients with type 2 diabetes were enrolled in the IDES_2 to be randomized to an intervention group, receiving theoretical and practical exercise counseling, and a control group, receiving standard care. At baseline, LPA, moderate-to-vigorous-intensity PA (MVPA, and SED-time were measured by accelerometer. Physical fitness and cardiovascular risk factors and scores were also assessed.LPA was 3.93±1.35 hours∙day-1, MVPA was 12.4±4.6 min∙day-1, and SED-time was 11.6±1.2 hours∙day-1, with a large range of values (0.89-7.11 hours∙day-1, 0.6-21.0 min∙day-1, and 9.14-15.28 hours∙day-1, respectively. At bivariate analysis, LPA and MVPA correlated with better cardiovascular risk profile and fitness parameters, whereas the opposite was observed for SED-time. Likewise, values of LPA, MVPA, and SED-time falling in the best tertile were associated with optimal or acceptable levels of cardiovascular risk factors and scores. At multivariate analysis, age, female gender, HbA1c, BMI or waist circumference, and high-sensitivity C reactive protein (for LPA and SED-time only were negatively associated with LPA and MPA and positively associated with SED-time in an independent manner.Physically inactive and sedentary patients with type 2 diabetes from the IDES_2 show a low level of PA, though values of LPA, MVPA, and SED-time vary largely. Furthermore, there is a strong correlation of these measures with glycemic control, adiposity and inflammation, thus suggesting that even small improvements in LPA, MVPA

  20. On minimizing the influence of the noise tail of correlation functions in operational modal analysis

    DEFF Research Database (Denmark)

    Tarpø, Marius; Olsen, Peter; Amador, Sandro

    2017-01-01

    on the identification results (random errors) when the noise tail is included in the identification. On the other hand, if the correlation function is truncated too much, then important information is lost. In other to minimize this error, a suitable truncation based on manual inspection of the correlation function......In operational modal analysis (OMA) correlation functions are used by all classical time-domain modal identification techniques that uses the impulse response function (free decays) as primary data. However, the main difference between the impulse response and the correlation functions estimated...... from the operational responses is that the latter present a higher noise level. This is due to statistical errors in the estimation of the correlation function and it causes random noise in the end of the function and this is called the noise tail. This noise might have significant influence...

  1. METHODS OF DISTANCE MEASUREMENT’S ACCURACY INCREASING BASED ON THE CORRELATION ANALYSIS OF STEREO IMAGES

    Directory of Open Access Journals (Sweden)

    V. L. Kozlov

    2018-01-01

    Full Text Available To solve the problem of increasing the accuracy of restoring a three-dimensional picture of space using two-dimensional digital images, it is necessary to use new effective techniques and algorithms for processing and correlation analysis of digital images. Actively developed tools that allow you to reduce the time costs for processing stereo images, improve the quality of the depth maps construction and automate their construction. The aim of the work is to investigate the possibilities of using various techniques for processing digital images to improve the measurements accuracy of the rangefinder based on the correlation analysis of the stereo image. The results of studies of the influence of color channel mixing techniques on the distance measurements accuracy for various functions realizing correlation processing of images are presented. Studies on the analysis of the possibility of using integral representation of images to reduce the time cost in constructing a depth map areproposed. The results of studies of the possibility of using images prefiltration before correlation processing when distance measuring by stereo imaging areproposed.It is obtained that using of uniform mixing of channels leads to minimization of the total number of measurement errors, and using of brightness extraction according to the sRGB standard leads to an increase of errors number for all of the considered correlation processing techniques. Integral representation of the image makes it possible to accelerate the correlation processing, but this method is useful for depth map calculating in images no more than 0.5 megapixels. Using of image filtration before correlation processing can provide, depending on the filter parameters, either an increasing of the correlation function value, which is useful for analyzing noisy images, or compression of the correlation function.

  2. An Econometric Analysis of Modulated Realised Covariance, Regression and Correlation in Noisy Diffusion Models

    DEFF Research Database (Denmark)

    Kinnebrock, Silja; Podolskij, Mark

    This paper introduces a new estimator to measure the ex-post covariation between high-frequency financial time series under market microstructure noise. We provide an asymptotic limit theory (including feasible central limit theorems) for standard methods such as regression, correlation analysis...... process can be relaxed and how our method can be applied to non-synchronous observations. We also present an empirical study of how high-frequency correlations, regressions and covariances change through time....

  3. Co-occurrence correlations of heavy metals in sediments revealed using network analysis.

    Science.gov (United States)

    Liu, Lili; Wang, Zhiping; Ju, Feng; Zhang, Tong

    2015-01-01

    In this study, the correlation-based study was used to identify the co-occurrence correlations among metals in marine sediment of Hong Kong, based on the long-term (from 1991 to 2011) temporal and spatial monitoring data. 14 stations out of the total 45 marine sediment monitoring stations were selected from three representative areas, including Deep Bay, Victoria Harbour and Mirs Bay. Firstly, Spearman's rank correlation-based network analysis was conducted as the first step to identify the co-occurrence correlations of metals from raw metadata, and then for further analysis using the normalized metadata. The correlations patterns obtained by network were consistent with those obtained by the other statistic normalization methods, including annual ratios, R-squared coefficient and Pearson correlation coefficient. Both Deep Bay and Victoria Harbour have been polluted by heavy metals, especially for Pb and Cu, which showed strong co-occurrence with other heavy metals (e.g. Cr, Ni, Zn and etc.) and little correlations with the reference parameters (Fe or Al). For Mirs Bay, which has better marine sediment quality compared with Deep Bay and Victoria Harbour, the co-occurrence patterns revealed by network analysis indicated that the metals in sediment dominantly followed the natural geography process. Besides the wide applications in biology, sociology and informatics, it is the first time to apply network analysis in the researches of environment pollutions. This study demonstrated its powerful application for revealing the co-occurrence correlations among heavy metals in marine sediments, which could be further applied for other pollutants in various environment systems. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Phenomenological analysis of quantum level correlations and classical repulsion effects in SU(3) model

    International Nuclear Information System (INIS)

    Fujiwara, Shigeyasu; Sakata, Fumihiko

    2003-01-01

    The quantum level fluctuation in various systems has been shown to be characterized by the random matrix theory, and to be related to a regular-to-chaos transition in classical system. We present a new qualitative analysis of quantum and classical fluctuation properties by exploiting correlation coefficients and variances. It is shown that the correlation coefficient of quantum level density is inversely proportional to the variance of consecutive phase-space point spacings on the Poincare section plane. (author)

  5. Quantum diffraction and interference of spatially correlated photon pairs and its Fourier-optical analysis

    International Nuclear Information System (INIS)

    Shimizu, Ryosuke; Edamatsu, Keiichi; Itoh, Tadashi

    2006-01-01

    We present one- and two-photon diffraction and interference experiments involving parametric down-converted photon pairs. By controlling the divergence of the pump beam in parametric down-conversion, the diffraction-interference pattern produced by an object changes from a quantum (perfectly correlated) case to a classical (uncorrelated) one. The observed diffraction and interference patterns are accurately reproduced by Fourier-optical analysis taking into account the quantum spatial correlation. We show that the relation between the spatial correlation and the object size plays a crucial role in the formation of both one- and two-photon diffraction-interference patterns

  6. Application of the Gini correlation coefficient to infer regulatory relationships in transcriptome analysis.

    Science.gov (United States)

    Ma, Chuang; Wang, Xiangfeng

    2012-09-01

    One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey's biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses.

  7. Application of the Gini Correlation Coefficient to Infer Regulatory Relationships in Transcriptome Analysis[W][OA

    Science.gov (United States)

    Ma, Chuang; Wang, Xiangfeng

    2012-01-01

    One of the computational challenges in plant systems biology is to accurately infer transcriptional regulation relationships based on correlation analyses of gene expression patterns. Despite several correlation methods that are applied in biology to analyze microarray data, concerns regarding the compatibility of these methods with the gene expression data profiled by high-throughput RNA transcriptome sequencing (RNA-Seq) technology have been raised. These concerns are mainly due to the fact that the distribution of read counts in RNA-Seq experiments is different from that of fluorescence intensities in microarray experiments. Therefore, a comprehensive evaluation of the existing correlation methods and, if necessary, introduction of novel methods into biology is appropriate. In this study, we compared four existing correlation methods used in microarray analysis and one novel method called the Gini correlation coefficient on previously published microarray-based and sequencing-based gene expression data in Arabidopsis (Arabidopsis thaliana) and maize (Zea mays). The comparisons were performed on more than 11,000 regulatory relationships in Arabidopsis, including 8,929 pairs of transcription factors and target genes. Our analyses pinpointed the strengths and weaknesses of each method and indicated that the Gini correlation can compensate for the shortcomings of the Pearson correlation, the Spearman correlation, the Kendall correlation, and the Tukey’s biweight correlation. The Gini correlation method, with the other four evaluated methods in this study, was implemented as an R package named rsgcc that can be utilized as an alternative option for biologists to perform clustering analyses of gene expression patterns or transcriptional network analyses. PMID:22797655

  8. Estimation of the biserial correlation and its sampling variance for use in meta-analysis.

    Science.gov (United States)

    Jacobs, Perke; Viechtbauer, Wolfgang

    2017-06-01

    Meta-analyses are often used to synthesize the findings of studies examining the correlational relationship between two continuous variables. When only dichotomous measurements are available for one of the two variables, the biserial correlation coefficient can be used to estimate the product-moment correlation between the two underlying continuous variables. Unlike the point-biserial correlation coefficient, biserial correlation coefficients can therefore be integrated with product-moment correlation coefficients in the same meta-analysis. The present article describes the estimation of the biserial correlation coefficient for meta-analytic purposes and reports simulation results comparing different methods for estimating the coefficient's sampling variance. The findings indicate that commonly employed methods yield inconsistent estimates of the sampling variance across a broad range of research situations. In contrast, consistent estimates can be obtained using two methods that appear to be unknown in the meta-analytic literature. A variance-stabilizing transformation for the biserial correlation coefficient is described that allows for the construction of confidence intervals for individual coefficients with close to nominal coverage probabilities in most of the examined conditions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Correlation dimension based nonlinear analysis of network traffics with different application protocols

    International Nuclear Information System (INIS)

    Wang Jun-Song; Yuan Jing; Li Qiang; Yuan Rui-Xi

    2011-01-01

    This paper uses a correlation dimension based nonlinear analysis approach to analyse the dynamics of network traffics with three different application protocols—HTTP, FTP and SMTP. First, the phase space is reconstructed and the embedding parameters are obtained by the mutual information method. Secondly, the correlation dimensions of three different traffics are calculated and the results of analysis have demonstrated that the dynamics of the three different application protocol traffics is different from each other in nature, i.e. HTTP and FTP traffics are chaotic, furthermore, the former is more complex than the later; on the other hand, SMTP traffic is stochastic. It is shown that correlation dimension approach is an efficient method to understand and to characterize the nonlinear dynamics of HTTP, FTP and SMTP protocol network traffics. This analysis provided insight into and a more accurate understanding of nonlinear dynamics of internet traffics which have a complex mixture of chaotic and stochastic components. (general)

  10. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    Energy Technology Data Exchange (ETDEWEB)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com [Wageningen University, P.O. Box 338, Wageningen 6700 AH (Netherlands); Heijungs, R. [Vrije Universiteit Amsterdam, De Boelelaan 1105, Amsterdam 1081 HV (Netherlands); Leiden University, Einsteinweg 2, Leiden 2333 CC (Netherlands)

    2017-01-15

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlations between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.

  11. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    Directory of Open Access Journals (Sweden)

    S. S. Motsa

    2014-01-01

    Full Text Available This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs. The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  12. A bivariate Chebyshev spectral collocation quasilinearization method for nonlinear evolution parabolic equations.

    Science.gov (United States)

    Motsa, S S; Magagula, V M; Sibanda, P

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  13. Econometric analysis of realized covariation: high frequency based covariance, regression, and correlation in financial economics

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2004-01-01

    This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing...... the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities....

  14. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data.

    Science.gov (United States)

    Tan, Qihua; Thomassen, Mads; Burton, Mark; Mose, Kristian Fredløv; Andersen, Klaus Ejner; Hjelmborg, Jacob; Kruse, Torben

    2017-06-06

    Modeling complex time-course patterns is a challenging issue in microarray study due to complex gene expression patterns in response to the time-course experiment. We introduce the generalized correlation coefficient and propose a combinatory approach for detecting, testing and clustering the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health.

  15. Similarity analysis between chromosomes of Homo sapiens and monkeys with correlation coefficient, rank correlation coefficient and cosine similarity measures

    OpenAIRE

    Someswara Rao, Chinta; Viswanadha Raju, S.

    2016-01-01

    In this paper, we consider correlation coefficient, rank correlation coefficient and cosine similarity measures for evaluating similarity between Homo sapiens and monkeys. We used DNA chromosomes of genome wide genes to determine the correlation between the chromosomal content and evolutionary relationship. The similarity among the H. sapiens and monkeys is measured for a total of 210 chromosomes related to 10 species. The similarity measures of these different species show the relationship b...

  16. Correlation Factor Analysis of Retinal Microvascular Changes in Patients With Essential Hypertension

    Institute of Scientific and Technical Information of China (English)

    Huang Duru; Huang Zhongning

    2006-01-01

    Objectives To investigate correlation between retinal microvascular signs and essential hypertension classification. Methods The retinal microvascular signs in patients with essential hypertension were assessed with the indirect biomicroscopy lens, the direct and the indirect ophthalmoscopes were used to determine the hypertensive retinopathy grades and retinal arteriosclerosis grades.The rank correlation analysis was used to analysis the correlation these grades with the risk factors concerned with hypertension. Results Of 72 cases with essential hypertension, 28 cases complicated with coronary disease, 20 cases diabetes, 41 cases stroke,17 cases renal malfunction. Varying extent retinal arterioscleroses were found in 71 cases, 1 case with retinal hemorrhage, 2 cases with retina edema, 4 cases with retinal hard exudation, 5 cases with retinal hemorrhage complicated by hard exudation, 2 cases with retinal hemorrhage complicated by hard exudation and cotton wool spot, 1 case with retinal hemorrhage complicated by hard exudation and microaneurysms,1 case with retinal edema and hard exudation, 1 case with retinal microaneurysms, 1 case with branch retinal vein occlusion. The rank correlation analysis showed that either hypertensive retinopathy grades or retinal arteriosclerosis grades were correlated with risk factor lamination of hypertension (r=0.25 or 0.31, P<0.05), other correlation factors included age and blood high density lipoprotein concerned about hypertensive retinopathy grades or retinal arteriosclerosis grades, but other parameters, namely systolic or diastolic pressure, total cholesterol, triglyceride, low density lipoprotein cholesterol, fasting blood glucose,blood urea nitrogen and blood creatinine were not confirmed in this correlation analysis (P > 0.05).Conclusions Either hypertensive retinopathy grade or retinal arteriosclerosis grade is close with the hypertension risk factor lamination, suggesting that the fundus examination of patients with

  17. Comparison between different uncertainty propagation methods in multivariate analysis: An application in the bivariate case

    International Nuclear Information System (INIS)

    Mullor, R.; Sanchez, A.; Martorell, S.; Martinez-Alzamora, N.

    2011-01-01

    Safety related systems performance optimization is classically based on quantifying the effects that testing and maintenance activities have on reliability and cost (R+C). However, R+C quantification is often incomplete in the sense that important uncertainties may not be considered. An important number of studies have been published in the last decade in the field of R+C based optimization considering uncertainties. They have demonstrated that inclusion of uncertainties in the optimization brings the decision maker insights concerning how uncertain the R+C results are and how this uncertainty does matter as it can result in differences in the outcome of the decision making process. Several methods of uncertainty propagation based on the theory of tolerance regions have been proposed in the literature depending on the particular characteristics of the variables in the output and their relations. In this context, the objective of this paper focuses on the application of non-parametric and parametric methods to analyze uncertainty propagation, which will be implemented on a multi-objective optimization problem where reliability and cost act as decision criteria and maintenance intervals act as decision variables. Finally, a comparison of results of these applications and the conclusions obtained are presented.

  18. Comparison between different uncertainty propagation methods in multivariate analysis: An application in the bivariate case

    Energy Technology Data Exchange (ETDEWEB)

    Mullor, R. [Dpto. Estadistica e Investigacion Operativa, Universidad Alicante (Spain); Sanchez, A., E-mail: aisanche@eio.upv.e [Dpto. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica Valencia, Camino de Vera s/n 46022 (Spain); Martorell, S. [Dpto. Ingenieria Quimica y Nuclear, Universidad Politecnica Valencia (Spain); Martinez-Alzamora, N. [Dpto. Estadistica e Investigacion Operativa Aplicadas y Calidad, Universidad Politecnica Valencia, Camino de Vera s/n 46022 (Spain)

    2011-06-15

    Safety related systems performance optimization is classically based on quantifying the effects that testing and maintenance activities have on reliability and cost (R+C). However, R+C quantification is often incomplete in the sense that important uncertainties may not be considered. An important number of studies have been published in the last decade in the field of R+C based optimization considering uncertainties. They have demonstrated that inclusion of uncertainties in the optimization brings the decision maker insights concerning how uncertain the R+C results are and how this uncertainty does matter as it can result in differences in the outcome of the decision making process. Several methods of uncertainty propagation based on the theory of tolerance regions have been proposed in the literature depending on the particular characteristics of the variables in the output and their relations. In this context, the objective of this paper focuses on the application of non-parametric and parametric methods to analyze uncertainty propagation, which will be implemented on a multi-objective optimization problem where reliability and cost act as decision criteria and maintenance intervals act as decision variables. Finally, a comparison of results of these applications and the conclusions obtained are presented.

  19. Bleed-through correction for rendering and correlation analysis in multi-colour localization microscopy

    International Nuclear Information System (INIS)

    Kim, Dahan; Curthoys, Nikki M; Parent, Matthew T; Hess, Samuel T

    2013-01-01

    Multi-colour localization microscopy has enabled sub-diffraction studies of colocalization between multiple biological species and quantification of their correlation at length scales previously inaccessible with conventional fluorescence microscopy. However, bleed-through, or misidentification of probe species, creates false colocalization and artificially increases certain types of correlation between two imaged species, affecting the reliability of information provided by colocalization and quantified correlation. Despite the potential risk of these artefacts of bleed-through, neither the effect of bleed-through on correlation nor methods for its correction in correlation analyses have been systematically studied at typical rates of bleed-through reported to affect multi-colour imaging. Here, we present a reliable method of bleed-through correction applicable to image rendering and correlation analysis of multi-colour localization microscopy. Application of our bleed-through correction shows that our method accurately corrects the artificial increase in both types of correlation studied (Pearson coefficient and pair correlation), at all rates of bleed-through tested, in all types of correlation examined. In particular, anti-correlation could not be quantified without our bleed-through correction, even at rates of bleed-through as low as 2%. While it is demonstrated with dichroic-based multi-colour FPALM here, our presented method of bleed-through correction can be applied to all types of localization microscopy (PALM, STORM, dSTORM, GSDIM, etc), including both simultaneous and sequential multi-colour modalities, provided the rate of bleed-through can be reliably determined. (special issue article)

  20. Importance analysis for models with correlated variables and its sparse grid solution

    International Nuclear Information System (INIS)

    Li, Luyi; Lu, Zhenzhou

    2013-01-01

    For structural models involving correlated input variables, a novel interpretation for variance-based importance measures is proposed based on the contribution of the correlated input variables to the variance of the model output. After the novel interpretation of the variance-based importance measures is compared with the existing ones, two solutions of the variance-based importance measures of the correlated input variables are built on the sparse grid numerical integration (SGI): double-loop nested sparse grid integration (DSGI) method and single loop sparse grid integration (SSGI) method. The DSGI method solves the importance measure by decreasing the dimensionality of the input variables procedurally, while SSGI method performs importance analysis through extending the dimensionality of the inputs. Both of them can make full use of the advantages of the SGI, and are well tailored for different situations. By analyzing the results of several numerical and engineering examples, it is found that the novel proposed interpretation about the importance measures of the correlated input variables is reasonable, and the proposed methods for solving importance measures are efficient and accurate. -- Highlights: •The contribution of correlated variables to the variance of the output is analyzed. •A novel interpretation for variance-based indices of correlated variables is proposed. •Two solutions for variance-based importance measures of correlated variables are built

  1. Dysregulated Pathway Identification of Alzheimer's Disease Based on Internal Correlation Analysis of Genes and Pathways.

    Science.gov (United States)

    Kong, Wei; Mou, Xiaoyang; Di, Benteng; Deng, Jin; Zhong, Ruxing; Wang, Shuaiqun

    2017-11-20

    Dysregulated pathway identification is an important task which can gain insight into the underlying biological processes of disease. Current pathway-identification methods focus on a set of co-expression genes and single pathways and ignore the correlation between genes and pathways. The method proposed in this study, takes into account the internal correlations not only between genes but also pathways to identifying dysregulated pathways related to Alzheimer's disease (AD), the most common form of dementia. In order to find the significantly differential genes for AD, mutual information (MI) is used to measure interdependencies between genes other than expression valves. Then, by integrating the topology information from KEGG, the significant pathways involved in the feature genes are identified. Next, the distance correlation (DC) is applied to measure the pairwise pathway crosstalks since DC has the advantage of detecting nonlinear correlations when compared to Pearson correlation. Finally, the pathway pairs with significantly different correlations between normal and AD samples are known as dysregulated pathways. The molecular biology analysis demonstrated that many dysregulated pathways related to AD pathogenesis have been discovered successfully by the internal correlation detection. Furthermore, the insights of the dysregulated pathways in the development and deterioration of AD will help to find new effective target genes and provide important theoretical guidance for drug design. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Time Correlations of Lightning Flash Sequences in Thunderstorms Revealed by Fractal Analysis

    Science.gov (United States)

    Gou, Xueqiang; Chen, Mingli; Zhang, Guangshu

    2018-01-01

    By using the data of lightning detection and ranging system at the Kennedy Space Center, the temporal fractal and correlation of interevent time series of lightning flash sequences in thunderstorms have been investigated with Allan factor (AF), Fano factor (FF), and detrended fluctuation analysis (DFA) methods. AF, FF, and DFA methods are powerful tools to detect the time-scaling structures and correlations in point processes. Totally 40 thunderstorms with distinguishing features of a single-cell storm and apparent increase and decrease in the total flash rate were selected for the analysis. It is found that the time-scaling exponents for AF (αAF) and FF (αFF) analyses are 1.62 and 0.95 in average, respectively, indicating a strong time correlation of the lightning flash sequences. DFA analysis shows that there is a crossover phenomenon—a crossover timescale (τc) ranging from 54 to 195 s with an average of 114 s. The occurrence of a lightning flash in a thunderstorm behaves randomly at timescales τc but shows strong time correlation at scales >τc. Physically, these may imply that the establishment of an extensive strong electric field necessary for the occurrence of a lightning flash needs a timescale >τc, which behaves strongly time correlated. But the initiation of a lightning flash within a well-established extensive strong electric field may involve the heterogeneities of the electric field at a timescale τc, which behave randomly.

  3. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Chen-Hua, E-mail: shenandchen01@163.com [College of Geographical Science, Nanjing Normal University, Nanjing 210046 (China); Jiangsu Center for Collaborative Innovation in Geographical Information Resource, Nanjing 210046 (China); Key Laboratory of Virtual Geographic Environment of Ministry of Education, Nanjing 210046 (China)

    2015-12-04

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  4. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    International Nuclear Information System (INIS)

    Shen, Chen-Hua

    2015-01-01

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  5. The effects of observational correlated noises on multifractal detrended fluctuation analysis

    Science.gov (United States)

    Gulich, Damián; Zunino, Luciano

    2012-08-01

    We have numerically investigated the effects that observational correlated noises have on the generalized Hurst exponents, h(q), estimated by using the multifractal generalization of detrended fluctuation analysis (MF-DFA). More precisely, artificially generated stochastic binomial multifractals with increased amount of colored noises were analyzed via MF-DFA. It has been recently shown that for moderate additions of white noise, the generalized Hurst exponents are significantly underestimated for qeffects of additive noise, short- term memory and periodic trends, Physica A 390 (2011) 2480-2490]. In this paper, we have found that h(q) with q≥2 are also affected when correlated noises are considered. This is due to the fact that the spurious correlations influence the scaling behaviors associated to large fluctuations. The results obtained are significant for practical situations, where noises with different correlations are inherently present.

  6. Statistical analysis of solid waste composition data: Arithmetic mean, standard deviation and correlation coefficients

    DEFF Research Database (Denmark)

    Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte

    2017-01-01

    -derived food waste amounted to 2.21 ± 3.12% with a confidence interval of (−4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson’s correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste...... and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data......, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients....

  7. A study on the effect of the CHF correlations to the LOCA analysis

    International Nuclear Information System (INIS)

    Kim, Ho Kee

    1998-02-01

    The critical heat flux (CHF) is a major parameter which determines the cooling performance and therefore the prediction of CHF is of importance for the design and safety analysis in boiling systems; such as nuclear reactors, conventional boilers, and other various two-phase flow systems. Until now, many CHF correlations have been developed and for the actual design a correlation has been selected in consideration of its characteristics. For the analysis of Loss of Coolant Accident (LOCA) in a Nuclear Power Plant, which shows the drastic parameters change during the system transient, a correlation having a reasonable degree of accuracy over a wide range is preferred, rather than that having accuracy for a specific range. It is required to have tangible insight about the effects of the CHF correlation to the LOCA analysis for the purpose of computer code development and nuclear regulation. The related research is further recommended. The purpose of this research is to obtain an insight and/or intuition about the above effect and to evaluate the selected CHF correlations. To achieve these purposes LOCA is analysed for the UL-JIN 3 and 4 nuclear power plant, the Korea Standard Type Nuclear Power Plant and the Loss of Flow Test (LOFT) L2-5 experiment is simulated using the RELAP5/MOD3.1 computer code for each selected CHF correlation. The selected correlations are the AECL-UO Lookup Table, adapted in RELAP5 code; the K110 CHF correlation, developed by KAERI; and the original W-3 CHF correlation, developed by L.S. Tong. LOFT is also simulated using the AECL-UO Lookup Table having the CHF multiplication factors 0.5 and 1.5, and then compared with the result of the original Lookup Table and the experiment result. In the LOCA analysis, the CHF correlations affect the magnitude of peak cladding temperatures, but does not seriously affect the occurrence points of time. The effect of each CHF correlation to the fuel cladding temperature behavior becomes apparent at the end of

  8. Diagrammatic analysis of correlations in polymer fluids: Cluster diagrams via Edwards' field theory

    International Nuclear Information System (INIS)

    Morse, David C.

    2006-01-01

    Edwards' functional integral approach to the statistical mechanics of polymer liquids is amenable to a diagrammatic analysis in which free energies and correlation functions are expanded as infinite sums of Feynman diagrams. This analysis is shown to lead naturally to a perturbative cluster expansion that is closely related to the Mayer cluster expansion developed for molecular liquids by Chandler and co-workers. Expansion of the functional integral representation of the grand-canonical partition function yields a perturbation theory in which all quantities of interest are expressed as functionals of a monomer-monomer pair potential, as functionals of intramolecular correlation functions of non-interacting molecules, and as functions of molecular activities. In different variants of the theory, the pair potential may be either a bare or a screened potential. A series of topological reductions yields a renormalized diagrammatic expansion in which collective correlation functions are instead expressed diagrammatically as functionals of the true single-molecule correlation functions in the interacting fluid, and as functions of molecular number density. Similar renormalized expansions are also obtained for a collective Ornstein-Zernicke direct correlation function, and for intramolecular correlation functions. A concise discussion is given of the corresponding Mayer cluster expansion, and of the relationship between the Mayer and perturbative cluster expansions for liquids of flexible molecules. The application of the perturbative cluster expansion to coarse-grained models of dense multi-component polymer liquids is discussed, and a justification is given for the use of a loop expansion. As an example, the formalism is used to derive a new expression for the wave-number dependent direct correlation function and recover known expressions for the intramolecular two-point correlation function to first-order in a renormalized loop expansion for coarse-grained models of

  9. Prospects of Frequency-Time Correlation Analysis for Detecting Pipeline Leaks by Acoustic Emission Method

    International Nuclear Information System (INIS)

    Faerman, V A; Cheremnov, A G; Avramchuk, V V; Luneva, E E

    2014-01-01

    In the current work the relevance of nondestructive test method development applied for pipeline leak detection is considered. It was shown that acoustic emission testing is currently one of the most widely spread leak detection methods. The main disadvantage of this method is that it cannot be applied in monitoring long pipeline sections, which in its turn complicates and slows down the inspection of the line pipe sections of main pipelines. The prospects of developing alternative techniques and methods based on the use of the spectral analysis of signals were considered and their possible application in leak detection on the basis of the correlation method was outlined. As an alternative, the time-frequency correlation function calculation is proposed. This function represents the correlation between the spectral components of the analyzed signals. In this work, the technique of time-frequency correlation function calculation is described. The experimental data that demonstrate obvious advantage of the time-frequency correlation function compared to the simple correlation function are presented. The application of the time-frequency correlation function is more effective in suppressing the noise components in the frequency range of the useful signal, which makes maximum of the function more pronounced. The main drawback of application of the time- frequency correlation function analysis in solving leak detection problems is a great number of calculations that may result in a further increase in pipeline time inspection. However, this drawback can be partially reduced by the development and implementation of efficient algorithms (including parallel) of computing the fast Fourier transform using computer central processing unit and graphic processing unit

  10. T2 relaxation time analysis in patients with multiple sclerosis: correlation with magnetization transfer ratio

    International Nuclear Information System (INIS)

    Papanikolaou, Nickolas; Papadaki, Eufrosini; Karampekios, Spyros; Maris, Thomas; Prassopoulos, Panos; Gourtsoyiannis, Nicholas; Spilioti, Martha

    2004-01-01

    The aim of the current study was to perform T2 relaxation time measurements in multiple sclerosis (MS) patients and correlate them with magnetization transfer ratio (MTR) measurements, in order to investigate in more detail the various histopathological changes that occur in lesions and normal-appearing white matter (NAWM). A total number of 291 measurements of MTR and T2 relaxation times were performed in 13 MS patients and 10 age-matched healthy volunteers. Measurements concerned MS plaques (105), NAWM (80), and ''dirty'' white matter (DWM; 30), evenly divided between the MS patients, and normal white matter (NWM; 76) in the healthy volunteers. Biexponential T2 relaxation-time analysis was performed, and also possible linearity between MTR and mean T2 relaxation times was evaluated using linear regression analysis in all subgroups. Biexponential relaxation was more pronounced in ''black-hole'' lesions (16.6%) and homogeneous enhancing plaques (10%), whereas DWM, NAWM, and mildly hypointense lesions presented biexponential behavior with a lower frequency(6.6, 5, and 3.1%, respectively). Non-enhancing isointense lesions and normal white matter did not reveal any biexponentional behavior. Linear regression analysis between monoexponential T2 relaxation time and MTR measurements demonstrated excellent correlation for DWM(r=-0.78, p<0.0001), very good correlation for black-hole lesions(r=-0.71, p=0.002), good correlation for isointense lesions(r=-0.60, p=0.005), moderate correlation for mildly hypointense lesions(r=-0.34, p=0.007), and non-significant correlation for homogeneous enhancing plaques, NAWM, and NWM. Biexponential T2 relaxation-time behavior is seen in only very few lesions (mainly on plaques with high degree of demyelination and axonal loss). A strong correlation between MTR and monoexponential T2 values was found in regions where either inflammation or demyelination predominates; however, when both pathological conditions coexist, this linear

  11. Error analysis of supercritical water correlations using ATHLET system code under DHT conditions

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, J., E-mail: jeffrey.samuel@uoit.ca [Univ. of Ontario Inst. of Tech., Oshawa, ON (Canada)

    2014-07-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is used for analysis of anticipated and abnormal plant transients, including safety analysis of Light Water Reactors (LWRs) and Russian Graphite-Moderated High Power Channel-type Reactors (RBMKs). The range of applicability of ATHLET has been extended to supercritical water by updating the fluid-and transport-properties packages, thus enabling the code to the used in analysis of SuperCritical Water-cooled Reactors (SCWRs). Several well-known heat-transfer correlations for supercritical fluids were added to the ATHLET code and a numerical model was created to represent an experimental test section. In this work, the error in the Heat Transfer Coefficient (HTC) calculation by the ATHLET model is studied along with the ability of the various correlations to predict different heat transfer regimes. (author)

  12. Multiset Canonical Correlations Analysis and Multispectral, Truly Multitemporal Remote Sensing Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2002-01-01

    This paper describes two- and multiset canonical correlations analysis (CCA) for data fusion, multi-source, multiset or multi-temporal exploratory data analysis. These techniques transform multivariate multiset data into new orthogonal variables called canonical variates (CVs) which when applied...... in remote sensing exhibit ever decreasing similarity (as expressed by correlation measures) over sets consisting of 1) spectral variables at fixed points in time (R-mode analysis), or 2) temporal variables with fixed wavelengths (T-mode analysis). The CVs are invariant to linear and affine transformations...... of the original variables within sets which means, for example, that the R-mode CVs are insensitive to changes over time in offset and gain in a measuring device. In a case study CVs are calculated from Landsat TM data with six spectral bands over six consecutive years. Both R- and T-mode CVs clearly exhibit...

  13. Research on criticality analysis method of CNC machine tools components under fault rate correlation

    Science.gov (United States)

    Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han

    2018-02-01

    In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.

  14. Correlation analysis of trace elemental data obtained from blood sera of ovarian cancer patients using PIXE

    International Nuclear Information System (INIS)

    Naidu, B.G.; Sarita, P.; Naga Raju, G.J.

    2017-01-01

    Proton induced X-ray emission (PIXE) technique is used for analysis of trace elements present in the blood sera of ovarian cancer patients and healthy controls. This work is also intended to establish the role played by trace elements in carcinogenic process. It is observed that the concentrations of elements Ti, V, Cr, Mn, Fe, Ni, Rb and Sr are lower and the concentration of Cu is higher in the cancer patients when compared to controls. However, no change in concentration is found in the elements Co, Zn, As, Se and Br. Correlation analysis of the data using SPSS 16.0 has revealed a strong positive correlation between Ti-V, Ni-Co, Cu-Fe, As-Ti, Br-Ti, Br-V and Sr-Fe while strong negative correlations are observed for Cu-Ti, As-Cu and Br-Cu. Changes in trace elemental content are probably associated with ovarian carcinogenesis. (author)

  15. Personality disorders in substance abusers: Validation of the DIP-Q through principal components factor analysis and canonical correlation analysis

    Directory of Open Access Journals (Sweden)

    Hesse Morten

    2005-05-01

    Full Text Available Abstract Background Personality disorders are common in substance abusers. Self-report questionnaires that can aid in the assessment of personality disorders are commonly used in assessment, but are rarely validated. Methods The Danish DIP-Q as a measure of co-morbid personality disorders in substance abusers was validated through principal components factor analysis and canonical correlation analysis. A 4 components structure was constructed based on 238 protocols, representing antagonism, neuroticism, introversion and conscientiousness. The structure was compared with (a a 4-factor solution from the DIP-Q in a sample of Swedish drug and alcohol abusers (N = 133, and (b a consensus 4-components solution based on a meta-analysis of published correlation matrices of dimensional personality disorder scales. Results It was found that the 4-factor model of personality was congruent across the Danish and Swedish samples, and showed good congruence with the consensus model. A canonical correlation analysis was conducted on a subset of the Danish sample with staff ratings of pathology. Three factors that correlated highly between the two variable sets were found. These variables were highly similar to the three first factors from the principal components analysis, antagonism, neuroticism and introversion. Conclusion The findings support the validity of the DIP-Q as a measure of DSM-IV personality disorders in substance abusers.

  16. Pyrcca: regularized kernel canonical correlation analysis in Python and its applications to neuroimaging

    OpenAIRE

    Natalia Y Bilenko; Jack L Gallant; Jack L Gallant

    2016-01-01

    In this article we introduce Pyrcca, an open-source Python package for performing canonical correlation analysis (CCA). CCA is a multivariate analysis method for identifying relationships between sets of variables. Pyrcca supports CCA with or without regularization, and with or without linear, polynomial, or Gaussian kernelization. We first use an abstract example to describe Pyrcca functionality. We then demonstrate how Pyrcca can be used to analyze neuroimaging data. Specifically, we use Py...

  17. A study on association and correlation of lip and finger print pattern analysis for gender identification

    Directory of Open Access Journals (Sweden)

    Surapaneni Ratheesh Kumar Nandan

    2015-01-01

    Conclusion: Lip print analysis is a challenging area in the personal identification during forensic dentistry examination. The study revealed the weaker correlation and approachable significance of lip and finger print pattern in gender identification. Future studies should be encouraged in the direction of software based identification for lip and finger print analysis in gender identification. Such studies may benefit this study pattern in more accurate way.

  18. Generalized Correlation Coefficient for Non-Parametric Analysis of Microarray Time-Course Data

    DEFF Research Database (Denmark)

    Tan, Qihua; Thomassen, Mads; Burton, Mark

    2017-01-01

    the heterogeneous time-course gene expression patterns. Application of the method identified nonlinear time-course patterns in high agreement with parametric analysis. We conclude that the non-parametric nature in the generalized correlation analysis could be an useful and efficient tool for analyzing microarray...... time-course data and for exploring the complex relationships in the omics data for studying their association with disease and health....

  19. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    Science.gov (United States)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  20. The discriminatory analysis about factors correlative with the early hypothyroidism after 131I therapy for hyperthyroidism

    International Nuclear Information System (INIS)

    Xiong Lingjing; Liang Changhua; Deng Haoyu; Li Xinhui; Hu Shuo

    2002-01-01

    Objective: To explore the factors correlative with the early hypothyroidism after 131 I therapy for Graves' hyperthyroidism so as to cure it and decrease the early hypothyroidism occurring and prevent it from becoming irreversible hypothyroidism. Methods: Logistic regression discriminatory analysis by introducing multiple factors from group data and forward stepwise selection of 11 independent variables of 240 hyperthyroidism patients from clinical data and 1 dependent variable from follow-up data after 131 I therapy was conducted. Univariate analysis of each observed factor was performed, too. Results: (1)The results of multivariate analysis showed that the age of patients, the weight of thyroid, the suffering situation, the curve of 131 I absorption rate and the giving 131 I dosage/g thyroid tissue were correlated to early hypothyroidism. The results of univariate analysis showed that the weight of thyroid, the highest absorption of 131 I, the total treatment dosage of 131 I were correlated to early hypothyroidism. (2) The logistic regression equation was statistically significant. (3) The positive and negative predicting accuracy of the early hypothyroidism occurring was 64.08 %, 78.83 %, respectively, the overall predicting accuracy was 72.50%. Conclusions: The dosage of 131 I for treatment of hyperthyroid is the key factor according to the five correlative factors which are relating to the early hypothyroidism and the discriminatory classification. Enhanced follow-up and in time supplement of thyroid hormone are important measures for preventing the early hypothyroidism from becoming irreversible hypothyroidism

  1. Denial-of-service attack detection based on multivariate correlation analysis

    NARCIS (Netherlands)

    Tan, Zhiyuan; Jamdagni, Aruna; He, Xiangjian; Nanda, Priyadarsi; Liu, Ren Ping; Lu, Bao-Liang; Zhang, Liqing; Kwok, James

    2011-01-01

    The reliability and availability of network services are being threatened by the growing number of Denial-of-Service (DoS) attacks. Effective mechanisms for DoS attack detection are demanded. Therefore, we propose a multivariate correlation analysis approach to investigate and extract second-order

  2. Digital image correlation in analysis of striffness in local zones of welded joints

    Czech Academy of Sciences Publication Activity Database

    Milosevic, M.; Milosevic, N.J.; Sedmak, S.; Tatic, U.; Mitrovic, N.; Hloch, Sergej; Jovicic, R.

    2016-01-01

    Roč. 23, č. 1 (2016), s. 19-24 ISSN 1330-3651 Institutional support: RVO:68145535 Keywords : Aramis software * digital image correlation * strain analysis * stiffness * welded joints Subject RIV: JQ - Machines ; Tools Impact factor: 0.723, year: 2016 http://hrcak.srce.hr/file/225545

  3. Generalized canonical analysis based on optimizing matrix correlations and a relation with IDIOSCAL

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Cléroux, R.; Ten Berge, Jos M.F.

    1994-01-01

    Carroll's method for generalized canonical analysis of two or more sets of variables is shown to optimize the sum of squared inner-product matrix correlations between a consensus matrix and matrices with canonical variates for each set of variables. In addition, the method that analogously optimizes

  4. Generalized canonical correlation analysis of matrices with missing rows : A simulation study

    NARCIS (Netherlands)

    van de Velden, Michel; Bijmolt, Tammo H. A.

    A method is presented for generalized canonical correlation analysis of two or more matrices with missing rows. The method is a combination of Carroll's (1968) method and the missing data approach of the OVERALS technique (Van der Burg, 1988). In a simulation study we assess the performance of the

  5. Similarity analysis between chromosomes of Homo sapiens and monkeys with correlation coefficient, rank correlation coefficient and cosine similarity measures.

    Science.gov (United States)

    Someswara Rao, Chinta; Viswanadha Raju, S

    2016-03-01

    In this paper, we consider correlation coefficient, rank correlation coefficient and cosine similarity measures for evaluating similarity between Homo sapiens and monkeys. We used DNA chromosomes of genome wide genes to determine the correlation between the chromosomal content and evolutionary relationship. The similarity among the H. sapiens and monkeys is measured for a total of 210 chromosomes related to 10 species. The similarity measures of these different species show the relationship between the H. sapiens and monkey. This similarity will be helpful at theft identification, maternity identification, disease identification, etc.

  6. Contributory fault and level of personal injury to drivers involved in head-on collisions: Application of copula-based bivariate ordinal models.

    Science.gov (United States)

    Wali, Behram; Khattak, Asad J; Xu, Jingjing

    2018-01-01

    The main objective of this study is to simultaneously investigate the degree of injury severity sustained by drivers involved in head-on collisions with respect to fault status designation. This is complicated to answer due to many issues, one of which is the potential presence of correlation between injury outcomes of drivers involved in the same head-on collision. To address this concern, we present seemingly unrelated bivariate ordered response models by analyzing the joint injury severity probability distribution of at-fault and not-at-fault drivers. Moreover, the assumption of bivariate normality of residuals and the linear form of stochastic dependence implied by such models may be unduly restrictive. To test this, Archimedean copula structures and normal mixture marginals are integrated into the joint estimation framework, which can characterize complex forms of stochastic dependencies and non-normality in residual terms. The models are estimated using 2013 Virginia police reported two-vehicle head-on collision data, where exactly one driver is at-fault. The results suggest that both at-fault and not-at-fault drivers sustained serious/fatal injuries in 8% of crashes, whereas, in 4% of the cases, the not-at-fault driver sustained a serious/fatal injury with no injury to the at-fault driver at all. Furthermore, if the at-fault driver is fatigued, apparently asleep, or has been drinking the not-at-fault driver is more likely to sustain a severe/fatal injury, controlling for other factors and potential correlations between the injury outcomes. While not-at-fault vehicle speed affects injury severity of at-fault driver, the effect is smaller than the effect of at-fault vehicle speed on at-fault injury outcome. Contrarily, and importantly, the effect of at-fault vehicle speed on injury severity of not-at-fault driver is almost equal to the effect of not-at-fault vehicle speed on injury outcome of not-at-fault driver. Compared to traditional ordered probability

  7. A new method to detect transitory signatures and local time/space variability structures in the climate system: the scale-dependent correlation analysis

    Science.gov (United States)

    Rodó, Xavier; Rodríguez-Arias, Miquel-Àngel

    2006-10-01

    The study of transitory signals and local variability structures in both/either time and space and their role as sources of climatic memory, is an important but often neglected topic in climate research despite its obvious importance and extensive coverage in the literature. Transitory signals arise either from non-linearities, in the climate system, transitory atmosphere-ocean couplings, and other processes in the climate system evolving after a critical threshold is crossed. These temporary interactions that, though intense, may not last long, can be responsible for a large amount of unexplained variability but are normally considered of limited relevance and often, discarded. With most of the current techniques at hand these typology of signatures are difficult to isolate because the low signal-to-noise ratio in midlatitudes, the limited recurrence of the transitory signals during a customary interval of data considered. Also, there is often a serious problem arising from the smoothing of local or transitory processes if statistical techniques are applied, that consider all the length of data available, rather than taking into account the size of the specific variability structure under investigation. Scale-dependent correlation (SDC) analysis is a new statistical method capable of highlighting the presence of transitory processes, these former being understood as temporary significant lag-dependent autocovariance in a single series, or covariance structures between two series. This approach, therefore, complements other approaches such as those resulting from the families of wavelet analysis, singular-spectrum analysis and recurrence plots. A main feature of SDC is its high-performance for short time series, its ability to characterize phase-relationships and thresholds in the bivariate domain. Ultimately, SDC helps tracking short-lagged relationships among processes that locally or temporarily couple and uncouple. The use of SDC is illustrated in the present

  8. Bivariate tensor product ( p , q $(p, q$ -analogue of Kantorovich-type Bernstein-Stancu-Schurer operators

    Directory of Open Access Journals (Sweden)

    Qing-Bo Cai

    2017-11-01

    Full Text Available Abstract In this paper, we construct a bivariate tensor product generalization of Kantorovich-type Bernstein-Stancu-Schurer operators based on the concept of ( p , q $(p, q$ -integers. We obtain moments and central moments of these operators, give the rate of convergence by using the complete modulus of continuity for the bivariate case and estimate a convergence theorem for the Lipschitz continuous functions. We also give some graphs and numerical examples to illustrate the convergence properties of these operators to certain functions.

  9. Tutorial on Biostatistics: Linear Regression Analysis of Continuous Correlated Eye Data.

    Science.gov (United States)

    Ying, Gui-Shuang; Maguire, Maureen G; Glynn, Robert; Rosner, Bernard

    2017-04-01

    To describe and demonstrate appropriate linear regression methods for analyzing correlated continuous eye data. We describe several approaches to regression analysis involving both eyes, including mixed effects and marginal models under various covariance structures to account for inter-eye correlation. We demonstrate, with SAS statistical software, applications in a study comparing baseline refractive error between one eye with choroidal neovascularization (CNV) and the unaffected fellow eye, and in a study determining factors associated with visual field in the elderly. When refractive error from both eyes were analyzed with standard linear regression without accounting for inter-eye correlation (adjusting for demographic and ocular covariates), the difference between eyes with CNV and fellow eyes was 0.15 diopters (D; 95% confidence interval, CI -0.03 to 0.32D, p = 0.10). Using a mixed effects model or a marginal model, the estimated difference was the same but with narrower 95% CI (0.01 to 0.28D, p = 0.03). Standard regression for visual field data from both eyes provided biased estimates of standard error (generally underestimated) and smaller p-values, while analysis of the worse eye provided larger p-values than mixed effects models and marginal models. In research involving both eyes, ignoring inter-eye correlation can lead to invalid inferences. Analysis using only right or left eyes is valid, but decreases power. Worse-eye analysis can provide less power and biased estimates of effect. Mixed effects or marginal models using the eye as the unit of analysis should be used to appropriately account for inter-eye correlation and maximize power and precision.

  10. The cross-correlation analysis of multi property of stock markets based on MM-DFA

    Science.gov (United States)

    Yang, Yujun; Li, Jianping; Yang, Yimei

    2017-09-01

    In this paper, we propose a new method called DH-MXA based on distribution histograms of Hurst surface and multiscale multifractal detrended fluctuation analysis. The method allows us to investigate the cross-correlation characteristics among multiple properties of different stock time series. It may provide a new way of measuring the nonlinearity of several signals. It also can provide a more stable and faithful description of cross-correlation of multiple properties of stocks. The DH-MXA helps us to present much richer information than multifractal detrented cross-correlation analysis and allows us to assess many universal and subtle cross-correlation characteristics of stock markets. We show DH-MXA by selecting four artificial data sets and five properties of four stock time series from different countries. The results show that our proposed method can be adapted to investigate the cross-correlation of stock markets. In general, the American stock markets are more mature and less volatile than the Chinese stock markets.

  11. Spatial correlation analysis of urban traffic state under a perspective of community detection

    Science.gov (United States)

    Yang, Yanfang; Cao, Jiandong; Qin, Yong; Jia, Limin; Dong, Honghui; Zhang, Aomuhan

    2018-05-01

    Understanding the spatial correlation of urban traffic state is essential for identifying the evolution patterns of urban traffic state. However, the distribution of traffic state always has characteristics of large spatial span and heterogeneity. This paper adapts the concept of community detection to the correlation network of urban traffic state and proposes a new perspective to identify the spatial correlation patterns of traffic state. In the proposed urban traffic network, the nodes represent road segments, and an edge between a pair of nodes is added depending on the result of significance test for the corresponding correlation of traffic state. Further, the process of community detection in the urban traffic network (named GWPA-K-means) is applied to analyze the spatial dependency of traffic state. The proposed method extends the traditional K-means algorithm in two steps: (i) redefines the initial cluster centers by two properties of nodes (the GWPA value and the minimum shortest path length); (ii) utilizes the weight signal propagation process to transfer the topological information of the urban traffic network into a node similarity matrix. Finally, numerical experiments are conducted on a simple network and a real urban road network in Beijing. The results show that GWPA-K-means algorithm is valid in spatial correlation analysis of traffic state. The network science and community structure analysis perform well in describing the spatial heterogeneity of traffic state on a large spatial scale.

  12. Multifractal temporally weighted detrended cross-correlation analysis to quantify power-law cross-correlation and its application to stock markets

    Science.gov (United States)

    Wei, Yun-Lan; Yu, Zu-Guo; Zou, Hai-Long; Anh, Vo

    2017-06-01

    A new method—multifractal temporally weighted detrended cross-correlation analysis (MF-TWXDFA)—is proposed to investigate multifractal cross-correlations in this paper. This new method is based on multifractal temporally weighted detrended fluctuation analysis and multifractal cross-correlation analysis (MFCCA). An innovation of the method is applying geographically weighted regression to estimate local trends in the nonstationary time series. We also take into consideration the sign of the fluctuations in computing the corresponding detrended cross-covariance function. To test the performance of the MF-TWXDFA algorithm, we apply it and the MFCCA method on simulated and actual series. Numerical tests on artificially simulated series demonstrate that our method can accurately detect long-range cross-correlations for two simultaneously recorded series. To further show the utility of MF-TWXDFA, we apply it on time series from stock markets and find that power-law cross-correlation between stock returns is significantly multifractal. A new coefficient, MF-TWXDFA cross-correlation coefficient, is also defined to quantify the levels of cross-correlation between two time series.

  13. Correlation analysis of motor current and chatter vibration in grinding using complex continuous wavelet coherence

    International Nuclear Information System (INIS)

    Liu, Yao; Wang, Xiufeng; Lin, Jing; Zhao, Wei

    2016-01-01

    Motor current is an emerging and popular signal which can be used to detect machining chatter with its multiple advantages. To achieve accurate and reliable chatter detection using motor current, it is important to make clear the quantitative relationship between motor current and chatter vibration, which has not yet been studied clearly. In this study, complex continuous wavelet coherence, including cross wavelet transform and wavelet coherence, is applied to the correlation analysis of motor current and chatter vibration in grinding. Experimental results show that complex continuous wavelet coherence performs very well in demonstrating and quantifying the intense correlation between these two signals in frequency, amplitude and phase. When chatter occurs, clear correlations in frequency and amplitude in the chatter frequency band appear and the phase difference of current signal to vibration signal turns from random to stable. The phase lead of the most correlated chatter frequency is the largest. With the further development of chatter, the correlation grows up in intensity and expands to higher order chatter frequency band. The analyzing results confirm that there is a consistent correlation between motor current and vibration signals in the grinding chatter process. However, to achieve accurate and reliable chatter detection using motor current, the frequency response bandwidth of current loop of the feed drive system must be wide enough to response chatter effectively. (paper)

  14. Inflatable penile prosthesis implant length with baseline characteristic correlations: preliminary analysis of the PROPPER study.

    Science.gov (United States)

    Bennett, Nelson; Henry, Gerard; Karpman, Edward; Brant, William; Jones, LeRoy; Khera, Mohit; Kohler, Tobias; Christine, Brian; Rhee, Eugene; Kansas, Bryan; Bella, Anthony J

    2017-12-01

    "Prospective Registry of Outcomes with Penile Prosthesis for Erectile Restoration" (PROPPER) is a large, multi-institutional, prospective clinical study to collect, analyze, and report real-world outcomes for men implanted with penile prosthetic devices. We prospectively correlated co-morbid conditions and demographic data with implanted penile prosthesis size to enable clinicians to better predict implanted penis size following penile implantation. We present many new data points for the first time in the literature and postulate that radical prostatectomy (RP) is negatively correlated with penile corporal length. Patient demographics, medical history, baseline characteristics and surgical details were compiled prospectively. Pearson correlation coefficient was generated for the correlation between demographic, etiology of ED, duration of ED, co-morbid conditions, pre-operative penile length (flaccid and stretched) and length of implanted penile prosthesis. Multivariate analysis was performed to define predictors of implanted prosthesis length. From June 2011 to June 2017, 1,135 men underwent primary implantation of penile prosthesis at a total of 11 study sites. Malleable (Spectra), 2-piece Ambicor, and 3-piece AMS 700 CX/LGX were included in the analysis. The most common patient comorbidities were CV disease (26.1%), DM (11.1%), and PD (12.4%). Primary etiology of ED: RP (27.4%), DM (20.3%), CVD (18.0%), PD (10.3%), and Priapism (1.4%), others (22.6%). Mean duration of ED is 6.2¡À4.1 years. Implant length was weakly negatively correlated with White/Caucasian (r=-0.18; Pprosthesis length is negatively correlated with some ethnic groups, prostatectomy, and incontinence. Positive correlates include CV disease, preoperative stretched penile length, and flaccid penile length.

  15. Correlation function analysis of the COBE differential microwave radiometer sky maps

    Energy Technology Data Exchange (ETDEWEB)

    Lineweaver, Charles Howe [Univ. of California, Berkeley, CA (United States). Space Sciences Lab.

    1994-08-01

    The Differential Microwave Radiometer (DMR) aboard the COBE satellite has detected anisotropies in the cosmic microwave background (CMB) radiation. A two-point correlation function analysis which helped lead to this discovery is presented in detail. The results of a correlation function analysis of the two year DMR data set is presented. The first and second year data sets are compared and found to be reasonably consistent. The positive correlation for separation angles less than ~20° is robust to Galactic latitude cuts and is very stable from year to year. The Galactic latitude cut independence of the correlation function is strong evidence that the signal is not Galactic in origin. The statistical significance of the structure seen in the correlation function of the first, second and two year maps is respectively > 9σ, > 10σ and > 18σ above the noise. The noise in the DMR sky maps is correlated at a low level. The structure of the pixel temperature covariance matrix is given. The noise covariance matrix of a DMR sky map is diagonal to an accuracy of better than 1%. For a given sky pixel, the dominant noise covariance occurs with the ring of pixels at an angular separation of 60° due to the 60° separation of the DMR horns. The mean covariance of 60° is 0.45%$+0.18\\atop{-0.14}$ of the mean variance. The noise properties of the DMR maps are thus well approximated by the noise properties of maps made by a single-beam experiment. Previously published DMR results are not significantly affected by correlated noise.

  16. Correlation Between Minimum Apparent Diffusion Coefficient (ADCmin) and Tumor Cellularity: A Meta-analysis.

    Science.gov (United States)

    Surov, Alexey; Meyer, Hans Jonas; Wienke, Andreas

    2017-07-01

    Diffusion-weighted imaging (DWI) is a magnetic resonance imaging (MRI) technique based on measure of water diffusion that can provide information about tissue microstructure, especially about cell count. Increase of cell density induces restriction of water diffusion and decreases apparent diffusion coefficient (ADC). ADC can be divided into three sub-parameters: ADC minimum or ADC min , mean ADC or ADC mean and ADC maximum or ADC max Some studies have suggested that ADC min shows stronger correlations with cell count in comparison to other ADC fractions and may be used as a parameter for estimation of tumor cellularity. The aim of the present meta-analysis was to summarize correlation coefficients between ADC min and cellularity in different tumors based on large patient data. For this analysis, MEDLINE database was screened for associations between ADC and cell count in different tumors up to September 2016. For this work, only data regarding ADC min were included. Overall, 12 publications with 317 patients were identified. Spearman's correlation coefficient was used to analyze associations between ADC min and cellularity. The reported Pearson correlation coefficients in some publications were converted into Spearman correlation coefficients. The pooled correlation coefficient for all included studies was ρ=-0.59 (95% confidence interval (CI)=-0.72 to -0.45), heterogeneity Tau 2 =0.04 (pcorrelated moderately with tumor cellularity. The calculated correlation coefficient is not stronger in comparison to the reported coefficient for ADC mean and, therefore, ADC min does not represent a better means to reflect cellularity. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  17. Effective and Efficient Correlation Analysis with Application to Market Basket Analysis and Network Community Detection

    Science.gov (United States)

    Duan, Lian

    2012-01-01

    Finding the most interesting correlations among items is essential for problems in many commercial, medical, and scientific domains. For example, what kinds of items should be recommended with regard to what has been purchased by a customer? How to arrange the store shelf in order to increase sales? How to partition the whole social network into…

  18. Correlation, path analysis and heritability estimation for agronomic traits contribute to yield on soybean

    Science.gov (United States)

    Sulistyo, A.; Purwantoro; Sari, K. P.

    2018-01-01

    Selection is a routine activity in plant breeding programs that must be done by plant breeders in obtaining superior plant genotypes. The use of appropriate selection criteria will determine the effectiveness of selection activities. The purpose of this study was to analysis the inheritable agronomic traits that contribute to soybean yield. A total of 91 soybean lines were planted in Muneng Experimental Station, Probolinggo District, East Java Province, Indonesia in 2016. All soybean lines were arranged in randomized complete block design with two replicates. Correlation analysis, path analysis and heritability estimation were performed on days to flowering, days to maturing, plant height, number of branches, number of fertile nodes, number of filled pods, weight of 100 seeds, and yield to determine selection criteria on soybean breeding program. The results showed that the heritability value of almost all agronomic traits observed is high except for the number of fertile nodes with low heritability. The result of correlation analysis shows that days to flowering, plant height and number of fertile nodes have positive correlation with seed yield per plot (0.056, 0.444, and 0.100, respectively). In addition, path analysis showed that plant height and number of fertile nodes have highest positive direct effect on soybean yield. Based on this result, plant height can be selected as one of selection criteria in soybean breeding program to obtain high yielding soybean variety.

  19. Neutron Activation Analysis and Moessbauer Correlations of Archaeological Pottery from Amazon Basin for Classification Studies

    International Nuclear Information System (INIS)

    Bellido, A. V. B.; Latini, R. M.; Nicoli, I.; Scorzelli, R. B.; Solorzano, P. M.

    2011-01-01

    The aim of the present work was to investigate the correlation between data obtained by means of two analytical methods, instrumental neutron activation analysis (INAA) and Moessbauer Spectroscopy of pottery samples combined with multivariate statistical analysis in order to optimize quantitative analysis in the classification studies. Ceramics recently discovered in archaeological earth circular structures sites in Acre state Brazil. 199 samples were analyzed by INAA, allowing simultaneous determination of twenty elements chemical concentrations, and 44 samples by using Moessbauer Spectroscopy, allowing the determination of fourteen hyperfine parameters. For the correlation study, data were treated by two multivariate statistical methods: cluster analysis for the classification and the principal component analysis for the data correlations. INAA data show that some of REE (rare earth elements) were the discriminating variables for this technique. Mossbauer parameters that exhibit the same behavior are being investigated, remarkable improve can be seem for the combined REE and the Mossbauer variables showing a good results considering the limited number of samples. This data matrix is being used for the understanding in the studies of classification and provenance of ceramics prehistory of the Amazonic basin.

  20. Network analysis reveals that bacteria and fungi form modules that correlate independently with soil parameters.

    Science.gov (United States)

    de Menezes, Alexandre B; Prendergast-Miller, Miranda T; Richardson, Alan E; Toscas, Peter; Farrell, Mark; Macdonald, Lynne M; Baker, Geoff; Wark, Tim; Thrall, Peter H

    2015-08-01

    Network and multivariate statistical analyses were performed to determine interactions between bacterial and fungal community terminal restriction length polymorphisms as well as soil properties in paired woodland and pasture sites. Canonical correspondence analysis (CCA) revealed that shifts in woodland community composition correlated with soil dissolved organic carbon, while changes in pasture community composition correlated with moisture, nitrogen and phosphorus. Weighted correlation network analysis detected two distinct microbial modules per land use. Bacterial and fungal ribotypes did not group separately, rather all modules comprised of both bacterial and fungal ribotypes. Woodland modules had a similar fungal : bacterial ribotype ratio, while in the pasture, one module was fungal dominated. There was no correspondence between pasture and woodland modules in their ribotype composition. The modules had different relationships to soil variables, and these contrasts were not detected without the use of network analysis. This study demonstrated that fungi and bacteria, components of the soil microbial communities usually treated as separate functional groups as in a CCA approach, were co-correlated and formed distinct associations in these adjacent habitats. Understanding these distinct modular associations may shed more light on their niche space in the soil environment, and allow a more realistic description of soil microbial ecology and function. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.

  1. Robustness analysis of geodetic networks in the case of correlated observations

    Directory of Open Access Journals (Sweden)

    Mevlut Yetkin

    Full Text Available GPS (or GNSS networks are invaluable tools for monitoring natural hazards such as earthquakes. However, blunders in GPS observations may be mistakenly interpreted as deformation. Therefore, robust networks are needed in deformation monitoring using GPS networks. Robustness analysis is a natural merger of reliability and strain and defined as the ability to resist deformations caused by the maximum undetecle errors as determined from internal reliability analysis. However, to obtain rigorously correct results; the correlations among the observations must be considered while computing maximum undetectable errors. Therefore, we propose to use the normalized reliability numbers instead of redundancy numbers (Baarda's approach in robustness analysis of a GPS network. A simple mathematical relation showing the ratio between uncorrelated and correlated cases for maximum undetectable error is derived. The same ratio is also valid for the displacements. Numerical results show that if correlations among observations are ignored, dramatically different displacements can be obtained depending on the size of multiple correlation coefficients. Furthermore, when normalized reliability numbers are small, displacements get large, i.e., observations with low reliability numbers cause bigger displacements compared to observations with high reliability numbers.

  2. Neural Network-Based Coronary Heart Disease Risk Prediction Using Feature Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jae Kwon Kim

    2017-01-01

    Full Text Available Background. Of the machine learning techniques used in predicting coronary heart disease (CHD, neural network (NN is popularly used to improve performance accuracy. Objective. Even though NN-based systems provide meaningful results based on clinical experiments, medical experts are not satisfied with their predictive performances because NN is trained in a “black-box” style. Method. We sought to devise an NN-based prediction of CHD risk using feature correlation analysis (NN-FCA using two stages. First, the feature selection stage, which makes features acceding to the importance in predicting CHD risk, is ranked, and second, the feature correlation analysis stage, during which one learns about the existence of correlations between feature relations and the data of each NN predictor output, is determined. Result. Of the 4146 individuals in the Korean dataset evaluated, 3031 had low CHD risk and 1115 had CHD high risk. The area under the receiver operating characteristic (ROC curve of the proposed model (0.749 ± 0.010 was larger than the Framingham risk score (FRS (0.393 ± 0.010. Conclusions. The proposed NN-FCA, which utilizes feature correlation analysis, was found to be better than FRS in terms of CHD risk prediction. Furthermore, the proposed model resulted in a larger ROC curve and more accurate predictions of CHD risk in the Korean population than the FRS.

  3. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    Science.gov (United States)

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  4. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    Directory of Open Access Journals (Sweden)

    Martin A. Proescholdt

    2017-01-01

    Full Text Available Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca, correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp. In this study we compared the results of the sca with the pressure reactivity index (PRx, an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc. The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  5. Bottomside sinusoidal irregularities in the equatorial F region. II - Cross-correlation and spectral analysis

    Science.gov (United States)

    Cragin, B. L.; Hanson, W. B.; Mcclure, J. P.; Valladares, C. E.

    1985-01-01

    Equatorial bottomside sinusoidal (BSS) irregularities have been studied by applying techniques of cross-correlation and spectral analysis to the Atmosphere Explorer data set. The phase of the cross-correlations of the plasma number density is discussed and the two drift velocity components observed using the retarding potential analyzer and ion drift meter on the satellite are discussed. Morphology is addressed, presenting the geographical distributions of the occurrence of BSS events for the equinoxes and solstices. Physical processes including the ion Larmor flux, interhemispheric plasma flows, and variations in the lower F region Pedersen conductivity are invoked to explain the findings.

  6. Correlation analysis of quantum fluctuations and repulsion effects of classical dynamics in SU(3) model

    International Nuclear Information System (INIS)

    Fujiwara, Shigeyasu; Sakata, Fumihiko

    2003-01-01

    In many quantum systems, random matrix theory has been used to characterize quantum level fluctuations, which is known to be a quantum correspondent to a regular-to-chaos transition in classical systems. We present a new qualitative analysis of quantum and classical fluctuation properties by exploiting correlation coefficients and variances. It is shown that the correlation coefficient of the quantum level density is roughly inversely proportional relation to the variance of consecutive phase-space point spacings on the Poincare section plane. (author)

  7. Detection of non-stationary leak signals at NPP primary circuit by cross-correlation analysis

    International Nuclear Information System (INIS)

    Shimanskij, S.B.

    2007-01-01

    A leak-detection system employing high-temperature microphones has been developed for the RBMK and ATR (Japan) reactors. Further improvement of the system focused on using cross-correlation analysis of the spectral components of the signal to detect a small leak at an early stage of development. Since envelope processes are less affected by distortions than are wave processes, they give a higher-degree of correlation and can be used to detect leaks with lower signal-noise ratios. Many simulation tests performed at nuclear power plants have shown that the proposed methods can be used to detect and find the location of a small leak [ru

  8. Bivariate threshold models for genetic evaluation of susceptibility to and ability to recover from mastitis in Danish Holstein cows.

    Science.gov (United States)

    Welderufael, B G; Janss, L L G; de Koning, D J; Sørensen, L P; Løvendahl, P; Fikse, W F

    2017-06-01

    Mastitis in dairy cows is an unavoidable problem and genetic variation in recovery from mastitis, in addition to susceptibility, is therefore of interest. Genetic parameters for susceptibility to and recovery from mastitis were estimated for Danish Holstein-Friesian cows using data from automatic milking systems equipped with online somatic cell count measuring units. The somatic cell count measurements were converted to elevated mastitis risk, a continuous variable [on a (0-1) scale] indicating the risk of mastitis. Risk values >0.6 were assumed to indicate that a cow had mastitis. For each cow and lactation, the sequence of health states (mastitic or healthy) was converted to a weekly transition: 0 if the cow stayed within the same state and 1 if the cow changed state. The result was 2 series of transitions: one for healthy to diseased (HD, to model mastitis susceptibility) and the other for diseased to healthy (DH, to model recovery ability). The 2 series of transitions were analyzed with bivariate threshold models, including several systematic effects and a function of time. The model included effects of herd, parity, herd-test-week, permanent environment (to account for the repetitive nature of transition records from a cow) plus two time-varying effects (lactation stage and time within episode). In early lactation, there was an increased risk of getting mastitis but the risk remained stable afterwards. Mean recovery rate was 45% per lactation. Heritabilities were 0.07 [posterior mean of standard deviations (PSD) = 0.03] for HD and 0.08 (PSD = 0.03) for DH. The genetic correlation between HD and DH has a posterior mean of -0.83 (PSD = 0.13). Although susceptibility and recovery from mastitis are strongly negatively correlated, recovery can be considered as a new trait for selection. The Authors. Published by the Federation of Animal Science Societies and Elsevier Inc. on behalf of the American Dairy Science Association®. This is an open access article under

  9. Uncertainty evaluation in correlated quantities: application to elemental analysis of atmospheric aerosols

    International Nuclear Information System (INIS)

    Espinosa, A.; Miranda, J.; Pineda, J. C.

    2010-01-01

    One of the aspects that are frequently overlooked in the evaluation of uncertainty in experimental data is the possibility that the involved quantities are correlated among them, due to different causes. An example in the elemental analysis of atmospheric aerosols using techniques like X-ray Fluorescence (X RF) or Particle Induced X-ray Emission (PIXE). In these cases, the measured elemental concentrations are highly correlated, and then are used to obtain information about other variables, such as the contribution from emitting sources related to soil, sulfate, non-soil potassium or organic matter. This work describes, as an example, the method required to evaluate the uncertainty in variables determined from correlated quantities from a set of atmospheric aerosol samples collected in the Metropolitan Area of the Mexico Valley and analyzed with PIXE. The work is based on the recommendations of the Guide for the Evaluation of Uncertainty published by the International Organization for Standardization. (Author)

  10. Multifractal analysis of the long-range correlations in the cardiac dynamics of Drosophila melanogaster

    International Nuclear Information System (INIS)

    Vitanov, Nikolay K.; Yankulova, Elka D.

    2006-01-01

    By means of the multifractal detrended fluctuation analysis (MFDFA) we investigate long-range correlations in the interbeat time series of heart activity of Drosophila melanogaster-the classical object of research in genetics. Our main investigation tool are the fractal spectra f(α) and h(q) by means of which we trace the correlation properties of Drosophila heartbeat dynamics for three consequent generations of species. We observe that opposite to the case of humans the time series of the heartbeat activity of healthy Drosophila do not have scaling properties. Time series from species with genetic defects can be long-range correlated. Different kinds of genetic heart defects lead to different shape of the fractal spectra. The fractal heartbeat dynamics of Drosophila is transferred from generation to generation

  11. Solar activity and terrestrial climate: an analysis of some purported correlations

    DEFF Research Database (Denmark)

    Laut, Peter

    2003-01-01

    claimed to support solar hypotheses. My analyses show that the apparent strong correlations displayed on these graphs have been obtained by an incorrect handling of the physical data. Since the graphs are still widely referred to in the literature and their misleading character has not yet been generally......The last decade has seen a revival of various hypotheses claiming a strong correlation between solar activity and a number of terrestrial climate parameters: Links between cosmic rays and cloud cover, first total cloud cover and then only low clouds, and between solar cycle lengths and Northern...... the existence of important links between solar activity and terrestrial climate. Such links have over the years been demonstrated by many authors. The sole objective of the present analysis is to draw attention to the fact that some of the widely publicized, apparent correlations do not properly reflect...

  12. Analysis of angiography findings in cerebral arteriovenous malformations: Correlation with hemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Hyoung; Kim, Hyung Jin; Jung, Jin Myung; Ha, Choong Kun; Chung, Sung Hoon [Gyeongsang National University College of Medicine, Chinju (Korea, Republic of)

    1993-07-15

    Intracerebral hemorrhage is the most serious complication of cerebral arteriovenous malformations (AVM). To identify angiographic characteristics of AVM which correlate with a history of hemorrhage, we retrospectively analyzed angiographic findings of 25 patients with AVM. Nine characteristic were evaluated; these include nidus size, location, arterial aneurysm, intranidal aneurysm, angiomatous change, venous drainage pattern, venous stenosis, delayed drainage and venous ectasia. The characteristic were correlated with hemorrhage,which was seen in 18 (72%) patients on CT or MR images. Venous stenosis (P<0.5) and delaved venous drainage (P<0.5) well correlated with a history of hemorrhage. Arterial aneurysm and intranidal aneurysm also had a tendency hemorrhage although they did not prove to be statistically significant. Detailed analysis of angiographic finding of AVM is important for recognition of characteristic which are related to hemorrhage and may contribute to establishing a prognosis and treatment planning.

  13. Characteristic analysis on UAV-MIMO channel based on normalized correlation matrix.

    Science.gov (United States)

    Gao, Xi jun; Chen, Zi li; Hu, Yong Jiang

    2014-01-01

    Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication.

  14. Design, demonstration and analysis of a modified wavelength-correlating receiver for incoherent OCDMA system.

    Science.gov (United States)

    Zhou, Heng; Qiu, Kun; Wang, Leyang

    2011-03-28

    A novel wavelength-correlating receiver for incoherent Optical Code Division Multiple Access (OCDMA) system is proposed and demonstrated in this paper. Enabled by the wavelength conversion based scheme, the proposed receiver can support various code types including one-dimensional optical codes and time-spreading/wavelength-hopping two dimensional codes. Also, a synchronous detection scheme with time-to- wavelength based code acquisition is proposed, by which code acquisition time can be substantially reduced. Moreover, a novel data-validation methodology based on all-optical pulse-width monitoring is introduced for the wavelength-correlating receiver. Experimental demonstration of the new proposed receiver is presented and low bit error rate data-receiving is achieved without optical hard limiting and electronic power thresholding. For the first time, a detailed theoretical performance analysis specialized for the wavelength-correlating receiver is presented. Numerical results show that the overall performance of the proposed receiver prevails over conventional OCDMA receivers.

  15. Nonlinear Analysis on Cross-Correlation of Financial Time Series by Continuum Percolation System

    Science.gov (United States)

    Niu, Hongli; Wang, Jun

    We establish a financial price process by continuum percolation system, in which we attribute price fluctuations to the investors’ attitudes towards the financial market, and consider the clusters in continuum percolation as the investors share the same investment opinion. We investigate the cross-correlations in two return time series, and analyze the multifractal behaviors in this relationship. Further, we study the corresponding behaviors for the real stock indexes of SSE and HSI as well as the liquid stocks pair of SPD and PAB by comparison. To quantify the multifractality in cross-correlation relationship, we employ multifractal detrended cross-correlation analysis method to perform an empirical research for the simulation data and the real markets data.

  16. A canonical correlation analysis based EMG classification algorithm for eliminating electrode shift effect.

    Science.gov (United States)

    Zhe Fan; Zhong Wang; Guanglin Li; Ruomei Wang

    2016-08-01

    Motion classification system based on surface Electromyography (sEMG) pattern recognition has achieved good results in experimental condition. But it is still a challenge for clinical implement and practical application. Many factors contribute to the difficulty of clinical use of the EMG based dexterous control. The most obvious and important is the noise in the EMG signal caused by electrode shift, muscle fatigue, motion artifact, inherent instability of signal and biological signals such as Electrocardiogram. In this paper, a novel method based on Canonical Correlation Analysis (CCA) was developed to eliminate the reduction of classification accuracy caused by electrode shift. The average classification accuracy of our method were above 95% for the healthy subjects. In the process, we validated the influence of electrode shift on motion classification accuracy and discovered the strong correlation with correlation coefficient of >0.9 between shift position data and normal position data.

  17. Correlation among the spectral parameters for qualitative analysis of Alpha Liquid Scintillation Spectra

    International Nuclear Information System (INIS)

    Bhade, Sonali P.D.; Reddy, P.J.; Kolekar, R.V.; Singh, Rajvir; Pradeepkumar, K.S.

    2014-01-01

    The potential use of alpha LSC technique is nowadays recognized widely. However the energy resolution of α particle is poor with liquid scintillators. Moreover, α peak positions are influenced by the level of quenching in the samples. To overcome this problem, a thorough study of all concerned parameters that affect spectral information was carried out. The parameters such as peak's centroid, quenching, % resolution, energy of α particle were investigated and the correlation between them was evaluated. In the present work, the qualitative analysis of α spectrum was carried out. Correlations between the energy of α particle and various parameters affecting the peaks of the collected spectra with respect to quenching were established. These correlations will be useful for the deconvolution studies of composite samples containing different alpha radionuclides

  18. Outage Performance Analysis of Cooperative Diversity with MRC and SC in Correlated Lognormal Channels

    Directory of Open Access Journals (Sweden)

    Skraparlis D

    2009-01-01

    Full Text Available Abstract The study of relaying systems has found renewed interest in the context of cooperative diversity for communication channels suffering from fading. This paper provides analytical expressions for the end-to-end SNR and outage probability of cooperative diversity in correlated lognormal channels, typically found in indoor and specific outdoor environments. The system under consideration utilizes decode-and-forward relaying and Selection Combining or Maximum Ratio Combining at the destination node. The provided expressions are used to evaluate the gains of cooperative diversity compared to noncooperation in correlated lognormal channels, taking into account the spectral and energy efficiency of the protocols and the half-duplex or full-duplex capability of the relay. Our analysis demonstrates that correlation and lognormal variances play a significant role on the performance gain of cooperative diversity against noncooperation.

  19. fCCAC: functional canonical correlation analysis to evaluate covariance between nucleic acid sequencing datasets.

    Science.gov (United States)

    Madrigal, Pedro

    2017-03-01

    Computational evaluation of variability across DNA or RNA sequencing datasets is a crucial step in genomic science, as it allows both to evaluate reproducibility of biological or technical replicates, and to compare different datasets to identify their potential correlations. Here we present fCCAC, an application of functional canonical correlation analysis to assess covariance of nucleic acid sequencing datasets such as chromatin immunoprecipitation followed by deep sequencing (ChIP-seq). We show how this method differs from other measures of correlation, and exemplify how it can reveal shared covariance between histone modifications and DNA binding proteins, such as the relationship between the H3K4me3 chromatin mark and its epigenetic writers and readers. An R/Bioconductor package is available at http://bioconductor.org/packages/fCCAC/ . pmb59@cam.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  20. Analyzing the Cross-Correlation Between Onshore and Offshore RMB Exchange Rates Based on Multifractal Detrended Cross-Correlation Analysis (MF-DCCA)

    Science.gov (United States)

    Xie, Chi; Zhou, Yingying; Wang, Gangjin; Yan, Xinguo

    We use the multifractal detrended cross-correlation analysis (MF-DCCA) method to explore the multifractal behavior of the cross-correlation between exchange rates of onshore RMB (CNY) and offshore RMB (CNH) against US dollar (USD). The empirical data are daily prices of CNY/USD and CNH/USD from May 1, 2012 to February 29, 2016. The results demonstrate that: (i) the cross-correlation between CNY/USD and CNH/USD is persistent and its fluctuation is smaller when the order of fluctuation function is negative than that when the order is positive; (ii) the multifractal behavior of the cross-correlation between CNY/USD and CNH/USD is significant during the sample period; (iii) the dynamic Hurst exponents obtained by the rolling windows analysis show that the cross-correlation is stable when the global economic situation is good and volatile in bad situation; and (iv) the non-normal distribution of original data has a greater effect on the multifractality of the cross-correlation between CNY/USD and CNH/USD than the temporary correlation.

  1. Canonical correlation analysis of synchronous neural interactions and cognitive deficits in Alzheimer's dementia

    Science.gov (United States)

    Karageorgiou, Elissaios; Lewis, Scott M.; Riley McCarten, J.; Leuthold, Arthur C.; Hemmy, Laura S.; McPherson, Susan E.; Rottunda, Susan J.; Rubins, David M.; Georgopoulos, Apostolos P.

    2012-10-01

    In previous work (Georgopoulos et al 2007 J. Neural Eng. 4 349-55) we reported on the use of magnetoencephalographic (MEG) synchronous neural interactions (SNI) as a functional biomarker in Alzheimer's dementia (AD) diagnosis. Here we report on the application of canonical correlation analysis to investigate the relations between SNI and cognitive neuropsychological (NP) domains in AD patients. First, we performed individual correlations between each SNI and each NP, which provided an initial link between SNI and specific cognitive tests. Next, we performed factor analysis on each set, followed by a canonical correlation analysis between the derived SNI and NP factors. This last analysis optimally associated the entire MEG signal with cognitive function. The results revealed that SNI as a whole were mostly associated with memory and language, and, slightly less, executive function, processing speed and visuospatial abilities, thus differentiating functions subserved by the frontoparietal and the temporal cortices. These findings provide a direct interpretation of the information carried by the SNI and set the basis for identifying specific neural disease phenotypes according to cognitive deficits.

  2. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Robust and sparse correlation matrix estimation for the analysis of high-dimensional genomics data.

    Science.gov (United States)

    Serra, Angela; Coretto, Pietro; Fratello, Michele; Tagliaferri, Roberto; Stegle, Oliver

    2018-02-15

    Microarray technology can be used to study the expression of thousands of genes across a number of different experimental conditions, usually hundreds. The underlying principle is that genes sharing similar expression patterns, across different samples, can be part of the same co-expression system, or they may share the same biological functions. Groups of genes are usually identified based on cluster analysis. Clustering methods rely on the similarity matrix between genes. A common choice to measure similarity is to compute the sample correlation matrix. Dimensionality reduction is another popular data analysis task which is also based on covariance/correlation matrix estimates. Unfortunately, covariance/correlation matrix estimation suffers from the intrinsic noise present in high-dimensional data. Sources of noise are: sampling variations, presents of outlying sample units, and the fact that in most cases the number of units is much larger than the number of genes. In this paper, we propose a robust correlation matrix estimator that is regularized based on adaptive thresholding. The resulting method jointly tames the effects of the high-dimensionality, and data contamination. Computations are easy to implement and do not require hand tunings. Both simulated and real data are analyzed. A Monte Carlo experiment shows that the proposed method is capable of remarkable performances. Our correlation metric is more robust to outliers compared with the existing alternatives in two gene expression datasets. It is also shown how the regularization allows to automatically detect and filter spurious correlations. The same regularization is also extended to other less robust correlation measures. Finally, we apply the ARACNE algorithm on the SyNTreN gene expression data. Sensitivity and specificity of the reconstructed network is compared with the gold standard. We show that ARACNE performs better when it takes the proposed correlation matrix estimator as input. The R

  4. Correlates of Unwanted Births in Bangladesh: A Study through Path Analysis.

    Science.gov (United States)

    Roy, Tapan Kumar; Singh, Brijesh P

    2016-01-01

    Unwanted birth is an important public health concern due to its negative association with adverse outcomes of mothers and children as well as socioeconomic development of a country. Although a number of studies have been investigated the determinants of unwanted births through logistic regression analysis, an extensive assessment using path model is lacking. In the current study, we applied path analysis to know the important covariates for unwanted births in Bangladesh. The study used data extracted from Bangladesh Demographic and Health Survey (BDHS) 2011. It considered sub-sample consisted of 7,972 women who had given most recent births five years preceding the date of interview or who were currently pregnant at survey time. Correlation analysis was used to find out the significant association with unwanted births. This study provided the factors affecting unwanted births in Bangladesh. The path model was used to determine the direct, indirect and total effects of socio-demographic factors on unwanted births. The result exhibited that more than one-tenth of the recent births were unwanted in Bangladesh. The differentials of unwanted births were women's age, education, age at marriage, religion, socioeconomic status, exposure of mass-media and use of family planning. In correlation analysis, it showed that unwanted births were positively correlated with women age and place of residence and these relationships were significant. On the contrary, unwanted births were inversely significantly correlated with education and social status. The total effects of endogenous variables such as women age, place of residence and use of family planning methods had favorable effect on unwanted births. Policymakers and program planners need to design programs and services carefully to reduce unwanted births in Bangladesh, especially, service should focus on helping those groups of women who were identified in the analysis as being at increased risks of unwanted births- older women

  5. Correlates of Unwanted Births in Bangladesh: A Study through Path Analysis.

    Directory of Open Access Journals (Sweden)

    Tapan Kumar Roy

    Full Text Available Unwanted birth is an important public health concern due to its negative association with adverse outcomes of mothers and children as well as socioeconomic development of a country. Although a number of studies have been investigated the determinants of unwanted births through logistic regression analysis, an extensive assessment using path model is lacking. In the current study, we applied path analysis to know the important covariates for unwanted births in Bangladesh.The study used data extracted from Bangladesh Demographic and Health Survey (BDHS 2011. It considered sub-sample consisted of 7,972 women who had given most recent births five years preceding the date of interview or who were currently pregnant at survey time. Correlation analysis was used to find out the significant association with unwanted births. This study provided the factors affecting unwanted births in Bangladesh. The path model was used to determine the direct, indirect and total effects of socio-demographic factors on unwanted births.The result exhibited that more than one-tenth of the recent births were unwanted in Bangladesh. The differentials of unwanted births were women's age, education, age at marriage, religion, socioeconomic status, exposure of mass-media and use of family planning. In correlation analysis, it showed that unwanted births were positively correlated with women age and place of residence and these relationships were significant. On the contrary, unwanted births were inversely significantly correlated with education and social status. The total effects of endogenous variables such as women age, place of residence and use of family planning methods had favorable effect on unwanted births.Policymakers and program planners need to design programs and services carefully to reduce unwanted births in Bangladesh, especially, service should focus on helping those groups of women who were identified in the analysis as being at increased risks of unwanted

  6. Correlation of quantitative histopathological morphology and quantitative radiological analysis during aseptic loosening of hip endoprostheses.

    Science.gov (United States)

    Bertz, S; Kriegsmann, J; Eckardt, A; Delank, K-S; Drees, P; Hansen, T; Otto, M

    2006-01-01

    Aseptic hip prosthesis loosening is the most important long-term complication in total hip arthroplasty. Polyethylene (PE) wear is the dominant etiologic factor in aseptic loosening, which together with other factors induces mechanisms resulting in bone loss, and finally in implant loosening. The single-shot radiograph analysis (EBRA, abbreviation for the German term "Einzel-Bild-Röntgenanalyse") is a computerized method for early radiological prediction of aseptic loosening. In this study, EBRA parameters were correlated with histomorphological parameters of the periprosthetic membrane. Periprosthetic membranes obtained from 19 patients during revision surgery of loosened ABG I-type total hip pros-theses were analyzed histologically and morphometrically. The pre-existing EBRA parameters, the thickness of the PE debris lay-er and the dimension of inclination and anteversion, were compared with the density of macrophages and giant cells. Addi-tionally, the semiquantitatively determined density of lymphocytes, plasma cells, giant cells and the size of the necrotic areas were correlated with the EBRA results. All periprosthetic membranes were classified as debris-induced type membranes. We found a positive correlation between the number of giant cells and the thickness of the PE debris layer. There was no significant correlation between the number of macrophages or all semiquantitative parameters and EBRA parameters. The number of giant cells decreased with implant duration. The morphometrically measured number of foreign body giant cells more closely reflects the results of the EBRA. The semiquantitative estimation of giant cell density could not substitute for the morphometrical analysis. The density of macrophages, lymphocytes, plasma cells and the size of necrotic areas did not correlate with the EBRA parameters, indicating that there is no correlation with aseptic loosening.

  7. Improving risk estimates of runoff producing areas: formulating variable source areas as a bivariate process.

    Science.gov (United States)

    Cheng, Xiaoya; Shaw, Stephen B; Marjerison, Rebecca D; Yearick, Christopher D; DeGloria, Stephen D; Walter, M Todd

    2014-05-01

    Predicting runoff producing areas and their corresponding risks of generating storm runoff is important for developing watershed management strategies to mitigate non-point source pollution. However, few methods for making these predictions have been proposed, especially operational approaches that would be useful in areas where variable source area (VSA) hydrology dominates storm runoff. The objective of this study is to develop a simple approach to estimate spatially-distributed risks of runoff production. By considering the development of overland flow as a bivariate process, we incorporated both rainfall and antecedent soil moisture conditions into a method for predicting VSAs based on the Natural Resource Conservation Service-Curve Number equation. We used base-flow immediately preceding storm events as an index of antecedent soil wetness status. Using nine sub-basins of the Upper Susquehanna River Basin, we demonstrated that our estimated runoff volumes and extent of VSAs agreed with observations. We further demonstrated a method for mapping these areas in a Geographic Information System using a Soil Topographic Index. The proposed methodology provides a new tool for watershed planners for quantifying runoff risks across watersheds, which can be used to target water quality protection strategies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. A Basic Bivariate Structure of Personality Attributes Evident Across Nine Languages.

    Science.gov (United States)

    Saucier, Gerard; Thalmayer, Amber Gayle; Payne, Doris L; Carlson, Robert; Sanogo, Lamine; Ole-Kotikash, Leonard; Church, A Timothy; Katigbak, Marcia S; Somer, Oya; Szarota, Piotr; Szirmák, Zsofia; Zhou, Xinyue

    2014-02-01

    Here, two studies seek to characterize a parsimonious common-denominator personality structure with optimal cross-cultural replicability. Personality differences are observed in all human populations and cultures, but lexicons for personality attributes contain so many distinctions that parsimony is lacking. Models stipulating the most important attributes have been formulated by experts or by empirical studies drawing on experience in a very limited range of cultures. Factor analyses of personality lexicons of nine languages of diverse provenance (Chinese, Korean, Filipino, Turkish, Greek, Polish, Hungarian, Maasai, and Senoufo) were examined, and their common structure was compared to that of several prominent models in psychology. A parsimonious bivariate model showed evidence of substantial convergence and ubiquity across cultures. Analyses involving key markers of these dimensions in English indicate that they are broad dimensions involving the overlapping content of the interpersonal circumplex, models of communion and agency, and morality/warmth and competence. These "Big Two" dimensions-Social Self-Regulation and Dynamism-provide a common-denominator model involving the two most crucial axes of personality variation, ubiquitous across cultures. The Big Two might serve as an umbrella model serving to link diverse theoretical models and associated research literatures. © 2013 Wiley Periodicals, Inc.

  9. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    Science.gov (United States)

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  10. Applying Emax model and bivariate thin plate splines to assess drug interactions.

    Science.gov (United States)

    Kong, Maiying; Lee, J Jack

    2010-01-01

    We review the semiparametric approach previously proposed by Kong and Lee and extend it to a case in which the dose-effect curves follow the Emax model instead of the median effect equation. When the maximum effects for the investigated drugs are different, we provide a procedure to obtain the additive effect based on the Loewe additivity model. Then, we apply a bivariate thin plate spline approach to estimate the effect beyond additivity along with its 95 per cent point-wise confidence interval as well as its 95 per cent simultaneous confidence interval for any combination dose. Thus, synergy, additivity, and antagonism can be identified. The advantages of the method are that it provides an overall assessment of the combination effect on the entire two-dimensional dose space spanned by the experimental doses, and it enables us to identify complex patterns of drug interaction in combination studies. In addition, this approach is robust to outliers. To illustrate this procedure, we analyzed data from two case studies.

  11. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    Science.gov (United States)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  12. THE BIVARIATE SIZE-LUMINOSITY RELATIONS FOR LYMAN BREAK GALAXIES AT z {approx} 4-5

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Kuang-Han; Su, Jian [Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Ferguson, Henry C. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Ravindranath, Swara, E-mail: kuanghan@pha.jhu.edu [The Inter-University Center for Astronomy and Astrophysics, Pune University Campus, Pune 411007, Maharashtra (India)

    2013-03-01

    We study the bivariate size-luminosity distribution of Lyman break galaxies (LBGs) selected at redshifts around 4 and 5 in GOODS and the HUDF fields. We model the size-luminosity distribution as a combination of log-normal distribution (in size) and Schechter function (in luminosity), therefore it enables a more detailed study of the selection effects. We perform extensive simulations to quantify the dropout-selection completenesses and measurement biases and uncertainties in two-dimensional size and magnitude bins, and transform the theoretical size-luminosity distribution to the expected distribution for the observed data. Using maximum-likelihood estimator, we find that the Schechter function parameters for B {sub 435}-dropouts and are consistent with the values in the literature, but the size distributions are wider than expected from the angular momentum distribution of the underlying dark matter halos. The slope of the size-luminosity (RL) relation is similar to those found for local disk galaxies, but considerably shallower than local early-type galaxies.

  13. THE BIVARIATE SIZE-LUMINOSITY RELATIONS FOR LYMAN BREAK GALAXIES AT z ∼ 4-5

    International Nuclear Information System (INIS)

    Huang, Kuang-Han; Su, Jian; Ferguson, Henry C.; Ravindranath, Swara

    2013-01-01

    We study the bivariate size-luminosity distribution of Lyman break galaxies (LBGs) selected at redshifts around 4 and 5 in GOODS and the HUDF fields. We model the size-luminosity distribution as a combination of log-normal distribution (in size) and Schechter function (in luminosity), therefore it enables a more detailed study of the selection effects. We perform extensive simulations to quantify the dropout-selection completenesses and measurement biases and uncertainties in two-dimensional size and magnitude bins, and transform the theoretical size-luminosity distribution to the expected distribution for the observed data. Using maximum-likelihood estimator, we find that the Schechter function parameters for B 435 -dropouts and are consistent with the values in the literature, but the size distributions are wider than expected from the angular momentum distribution of the underlying dark matter halos. The slope of the size-luminosity (RL) relation is similar to those found for local disk galaxies, but considerably shallower than local early-type galaxies.

  14. Correlates of perceived stigma for people living with epilepsy: A meta-analysis.

    Science.gov (United States)

    Shi, Ying; Wang, Shouqi; Ying, Jie; Zhang, Meiling; Liu, Pengcheng; Zhang, Huanhuan; Sun, Jiao

    2017-05-01

    Epilepsy, one of the most common, serious chronic neurological diseases, is accompanied by different levels of perceived stigma that affects people in almost all age groups. This stigma can negatively impact the physical and mental health of people living with epilepsy (PLWE). Good knowledge of perceived stigma for PLWE is important. In this study, we conducted a meta-analysis to identify the correlates of perceived stigma for PLWE. Studies on factors associated with perceived stigma for PLWE, including sociodemographic, psychosocial, and disease-related variables, were searched in PubMed, PsychINFO, EMBASE, and Web of Science. Nineteen variables (k>1) were included in the meta-analysis. For sociodemographic characteristics, findings revealed that the significant weighted mean correlation (R) for "residence" and "poor financial status" were 0.177 and 0.286, respectively. For disease-related characteristics, all variables of significance, including "seizure severity," "seizure frequency," "number of medicines," and "adverse event" (R ranging from 0.190 to 0.362), were positively correlated with perceived stigma. For psychosocial characteristics, "depression" and "anxiety" with R values of 0.414 and 0.369 were significantly associated with perceived stigma. In addition, "social support," "quality of life (QOLIE-31,89)," "knowledge," and "attitude," with R values ranging from -0.444 to -0.200 indicating negative correlation with perceived stigma. The current meta-analysis evaluated the correlates of perceived stigma for PLWE. Results can serve as a basis for policymakers and healthcare professionals for formulating health promotion and prevention strategies. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. A Simple Geotracer Compositional Correlation Analysis Reveals Oil Charge and Migration Pathways

    Science.gov (United States)

    Yang, Yunlai; Arouri, Khaled

    2016-03-01

    A novel approach, based on geotracer compositional correlation analysis is reported, which reveals the oil charge sequence and migration pathways for five oil fields in Saudi Arabia. The geotracers utilised are carbazoles, a family of neutral pyrrolic nitrogen compounds known to occur naturally in crude oils. The approach is based on the concept that closely related fields, with respect to filling sequence, will show a higher carbazole compositional correlation, than those fields that are less related. That is, carbazole compositional correlation coefficients can quantify the charge and filling relationships among different fields. Consequently, oil migration pathways can be defined based on the established filling relationships. The compositional correlation coefficients of isomers of C1 and C2 carbazoles, and benzo[a]carbazole for all different combination pairs of the five fields were found to vary extremely widely (0.28 to 0.94). A wide range of compositional correlation coefficients allows adequate differentiation of separate filling relationships. Based on the established filling relationships, three distinct migration pathways were inferred, with each apparently being charged from a different part of a common source kitchen. The recognition of these charge and migration pathways will greatly aid the search for new accumulations.

  16. Multivariate analysis of correlation between electrophysiological and hemodynamic responses during cognitive processing

    Science.gov (United States)

    Kujala, Jan; Sudre, Gustavo; Vartiainen, Johanna; Liljeström, Mia; Mitchell, Tom; Salmelin, Riitta

    2014-01-01

    Animal and human studies have frequently shown that in primary sensory and motor regions the BOLD signal correlates positively with high-frequency and negatively with low-frequency neuronal activity. However, recent evidence suggests that this relationship may also vary across cortical areas. Detailed knowledge of the possible spectral diversity between electrophysiological and hemodynamic responses across the human cortex would be essential for neural-level interpretation of fMRI data and for informative multimodal combination of electromagnetic and hemodynamic imaging data, especially in cognitive tasks. We applied multivariate partial least squares correlation analysis to MEG–fMRI data recorded in a reading paradigm to determine the correlation patterns between the data types, at once, across the cortex. Our results revealed heterogeneous patterns of high-frequency correlation between MEG and fMRI responses, with marked dissociation between lower and higher order cortical regions. The low-frequency range showed substantial variance, with negative and positive correlations manifesting at different frequencies across cortical regions. These findings demonstrate the complexity of the neurophysiological counterparts of hemodynamic fluctuations in cognitive processing. PMID:24518260

  17. Structured sparse canonical correlation analysis for brain imaging genetics: an improved GraphNet method.

    Science.gov (United States)

    Du, Lei; Huang, Heng; Yan, Jingwen; Kim, Sungeun; Risacher, Shannon L; Inlow, Mark; Moore, Jason H; Saykin, Andrew J; Shen, Li

    2016-05-15

    Structured sparse canonical correlation analysis (SCCA) models have been used to identify imaging genetic associations. These models either use group lasso or graph-guided fused lasso to conduct feature selection and feature grouping simultaneously. The group lasso based methods require prior knowledge to define the groups, which limits the capability when prior knowledge is incomplete or unavailable. The graph-guided methods overcome this drawback by using the sample correlation to define the constraint. However, they are sensitive to the sign of the sample correlation, which could introduce undesirable bias if the sign is wrongly estimated. We introduce a novel SCCA model with a new penalty, and develop an efficient optimization algorithm. Our method has a strong upper bound for the grouping effect for both positively and negatively correlated features. We show that our method performs better than or equally to three competing SCCA models on both synthetic and real data. In particular, our method identifies stronger canonical correlations and better canonical loading patterns, showing its promise for revealing interesting imaging genetic associations. The Matlab code and sample data are freely available at http://www.iu.edu/∼shenlab/tools/angscca/ shenli@iu.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. A Simple Geotracer Compositional Correlation Analysis Reveals Oil Charge and Migration Pathways.

    Science.gov (United States)

    Yang, Yunlai; Arouri, Khaled

    2016-03-11

    A novel approach, based on geotracer compositional correlation analysis is reported, which reveals the oil charge sequence and migration pathways for five oil fields in Saudi Arabia. The geotracers utilised are carbazoles, a family of neutral pyrrolic nitrogen compounds known to occur naturally in crude oils. The approach is based on the concept that closely related fields, with respect to filling sequence, will show a higher carbazole compositional correlation, than those fields that are less related. That is, carbazole compositional correlation coefficients can quantify the charge and filling relationships among different fields. Consequently, oil migration pathways can be defined based on the established filling relationships. The compositional correlation coefficients of isomers of C1 and C2 carbazoles, and benzo[a]carbazole for all different combination pairs of the five fields were found to vary extremely widely (0.28 to 0.94). A wide range of compositional correlation coefficients allows adequate differentiation of separate filling relationships. Based on the established filling relationships, three distinct migration pathways were inferred, with each apparently being charged from a different part of a common source kitchen. The recognition of these charge and migration pathways will greatly aid the search for new accumulations.

  19. An efficient sensitivity analysis method for modified geometry of Macpherson suspension based on Pearson correlation coefficient

    Science.gov (United States)

    Shojaeefard, Mohammad Hasan; Khalkhali, Abolfazl; Yarmohammadisatri, Sadegh

    2017-06-01

    The main purpose of this paper is to propose a new method for designing Macpherson suspension, based on the Sobol indices in terms of Pearson correlation which determines the importance of each member on the behaviour of vehicle suspension. The formulation of dynamic analysis of Macpherson suspension system is developed using the suspension members as the modified links in order to achieve the desired kinematic behaviour. The mechanical system is replaced with an equivalent constrained links and then kinematic laws are utilised to obtain a new modified geometry of Macpherson suspension. The equivalent mechanism of Macpherson suspension increased the speed of analysis and reduced its complexity. The ADAMS/CAR software is utilised to simulate a full vehicle, Renault Logan car, in order to analyse the accuracy of modified geometry model. An experimental 4-poster test rig is considered for validating both ADAMS/CAR simulation and analytical geometry model. Pearson correlation coefficient is applied to analyse the sensitivity of each suspension member according to vehicle objective functions such as sprung mass acceleration, etc. Besides this matter, the estimation of Pearson correlation coefficient between variables is analysed in this method. It is understood that the Pearson correlation coefficient is an efficient method for analysing the vehicle suspension which leads to a better design of Macpherson suspension system.

  20. Minimizing the trend effect on detrended cross-correlation analysis with empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhao Xiaojun; Shang Pengjian; Zhao Chuang; Wang Jing; Tao Rui

    2012-01-01

    Highlights: ► Investigate the effects of linear, exponential and periodic trends on DCCA. ► Apply empirical mode decomposition to extract trend term. ► Strong and monotonic trends are successfully eliminated. ► Get the cross-correlation exponent in a persistent behavior without crossover. - Abstract: Detrended cross-correlation analysis (DCCA) is a scaling method commonly used to estimate long-range power law cross-correlation in non-stationary signals. However, the susceptibility of DCCA to trends makes the scaling results difficult to analyze due to spurious crossovers. We artificially generate long-range cross-correlated signals and systematically investigate the effect of linear, exponential and periodic trends. Specifically to the crossovers raised by trends, we apply empirical mode decomposition method which decomposes underlying signals into several intrinsic mode functions (IMF) and a residual trend. After the removal of residual term, strong and monotonic trends such as linear and exponential trends are successfully eliminated. But periodic trend cannot be separated out according to the criterion of IMF, which can be eliminated by Fourier transform. As a special case of DCCA, detrended fluctuation analysis presents similar results.

  1. Correlation between weather and incidence of selected ophthalmological diagnoses: a database analysis

    Directory of Open Access Journals (Sweden)

    Kern C

    2016-08-01

    Full Text Available Christoph Kern, Karsten Kortüm, Michael Müller, Florian Raabe, Wolfgang Johann Mayer, Siegfried Priglinger, Thomas Christian Kreutzer University Eye Hospital Munich, Faculty of Medicine, Ludwig-Maximilians-Universität München, Munich, Germany Purpose: Our aim was to correlate the overall patient volume and the incidence of several ophthalmological diseases in our emergency department with weather data. Patients and methods: For data analysis, we used our clinical data warehouse and weather data. We investigated the weekly overall patient volume and the average weekly incidence of all encoded diagnoses of “conjunctivitis”, “foreign body”, “acute iridocyclitis”, and “corneal abrasion”. A Spearman’s correlation was performed to link these data with the weekly average sunshine duration, temperature, and wind speed. Results: We noticed increased patient volume in correlation with increasing sunshine duration and higher temperature. Moreover, we found a positive correlation between the weekly incidences of conjunctivitis and of foreign body and weather data. Conclusion: The results of this data analysis reveal the possible influence of external conditions on the health of a population and can be used for weather-dependent resource allocation. Keywords: corneal injury, trauma, uveitis, conjunctivitis, weather

  2. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    Science.gov (United States)

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Statistical analysis of solid waste composition data: Arithmetic mean, standard deviation and correlation coefficients.

    Science.gov (United States)

    Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2017-11-01

    Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. FEM correlation and shock analysis of a VNC MEMS mirror segment

    Science.gov (United States)

    Aguayo, Eduardo J.; Lyon, Richard; Helmbrecht, Michael; Khomusi, Sausan

    2014-08-01

    Microelectromechanical systems (MEMS) are becoming more prevalent in today's advanced space technologies. The Visible Nulling Coronagraph (VNC) instrument, being developed at the NASA Goddard Space Flight Center, uses a MEMS Mirror to correct wavefront errors. This MEMS Mirror, the Multiple Mirror Array (MMA), is a key component that will enable the VNC instrument to detect Jupiter and ultimately Earth size exoplanets. Like other MEMS devices, the MMA faces several challenges associated with spaceflight. Therefore, Finite Element Analysis (FEA) is being used to predict the behavior of a single MMA segment under different spaceflight-related environments. Finite Element Analysis results are used to guide the MMA design and ensure its survival during launch and mission operations. A Finite Element Model (FEM) has been developed of the MMA using COMSOL. This model has been correlated to static loading on test specimens. The correlation was performed in several steps—simple beam models were correlated initially, followed by increasingly complex and higher fidelity models of the MMA mirror segment. Subsequently, the model has been used to predict the dynamic behavior and stresses of the MMA segment in a representative spaceflight mechanical shock environment. The results of the correlation and the stresses associated with a shock event are presented herein.

  5. Statistical analysis of correlated experimental data and neutron cross section evaluation

    International Nuclear Information System (INIS)

    Badikov, S.A.

    1998-01-01

    The technique for evaluation of neutron cross sections on the basis of statistical analysis of correlated experimental data is presented. The most important stages of evaluation beginning from compilation of correlation matrix for measurement uncertainties till representation of the analysis results in the ENDF-6 format are described in details. Special attention is paid to restrictions (positive uncertainty) on covariation matrix of approximate parameters uncertainties generated within the method of least square fit which is derived from physical reasons. The requirements for source experimental data assuring satisfaction of the restrictions mentioned above are formulated. Correlation matrices of measurement uncertainties in particular should be also positively determined. Variants of modelling the positively determined correlation matrices of measurement uncertainties in a situation when their consequent calculation on the basis of experimental information is impossible are discussed. The technique described is used for creating the new generation of estimates of dosimetric reactions cross sections for the first version of the Russian dosimetric file (including nontrivial covariation information)

  6. The correlation between apparent diffusion coefficient and tumor cellularity in patients: a meta-analysis.

    Science.gov (United States)

    Chen, Lihua; Liu, Min; Bao, Jing; Xia, Yunbao; Zhang, Jiuquan; Zhang, Lin; Huang, Xuequan; Wang, Jian

    2013-01-01

    To perform a meta-analysis exploring the correlation between the apparent diffusion coefficient (ADC) and tumor cellularity in patients. We searched medical and scientific literature databases for studies discussing the correlation between the ADC and tumor cellularity in patients. Only studies that were published in English or Chinese prior to November 2012 were considered for inclusion. Summary correlation coefficient (r) values were extracted from each study, and 95% confidence intervals (CIs) were calculated. Sensitivity and subgroup analyses were performed to investigate potential heterogeneity. Of 189 studies, 28 were included in the meta-analysis, comprising 729 patients. The pooled r for all studies was -0.57 (95% CI: -0.62, -0.52), indicating notable heterogeneity (Pcorrelation between the ADC and cellularity for brain tumors. There was no notable evidence of publication bias. There is a strong negative correlation between the ADC and tumor cellularity in patients, particularly in the brain. However, larger, prospective studies are warranted to validate these findings in other cancer types.

  7. Dynamical Analysis of Stock Market Instability by Cross-correlation Matrix

    Science.gov (United States)

    Takaishi, Tetsuya

    2016-08-01

    We study stock market instability by using cross-correlations constructed from the return time series of 366 stocks traded on the Tokyo Stock Exchange from January 5, 1998 to December 30, 2013. To investigate the dynamical evolution of the cross-correlations, crosscorrelation matrices are calculated with a rolling window of 400 days. To quantify the volatile market stages where the potential risk is high, we apply the principal components analysis and measure the cumulative risk fraction (CRF), which is the system variance associated with the first few principal components. From the CRF, we detected three volatile market stages corresponding to the bankruptcy of Lehman Brothers, the 2011 Tohoku Region Pacific Coast Earthquake, and the FRB QE3 reduction observation in the study period. We further apply the random matrix theory for the risk analysis and find that the first eigenvector is more equally de-localized when the market is volatile.

  8. Dynamical Analysis of Stock Market Instability by Cross-correlation Matrix

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2016-01-01

    We study stock market instability by using cross-correlations constructed from the return time series of 366 stocks traded on the Tokyo Stock Exchange from January 5, 1998 to December 30, 2013. To investigate the dynamical evolution of the cross-correlations, crosscorrelation matrices are calculated with a rolling window of 400 days. To quantify the volatile market stages where the potential risk is high, we apply the principal components analysis and measure the cumulative risk fraction (CRF), which is the system variance associated with the first few principal components. From the CRF, we detected three volatile market stages corresponding to the bankruptcy of Lehman Brothers, the 2011 Tohoku Region Pacific Coast Earthquake, and the FRB QE3 reduction observation in the study period. We further apply the random matrix theory for the risk analysis and find that the first eigenvector is more equally de-localized when the market is volatile. (paper)

  9. Topic Correlation Analysis for Bearing Fault Diagnosis Under Variable Operating Conditions

    Science.gov (United States)

    Chen, Chao; Shen, Fei; Yan, Ruqiang

    2017-05-01

    This paper presents a Topic Correlation Analysis (TCA) based approach for bearing fault diagnosis. In TCA, Joint Mixture Model (JMM), a model which adapts Probability Latent Semantic Analysis (PLSA), is constructed first. Then, JMM models the shared and domain-specific topics using “fault vocabulary” . After that, the correlations between two kinds of topics are computed and used to build a mapping matrix. Furthermore, a new shared space spanned by the shared and mapped domain-specific topics is set up where the distribution gap between different domains is reduced. Finally, a classifier is trained with mapped features which follow a different distribution and then the trained classifier is tested on target bearing data. Experimental results justify the superiority of the proposed approach over the stat-of-the-art baselines and it can diagnose bearing fault efficiently and effectively under variable operating conditions.

  10. Quantifying NMR relaxation correlation and exchange in articular cartilage with time domain analysis

    Science.gov (United States)

    Mailhiot, Sarah E.; Zong, Fangrong; Maneval, James E.; June, Ronald K.; Galvosas, Petrik; Seymour, Joseph D.

    2018-02-01

    Measured nuclear magnetic resonance (NMR) transverse relaxation data in articular cartilage has been shown to be multi-exponential and correlated to the health of the tissue. The observed relaxation rates are dependent on experimental parameters such as solvent, data acquisition methods, data analysis methods, and alignment to the magnetic field. In this study, we show that diffusive exchange occurs in porcine articular cartilage and impacts the observed relaxation rates in T1-T2 correlation experiments. By using time domain analysis of T2-T2 exchange spectroscopy, the diffusive exchange time can be quantified by measurements that use a single mixing time. Measured characteristic times for exchange are commensurate with T1 in this material and so impacts the observed T1 behavior. The approach used here allows for reliable quantification of NMR relaxation behavior in cartilage in the presence of diffusive fluid exchange between two environments.

  11. A learning algorithm for adaptive canonical correlation analysis of several data sets.

    Science.gov (United States)

    Vía, Javier; Santamaría, Ignacio; Pérez, Jesús

    2007-01-01

    Canonical correlation analysis (CCA) is a classical tool in statistical analysis to find the projections that maximize the correlation between two data sets. In this work we propose a generalization of CCA to several data sets, which is shown to be equivalent to the classical maximum variance (MAXVAR) generalization proposed by Kettenring. The reformulation of this generalization as a set of coupled least squares regression problems is exploited to develop a neural structure for CCA. In particular, the proposed CCA model is a two layer feedforward neural network with lateral connections in the output layer to achieve the simultaneous extraction of all the CCA eigenvectors through deflation. The CCA neural model is trained using a recursive least squares (RLS) algorithm. Finally, the convergence of the proposed learning rule is proved by means of stochastic approximation techniques and their performance is analyzed through simulations.

  12. Assessment of SIP Buildings for Sustainable Development in Rural China Using AHP-Grey Correlation Analysis.

    Science.gov (United States)

    Bai, Libiao; Wang, Hailing; Shi, Chunming; Du, Qiang; Li, Yi

    2017-10-25

    Traditional rural residential construction has the problems of high energy consumption and severe pollution. In general, with sustainable development in the construction industry, rural residential construction should be aimed towards low energy consumption and low carbon emissions. To help achieve this objective, in this paper, we evaluated four different possible building structures using AHP-Grey Correlation Analysis, which consists of the Analytic Hierarchy Process (AHP) and the Grey Correlation Analysis. The four structures included the traditional and currently widely used brick and concrete structure, as well as structure insulated panels (SIPs). Comparing the performances of economic benefit and carbon emission, the conclusion that SIPs have the best overall performance can be obtained, providing a reference to help builders choose the most appropriate building structure in rural China.

  13. Potential ligand-binding residues in rat olfactory receptors identified by correlated mutation analysis

    Science.gov (United States)

    Singer, M. S.; Oliveira, L.; Vriend, G.; Shepherd, G. M.

    1995-01-01

    A family of G-protein-coupled receptors is believed to mediate the recognition of odor molecules. In order to identify potential ligand-binding residues, we have applied correlated mutation analysis to receptor sequences from the rat. This method identifies pairs of sequence positions where residues remain conserved or mutate in tandem, thereby suggesting structural or functional importance. The analysis supported molecular modeling studies in suggesting several residues in positions that were consistent with ligand-binding function. Two of these positions, dominated by histidine residues, may play important roles in ligand binding and could confer broad specificity to mammalian odor receptors. The presence of positive (overdominant) selection at some of the identified positions provides additional evidence for roles in ligand binding. Higher-order groups of correlated residues were also observed. Each group may interact with an individual ligand determinant, and combinations of these groups may provide a multi-dimensional mechanism for receptor diversity.

  14. Genetic and Environmental Basis in Phenotype Correlation Between Physical Function and Cognition in Aging Chinese Twins

    DEFF Research Database (Denmark)

    Xu, Chunsheng; Zhang, Dongfeng; Tian, Xiaocao

    2017-01-01

    for cognition with handgrip strength, FTSST, near visual acuity, and number of teeth lost. Cognitive function was genetically related to pulmonary function. The FTSST and cognition shared almost the same common environmental factors but only part of the unique environmental factors, both with negative......Although the correlation between cognition and physical function has been well studied in the general population, the genetic and environmental nature of the correlation has been rarely investigated. We conducted a classical twin analysis on cognitive and physical function, including forced...... and cognitive function. Bivariate analysis showed mildly positively genetic correlations between cognition and FEV1, r G = 0.23 [95% CI: 0.03, 0.62], as well as FVC, r G = 0.35 [95% CI: 0.06, 1.00]. We found that FTSST and cognition presented very high common environmental correlation, r C = -1.00 [95% CI: -1...

  15. Detecting long-range correlation with detrended fluctuation analysis: Application to BWR stability

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa-Paredes, Gilberto [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico, DF 09340 (Mexico)]. E-mail: gepe@xanum.uam.mx; Alvarez-Ramirez, Jose [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico, DF 09340 (Mexico); Vazquez, Alejandro [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico, DF 09340 (Mexico)

    2006-11-15

    The aim of this paper is to explore the application of detrended fluctuation analysis (DFA) to study boiling water reactor stability. DFA is a scaling method commonly used for detecting long-range correlations in non-stationary time series. This method is based on the random walk theory and was applied to neutronic power signal of Forsmark stability benchmark. Our results shows that the scaling properties breakdown during unstable oscillations.

  16. Detecting long-range correlation with detrended fluctuation analysis: Application to BWR stability

    International Nuclear Information System (INIS)

    Espinosa-Paredes, Gilberto; Alvarez-Ramirez, Jose; Vazquez, Alejandro

    2006-01-01

    The aim of this paper is to explore the application of detrended fluctuation analysis (DFA) to study boiling water reactor stability. DFA is a scaling method commonly used for detecting long-range correlations in non-stationary time series. This method is based on the random walk theory and was applied to neutronic power signal of Forsmark stability benchmark. Our results shows that the scaling properties breakdown during unstable oscillations

  17. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  18. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  19. NMR-based metabonomics and correlation analysis reveal potential biomarkers associated with chronic atrophic gastritis.

    Science.gov (United States)

    Cui, Jiajia; Liu, Yuetao; Hu, Yinghuan; Tong, Jiayu; Li, Aiping; Qu, Tingli; Qin, Xuemei; Du, Guanhua

    2017-01-05

    Chronic atrophic gastritis (CAG) is one of the most important pre-cancerous states with a high prevalence. Exploring of the underlying mechanism and potential biomarkers is of significant importance for CAG. In the present work, 1 H NMR-based metabonomics with correlative analysis was performed to analyze the metabolic features of CAG. 19 plasma metabolites and 18 urine metabolites were enrolled to construct the circulatory and excretory metabolome of CAG, which was in response to alterations of energy metabolism, inflammation, immune dysfunction, as well as oxidative stress. 7 plasma biomarkers and 7 urine biomarkers were screened to elucidate the pathogenesis of CAG based on the further correlation analysis with biochemical indexes. Finally, 3 plasma biomarkers (arginine, succinate and 3-hydroxybutyrate) and 2 urine biomarkers (α-ketoglutarate and valine) highlighted the potential to indicate risks of CAG in virtue of correlation with pepsin activity and ROC analysis. Here, our results paved a way for elucidating the underlying mechanisms in the development of CAG, and provided new avenues for the diagnosis of CAG and presented potential drug targets for treatment of CAG. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Correlation of Descriptive Analysis and Instrumental Puncture Testing of Watermelon Cultivars.

    Science.gov (United States)

    Shiu, J W; Slaughter, D C; Boyden, L E; Barrett, D M

    2016-06-01

    The textural properties of 5 seedless watermelon cultivars were assessed by descriptive analysis and the standard puncture test using a hollow probe with increased shearing properties. The use of descriptive analysis methodology was an effective means of quantifying watermelon sensory texture profiles for characterizing specific cultivars' characteristics. Of the 10 cultivars screened, 71% of the variation in the sensory attributes was measured using the 1st 2 principal components. Pairwise correlation of the hollow puncture probe and sensory parameters determined that initial slope, maximum force, and work after maximum force measurements all correlated well to the sensory attributes crisp and firm. These findings confirm that maximum force correlates well with not only firmness in watermelon, but crispness as well. The initial slope parameter also captures the sensory crispness of watermelon, but is not as practical to measure in the field as maximum force. The work after maximum force parameter is thought to reflect cellular arrangement and membrane integrity that in turn impact sensory firmness and crispness. Watermelon cultivar types were correctly predicted by puncture test measurements in heart tissue 87% of the time, although descriptive analysis was correct 54% of the time. © 2016 Institute of Food Technologists®

  1. X-ray texture analysis of paper coating pigments and the correlation with chemical composition analysis

    Science.gov (United States)

    Roine, J.; Tenho, M.; Murtomaa, M.; Lehto, V.-P.; Kansanaho, R.

    2007-10-01

    The present research experiments the applicability of x-ray texture analysis in investigating the properties of paper coatings. The preferred orientations of kaolin, talc, ground calcium carbonate, and precipitated calcium carbonate particles used in four different paper coatings were determined qualitatively based on the measured crystal orientation data. The extent of the orientation, namely, the degree of the texture of each pigment, was characterized quantitatively using a single parameter. As a result, the effect of paper calendering is clearly seen as an increase on the degree of texture of the coating pigments. The effect of calendering on the preferred orientation of kaolin was also evident in an independent energy dispersive spectrometer analysis on micrometer scale and an electron spectroscopy for chemical analysis on nanometer scale. Thus, the present work proves x-ray texture analysis to be a potential research tool for characterizing the properties of paper coating layers.

  2. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  3. Identifying compromised systems through correlation of suspicious traffic from malware behavioral analysis

    Science.gov (United States)

    Camilo, Ana E. F.; Grégio, André; Santos, Rafael D. C.

    2016-05-01

    Malware detection may be accomplished through the analysis of their infection behavior. To do so, dynamic analysis systems run malware samples and extract their operating system activities and network traffic. This traffic may represent malware accessing external systems, either to steal sensitive data from victims or to fetch other malicious artifacts (configuration files, additional modules, commands). In this work, we propose the use of visualization as a tool to identify compromised systems based on correlating malware communications in the form of graphs and finding isomorphisms between them. We produced graphs from over 6 thousand distinct network traffic files captured during malware execution and analyzed the existing relationships among malware samples and IP addresses.

  4. CONTIN XPCS: Software for Inverse Transform Analysis of X-Ray Photon Correlation Spectroscopy Dynamics.

    Science.gov (United States)

    Andrews, Ross N; Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan

    2018-02-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements.

  5. Principal component and spatial correlation analysis of spectroscopic-imaging data in scanning probe microscopy

    International Nuclear Information System (INIS)

    Jesse, Stephen; Kalinin, Sergei V

    2009-01-01

    An approach for the analysis of multi-dimensional, spectroscopic-imaging data based on principal component analysis (PCA) is explored. PCA selects and ranks relevant response components based on variance within the data. It is shown that for examples with small relative variations between spectra, the first few PCA components closely coincide with results obtained using model fitting, and this is achieved at rates approximately four orders of magnitude faster. For cases with strong response variations, PCA allows an effective approach to rapidly process, de-noise, and compress data. The prospects for PCA combined with correlation function analysis of component maps as a universal tool for data analysis and representation in microscopy are discussed.

  6. Analysis of Correlation between an Accelerometer-Based Algorithm for Detecting Parkinsonian Gait and UPDRS Subscales

    Directory of Open Access Journals (Sweden)

    Alejandro Rodríguez-Molinero

    2017-09-01

    Full Text Available BackgroundOur group earlier developed a small monitoring device, which uses accelerometer measurements to accurately detect motor fluctuations in patients with Parkinson’s (On and Off state based on an algorithm that characterizes gait through the frequency content of strides. To further validate the algorithm, we studied the correlation of its outputs with the motor section of the Unified Parkinson’s Disease Rating Scale part-III (UPDRS-III.MethodSeventy-five patients suffering from Parkinson’s disease were asked to walk both in the Off and the On state while wearing the inertial sensor on the waist. Additionally, all patients were administered the motor section of the UPDRS in both motor phases. Tests were conducted at the patient’s home. Convergence between the algorithm and the scale was evaluated by using the Spearman’s correlation coefficient.ResultsCorrelation with the UPDRS-III was moderate (rho −0.56; p < 0.001. Correlation between the algorithm outputs and the gait item in the UPDRS-III was good (rho −0.73; p < 0.001. The factorial analysis of the UPDRS-III has repeatedly shown that several of its items can be clustered under the so-called Factor 1: “axial function, balance, and gait.” The correlation between the algorithm outputs and this factor of the UPDRS-III was −0.67 (p < 0.01.ConclusionThe correlation achieved by the algorithm with the UPDRS-III scale suggests that this algorithm might be a useful tool for monitoring patients with Parkinson’s disease and motor fluctuations.

  7. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi; Gao, Xin; Huang, Jianhua Z.

    2012-01-01

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  8. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    KAUST Repository

    Maadooliat, Mehdi

    2012-08-27

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence-structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu. edu/~madoliat/LagSVD) that can be used to produce informative animations. © The Author 2012. Published by Oxford University Press.

  9. Correlation Analysis of Personality Characteristics of Children with TIC Disorder with Family Factors

    Institute of Scientific and Technical Information of China (English)

    LI Rui; WANG Liqun; MA Chunxia; MA Lixian

    2016-01-01

    Objective To explore the personality characteristics of children with tic disorders and their relationship with family factors.Methods Sixty cases of children with tic disorders diagnosed in our hospital were selected as the case group and 65 cases of normal children were selected as the control group.The children of two groups were assessed using Eysenck Personality Questionnaire (EPQ),Family Environment Scale (FES-CV) and general situation questionnaire of family (GSQ),respectively.The scores of EPQ personality characteristics,FES-CV and GSQ scores were compared for the children in the two groups.The Person correlation analysis method was used to analyze the correlation between personality scores of children in case group and family environment factors.Results The general situation questionnaire results showed that there was significant statistically difference in parenting style,parental education level and family types of the children between case group and control group (P < 0.05);EPQ results showed that the neuroticism and psychoticism scores of children in the case group were significantly higher than those in the control group (P< 0.05) and the lying degree scores in the control group were significantly higher than those in the case group (P< 0.05);FES-CV results showed that the family cohesion scores of the case group were significantly lower than those of the control group (P<0.05),and the family conflict scores in the case group were significantly higher than those in the control group (P<0.05).The Person correlation analysis results indicated that the psychoticism score was negatively correlated with the score of family cohesion (P<0.05),and positively correlated with family conflict (P<0.05),while the neuroticism score was positively correlated with family conflict score (P<0.05).Conclusion The children with tic disorders have significant personality deviation compared to the normal children,and the personality deviation degree is

  10. [Correlation analysis between residual displacement and hip function after reconstruction of acetabular fractures].

    Science.gov (United States)

    Ma, Kunlong; Fang, Yue; Luan, Fujun; Tu, Chongqi; Yang, Tianfu

    2012-03-01

    To investigate the relationships between residual displacement of weight-bearing and non weight-bearing zones (gap displacement and step displacement) and hip function by analyzing the CT images after reconstruction of acetabular fractures. The CT measures and clinical outcome were retrospectively analyzed from 48 patients with displaced acetabular fracture between June 2004 and June 2009. All patients were treated by open reduction and internal fixation, and were followed up 24 to 72 months (mean, 36 months); all fractures healed after operation. The residual displacement involved the weight-bearing zone in 30 cases (weight-bearing group), and involved the non weight-bearing zone in 18 cases (non weight-bearing group). The clinical outcomes were evaluated by Merle d'Aubigné-Postel criteria, and the reduction of articular surface by CT images, including the maximums of two indexes (gap displacement and step displacement). All the data were analyzed in accordance with the Spearman rank correlation coefficient analysis. There was strong negative correlation between the hip function and the residual displacement values in weight-bearing group (r(s) = -0.722, P = 0.001). But there was no correlation between the hip function and the residual displacement values in non weight-bearing group (r(s) = 0.481, P = 0.059). The results of clinical follow-up were similar to the correlation analysis results. In weight-bearing group, the hip function had strong negative correlation with step displacement (r(s) = 0.825, P = 0.002), but it had no correlation with gap displacement (r(s) = 0.577, P = 0.134). In patients with acetabular fracture, the hip function has correlation not only with the extent of the residual displacement but also with the location of the residual displacement, so the residual displacement of weight-bearing zone is a key factor to affect the hip function. In patients with residual displacement in weight-bearing zone, the bigger the step displacement is, the

  11. Multifractal detrended cross-correlation analysis for epileptic patient in seizure and seizure free status

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Dutta, Srimonti; Chakraborty, Sayantan

    2014-01-01

    Highlights: • We analyze EEG of patients during seizure and in seizure free interval. • Data from different sections of the brain and seizure activity was analyzed. • Assessment of cross-correlation in seizure and seizure free interval using MF-DXA technique. - Abstract: This paper reports a study of EEG data of epileptic patients in terms of multifractal detrended cross-correlation analysis (MF-DXA). The EEG clinical data were obtained from the EEG Database available with the Clinic of Epileptology of the University Hospital of Bonn, Germany. The data sets (C, D, and E) were taken from five epileptic patients undergoing presurgical evaluations. The data sets consist of intracranial EEG recordings during seizure-free intervals (interictal periods) from within the epileptogenic zone (D) and from the hippocampal formation of the opposite hemisphere of the epileptic patients’ brain, respectively (C). The data set (E) was recorded during seizure activity (ictal periods). MF-DXA is a very rigorous and robust tool for assessment of cross-correlation among two nonlinear time series. The study reveals the degree of cross-correlation is more among seizure and seizure free interval in epileptogenic zone. These data are very significant for diagnosis, onset and prognosis of epileptic patients

  12. Motivational Basis of Personality Traits: A Meta-Analysis of Value-Personality Correlations.

    Science.gov (United States)

    Fischer, Ronald; Boer, Diana

    2015-10-01

    We investigated the relationships between personality traits and basic value dimensions. Furthermore, we developed novel country-level hypotheses predicting that contextual threat moderates value-personality trait relationships. We conducted a three-level v-known meta-analysis of correlations between Big Five traits and Schwartz's (1992) 10 values involving 9,935 participants from 14 countries. Variations in contextual threat (measured as resource threat, ecological threat, and restrictive social institutions) were used as country-level moderator variables. We found systematic relationships between Big Five traits and human values that varied across contexts. Overall, correlations between Openness traits and the Conservation value dimension and Agreeableness traits and the Transcendence value dimension were strongest across all samples. Correlations between values and all personality traits (except Extraversion) were weaker in contexts with greater financial, ecological, and social threats. In contrast, stronger personality-value links are typically found in contexts with low financial and ecological threats and more democratic institutions and permissive social context. These effects explained on average more than 10% of the variability in value-personality correlations. Our results provide strong support for systematic linkages between personality and broad value dimensions, but they also point out that these relations are shaped by contextual factors. © 2014 Wiley Periodicals, Inc.

  13. Structure function analysis of long-range correlations in plasma turbulence

    International Nuclear Information System (INIS)

    Yu, C.X.; Gilmore, M.; Peebles, W.A.; Rhodes, T.L.

    2003-01-01

    Long-range correlations (temporal and spatial) have been predicted in a number of different turbulence models, both analytical and numerical. These long-range correlations are thought to significantly affect cross-field turbulent transport in magnetically confined plasmas. The Hurst exponent, H - one of a number of methods to identify the existence of long-range correlations in experimental data - can be used to quantify self-similarity scalings and correlations in the mesoscale temporal range. The Hurst exponent can be calculated by several different algorithms, each of which has particular advantages and disadvantages. One method for calculating H is via structure functions (SFs). The SF method is a robust technique for determining H with several inherent advantages that has not yet been widely used in plasma turbulence research. In this article, the SF method and its advantages are discussed in detail, using both simulated and measured fluctuation data from the DIII-D tokamak [J. L. Luxon and L. G. Davis, Fusion Technol. 8, 441 (1985)]. In addition, it is shown that SFs used in conjunction with rescaled range analysis (another method for calculating H) can be used to mitigate the effects of coherent modes in some cases

  14. Multivariate correlation analysis of eye cyclotorsion degree in corneal refractive surgery

    Directory of Open Access Journals (Sweden)

    Xiao-Guang Niu

    2015-06-01

    Full Text Available AIM: To explore the correlation between eye cyclotorsion degrees and patient's age, gender, diopter and other factors in corneal refractive surgery. METHODS: A total of 762 wavefront-guided LASIK patients with 1524 eyes were retrospectively analyzed from January 2010 to December 2013 in our hospital. Iris recognition was accomplished successfully and eye cyclotorsion degrees were recorded intraoperatively for all the patients. The correlations between eye cyclotorsion degrees and patient's age, gender, different eye, diopter and the dominant eye or not were statistically analyzed. In which correlation analysis was used to analyze the relationship between eye cyclotorsion degrees and age and diopter, while the correlations with gender, different eye and the dominant eye or not were analyzed using t-test.RESULTS: The eye cyclotorsion degrees of patients were 0 to 9.7 degrees with an average of 3.08±2.22 degrees. Amongst the average cyclotorsion of 444 men with 888 eyes were 3.05±2.26 degrees, 318 women with 636 eyes were 3.12±2.15 degrees and there were no significant differences(t=1.905, P=0.168. The average age of all the patients was 22.6±5.4y. No significant correlation was found between cyclotorsion degrees and age(r=-0.012, P=0.748. The mean spherical equivalent was -4.76±1.77 degrees, and there was no significant correlation between the eye cyclotorsion degrees and spherical equivalent(r=0.017, P=0.633. The mean cylinder was -0.60±0.64 degrees of no significant correlation with eye cyclotorsion degrees(r=-0.004, P=0.910. The cyclotorsion of dominant eyes of all the patients was 3.0±2.17 degrees, and the non-dominant eyes were 3.11±2.12 degrees. No significant differences were found(t=-0.521,P=0.603. CONCLUSION: The eye cyclotorsion degrees occurred in LASIK surgery had no correlation with age, gender, different eye, diopter and the dominant eye or not.

  15. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  16. Multiset canonical correlations analysis and multispectral, truly multitemporal remote sensing data.

    Science.gov (United States)

    Nielsen, Allan Aasbjerg

    2002-01-01

    This paper describes two- and multiset canonical correlations analysis (CCA) for data fusion, multisource, multiset, or multitemporal exploratory data analysis. These techniques transform multivariate multiset data into new orthogonal variables called canonical variates (CVs) which, when applied in remote sensing, exhibit ever-decreasing similarity (as expressed by correlation measures) over sets consisting of 1) spectral variables at fixed points in time (R-mode analysis), or 2) temporal variables with fixed wavelengths (T-mode analysis). The CVs are invariant to linear and affine transformations of the original variables within sets which means, for example, that the R-mode CVs are insensitive to changes over time in offset and gain in a measuring device. In a case study, CVs are calculated from Landsat Thematic Mapper (TM) data with six spectral bands over six consecutive years. Both Rand T-mode CVs clearly exhibit the desired characteristic: they show maximum similarity for the low-order canonical variates and minimum similarity for the high-order canonical variates. These characteristics are seen both visually and in objective measures. The results from the multiset CCA R- and T-mode analyses are very different. This difference is ascribed to the noise structure in the data. The CCA methods are related to partial least squares (PLS) methods. This paper very briefly describes multiset CCA-based multiset PLS. Also, the CCA methods can be applied as multivariate extensions to empirical orthogonal functions (EOF) techniques. Multiset CCA is well-suited for inclusion in geographical information systems (GIS).

  17. Parent heparin and daughter LMW heparin correlation analysis using LC-MS and NMR

    International Nuclear Information System (INIS)

    Liu, Xinyue; St Ange, Kalib; Wang, Xiaohua; Lin, Lei; Zhang, Fuming

    2017-01-01

    Heparin is a structurally complex, polysaccharide anticoagulant derived from livestock, primarily porcine intestinal tissues. Low molecular weight (LMW) heparins are derived through the controlled partial depolymerization of heparin. Increased manufacturing and regulatory concerns have provided the motivation for the development of more sophisticated analytical methods for determining both their structure and pedigree. A strategy, for the comprehensive comparison of parent heparins and their LMW heparin daughters, is described that relies on the analysis of monosaccharide composition, disaccharide composition, and oligosaccharide composition. Liquid chromatography-mass spectrometry is rapid, robust, and amenable to automated processing and interpretation of both top-down and bottom-up analyses. Nuclear magnetic resonance spectroscopy provides complementary top-down information on the chirality of the uronic acid residues and glucosamine substitution. Principal component analysis (PCA) was applied to the normalized abundance of oligosaccharides, calculated in the bottom-up analysis, to show parent and daughter correlation in oligosaccharide composition. Using these approaches, six pairs of parent heparins and their daughter generic enoxaparins from two different manufacturers were comprehensively analyzed. Enoxaparin is the most widely used LMW heparin and is prepared through controlled chemical β-eliminative cleavage of porcine intestinal heparin. Lovenox"®, the innovator version of enoxaparin marketed in the US, was analyzed as a reference for the daughter LMW heparins. The results, show similarities between LMW heparins from two different manufacturers with Lovenox"®, excellent lot-to-lot consistency of products from each manufacturer, and detects a correlation between each parent heparin and daughter LMW heparin. - Highlights: • Low molecular weight heparins prepared from different heparin parents were analyzed. • An integrated analytical approach relied

  18. Parent heparin and daughter LMW heparin correlation analysis using LC-MS and NMR

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xinyue, E-mail: liux22@rpi.edu [National Glycoengineering Research Center, Shandong Provincial Key Laboratory of Carbohydrate Chemistry and Glycobiology, State Key Laboratory of Microbial Technology, Shandong University, Jinan, Shandong, 250100 (China); Department of Chemistry and Chemical Biology, Department of Chemical and Biological Engineering, Department of Biology, Department of Biomedical Engineering, Center for Biotechnology and Interdisciplinary Studies, Rensselaer Polytechnic Institute, Troy, NY, 12180 (United States); St Ange, Kalib, E-mail: stangk2@rpi.edu [Department of Chemistry and Chemical Biology, Department of Chemical and Biological Engineering, Department of Biology, Department of Biomedical Engineering, Center for Biotechnology and Interdisciplinary Studies, Rensselaer Polytechnic Institute, Troy, NY, 12180 (United States); Wang, Xiaohua, E-mail: wangx35@rpi.edu [Department of Chemistry and Chemical Biology, Department of Chemical and Biological Engineering, Department of Biology, Department of Biomedical Engineering, Center for Biotechnology and Interdisciplinary Studies, Rensselaer Polytechnic Institute, Troy, NY, 12180 (United States); School of Computer and Information, Hefei University of Technology, Hefei (China); Lin, Lei, E-mail: Linl5@rpi.edu [Department of Chemistry and Chemical Biology, Department of Chemical and Biological Engineering, Department of Biology, Department of Biomedical Engineering, Center for Biotechnology and Interdisciplinary Studies, Rensselaer Polytechnic Institute, Troy, NY, 12180 (United States); Zhang, Fuming, E-mail: zhangf2@rpi.edu [Department of Chemistry and Chemical Biology, Department of Chemical and Biological Engineering, Department of Biology, Department of Biomedical Engineering, Center for Biotechnology and Interdisciplinary Studies, Rensselaer Polytechnic Institute, Troy, NY, 12180 (United States); and others

    2017-04-08

    Heparin is a structurally complex, polysaccharide anticoagulant derived from livestock, primarily porcine intestinal tissues. Low molecular weight (LMW) heparins are derived through the controlled partial depolymerization of heparin. Increased manufacturing and regulatory concerns have provided the motivation for the development of more sophisticated analytical methods for determining both their structure and pedigree. A strategy, for the comprehensive comparison of parent heparins and their LMW heparin daughters, is described that relies on the analysis of monosaccharide composition, disaccharide composition, and oligosaccharide composition. Liquid chromatography-mass spectrometry is rapid, robust, and amenable to automated processing and interpretation of both top-down and bottom-up analyses. Nuclear magnetic resonance spectroscopy provides complementary top-down information on the chirality of the uronic acid residues and glucosamine substitution. Principal component analysis (PCA) was applied to the normalized abundance of oligosaccharides, calculated in the bottom-up analysis, to show parent and daughter correlation in oligosaccharide composition. Using these approaches, six pairs of parent heparins and their daughter generic enoxaparins from two different manufacturers were comprehensively analyzed. Enoxaparin is the most widely used LMW heparin and is prepared through controlled chemical β-eliminative cleavage of porcine intestinal heparin. Lovenox{sup ®}, the innovator version of enoxaparin marketed in the US, was analyzed as a reference for the daughter LMW heparins. The results, show similarities between LMW heparins from two different manufacturers with Lovenox{sup ®}, excellent lot-to-lot consistency of products from each manufacturer, and detects a correlation between each parent heparin and daughter LMW heparin. - Highlights: • Low molecular weight heparins prepared from different heparin parents were analyzed. • An integrated analytical

  19. Pet Bottle Design, Correlation Analysis Of Pet Bottle Characteristics Subjective Judgment

    Directory of Open Access Journals (Sweden)

    Darko Avramović

    2012-06-01

    Full Text Available Ability to predict consumer’s reaction to particular design solution of the product is very important. Gathering andanalysis of subjective judgments of particular characteristics, based on which the aesthetic of the product is judged,is one of predicting the consumer’s reaction in the future. Knowledge gathered this manner can serve as a referencefor further studies of determining factors for aesthetic results and design quality. There are two opposed opinionsregarding prediction of aesthetic impression. One opinion is that taste of individual cannot be discussed because itis extremely variable and the possibility of meaningful analysis of aesthetic impression is rejected. Other opinionstates that there is a consistent preference of certain aesthetic characteristics despite individual and group differences.Main goal of this paper is to examine the correlation between subjective judgments of certain PET bottlecharacteristics. Analysis showed meaningful correlation between some of the PET bottle characteristics while othercharacteristics showed less correlation. It can be concluded that not all of the characteristics have the same influenceon the aesthetics and design quality of the PET bottle form. Emphasizing the characteristics relative to aesthetics ofthe product can produce better market results, taking in to account that consumer’s buy the product they consider tobe more attractive if other parameters of the product are similar.

  20. Analysis on correlation between overall classification on color doppler ultrasound and clinical stages of atherosclerosis obliterans

    International Nuclear Information System (INIS)

    Zhang Dongmei; Liu Meihan; Shi Weidong; Chen Enqi; Li Xinying; Lin Yu

    2010-01-01

    Objective: To investigate the correlation and the clinical significance between the overall classification on color Doppler ultrasound and the clinical stages of atherosclerosis obliterans (ASO), and evaluate the extent of arterial lesions comprehensively. Methods: 125 patients of ASO, who were divided into three groups of mild, moderate and severe with Color Doppler ultrasound according to differences of occlusion, quantity, degree of stenosis and collateral number, were analyzed with clinical stages, then their associations were studied with Spearman rank analysis. Results: The clinical manifestations of ASO patients who were divided into three groups of mild, moderate and severe according to overall classification on color Doppler ultrasound were respectively gradually serious, which had positive correlations with the stages of I, II and III according to clinical stages. Spearman rank analysis showed that the correlation coefficients (rs)was 0.797 2 between two groups (P<0.01), there was good consistency between the overall classification on color Doppler ultrasound and the clinical stagesof ASO. Conclusion: The overall classification of ASO on color Doppler ultrasound has considered impact of many other factors on the clinical symptoms,such as the level of the local narrow, narrow scope, segments of occlusion and collateral arteries, which divides the lesions more objectively, shows good consistency with the clinical stages. (authors)