WorldWideScience

Sample records for mahalanobis distance metric

  1. KM-FCM: A fuzzy clustering optimization algorithm based on Mahalanobis distance

    Directory of Open Access Journals (Sweden)

    Zhiwen ZU

    2018-04-01

    Full Text Available The traditional fuzzy clustering algorithm uses Euclidean distance as the similarity criterion, which is disadvantageous to the multidimensional data processing. In order to solve this situation, Mahalanobis distance is used instead of the traditional Euclidean distance, and the optimization of fuzzy clustering algorithm based on Mahalanobis distance is studied to enhance the clustering effect and ability. With making the initialization means by Heuristic search algorithm combined with k-means algorithm, and in terms of the validity function which could automatically adjust the optimal clustering number, an optimization algorithm KM-FCM is proposed. The new algorithm is compared with FCM algorithm, FCM-M algorithm and M-FCM algorithm in three standard data sets. The experimental results show that the KM-FCM algorithm is effective. It has higher clustering accuracy than FCM, FCM-M and M-FCM, recognizing high-dimensional data clustering well. It has global optimization effect, and the clustering number has no need for setting in advance. The new algorithm provides a reference for the optimization of fuzzy clustering algorithm based on Mahalanobis distance.

  2. Mahalanobis Distance Based Iterative Closest Point

    DEFF Research Database (Denmark)

    Hansen, Mads Fogtmann; Blas, Morten Rufus; Larsen, Rasmus

    2007-01-01

    the notion of a mahalanobis distance map upon a point set with associated covariance matrices which in addition to providing correlation weighted distance implicitly provides a method for assigning correspondence during alignment. This distance map provides an easy formulation of the ICP problem that permits...... a fast optimization. Initially, the covariance matrices are set to the identity matrix, and all shapes are aligned to a randomly selected shape (equivalent to standard ICP). From this point the algorithm iterates between the steps: (a) obtain mean shape and new estimates of the covariance matrices from...... the aligned shapes, (b) align shapes to the mean shape. Three different methods for estimating the mean shape with associated covariance matrices are explored in the paper. The proposed methods are validated experimentally on two separate datasets (IMM face dataset and femur-bones). The superiority of ICP...

  3. Mahalanobis distance and variable selection to optimize dose response

    International Nuclear Information System (INIS)

    Moore, D.H. II; Bennett, D.E.; Wyrobek, A.J.; Kranzler, D.

    1979-01-01

    A battery of statistical techniques are combined to improve detection of low-level dose response. First, Mahalanobis distances are used to classify objects as normal or abnormal. Then the proportion classified abnormal is regressed on dose. Finally, a subset of regressor variables is selected which maximizes the slope of the dose response line. Use of the techniques is illustrated by application to mouse sperm damaged by low doses of x-rays

  4. Limitations to mapping habitat-use areas in changing landscapes using the Mahalanobis distance statistic

    Science.gov (United States)

    Knick, Steven T.; Rotenberry, J.T.

    1998-01-01

    We tested the potential of a GIS mapping technique, using a resource selection model developed for black-tailed jackrabbits (Lepus californicus) and based on the Mahalanobis distance statistic, to track changes in shrubsteppe habitats in southwestern Idaho. If successful, the technique could be used to predict animal use areas, or those undergoing change, in different regions from the same selection function and variables without additional sampling. We determined the multivariate mean vector of 7 GIS variables that described habitats used by jackrabbits. We then ranked the similarity of all cells in the GIS coverage from their Mahalanobis distance to the mean habitat vector. The resulting map accurately depicted areas where we sighted jackrabbits on verification surveys. We then simulated an increase in shrublands (which are important habitats). Contrary to expectation, the new configurations were classified as lower similarity relative to the original mean habitat vector. Because the selection function is based on a unimodal mean, any deviation, even if biologically positive, creates larger Malanobis distances and lower similarity values. We recommend the Mahalanobis distance technique for mapping animal use areas when animals are distributed optimally, the landscape is well-sampled to determine the mean habitat vector, and distributions of the habitat variables does not change.

  5. Outlier detection by robust Mahalanobis distance in geological data obtained by INAA to provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Jose O. dos, E-mail: osmansantos@ig.com.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Sergipe (IFS), Lagarto, SE (Brazil); Munita, Casimiro S., E-mail: camunita@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Soares, Emilio A.A., E-mail: easoares@ufan.edu.br [Universidade Federal do Amazonas (UFAM), Manaus, AM (Brazil). Dept. de Geociencias

    2013-07-01

    The detection of outlier in geochemical studies is one of the main difficulties in the interpretation of dataset because they can disturb the statistical method. The search for outliers in geochemical studies is usually based in the Mahalanobis distance (MD), since points in multivariate space that are a distance larger the some predetermined values from center of the data are considered outliers. However, the MD is very sensitive to the presence of discrepant samples. Many robust estimators for location and covariance have been introduced in the literature, such as Minimum Covariance Determinant (MCD) estimator. When MCD estimators are used to calculate the MD leads to the so-called Robust Mahalanobis Distance (RD). In this context, in this work RD was used to detect outliers in geological study of samples collected from confluence of Negro and Solimoes rivers. The purpose of this study was to study the contributions of the sediments deposited by the Solimoes and Negro rivers in the filling of the tectonic depressions at Parana do Ariau. For that 113 samples were analyzed by Instrumental Neutron Activation Analysis (INAA) in which were determined the concentration of As, Ba, Ce, Co, Cr, Cs, Eu, Fe, Hf, K, La, Lu, Na, Nd, Rb, Sb, Sc, Sm, U, Yb, Ta, Tb, Th and Zn. In the dataset was possible to construct the ellipse corresponding to robust Mahalanobis distance for each group of samples. The samples found outside of the tolerance ellipse were considered an outlier. The results showed that Robust Mahalanobis Distance was more appropriate for the identification of the outliers, once it is a more restrictive method. (author)

  6. Outlier detection by robust Mahalanobis distance in geological data obtained by INAA to provenance studies

    International Nuclear Information System (INIS)

    Santos, Jose O. dos; Munita, Casimiro S.; Soares, Emilio A.A.

    2013-01-01

    The detection of outlier in geochemical studies is one of the main difficulties in the interpretation of dataset because they can disturb the statistical method. The search for outliers in geochemical studies is usually based in the Mahalanobis distance (MD), since points in multivariate space that are a distance larger the some predetermined values from center of the data are considered outliers. However, the MD is very sensitive to the presence of discrepant samples. Many robust estimators for location and covariance have been introduced in the literature, such as Minimum Covariance Determinant (MCD) estimator. When MCD estimators are used to calculate the MD leads to the so-called Robust Mahalanobis Distance (RD). In this context, in this work RD was used to detect outliers in geological study of samples collected from confluence of Negro and Solimoes rivers. The purpose of this study was to study the contributions of the sediments deposited by the Solimoes and Negro rivers in the filling of the tectonic depressions at Parana do Ariau. For that 113 samples were analyzed by Instrumental Neutron Activation Analysis (INAA) in which were determined the concentration of As, Ba, Ce, Co, Cr, Cs, Eu, Fe, Hf, K, La, Lu, Na, Nd, Rb, Sb, Sc, Sm, U, Yb, Ta, Tb, Th and Zn. In the dataset was possible to construct the ellipse corresponding to robust Mahalanobis distance for each group of samples. The samples found outside of the tolerance ellipse were considered an outlier. The results showed that Robust Mahalanobis Distance was more appropriate for the identification of the outliers, once it is a more restrictive method. (author)

  7. Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.

    Science.gov (United States)

    Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si

    2017-07-01

    Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.

  8. Active Metric Learning for Supervised Classification

    OpenAIRE

    Kumaran, Krishnan; Papageorgiou, Dimitri; Chang, Yutong; Li, Minhan; Takáč, Martin

    2018-01-01

    Clustering and classification critically rely on distance metrics that provide meaningful comparisons between data points. We present mixed-integer optimization approaches to find optimal distance metrics that generalize the Mahalanobis metric extensively studied in the literature. Additionally, we generalize and improve upon leading methods by removing reliance on pre-designated "target neighbors," "triplets," and "similarity pairs." Another salient feature of our method is its ability to en...

  9. Protein-protein interaction site predictions with minimum covariance determinant and Mahalanobis distance.

    Science.gov (United States)

    Qiu, Zhijun; Zhou, Bo; Yuan, Jiangfeng

    2017-11-21

    Protein-protein interaction site (PPIS) prediction must deal with the diversity of interaction sites that limits their prediction accuracy. Use of proteins with unknown or unidentified interactions can also lead to missing interfaces. Such data errors are often brought into the training dataset. In response to these two problems, we used the minimum covariance determinant (MCD) method to refine the training data to build a predictor with better performance, utilizing its ability of removing outliers. In order to predict test data in practice, a method based on Mahalanobis distance was devised to select proper test data as input for the predictor. With leave-one-validation and independent test, after the Mahalanobis distance screening, our method achieved higher performance according to Matthews correlation coefficient (MCC), although only a part of test data could be predicted. These results indicate that data refinement is an efficient approach to improve protein-protein interaction site prediction. By further optimizing our method, it is hopeful to develop predictors of better performance and wide range of application. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  11. Alignment-free genome tree inference by learning group-specific distance metrics.

    Science.gov (United States)

    Patil, Kaustubh R; McHardy, Alice C

    2013-01-01

    Understanding the evolutionary relationships between organisms is vital for their in-depth study. Gene-based methods are often used to infer such relationships, which are not without drawbacks. One can now attempt to use genome-scale information, because of the ever increasing number of genomes available. This opportunity also presents a challenge in terms of computational efficiency. Two fundamentally different methods are often employed for sequence comparisons, namely alignment-based and alignment-free methods. Alignment-free methods rely on the genome signature concept and provide a computationally efficient way that is also applicable to nonhomologous sequences. The genome signature contains evolutionary signal as it is more similar for closely related organisms than for distantly related ones. We used genome-scale sequence information to infer taxonomic distances between organisms without additional information such as gene annotations. We propose a method to improve genome tree inference by learning specific distance metrics over the genome signature for groups of organisms with similar phylogenetic, genomic, or ecological properties. Specifically, our method learns a Mahalanobis metric for a set of genomes and a reference taxonomy to guide the learning process. By applying this method to more than a thousand prokaryotic genomes, we showed that, indeed, better distance metrics could be learned for most of the 18 groups of organisms tested here. Once a group-specific metric is available, it can be used to estimate the taxonomic distances for other sequenced organisms from the group. This study also presents a large scale comparison between 10 methods--9 alignment-free and 1 alignment-based.

  12. Distance Metric Tracking

    Science.gov (United States)

    2016-03-02

    whereBψ is any Bregman divergence and ηt is the learning rate parameter. From (Hall & Willett, 2015) we have: Theorem 1. G` = max θ∈Θ,`∈L ‖∇f(θ)‖ φmax = 1...Kullback-Liebler divergence between an initial guess of the matrix that parameterizes the Mahalanobis distance and a solution that satisfies a set of...Bregman divergence and ηt is the learning rate parameter. M̂0, µ̂0 are initialized to some initial value. In [18] a closed-form algorithm for solving

  13. [Mahalanobis distance based hyperspectral characteristic discrimination of leaves of different desert tree species].

    Science.gov (United States)

    Lin, Hai-jun; Zhang, Hui-fang; Gao, Ya-qi; Li, Xia; Yang, Fan; Zhou, Yan-fei

    2014-12-01

    The hyperspectral reflectance of Populus euphratica, Tamarix hispida, Haloxylon ammodendron and Calligonum mongolicum in the lower reaches of Tarim River and Turpan Desert Botanical Garden was measured by using the HR-768 field-portable spectroradiometer. The method of continuum removal, first derivative reflectance and second derivative reflectance were used to deal with the original spectral data of four tree species. The method of Mahalanobis Distance was used to select the bands with significant differences in the original spectral data and transform spectral data to identify the different tree species. The progressive discrimination analyses were used to test the selective bands used to identify different tree species. The results showed that The Mahalanobis Distance method was an effective method in feature band extraction. The bands for identifying different tree species were most near-infrared bands. The recognition accuracy of four methods was 85%, 93.8%, 92.4% and 95.5% respectively. Spectrum transform could improve the recognition accuracy. The recognition accuracy of different research objects and different spectrum transform methods were different. The research provided evidence for desert tree species classification, monitoring biodiversity and the analysis of area in desert by using large scale remote sensing method.

  14. A method for assigning species into groups based on generalized Mahalanobis distance between habitat model coefficients

    Science.gov (United States)

    Williams, C.J.; Heglund, P.J.

    2009-01-01

    Habitat association models are commonly developed for individual animal species using generalized linear modeling methods such as logistic regression. We considered the issue of grouping species based on their habitat use so that management decisions can be based on sets of species rather than individual species. This research was motivated by a study of western landbirds in northern Idaho forests. The method we examined was to separately fit models to each species and to use a generalized Mahalanobis distance between coefficient vectors to create a distance matrix among species. Clustering methods were used to group species from the distance matrix, and multidimensional scaling methods were used to visualize the relations among species groups. Methods were also discussed for evaluating the sensitivity of the conclusions because of outliers or influential data points. We illustrate these methods with data from the landbird study conducted in northern Idaho. Simulation results are presented to compare the success of this method to alternative methods using Euclidean distance between coefficient vectors and to methods that do not use habitat association models. These simulations demonstrate that our Mahalanobis-distance- based method was nearly always better than Euclidean-distance-based methods or methods not based on habitat association models. The methods used to develop candidate species groups are easily explained to other scientists and resource managers since they mainly rely on classical multivariate statistical methods. ?? 2008 Springer Science+Business Media, LLC.

  15. Is flood risk capitalized into real estate market values? : a Mahalanobis-metric matching approach to housing market in Busan, South Korea

    Science.gov (United States)

    Jung, E.; Yoon, H.

    2016-12-01

    Natural disasters are substantial source of social and economic damage around the globe. The amount of damage is larger when such catastrophe events happen in urbanized areas where the wealth is concentrated. Disasters cause losses in real estate assets, incurring additional cost of repair and maintenance of the properties. For this reason, natural hazard risk such as flooding and landslide is regarded as one of the important determinants of homebuyers' choice and preference. In this research, we aim to reveal whether the past records of flood affect real estate market values in Busan, Korea in 2014, under a hypothesis that homebuyers' perception of natural hazard is reflected on housing values, using the Mahalanobis-metric matching method. Unlike conventionally used hedonic pricing model to estimate capitalization of flood risk into the sales price of properties, the analytical method we adopt here enables inferring causal effects by efficiently controlling for observed/unobserved omitted variable bias. This matching approach pairs each inundated property (treatment variable) with a non-inundated property (control variable) with the closest Mahalanobis distance between them, and comparing their effects on residential property sales price (outcome variable). As a result, we expect price discounts for inundated properties larger than the one for comparable non-inundated properties. This research will be valuable in establishing the mitigation policies of future climate change to relieve the possible negative economic consequences from the disaster by estimating how people perceive and respond to natural hazard. This work was supported by the Korea Environmental Industry and Technology Institute (KEITI) under Grant (No. 2014-001-310007).

  16. Modified Mahalanobis Taguchi System for Imbalance Data Classification

    Directory of Open Access Journals (Sweden)

    Mahmoud El-Banna

    2017-01-01

    Full Text Available The Mahalanobis Taguchi System (MTS is considered one of the most promising binary classification algorithms to handle imbalance data. Unfortunately, MTS lacks a method for determining an efficient threshold for the binary classification. In this paper, a nonlinear optimization model is formulated based on minimizing the distance between MTS Receiver Operating Characteristics (ROC curve and the theoretical optimal point named Modified Mahalanobis Taguchi System (MMTS. To validate the MMTS classification efficacy, it has been benchmarked with Support Vector Machines (SVMs, Naive Bayes (NB, Probabilistic Mahalanobis Taguchi Systems (PTM, Synthetic Minority Oversampling Technique (SMOTE, Adaptive Conformal Transformation (ACT, Kernel Boundary Alignment (KBA, Hidden Naive Bayes (HNB, and other improved Naive Bayes algorithms. MMTS outperforms the benchmarked algorithms especially when the imbalance ratio is greater than 400. A real life case study on manufacturing sector is used to demonstrate the applicability of the proposed model and to compare its performance with Mahalanobis Genetic Algorithm (MGA.

  17. Modified Mahalanobis Taguchi System for Imbalance Data Classification

    Science.gov (United States)

    2017-01-01

    The Mahalanobis Taguchi System (MTS) is considered one of the most promising binary classification algorithms to handle imbalance data. Unfortunately, MTS lacks a method for determining an efficient threshold for the binary classification. In this paper, a nonlinear optimization model is formulated based on minimizing the distance between MTS Receiver Operating Characteristics (ROC) curve and the theoretical optimal point named Modified Mahalanobis Taguchi System (MMTS). To validate the MMTS classification efficacy, it has been benchmarked with Support Vector Machines (SVMs), Naive Bayes (NB), Probabilistic Mahalanobis Taguchi Systems (PTM), Synthetic Minority Oversampling Technique (SMOTE), Adaptive Conformal Transformation (ACT), Kernel Boundary Alignment (KBA), Hidden Naive Bayes (HNB), and other improved Naive Bayes algorithms. MMTS outperforms the benchmarked algorithms especially when the imbalance ratio is greater than 400. A real life case study on manufacturing sector is used to demonstrate the applicability of the proposed model and to compare its performance with Mahalanobis Genetic Algorithm (MGA). PMID:28811820

  18. A robust metric for screening outliers from analogue product manufacturing tests responses

    NARCIS (Netherlands)

    Krishnan, S.; Kerkhoff, H.G.

    2011-01-01

    Mahalanobis distance is one of the commonly used multivariate metrics for finely segregating defective devices from non-defective ones. An associated problem with this approach is the estimation of a robust mean and a covariance matrix. In the absence of such robust estimates, especially in the

  19. A Robust Metric for Screening Outliers from Analogue Product Manufacturing Tests Responses

    NARCIS (Netherlands)

    Krishnan, Shaji; Krishnan, Shaji; Kerkhoff, Hans G.

    2011-01-01

    Mahalanobis distance is one of the commonly used multivariate metrics for finely segregating defective devices from non-defective ones. An associated problem with this approach is the estimation of a robust mean and a covariance matrix. In the absence of such robust estimates, especially in the

  20. Assessing Gait Impairments Based on Auto-Encoded Patterns of Mahalanobis Distances from Consecutive Steps.

    Science.gov (United States)

    Muñoz-Organero, Mario; Davies, Richard; Mawson, Sue

    2017-01-01

    Insole pressure sensors capture the force distribution patterns during the stance phase while walking. By comparing patterns obtained from healthy individuals to patients suffering different medical conditions based on a given similarity measure, automatic impairment indexes can be computed in order to help in applications such as rehabilitation. This paper uses the data sensed from insole pressure sensors for a group of healthy controls to train an auto-encoder using patterns of stochastic distances in series of consecutive steps while walking at normal speeds. Two experiment groups are compared to the healthy control group: a group of patients suffering knee pain and a group of post-stroke survivors. The Mahalanobis distance is computed for every single step by each participant compared to the entire dataset sensed from healthy controls. The computed distances for consecutive steps are fed into the previously trained autoencoder and the average error is used to assess how close the walking segment is to the autogenerated model from healthy controls. The results show that automatic distortion indexes can be used to assess each participant as compared to normal patterns computed from healthy controls. The stochastic distances observed for the group of stroke survivors are bigger than those for the people with knee pain.

  1. Using Mahalanobis Distance Scores for Matched Pairing of Schools in a Randomized Controlled Trial Study of Leadership and Assistance for Science Education Reform (LASER)

    Science.gov (United States)

    Zoblotsky, Todd; Ransford-Kaldon, Carolyn; Morrison, Donald M.

    2011-01-01

    The present paper describes the recruitment and site selection process that has been underway since January 2011, with particular emphasis on the use of Mahalanobis distance score to determine matched pairs of sites prior to randomization to treatment and control groups. Through a systematic winnowing process, the authors found that they could…

  2. New neural network classifier of fall-risk based on the Mahalanobis distance and kinematic parameters assessed by a wearable device

    International Nuclear Information System (INIS)

    Giansanti, Daniele; Macellari, Velio; Maccioni, Giovanni

    2008-01-01

    Fall prevention lacks easy, quantitative and wearable methods for the classification of fall-risk (FR). Efforts must be thus devoted to the choice of an ad hoc classifier both to reduce the size of the sample used to train the classifier and to improve performances. A new methodology that uses a neural network (NN) and a wearable device are hereby proposed for this purpose. The NN uses kinematic parameters assessed by a wearable device with accelerometers and rate gyroscopes during a posturography protocol. The training of the NN was based on the Mahalanobis distance and was carried out on two groups of 30 elderly subjects with varying fall-risk Tinetti scores. The validation was done on two groups of 100 subjects with different fall-risk Tinetti scores and showed that, both in terms of specificity and sensitivity, the NN performed better than other classifiers (naive Bayes, Bayes net, multilayer perceptron, support vector machines, statistical classifiers). In particular, (i) the proposed NN methodology improved the specificity and sensitivity by a mean of 3% when compared to the statistical classifier based on the Mahalanobis distance (SCMD) described in Giansanti (2006 Physiol. Meas. 27 1081–90); (ii) the assessed specificity was 97%, the assessed sensitivity was 98% and the area under receiver operator characteristics was 0.965. (note)

  3. Precise Positioning Method for Logistics Tracking Systems Using Personal Handy-Phone System Based on Mahalanobis Distance

    Science.gov (United States)

    Yokoi, Naoaki; Kawahara, Yasuhiro; Hosaka, Hiroshi; Sakata, Kenji

    Focusing on the Personal Handy-phone System (PHS) positioning service used in physical distribution logistics, a positioning error offset method for improving positioning accuracy is invented. A disadvantage of PHS positioning is that measurement errors caused by the fluctuation of radio waves due to buildings around the terminal are large, ranging from several tens to several hundreds of meters. In this study, an error offset method is developed, which learns patterns of positioning results (latitude and longitude) containing errors and the highest signal strength at major logistic points in advance, and matches them with new data measured in actual distribution processes according to the Mahalanobis distance. Then the matching resolution is improved to 1/40 that of the conventional error offset method.

  4. Semiparametric Allelic Tests for Mapping Multiple Phenotypes: Binomial Regression and Mahalanobis Distance.

    Science.gov (United States)

    Majumdar, Arunabha; Witte, John S; Ghosh, Saurabh

    2015-12-01

    Binary phenotypes commonly arise due to multiple underlying quantitative precursors and genetic variants may impact multiple traits in a pleiotropic manner. Hence, simultaneously analyzing such correlated traits may be more powerful than analyzing individual traits. Various genotype-level methods, e.g., MultiPhen (O'Reilly et al. []), have been developed to identify genetic factors underlying a multivariate phenotype. For univariate phenotypes, the usefulness and applicability of allele-level tests have been investigated. The test of allele frequency difference among cases and controls is commonly used for mapping case-control association. However, allelic methods for multivariate association mapping have not been studied much. In this article, we explore two allelic tests of multivariate association: one using a Binomial regression model based on inverted regression of genotype on phenotype (Binomial regression-based Association of Multivariate Phenotypes [BAMP]), and the other employing the Mahalanobis distance between two sample means of the multivariate phenotype vector for two alleles at a single-nucleotide polymorphism (Distance-based Association of Multivariate Phenotypes [DAMP]). These methods can incorporate both discrete and continuous phenotypes. Some theoretical properties for BAMP are studied. Using simulations, the power of the methods for detecting multivariate association is compared with the genotype-level test MultiPhen's. The allelic tests yield marginally higher power than MultiPhen for multivariate phenotypes. For one/two binary traits under recessive mode of inheritance, allelic tests are found to be substantially more powerful. All three tests are applied to two different real data and the results offer some support for the simulation study. We propose a hybrid approach for testing multivariate association that implements MultiPhen when Hardy-Weinberg Equilibrium (HWE) is violated and BAMP otherwise, because the allelic approaches assume HWE

  5. Learning Global-Local Distance Metrics for Signature-Based Biometric Cryptosystems

    Directory of Open Access Journals (Sweden)

    George S. Eskander Ekladious

    2017-11-01

    Full Text Available Biometric traits, such as fingerprints, faces and signatures have been employed in bio-cryptosystems to secure cryptographic keys within digital security schemes. Reliable implementations of these systems employ error correction codes formulated as simple distance thresholds, although they may not effectively model the complex variability of behavioral biometrics like signatures. In this paper, a Global-Local Distance Metric (GLDM framework is proposed to learn cost-effective distance metrics, which reduce within-class variability and augment between-class variability, so that simple error correction thresholds of bio-cryptosystems provide high classification accuracy. First, a large number of samples from a development dataset are used to train a global distance metric that differentiates within-class from between-class samples of the population. Then, once user-specific samples are available for enrollment, the global metric is tuned to a local user-specific one. Proof-of-concept experiments on two reference offline signature databases confirm the viability of the proposed approach. Distance metrics are produced based on concise signature representations consisting of about 20 features and a single prototype. A signature-based bio-cryptosystem is designed using the produced metrics and has shown average classification error rates of about 7% and 17% for the PUCPR and the GPDS-300 databases, respectively. This level of performance is comparable to that obtained with complex state-of-the-art classifiers.

  6. Metrics for measuring distances in configuration spaces

    International Nuclear Information System (INIS)

    Sadeghi, Ali; Ghasemi, S. Alireza; Schaefer, Bastian; Mohr, Stephan; Goedecker, Stefan; Lill, Markus A.

    2013-01-01

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices

  7. Research on cardiovascular disease prediction based on distance metric learning

    Science.gov (United States)

    Ni, Zhuang; Liu, Kui; Kang, Guixia

    2018-04-01

    Distance metric learning algorithm has been widely applied to medical diagnosis and exhibited its strengths in classification problems. The k-nearest neighbour (KNN) is an efficient method which treats each feature equally. The large margin nearest neighbour classification (LMNN) improves the accuracy of KNN by learning a global distance metric, which did not consider the locality of data distributions. In this paper, we propose a new distance metric algorithm adopting cosine metric and LMNN named COS-SUBLMNN which takes more care about local feature of data to overcome the shortage of LMNN and improve the classification accuracy. The proposed methodology is verified on CVDs patient vector derived from real-world medical data. The Experimental results show that our method provides higher accuracy than KNN and LMNN did, which demonstrates the effectiveness of the Risk predictive model of CVDs based on COS-SUBLMNN.

  8. Metric distances derived from cosine similarity and Pearson and Spearman correlations

    OpenAIRE

    van Dongen, Stijn; Enright, Anton J.

    2012-01-01

    We investigate two classes of transformations of cosine similarity and Pearson and Spearman correlations into metric distances, utilising the simple tool of metric-preserving functions. The first class puts anti-correlated objects maximally far apart. Previously known transforms fall within this class. The second class collates correlated and anti-correlated objects. An example of such a transformation that yields a metric distance is the sine function when applied to centered data.

  9. Experimental Study on Damage Detection in Timber Specimens Based on an Electromechanical Impedance Technique and RMSD-Based Mahalanobis Distance

    Directory of Open Access Journals (Sweden)

    Dansheng Wang

    2016-10-01

    Full Text Available In the electromechanical impedance (EMI method, the PZT patch performs the functions of both sensor and exciter. Due to the high frequency actuation and non-model based characteristics, the EMI method can be utilized to detect incipient structural damage. In recent years EMI techniques have been widely applied to monitor the health status of concrete and steel materials, however, studies on application to timber are limited. This paper will explore the feasibility of using the EMI technique for damage detection in timber specimens. In addition, the conventional damage index, namely root mean square deviation (RMSD is employed to evaluate the level of damage. On that basis, a new damage index, Mahalanobis distance based on RMSD, is proposed to evaluate the damage severity of timber specimens. Experimental studies are implemented to detect notch and hole damage in the timber specimens. Experimental results verify the availability and robustness of the proposed damage index and its superiority over the RMSD indexes.

  10. Reconstruction of hit time and hit position of annihilation quanta in the J-PET detector using the Mahalanobis distance

    Directory of Open Access Journals (Sweden)

    Sharma Neha Gupta

    2015-12-01

    Full Text Available The J-PET detector being developed at the Jagiellonian University is a positron emission tomograph composed of the long strips of polymer scintillators. At the same time, it is a detector system that will be used for studies of the decays of positronium atoms. The shape of photomultiplier signals depends on the hit time and hit position of the gamma quantum. In order to take advantage of this fact, a dedicated sampling front-end electronics that enables to sample signals in voltage domain with the time precision of about 20 ps and novel reconstruction method based on the comparison of examined signal with the model signals stored in the library has been developed. As a measure of the similarity, we use the Mahalanobis distance. The achievable position and time resolution depend on the number and values of the threshold levels at which the signal is sampled. A reconstruction method as well as preliminary results are presented and discussed.

  11. Mahalanobis' Contributions to Sample Surveys

    Indian Academy of Sciences (India)

    Sample Survey started its operations in October 1950 under the ... and adopted random cuts for estimating the acreage under jute ... demographic factors relating to indebtedness, unemployment, ... traffic surveys, demand for currency coins and average life of .... Mahalanobis derived the optimum allocation in stratified.

  12. A study of metrics of distance and correlation between ranked lists for compositionality detection

    DEFF Research Database (Denmark)

    Lioma, Christina; Hansen, Niels Dalum

    2017-01-01

    affects the measurement of semantic similarity. We propose a new compositionality detection method that represents phrases as ranked lists of term weights. Our method approximates the semantic similarity between two ranked list representations using a range of well-known distance and correlation metrics...... of compositionality using any of the distance and correlation metrics considered....

  13. Deep Multimodal Distance Metric Learning Using Click Constraints for Image Ranking.

    Science.gov (United States)

    Yu, Jun; Yang, Xiaokang; Gao, Fei; Tao, Dacheng

    2017-12-01

    How do we retrieve images accurately? Also, how do we rank a group of images precisely and efficiently for specific queries? These problems are critical for researchers and engineers to generate a novel image searching engine. First, it is important to obtain an appropriate description that effectively represent the images. In this paper, multimodal features are considered for describing images. The images unique properties are reflected by visual features, which are correlated to each other. However, semantic gaps always exist between images visual features and semantics. Therefore, we utilize click feature to reduce the semantic gap. The second key issue is learning an appropriate distance metric to combine these multimodal features. This paper develops a novel deep multimodal distance metric learning (Deep-MDML) method. A structured ranking model is adopted to utilize both visual and click features in distance metric learning (DML). Specifically, images and their related ranking results are first collected to form the training set. Multimodal features, including click and visual features, are collected with these images. Next, a group of autoencoders is applied to obtain initially a distance metric in different visual spaces, and an MDML method is used to assign optimal weights for different modalities. Next, we conduct alternating optimization to train the ranking model, which is used for the ranking of new queries with click features. Compared with existing image ranking methods, the proposed method adopts a new ranking model to use multimodal features, including click features and visual features in DML. We operated experiments to analyze the proposed Deep-MDML in two benchmark data sets, and the results validate the effects of the method.

  14. Sequence of maximal distance codes in graphs or other metric spaces

    Directory of Open Access Journals (Sweden)

    Charles Delorme

    2013-11-01

    Full Text Available Given a subset C in a metric space E, its successor is the subset  s(C of points at maximum distance from C in E. We study some properties of the sequence obtained by iterating this operation.  Graphs with their usual distance provide already typical examples.

  15. Measuring distance “as the horse runs”: Cross-scale comparison of terrain-based metrics

    Science.gov (United States)

    Buttenfield, Barbara P.; Ghandehari, M; Leyk, S; Stanislawski, Larry V.; Brantley, M E; Qiang, Yi

    2016-01-01

    Distance metrics play significant roles in spatial modeling tasks, such as flood inundation (Tucker and Hancock 2010), stream extraction (Stanislawski et al. 2015), power line routing (Kiessling et al. 2003) and analysis of surface pollutants such as nitrogen (Harms et al. 2009). Avalanche risk is based on slope, aspect, and curvature, all directly computed from distance metrics (Gutiérrez 2012). Distance metrics anchor variogram analysis, kernel estimation, and spatial interpolation (Cressie 1993). Several approaches are employed to measure distance. Planar metrics measure straight line distance between two points (“as the crow flies”) and are simple and intuitive, but suffer from uncertainties. Planar metrics assume that Digital Elevation Model (DEM) pixels are rigid and flat, as tiny facets of ceramic tile approximating a continuous terrain surface. In truth, terrain can bend, twist and undulate within each pixel.Work with Light Detection and Ranging (lidar) data or High Resolution Topography to achieve precise measurements present challenges, as filtering can eliminate or distort significant features (Passalacqua et al. 2015). The current availability of lidar data is far from comprehensive in developed nations, and non-existent in many rural and undeveloped regions. Notwithstanding computational advances, distance estimation on DEMs has never been systematically assessed, due to assumptions that improvements are so small that surface adjustment is unwarranted. For individual pixels inaccuracies may be small, but additive effects can propagate dramatically, especially in regional models (e.g., disaster evacuation) or global models (e.g., sea level rise) where pixels span dozens to hundreds of kilometers (Usery et al 2003). Such models are increasingly common, lending compelling reasons to understand shortcomings in the use of planar distance metrics. Researchers have studied curvature-based terrain modeling. Jenny et al. (2011) use curvature to generate

  16. Two fixed point theorems on quasi-metric spaces via mw- distances

    Energy Technology Data Exchange (ETDEWEB)

    Alegre, C.

    2017-07-01

    In this paper we prove a Banach-type fixed point theorem and a Kannan-type theorem in the setting of quasi-metric spaces using the notion of mw-distance. These theorems generalize some results that have recently appeared in the literature. (Author)

  17. A boosting framework for visuality-preserving distance metric learning and its application to medical image retrieval.

    Science.gov (United States)

    Yang, Liu; Jin, Rong; Mummert, Lily; Sukthankar, Rahul; Goode, Adam; Zheng, Bin; Hoi, Steven C H; Satyanarayanan, Mahadev

    2010-01-01

    Similarity measurement is a critical component in content-based image retrieval systems, and learning a good distance metric can significantly improve retrieval performance. However, despite extensive study, there are several major shortcomings with the existing approaches for distance metric learning that can significantly affect their application to medical image retrieval. In particular, "similarity" can mean very different things in image retrieval: resemblance in visual appearance (e.g., two images that look like one another) or similarity in semantic annotation (e.g., two images of tumors that look quite different yet are both malignant). Current approaches for distance metric learning typically address only one goal without consideration of the other. This is problematic for medical image retrieval where the goal is to assist doctors in decision making. In these applications, given a query image, the goal is to retrieve similar images from a reference library whose semantic annotations could provide the medical professional with greater insight into the possible interpretations of the query image. If the system were to retrieve images that did not look like the query, then users would be less likely to trust the system; on the other hand, retrieving images that appear superficially similar to the query but are semantically unrelated is undesirable because that could lead users toward an incorrect diagnosis. Hence, learning a distance metric that preserves both visual resemblance and semantic similarity is important. We emphasize that, although our study is focused on medical image retrieval, the problem addressed in this work is critical to many image retrieval systems. We present a boosting framework for distance metric learning that aims to preserve both visual and semantic similarities. The boosting framework first learns a binary representation using side information, in the form of labeled pairs, and then computes the distance as a weighted Hamming

  18. Mahalanobis Distance

    Indian Academy of Sciences (India)

    defined by. (1) where the superfix T denotes matrix transpose, L denotes the common (nonsingular) covariance matrix of X in each group G 1 and G 2" It can be seen ... standard deviation. The quadratic form (1) has the effect of transforming the variables to uncorrelated standardised variables. Yand computing the (squared) ...

  19. Mahalanobis Distance

    Indian Academy of Sciences (India)

    McLachlan's research interests have ... Craniometric and anthropological studies are the first field in which the ... applied and have since attracted the attention of many workers ... shall label as Gland G 2' For example, in some community, G 1.

  20. Using Generalized Entropies and OC-SVM with Mahalanobis Kernel for Detection and Classification of Anomalies in Network Traffic

    Directory of Open Access Journals (Sweden)

    Jayro Santiago-Paz

    2015-09-01

    Full Text Available Network anomaly detection and classification is an important open issue in network security. Several approaches and systems based on different mathematical tools have been studied and developed, among them, the Anomaly-Network Intrusion Detection System (A-NIDS, which monitors network traffic and compares it against an established baseline of a “normal” traffic profile. Then, it is necessary to characterize the “normal” Internet traffic. This paper presents an approach for anomaly detection and classification based on Shannon, Rényi and Tsallis entropies of selected features, and the construction of regions from entropy data employing the Mahalanobis distance (MD, and One Class Support Vector Machine (OC-SVM with different kernels (Radial Basis Function (RBF and Mahalanobis Kernel (MK for “normal” and abnormal traffic. Regular and non-regular regions built from “normal” traffic profiles allow anomaly detection, while the classification is performed under the assumption that regions corresponding to the attack classes have been previously characterized. Although this approach allows the use of as many features as required, only four well-known significant features were selected in our case. In order to evaluate our approach, two different data sets were used: one set of real traffic obtained from an Academic Local Area Network (LAN, and the other a subset of the 1998 MIT-DARPA set. For these data sets, a True positive rate up to 99.35%, a True negative rate up to 99.83% and a False negative rate at about 0.16% were yielded. Experimental results show that certain q-values of the generalized entropies and the use of OC-SVM with RBF kernel improve the detection rate in the detection stage, while the novel inclusion of MK kernel in OC-SVM and k-temporal nearest neighbors improve accuracy in classification. In addition, the results show that using the Box-Cox transformation, the Mahalanobis distance yielded high detection rates with

  1. An uncertainty importance measure using a distance metric for the change in a cumulative distribution function

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Han, Seok-Jung; Tak, Nam-IL

    2000-01-01

    A simple measure of uncertainty importance using the entire change of cumulative distribution functions (CDFs) has been developed for use in probability safety assessments (PSAs). The entire change of CDFs is quantified in terms of the metric distance between two CDFs. The metric distance measure developed in this study reflects the relative impact of distributional changes of inputs on the change of an output distribution, while most of the existing uncertainty importance measures reflect the magnitude of relative contribution of input uncertainties to the output uncertainty. The present measure has been evaluated analytically for various analytical distributions to examine its characteristics. To illustrate the applicability and strength of the present measure, two examples are provided. The first example is an application of the present measure to a typical problem of a system fault tree analysis and the second one is for a hypothetical non-linear model. Comparisons of the present result with those obtained by existing uncertainty importance measures show that the metric distance measure is a useful tool to express the measure of uncertainty importance in terms of the relative impact of distributional changes of inputs on the change of an output distribution

  2. Content-based retrieval of brain tumor in contrast-enhanced MRI images using tumor margin information and learned distance metric.

    Science.gov (United States)

    Yang, Wei; Feng, Qianjin; Yu, Mei; Lu, Zhentai; Gao, Yang; Xu, Yikai; Chen, Wufan

    2012-11-01

    A content-based image retrieval (CBIR) method for T1-weighted contrast-enhanced MRI (CE-MRI) images of brain tumors is presented for diagnosis aid. The method is thoroughly evaluated on a large image dataset. Using the tumor region as a query, the authors' CBIR system attempts to retrieve tumors of the same pathological category. Aside from commonly used features such as intensity, texture, and shape features, the authors use a margin information descriptor (MID), which is capable of describing the characteristics of tissue surrounding a tumor, for representing image contents. In addition, the authors designed a distance metric learning algorithm called Maximum mean average Precision Projection (MPP) to maximize the smooth approximated mean average precision (mAP) to optimize retrieval performance. The effectiveness of MID and MPP algorithms was evaluated using a brain CE-MRI dataset consisting of 3108 2D scans acquired from 235 patients with three categories of brain tumors (meningioma, glioma, and pituitary tumor). By combining MID and other features, the mAP of retrieval increased by more than 6% with the learned distance metrics. The distance metric learned by MPP significantly outperformed the other two existing distance metric learning methods in terms of mAP. The CBIR system using the proposed strategies achieved a mAP of 87.3% and a precision of 89.3% when top 10 images were returned by the system. Compared with scale-invariant feature transform, the MID, which uses the intensity profile as descriptor, achieves better retrieval performance. Incorporating tumor margin information represented by MID with the distance metric learned by the MPP algorithm can substantially improve the retrieval performance for brain tumors in CE-MRI.

  3. Evaluating Outlier Identification Tests: Mahalanobis "D" Squared and Comrey "Dk."

    Science.gov (United States)

    Rasmussen, Jeffrey Lee

    1988-01-01

    A Monte Carlo simulation was used to compare the Mahalanobis "D" Squared and the Comrey "Dk" methods of detecting outliers in data sets. Under the conditions investigated, the "D" Squared technique was preferable as an outlier removal statistic. (SLD)

  4. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2016-01-01

    This 4th edition of the leading reference volume on distance metrics is characterized by updated and rewritten sections on some items suggested by experts and readers, as well a general streamlining of content and the addition of essential new topics. Though the structure remains unchanged, the new edition also explores recent advances in the use of distances and metrics for e.g. generalized distances, probability theory, graph theory, coding theory, data analysis. New topics in the purely mathematical sections include e.g. the Vitanyi multiset-metric, algebraic point-conic distance, triangular ratio metric, Rossi-Hamming metric, Taneja distance, spectral semimetric between graphs, channel metrization, and Maryland bridge distance. The multidisciplinary sections have also been supplemented with new topics, including: dynamic time wrapping distance, memory distance, allometry, atmospheric depth, elliptic orbit distance, VLBI distance measurements, the astronomical system of units, and walkability distance. Lea...

  5. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    Science.gov (United States)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  6. A robust new metric of phenotypic distance to estimate and compare multiple trait differences among populations

    Directory of Open Access Journals (Sweden)

    Rebecca SAFRAN, Samuel FLAXMAN, Michael KOPP, Darren E. IRWIN, Derek BRIGGS, Matthew R. EVANS, W. Chris FUNK, David A. GRAY, Eileen A. HEBE

    2012-06-01

    Full Text Available Whereas a rich literature exists for estimating population genetic divergence, metrics of phenotypic trait divergence are lacking, particularly for comparing multiple traits among three or more populations. Here, we review and analyze via simulation Hedges’ g, a widely used parametric estimate of effect size. Our analyses indicate that g is sensitive to a combination of unequal trait variances and unequal sample sizes among populations and to changes in the scale of measurement. We then go on to derive and explain a new, non-parametric distance measure, “Δp”, which is calculated based upon a joint cumulative distribution function (CDF from all populations under study. More precisely, distances are measured in terms of the percentiles in this CDF at which each population’s median lies. Δp combines many desirable features of other distance metrics into a single metric; namely, compared to other metrics, p is relatively insensitive to unequal variances and sample sizes among the populations sampled. Furthermore, a key feature of Δp—and our main motivation for developing it—is that it easily accommodates simultaneous comparisons of any number of traits across any number of populations. To exemplify its utility, we employ Δp to address a question related to the role of sexual selection in speciation: are sexual signals more divergent than ecological traits in closely related taxa? Using traits of known function in closely related populations, we show that traits predictive of reproductive performance are, indeed, more divergent and more sexually dimorphic than traits related to ecological adaptation [Current Zoology 58 (3: 423-436, 2012].

  7. Series distance – an intuitive metric to quantify hydrograph similarity in terms of occurrence, amplitude and timing of hydrological events

    Directory of Open Access Journals (Sweden)

    U. Ehret

    2011-03-01

    Full Text Available Applying metrics to quantify the similarity or dissimilarity of hydrographs is a central task in hydrological modelling, used both in model calibration and the evaluation of simulations or forecasts. Motivated by the shortcomings of standard objective metrics such as the Root Mean Square Error (RMSE or the Mean Absolute Peak Time Error (MAPTE and the advantages of visual inspection as a powerful tool for simultaneous, case-specific and multi-criteria (yet subjective evaluation, we propose a new objective metric termed Series Distance, which is in close accordance with visual evaluation. The Series Distance quantifies the similarity of two hydrographs neither in a time-aggregated nor in a point-by-point manner, but on the scale of hydrological events. It consists of three parts, namely a Threat Score which evaluates overall agreement of event occurrence, and the overall distance of matching observed and simulated events with respect to amplitude and timing. The novelty of the latter two is the way in which matching point pairs on the observed and simulated hydrographs are identified: not by equality in time (as is the case with the RMSE, but by the same relative position in matching segments (rise or recession of the event, indicating the same underlying hydrological process. Thus, amplitude and timing errors are calculated simultaneously but separately, from point pairs that also match visually, considering complete events rather than only individual points (as is the case with MAPTE. Relative weights can freely be assigned to each component of the Series Distance, which allows (subjective customization of the metric to various fields of application, but in a traceable way. Each of the three components of the Series Distance can be used in an aggregated or non-aggregated way, which makes the Series Distance a suitable tool for differentiated, process-based model diagnostics.

    After discussing the applicability of established time series

  8. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2014-01-01

    This updated and revised third edition of the leading reference volume on distance metrics includes new items from very active research areas in the use of distances and metrics such as geometry, graph theory, probability theory and analysis. Among the new topics included are, for example, polyhedral metric space, nearness matrix problems, distances between belief assignments, distance-related animal settings, diamond-cutting distances, natural units of length, Heidegger’s de-severance distance, and brain distances. The publication of this volume coincides with intensifying research efforts into metric spaces and especially distance design for applications. Accurate metrics have become a crucial goal in computational biology, image analysis, speech recognition and information retrieval. Leaving aside the practical questions that arise during the selection of a ‘good’ distance function, this work focuses on providing the research community with an invaluable comprehensive listing of the main available di...

  9. Characterization of Diffusion Metric Map Similarity in Data From a Clinical Data Repository Using Histogram Distances

    Science.gov (United States)

    Warner, Graham C.; Helmer, Karl G.

    2018-01-01

    As the sharing of data is mandated by funding agencies and journals, reuse of data has become more prevalent. It becomes imperative, therefore, to develop methods to characterize the similarity of data. While users can group data based on the acquisition parameters stored in the file headers, these gives no indication whether a file can be combined with other data without increasing the variance in the data set. Methods have been implemented that characterize the signal-to-noise ratio or identify signal drop-outs in the raw image files, but potential users of data often have access to calculated metric maps and these are more difficult to characterize and compare. Here we describe a histogram-distance-based method applied to diffusion metric maps of fractional anisotropy and mean diffusivity that were generated using data extracted from a repository of clinically-acquired MRI data. We describe the generation of the data set, the pitfalls specific to diffusion MRI data, and the results of the histogram distance analysis. We find that, in general, data from GE scanners are less similar than are data from Siemens scanners. We also find that the distribution of distance metric values is not Gaussian at any selection of the acquisition parameters considered here (field strength, number of gradient directions, b-value, and vendor). PMID:29568257

  10. Novel Mahalanobis-based feature selection improves one-class classification of early hepatocellular carcinoma.

    Science.gov (United States)

    Thomaz, Ricardo de Lima; Carneiro, Pedro Cunha; Bonin, João Eliton; Macedo, Túlio Augusto Alves; Patrocinio, Ana Claudia; Soares, Alcimar Barbosa

    2018-05-01

    Detection of early hepatocellular carcinoma (HCC) is responsible for increasing survival rates in up to 40%. One-class classifiers can be used for modeling early HCC in multidetector computed tomography (MDCT), but demand the specific knowledge pertaining to the set of features that best describes the target class. Although the literature outlines several features for characterizing liver lesions, it is unclear which is most relevant for describing early HCC. In this paper, we introduce an unconstrained GA feature selection algorithm based on a multi-objective Mahalanobis fitness function to improve the classification performance for early HCC. We compared our approach to a constrained Mahalanobis function and two other unconstrained functions using Welch's t-test and Gaussian Data Descriptors. The performance of each fitness function was evaluated by cross-validating a one-class SVM. The results show that the proposed multi-objective Mahalanobis fitness function is capable of significantly reducing data dimensionality (96.4%) and improving one-class classification of early HCC (0.84 AUC). Furthermore, the results provide strong evidence that intensity features extracted at the arterial to portal and arterial to equilibrium phases are important for classifying early HCC.

  11. Fixed point results for contractions involving generalized altering distances in ordered metric spaces

    Directory of Open Access Journals (Sweden)

    Samet Bessem

    2011-01-01

    Full Text Available Abstract In this article, we establish coincidence point and common fixed point theorems for mappings satisfying a contractive inequality which involves two generalized altering distance functions in ordered complete metric spaces. As application, we study the existence of a common solution to a system of integral equations. 2000 Mathematics subject classification. Primary 47H10, Secondary 54H25

  12. Development Planning & Policies under Mahalanobis Strategy: A Tale of India’s Dilemma

    Directory of Open Access Journals (Sweden)

    Dr. Asim K. Karmakar

    2013-07-01

    In the above backdrop the present paper gives a short review of Mahalanobis strategy of development planning in the context of the then India’s dilemma: dynamic industrialization and static agriculture.

  13. Differentiation and detection of microorganisms using Fourier transform infrared photoacoustic spectroscopy

    Science.gov (United States)

    Irudayaraj, Joseph; Yang, Hong; Sakhamuri, Sivakesava

    2002-03-01

    Fourier transform infrared photoacoustic spectroscopy (FTIR-PAS) was used to differentiate and identify microorganisms on a food (apple) surface. Microorganisms considered include bacteria (Lactobacillus casei, Bacillus cereus, and Escherichia coli), yeast (Saccharomyces cerevisiae), and fungi (Aspergillus niger and Fusarium verticilliodes). Discriminant analysis was used to differentiate apples contaminated with the different microorganisms from uncontaminated apple. Mahalanobis distances were calculated to quantify the differences. The higher the value of the Mahalanobis distance metric between different microorganisms, the greater is their difference. Additionally, pathogenic (O157:H7) E. coli was successfully differentiated from non-pathogenic strains. Results demonstrate that FTIR-PAS spectroscopy has the potential to become a non-destructive analysis tool in food safety related research.

  14. Early Identification of Ineffective Cooperative Learning Teams

    Science.gov (United States)

    Hsiung, C .M.; Luo, L. F.; Chung, H. C.

    2014-01-01

    Cooperative learning has many pedagogical benefits. However, if the cooperative learning teams become ineffective, these benefits are lost. Accordingly, this study developed a computer-aided assessment method for identifying ineffective teams at their early stage of dysfunction by using the Mahalanobis distance metric to examine the difference…

  15. WE-E-213CD-11: A New Automatically Generated Metric for Evaluating the Spatial Precision of Deformable Image Registrations: The Distance Discordance Metric.

    Science.gov (United States)

    Saleh, Z; Apte, A; Sharp, G; Deasy, J

    2012-06-01

    We propose a new metric called Distance Discordance (DD), which is defined as the distance between two anatomic points from two moving images, which are co-located on some reference image, when deformed onto another reference image. To demonstrate the concept of DD, we created a reference software phantom which contains two objects. The first object (1) consists of a hollow box with a fixed size core and variable wall thickness. The second object (2) consists of a solid box of fixed size and arbitrary location. 7 different variations of the fixed phantom were created. Each phantom was deformed onto every other phantom using two B-Spline DIR algorithms available in Elastix and Plastimatch. Voxels were sampled from the reference phantom [1], which were also deformed from moving phantoms [2…6], and we find the differences in their corresponding location on phantom [7]. Each voxel results in a distribution of DD values, which we call distance discordance histogram (DDH). We also demonstrate this concept in 8 Head & Neck patients. The two image registration algorithms produced two different DD results for the same phantom image set. The mean values of the DDH were slightly lower for Elastix (0-1.28 cm) as compared to the values produced by Plastimatch (0-1.43 cm). The combined DDH for the H&N patients followed a lognormal distribution with a mean of 0.45 cm and std. deviation of 0.42 cm. The proposed distance discordance (DD) metric is an easily interpretable, quantitative tool that can be used to evaluate the effect of inter-patient variability on the goodness of the registration in different parts of the patient anatomy. Therefore, it can be utilized to exclude certain images based on their DDH characteristics. In addition, this metric does not rely on 'ground truth' or the presence of contoured structures. Partially supported by NIH grant R01 CA85181. © 2012 American Association of Physicists in Medicine.

  16. Fault diagnosis of rolling bearings based on multifractal detrended fluctuation analysis and Mahalanobis distance criterion

    Science.gov (United States)

    Lin, Jinshan; Chen, Qian

    2013-07-01

    Vibration data of faulty rolling bearings are usually nonstationary and nonlinear, and contain fairly weak fault features. As a result, feature extraction of rolling bearing fault data is always an intractable problem and has attracted considerable attention for a long time. This paper introduces multifractal detrended fluctuation analysis (MF-DFA) to analyze bearing vibration data and proposes a novel method for fault diagnosis of rolling bearings based on MF-DFA and Mahalanobis distance criterion (MDC). MF-DFA, an extension of monofractal DFA, is a powerful tool for uncovering the nonlinear dynamical characteristics buried in nonstationary time series and can capture minor changes of complex system conditions. To begin with, by MF-DFA, multifractality of bearing fault data was quantified with the generalized Hurst exponent, the scaling exponent and the multifractal spectrum. Consequently, controlled by essentially different dynamical mechanisms, the multifractality of four heterogeneous bearing fault data is significantly different; by contrast, controlled by slightly different dynamical mechanisms, the multifractality of homogeneous bearing fault data with different fault diameters is significantly or slightly different depending on different types of bearing faults. Therefore, the multifractal spectrum, as a set of parameters describing multifractality of time series, can be employed to characterize different types and severity of bearing faults. Subsequently, five characteristic parameters sensitive to changes of bearing fault conditions were extracted from the multifractal spectrum and utilized to construct fault features of bearing fault data. Moreover, Hilbert transform based envelope analysis, empirical mode decomposition (EMD) and wavelet transform (WT) were utilized to study the same bearing fault data. Also, the kurtosis and the peak levels of the EMD or the WT component corresponding to the bearing tones in the frequency domain were carefully checked

  17. Correlation of spatial climate/weather maps and the advantages of using the Mahalanobis metric in predictions

    Science.gov (United States)

    Stephenson, D. B.

    1997-10-01

    The skill in predicting spatially varying weather/climate maps depends on the definition of the measure of similarity between the maps. Under the justifiable approximation that the anomaly maps are distributed multinormally, it is shown analytically that the choice of weighting metric, used in defining the anomaly correlation between spatial maps, can change the resulting probability distribution of the correlation coefficient. The estimate of the numbers of degrees of freedom based on the variance of the correlation distribution can vary from unity up to the number of grid points depending on the choice of weighting metric. The (pseudo-) inverse of the sample covariance matrix acts as a special choice for the metric in that it gives a correlation distribution which has minimal kurtosis and maximum dimension. Minimal kurtosis suggests that the average predictive skill might be improved due to the rarer occurrence of troublesome outlier patterns far from the mean state. Maximum dimension has a disadvantage for analogue prediction schemes in that it gives the minimum number of analogue states. This metric also has an advantage in that it allows one to powerfully test the null hypothesis of multinormality by examining the second and third moments of the correlation coefficient which were introduced by Mardia as invariant measures of multivariate kurtosis and skewness. For these reasons, it is suggested that this metric could be usefully employed in the prediction of weather/climate and in fingerprinting anthropogenic climate change. The ideas are illustrated using the bivariate example of the observed monthly mean sea-level pressures at Darwin and Tahitifrom 1866 1995.

  18. The External Performance Appraisal of China Energy Regulation: An Empirical Study Using a TOPSIS Method Based on Entropy Weight and Mahalanobis Distance.

    Science.gov (United States)

    Wang, Zheng-Xin; Li, Dan-Dan; Zheng, Hong-Hao

    2018-01-30

    In China's industrialization process, the effective regulation of energy and environment can promote the positive externality of energy consumption while reducing negative externality, which is an important means for realizing the sustainable development of an economic society. The study puts forward an improved technique for order preference by similarity to an ideal solution based on entropy weight and Mahalanobis distance (briefly referred as E-M-TOPSIS). The performance of the approach was verified to be satisfactory. By separately using traditional and improved TOPSIS methods, the study carried out the empirical appraisals on the external performance of China's energy regulation during 1999~2015. The results show that the correlation between the performance indexes causes the significant difference between the appraisal results of E-M-TOPSIS and traditional TOPSIS. The E-M-TOPSIS takes the correlation between indexes into account and generally softens the closeness degree compared with traditional TOPSIS. Moreover, it makes the relative closeness degree fluctuate within a small-amplitude. The results conform to the practical condition of China's energy regulation and therefore the E-M-TOPSIS is favorably applicable for the external performance appraisal of energy regulation. Additionally, the external economic performance and social responsibility performance (including environmental and energy safety performances) based on the E-M-TOPSIS exhibit significantly different fluctuation trends. The external economic performance dramatically fluctuates with a larger fluctuation amplitude, while the social responsibility performance exhibits a relatively stable interval fluctuation. This indicates that compared to the social responsibility performance, the fluctuation of external economic performance is more sensitive to energy regulation.

  19. Quantum Algorithm for K-Nearest Neighbors Classification Based on the Metric of Hamming Distance

    Science.gov (United States)

    Ruan, Yue; Xue, Xiling; Liu, Heng; Tan, Jianing; Li, Xi

    2017-11-01

    K-nearest neighbors (KNN) algorithm is a common algorithm used for classification, and also a sub-routine in various complicated machine learning tasks. In this paper, we presented a quantum algorithm (QKNN) for implementing this algorithm based on the metric of Hamming distance. We put forward a quantum circuit for computing Hamming distance between testing sample and each feature vector in the training set. Taking advantage of this method, we realized a good analog for classical KNN algorithm by setting a distance threshold value t to select k - n e a r e s t neighbors. As a result, QKNN achieves O( n 3) performance which is only relevant to the dimension of feature vectors and high classification accuracy, outperforms Llyod's algorithm (Lloyd et al. 2013) and Wiebe's algorithm (Wiebe et al. 2014).

  20. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  1. Semantic metrics

    OpenAIRE

    Hu, Bo; Kalfoglou, Yannis; Dupplaw, David; Alani, Harith; Lewis, Paul; Shadbolt, Nigel

    2006-01-01

    In the context of the Semantic Web, many ontology-related operations, e.g. ontology ranking, segmentation, alignment, articulation, reuse, evaluation, can be boiled down to one fundamental operation: computing the similarity and/or dissimilarity among ontological entities, and in some cases among ontologies themselves. In this paper, we review standard metrics for computing distance measures and we propose a series of semantic metrics. We give a formal account of semantic metrics drawn from a...

  2. Multi-site Study of Diffusion Metric Variability: Characterizing the Effects of Site, Vendor, Field Strength, and Echo Time using the Histogram Distance

    Science.gov (United States)

    Helmer, K. G.; Chou, M-C.; Preciado, R. I.; Gimi, B.; Rollins, N. K.; Song, A.; Turner, J.; Mori, S.

    2016-01-01

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables. PMID:27350723

  3. Multi-site Study of Diffusion Metric Variability: Characterizing the Effects of Site, Vendor, Field Strength, and Echo Time using the Histogram Distance.

    Science.gov (United States)

    Helmer, K G; Chou, M-C; Preciado, R I; Gimi, B; Rollins, N K; Song, A; Turner, J; Mori, S

    2016-02-27

    MRI-based multi-site trials now routinely include some form of diffusion-weighted imaging (DWI) in their protocol. These studies can include data originating from scanners built by different vendors, each with their own set of unique protocol restrictions, including restrictions on the number of available gradient directions, whether an externally-generated list of gradient directions can be used, and restrictions on the echo time (TE). One challenge of multi-site studies is to create a common imaging protocol that will result in a reliable and accurate set of diffusion metrics. The present study describes the effect of site, scanner vendor, field strength, and TE on two common metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA). We have shown in earlier work that ROI metrics and the mean of MD and FA histograms are not sufficiently sensitive for use in site characterization. Here we use the distance between whole brain histograms of FA and MD to investigate within- and between-site effects. We concluded that the variability of DTI metrics due to site, vendor, field strength, and echo time could influence the results in multi-center trials and that histogram distance is sensitive metrics for each of these variables.

  4. Drift correction for single-molecule imaging by molecular constraint field, a distance minimum metric

    International Nuclear Information System (INIS)

    Han, Renmin; Wang, Liansan; Xu, Fan; Zhang, Yongdeng; Zhang, Mingshu; Liu, Zhiyong; Ren, Fei; Zhang, Fa

    2015-01-01

    The recent developments of far-field optical microscopy (single molecule imaging techniques) have overcome the diffraction barrier of light and improve image resolution by a factor of ten compared with conventional light microscopy. These techniques utilize the stochastic switching of probe molecules to overcome the diffraction limit and determine the precise localizations of molecules, which often requires a long image acquisition time. However, long acquisition times increase the risk of sample drift. In the case of high resolution microscopy, sample drift would decrease the image resolution. In this paper, we propose a novel metric based on the distance between molecules to solve the drift correction. The proposed metric directly uses the position information of molecules to estimate the frame drift. We also designed an algorithm to implement the metric for the general application of drift correction. There are two advantages of our method: First, because our method does not require space binning of positions of molecules but directly operates on the positions, it is more natural for single molecule imaging techniques. Second, our method can estimate drift with a small number of positions in each temporal bin, which may extend its potential application. The effectiveness of our method has been demonstrated by both simulated data and experiments on single molecular images

  5. Kerr metric in cosmological background

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, P C [Gujarat Univ., Ahmedabad (India). Dept. of Mathematics

    1977-06-01

    A metric satisfying Einstein's equation is given which in the vicinity of the source reduces to the well-known Kerr metric and which at large distances reduces to the Robertson-Walker metric of a nomogeneous cosmological model. The radius of the event horizon of the Kerr black hole in the cosmological background is found out.

  6. Correlation of spatial climate/weather maps and the advantages of using the Mahalanobis metric in predictions

    OpenAIRE

    Stephenson, D. B.

    2011-01-01

    he skill in predicting spatially varying weather/climate maps depends on the definition of the measure of similarity between the maps. Under the justifiable approximation that the anomaly maps are distributed multinormally, it is shown analytically that the choice of weighting metric, used in defining the anomaly correlation between spatial maps, can change the resulting probability distribution of the correlation coefficient. The estimate of the numbers of degrees of freedom based on the var...

  7. Product Differentiation and Brand Competition in the Italian Breakfast Cereal Market: a Distance Metric Approach

    Directory of Open Access Journals (Sweden)

    Paolo Sckokai

    2013-03-01

    Full Text Available This article employs a nation-wide sample of supermarket scanner data to study product and brand competition in the Italian breakfast cereal market. A modified Almost Ideal Demand System (AIDS, that includes Distance Metrics (DMs as proposed by Pinkse, Slade and Brett (2002, is estimated to study demand responses, substitution patterns, own-price and cross-price elasticities. Estimation results provide evidence of some degree of brand loyalty, while consumers do not seem loyal to the product type. Elasticity estimates point out the presence of patterns of substitution within products sharing the same brand and similar nutritional characteristics.

  8. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  9. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  10. Heterosis and genetic distance in rapeseed (Brasica napus L.). Use of different indicators of genetic divergence in 7x7 diallel

    OpenAIRE

    Lefort-Buson, Marianne; Guillot-Lemoine, Brigitte; Dattée, Yvette

    1986-01-01

    The paper deals with a comparison of different indicators of genetic divergence between rapeseed parental lines : the relationship coefficient defined by MALÈCOT the generalized distance D2 of Mahalanobis, and a new G2 parameter close to HANSON & CASAS' R2. The purpose of the authors is to discuss the advantages of their simultaneous use in the prediction of both heterosis values and F1 performances of hybrids from parental lines. Relationships between heterosis values and genetic distanc...

  11. Active Metric Learning from Relative Comparisons

    OpenAIRE

    Xiong, Sicheng; Rosales, Rómer; Pei, Yuanli; Fern, Xiaoli Z.

    2014-01-01

    This work focuses on active learning of distance metrics from relative comparison information. A relative comparison specifies, for a data point triplet $(x_i,x_j,x_k)$, that instance $x_i$ is more similar to $x_j$ than to $x_k$. Such constraints, when available, have been shown to be useful toward defining appropriate distance metrics. In real-world applications, acquiring constraints often require considerable human effort. This motivates us to study how to select and query the most useful ...

  12. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  13. On the Metric-Based Approximate Minimization of Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2017-01-01

    We address the behavioral metric-based approximate minimization problem of Markov Chains (MCs), i.e., given a finite MC and a positive integer k, we are interested in finding a k-state MC of minimal distance to the original. By considering as metric the bisimilarity distance of Desharnais at al...

  14. Encyclopedia of distances

    CERN Document Server

    Deza, Michel Marie

    2009-01-01

    Distance metrics and distances have become an essential tool in many areas of pure and applied Mathematics. This title offers both independent introductions and definitions, while at the same time making cross-referencing easy through hyperlink-like boldfaced references to original definitions.

  15. Metric space construction for the boundary of space-time

    International Nuclear Information System (INIS)

    Meyer, D.A.

    1986-01-01

    A distance function between points in space-time is defined and used to consider the manifold as a topological metric space. The properties of the distance function are investigated: conditions under which the metric and manifold topologies agree, the relationship with the causal structure of the space-time and with the maximum lifetime function of Wald and Yip, and in terms of the space of causal curves. The space-time is then completed as a topological metric space; the resultant boundary is compared with the causal boundary and is also calculated for some pertinent examples

  16. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  17. A methodology for quantitatively managing the bug fixing process using Mahalanobis Taguchi system

    Directory of Open Access Journals (Sweden)

    Boby John

    2015-12-01

    Full Text Available The controlling of bug fixing process during the system testing phase of software development life cycle is very important for fixing all the detected bugs within the scheduled time. The presence of open bugs often delays the release of the software or result in releasing the software with compromised functionalities. These can lead to customer dissatisfaction, cost overrun and eventually the loss of market share. In this paper, the authors propose a methodology to quantitatively manage the bug fixing process during system testing. The proposed methodology identifies the critical milestones in the system testing phase which differentiates the successful projects from the unsuccessful ones using Mahalanobis Taguchi system. Then a model is developed to predict whether a project is successful or not with the bug fix progress at critical milestones as control factors. Finally the model is used to control the bug fixing process. It is found that the performance of the proposed methodology using Mahalanobis Taguchi system is superior to the models developed using other multi-dimensional pattern recognition techniques. The proposed methodology also reduces the number of control points providing the managers with more options and flexibility to utilize the bug fixing resources across system testing phase. Moreover the methodology allows the mangers to carry out mid- course corrections to bring the bug fixing process back on track so that all the detected bugs can be fixed on time. The methodology is validated with eight new projects and the results are very encouraging.

  18. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...

  19. Comparison of efficiency of distance measurement methodologies in mango (Mangifera indica) progenies based on physicochemical descriptors.

    Science.gov (United States)

    Alves, E O S; Cerqueira-Silva, C B M; Souza, A M; Santos, C A F; Lima Neto, F P; Corrêa, R X

    2012-03-14

    We investigated seven distance measures in a set of observations of physicochemical variables of mango (Mangifera indica) submitted to multivariate analyses (distance, projection and grouping). To estimate the distance measurements, five mango progeny (total of 25 genotypes) were analyzed, using six fruit physicochemical descriptors (fruit weight, equatorial diameter, longitudinal diameter, total soluble solids in °Brix, total titratable acidity, and pH). The distance measurements were compared by the Spearman correlation test, projection in two-dimensional space and grouping efficiency. The Spearman correlation coefficients between the seven distance measurements were, except for the Mahalanobis' generalized distance (0.41 ≤ rs ≤ 0.63), high and significant (rs ≥ 0.91; P < 0.001). Regardless of the origin of the distance matrix, the unweighted pair group method with arithmetic mean grouping method proved to be the most adequate. The various distance measurements and grouping methods gave different values for distortion (-116.5 ≤ D ≤ 74.5), cophenetic correlation (0.26 ≤ rc ≤ 0.76) and stress (-1.9 ≤ S ≤ 58.9). Choice of distance measurement and analysis methods influence the.

  20. Distance walked and run as improved metrics over time-based energy estimation in epidemiological studies and prevention; evidence from medication use.

    Directory of Open Access Journals (Sweden)

    Paul T Williams

    Full Text Available The guideline physical activity levels are prescribed in terms of time, frequency, and intensity (e.g., 30 minutes brisk walking, five days a week or its energy equivalence and assume that different activities may be combined to meet targeted goals (exchangeability premise. Habitual runners and walkers may quantify exercise in terms of distance (km/day, and for them, the relationship between activity dose and health benefits may be better assessed in terms of distance rather than time. Analyses were therefore performed to test: 1 whether time-based or distance-based estimates of energy expenditure provide the best metric for relating running and walking to hypertensive, high cholesterol, and diabetes medication use (conditions known to be diminished by exercise, and 2 the exchangeability premise.Logistic regression analyses of medication use (dependent variable vs. metabolic equivalent hours per day (METhr/d of running, walking and other exercise (independent variables using cross-sectional data from the National Runners' (17,201 male, 16,173 female and Walkers' Health Studies (3,434 male, 12,384 female.Estimated METhr/d of running and walking activity were 38% and 31% greater, respectively, when calculated from self-reported time than distance in men, and 43% and 37% greater in women, respectively. Percent reductions in the odds for hypertension and high cholesterol medication use per METhr/d run or per METhr/d walked were ≥ 2-fold greater when estimated from reported distance (km/wk than from time (hr/wk. The per METhr/d odds reduction was significantly greater for the distance- than the time-based estimate for hypertension (runners: P<10(-5 for males and P=0.003 for females; walkers: P=0.03 for males and P<10(-4 for females, high cholesterol medication use in runners (P<10(-4 for males and P=0.02 for females and male walkers (P=0.01 for males and P=0.08 for females and for diabetes medication use in male runners (P<10(-3.Although causality

  1. Computing Best and Worst Shortcuts of Graphs Embedded in Metric Spaces

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian; Luo, Jun

    2008-01-01

    Given a graph embedded in a metric space, its dilation is the maximum over all distinct pairs of vertices of the ratio between their distance in the graph and the metric distance between them. Given such a graph G with n vertices and m edges and consisting of at most two connected components, we ...

  2. Characterizing the round sphere by mean distance

    DEFF Research Database (Denmark)

    Kokkendorff, Simon Lyngby

    2008-01-01

    We discuss the measure theoretic metric invariants extent, rendezvous number and mean distance of a general compact metric space X and relate these to classical metric invariants such as diameter and radius. In the final section we focus attention to the category of Riemannian manifolds. The main...

  3. Nonlinear Semi-Supervised Metric Learning Via Multiple Kernels and Local Topology.

    Science.gov (United States)

    Li, Xin; Bai, Yanqin; Peng, Yaxin; Du, Shaoyi; Ying, Shihui

    2018-03-01

    Changing the metric on the data may change the data distribution, hence a good distance metric can promote the performance of learning algorithm. In this paper, we address the semi-supervised distance metric learning (ML) problem to obtain the best nonlinear metric for the data. First, we describe the nonlinear metric by the multiple kernel representation. By this approach, we project the data into a high dimensional space, where the data can be well represented by linear ML. Then, we reformulate the linear ML by a minimization problem on the positive definite matrix group. Finally, we develop a two-step algorithm for solving this model and design an intrinsic steepest descent algorithm to learn the positive definite metric matrix. Experimental results validate that our proposed method is effective and outperforms several state-of-the-art ML methods.

  4. A Kerr-NUT metric

    International Nuclear Information System (INIS)

    Vaidya, P.C.; Patel, L.K.; Bhatt, P.V.

    1976-01-01

    Using Galilean time and retarded distance as coordinates the usual Kerr metric is expressed in form similar to the Newman-Unti-Tamburino (NUT) metric. The combined Kerr-NUT metric is then investigated. In addition to the Kerr and NUT solutions of Einstein's equations, three other types of solutions are derived. These are (i) the radiating Kerr solution, (ii) the radiating NUT solution satisfying Rsub(ik) = sigmaxisub(i)xisub(k), xisub(i)xisup(i) = 0, and (iii) the associated Kerr solution satisfying Rsub(ik) = 0. Solution (i) is distinct from and simpler than the one reported earlier by Vaidya and Patel (Phys. Rev.; D7:3590 (1973)). Solutions (ii) and (iii) gave line elements which have the axis of symmetry as a singular line. (author)

  5. SU-E-J-159: Intra-Patient Deformable Image Registration Uncertainties Quantified Using the Distance Discordance Metric

    International Nuclear Information System (INIS)

    Saleh, Z; Thor, M; Apte, A; Deasy, J; Sharp, G; Muren, L

    2014-01-01

    Purpose: The quantitative evaluation of deformable image registration (DIR) is currently challenging due to lack of a ground truth. In this study we test a new method proposed for quantifying multiple-image based DIRrelated uncertainties, for DIR of pelvic images. Methods: 19 patients were analyzed, each with 6 CT scans, who previously had radiotherapy for prostate cancer. Manually delineated structures for rectum and bladder, which served as ground truth structures, were delineated on the planning CT and each subsequent scan. For each patient, voxel-by-voxel DIR-related uncertainties were evaluated, following B-spline based DIR, by applying a previously developed metric, the distance discordance metric (DDM; Saleh et al., PMB (2014) 59:733). The DDM map was superimposed on the first acquired CT scan and DDM statistics were assessed, also relative to two metrics estimating the agreement between the propagated and the manually delineated structures. Results: The highest DDM values which correspond to greatest spatial uncertainties were observed near the body surface and in the bowel due to the presence of gas. The mean rectal and bladder DDM values ranged from 1.1–11.1 mm and 1.5–12.7 mm, respectively. There was a strong correlation in the DDMs between the rectum and bladder (Pearson R = 0.68 for the max DDM). For both structures, DDM was correlated with the ratio between the DIR-propagated and manually delineated volumes (R = 0.74 for the max rectal DDM). The maximum rectal DDM was negatively correlated with the Dice Similarity Coefficient between the propagated and the manually delineated volumes (R= −0.52). Conclusion: The multipleimage based DDM map quantified considerable DIR variability across different structures and among patients. Besides using the DDM for quantifying DIR-related uncertainties it could potentially be used to adjust for uncertainties in DIR-based accumulated dose distributions

  6. Algorithms for Planar Graphs and Graphs in Metric Spaces

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    structural properties that can be exploited. For instance, a road network or a wire layout on a microchip is typically (near-)planar and distances in the network are often defined w.r.t. the Euclidean or the rectilinear metric. Specialized algorithms that take advantage of such properties are often orders...... of magnitude faster than the corresponding algorithms for general graphs. The first and main part of this thesis focuses on the development of efficient planar graph algorithms. The most important contributions include a faster single-source shortest path algorithm, a distance oracle with subquadratic...... for geometric graphs and graphs embedded in metric spaces. Roughly speaking, the stretch factor is a real value expressing how well a (geo-)metric graph approximates the underlying complete graph w.r.t. distances. We give improved algorithms for computing the stretch factor of a given graph and for augmenting...

  7. Discriminatory Data Mapping by Matrix-Based Supervised Learning Metrics

    NARCIS (Netherlands)

    Strickert, M.; Schneider, P.; Keilwagen, J.; Villmann, T.; Biehl, M.; Hammer, B.

    2008-01-01

    Supervised attribute relevance detection using cross-comparisons (SARDUX), a recently proposed method for data-driven metric learning, is extended from dimension-weighted Minkowski distances to metrics induced by a data transformation matrix Ω for modeling mutual attribute dependence. Given class

  8. Continuity Properties of Distances for Markov Processes

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Mao, Hua; Larsen, Kim Guldstrand

    2014-01-01

    In this paper we investigate distance functions on finite state Markov processes that measure the behavioural similarity of non-bisimilar processes. We consider both probabilistic bisimilarity metrics, and trace-based distances derived from standard Lp and Kullback-Leibler distances. Two desirable...

  9. Distance between Behaviors and Rational Representations

    NARCIS (Netherlands)

    Trentelman, H.L.; Gottimukkala, S.V.

    2013-01-01

    In this paper we study notions of distance between behaviors of linear differential systems. We introduce four metrics on the space of all controllable behaviors which generalize existing metrics on the space of input-output systems represented by transfer matrices. Three of these are defined in

  10. Contextual Distance Refining for Image Retrieval

    KAUST Repository

    Islam, Almasri

    2014-01-01

    Recently, a number of methods have been proposed to improve image retrieval accuracy by capturing context information. These methods try to compensate for the fact that a visually less similar image might be more relevant because it depicts the same object. We propose a new quick method for refining any pairwise distance metric, it works by iteratively discovering the object in the image from the most similar images, and then refine the distance metric accordingly. Test show that our technique improves over the state of art in terms of accuracy over the MPEG7 dataset.

  11. Contextual Distance Refining for Image Retrieval

    KAUST Repository

    Islam, Almasri

    2014-09-16

    Recently, a number of methods have been proposed to improve image retrieval accuracy by capturing context information. These methods try to compensate for the fact that a visually less similar image might be more relevant because it depicts the same object. We propose a new quick method for refining any pairwise distance metric, it works by iteratively discovering the object in the image from the most similar images, and then refine the distance metric accordingly. Test show that our technique improves over the state of art in terms of accuracy over the MPEG7 dataset.

  12. Metrics in Keplerian orbits quotient spaces

    Science.gov (United States)

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  13. Improved nonlinear fault detection strategy based on the Hellinger distance metric: Plug flow reactor monitoring

    KAUST Repository

    Harrou, Fouzi

    2017-03-18

    Fault detection has a vital role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. This paper proposes an innovative multivariate fault detection method that can be used for monitoring nonlinear processes. The proposed method merges advantages of nonlinear projection to latent structures (NLPLS) modeling and those of Hellinger distance (HD) metric to identify abnormal changes in highly correlated multivariate data. Specifically, the HD is used to quantify the dissimilarity between current NLPLS-based residual and reference probability distributions obtained using fault-free data. Furthermore, to enhance further the robustness of these methods to measurement noise, and reduce the false alarms due to modeling errors, wavelet-based multiscale filtering of residuals is used before the application of the HD-based monitoring scheme. The performances of the developed NLPLS-HD fault detection technique is illustrated using simulated plug flow reactor data. The results show that the proposed method provides favorable performance for detection of faults compared to the conventional NLPLS method.

  14. Converging from Branching to Linear Metrics on Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim Guldstrand

    2015-01-01

    time in the size of the MC. The upper-approximants are Kantorovich-like pseudometrics, i.e. branching-time distances, that converge point-wise to the linear-time metrics. This convergence is interesting in itself, since it reveals a nontrivial relation between branching and linear-time metric...

  15. Speaker Recognition from Emotional Speech Using I-vector Approach

    Directory of Open Access Journals (Sweden)

    MACKOVÁ Lenka

    2014-05-01

    Full Text Available In recent years the concept of i-vectors become very popular and successful in the field of the speaker verification. The basic principle of i-vectors is that each utterance is represented by fixed-length feature vector of low-dimension. In the literature for purpose of speaker verification various recordings obtained from telephones or microphones were used. The aim of this experiment was to perform speaker verification using speaker model trained with emotional recordings on i-vector basis. The Mel Frequency Cepstral Coefficients (MFCC, log energy, their deltas and acceleration coefficients were used in process of features extraction. As the classification methods of the verification system Mahalanobis distance metric in combination with Eigen Factor Radial normalization was used and in the second approach Cosine Distance Scoring (CSS metric with Within-class Covariance Normalization as a channel compensation was employed. This verification system used emotional recordings of male subjects from freely available German emotional database (Emo-DB.

  16. On the Metric-based Approximate Minimization of Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2018-01-01

    In this paper we address the approximate minimization problem of Markov Chains (MCs) from a behavioral metric-based perspective. Specifically, given a finite MC and a positive integer k, we are looking for an MC with at most k states having minimal distance to the original. The metric considered...

  17. Normalized compression distance of multisets with applications

    NARCIS (Netherlands)

    Cohen, A.R.; Vitányi, P.M.B.

    Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression. We propose an NCD of multisets that is also metric. Previously, attempts to obtain such an NCD failed. For classification purposes it is superior to the pairwise

  18. Assessment of six dissimilarity metrics for climate analogues

    Science.gov (United States)

    Grenier, Patrick; Parent, Annie-Claude; Huard, David; Anctil, François; Chaumont, Diane

    2013-04-01

    Spatial analogue techniques consist in identifying locations whose recent-past climate is similar in some aspects to the future climate anticipated at a reference location. When identifying analogues, one key step is the quantification of the dissimilarity between two climates separated in time and space, which involves the choice of a metric. In this communication, spatial analogues and their usefulness are briefly discussed. Next, six metrics are presented (the standardized Euclidean distance, the Kolmogorov-Smirnov statistic, the nearest-neighbor distance, the Zech-Aslan energy statistic, the Friedman-Rafsky runs statistic and the Kullback-Leibler divergence), along with a set of criteria used for their assessment. The related case study involves the use of numerical simulations performed with the Canadian Regional Climate Model (CRCM-v4.2.3), from which three annual indicators (total precipitation, heating degree-days and cooling degree-days) are calculated over 30-year periods (1971-2000 and 2041-2070). Results indicate that the six metrics identify comparable analogue regions at a relatively large scale, but best analogues may differ substantially. For best analogues, it is also shown that the uncertainty stemming from the metric choice does generally not exceed that stemming from the simulation or model choice. A synthesis of the advantages and drawbacks of each metric is finally presented, in which the Zech-Aslan energy statistic stands out as the most recommended metric for analogue studies, whereas the Friedman-Rafsky runs statistic is the least recommended, based on this case study.

  19. INFORMATIVE ENERGY METRIC FOR SIMILARITY MEASURE IN REPRODUCING KERNEL HILBERT SPACES

    Directory of Open Access Journals (Sweden)

    Songhua Liu

    2012-02-01

    Full Text Available In this paper, information energy metric (IEM is obtained by similarity computing for high-dimensional samples in a reproducing kernel Hilbert space (RKHS. Firstly, similar/dissimilar subsets and their corresponding informative energy functions are defined. Secondly, IEM is proposed for similarity measure of those subsets, which converts the non-metric distances into metric ones. Finally, applications of this metric is introduced, such as classification problems. Experimental results validate the effectiveness of the proposed method.

  20. Wireless sensor network performance metrics for building applications

    Energy Technology Data Exchange (ETDEWEB)

    Jang, W.S. (Department of Civil Engineering Yeungnam University 214-1 Dae-Dong, Gyeongsan-Si Gyeongsangbuk-Do 712-749 South Korea); Healy, W.M. [Building and Fire Research Laboratory, 100 Bureau Drive, Gaithersburg, MD 20899-8632 (United States)

    2010-06-15

    Metrics are investigated to help assess the performance of wireless sensors in buildings. Wireless sensor networks present tremendous opportunities for energy savings and improvement in occupant comfort in buildings by making data about conditions and equipment more readily available. A key barrier to their adoption, however, is the uncertainty among users regarding the reliability of the wireless links through building construction. Tests were carried out that examined three performance metrics as a function of transmitter-receiver separation distance, transmitter power level, and obstruction type. These tests demonstrated, via the packet delivery rate, a clear transition from reliable to unreliable communications at different separation distances. While the packet delivery rate is difficult to measure in actual applications, the received signal strength indication correlated well with the drop in packet delivery rate in the relatively noise-free environment used in these tests. The concept of an equivalent distance was introduced to translate the range of reliability in open field operation to that seen in a typical building, thereby providing wireless system designers a rough estimate of the necessary spacing between sensor nodes in building applications. It is anticipated that the availability of straightforward metrics on the range of wireless sensors in buildings will enable more widespread sensing in buildings for improved control and fault detection. (author)

  1. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights...

  2. A metric for the Radial Basis Function Network - Application on Real Radar Data

    NARCIS (Netherlands)

    Heiden, R. van der; Groen, F.C.A.

    1996-01-01

    A Radial Basis Functions (RBF) network for pattern recognition is considered. Classification with such a network is based on distances between patterns, so a metric is always present. Using real radar data, the Euclidean metric is shown to perform poorly - a metric based on the so called Box-Cox

  3. A guide to phylogenetic metrics for conservation, community ecology and macroecology

    Science.gov (United States)

    Cadotte, Marc W.; Carvalho, Silvia B.; Davies, T. Jonathan; Ferrier, Simon; Fritz, Susanne A.; Grenyer, Rich; Helmus, Matthew R.; Jin, Lanna S.; Mooers, Arne O.; Pavoine, Sandrine; Purschke, Oliver; Redding, David W.; Rosauer, Dan F.; Winter, Marten; Mazel, Florent

    2016-01-01

    ABSTRACT The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub‐disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub‐disciplines hampers potential meta‐analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo‐diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information. Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo‐diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α‐diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices. PMID:26785932

  4. A Metric on Phylogenetic Tree Shapes.

    Science.gov (United States)

    Colijn, C; Plazzotta, G

    2018-01-01

    The shapes of evolutionary trees are influenced by the nature of the evolutionary process but comparisons of trees from different processes are hindered by the challenge of completely describing tree shape. We present a full characterization of the shapes of rooted branching trees in a form that lends itself to natural tree comparisons. We use this characterization to define a metric, in the sense of a true distance function, on tree shapes. The metric distinguishes trees from random models known to produce different tree shapes. It separates trees derived from tropical versus USA influenza A sequences, which reflect the differing epidemiology of tropical and seasonal flu. We describe several metrics based on the same core characterization, and illustrate how to extend the metric to incorporate trees' branch lengths or other features such as overall imbalance. Our approach allows us to construct addition and multiplication on trees, and to create a convex metric on tree shapes which formally allows computation of average tree shapes. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  5. Assessment of the Log-Euclidean Metric Performance in Diffusion Tensor Image Segmentation

    Directory of Open Access Journals (Sweden)

    Mostafa Charmi

    2010-06-01

    Full Text Available Introduction: Appropriate definition of the distance measure between diffusion tensors has a deep impact on Diffusion Tensor Image (DTI segmentation results. The geodesic metric is the best distance measure since it yields high-quality segmentation results. However, the important problem with the geodesic metric is a high computational cost of the algorithms based on it. The main goal of this paper is to assess the possible substitution of the geodesic metric with the Log-Euclidean one to reduce the computational cost of a statistical surface evolution algorithm. Materials and Methods: We incorporated the Log-Euclidean metric in the statistical surface evolution algorithm framework. To achieve this goal, the statistics and gradients of diffusion tensor images were defined using the Log-Euclidean metric. Numerical implementation of the segmentation algorithm was performed in the MATLAB software using the finite difference techniques. Results: In the statistical surface evolution framework, the Log-Euclidean metric was able to discriminate the torus and helix patterns in synthesis datasets and rat spinal cords in biological phantom datasets from the background better than the Euclidean and J-divergence metrics. In addition, similar results were obtained with the geodesic metric. However, the main advantage of the Log-Euclidean metric over the geodesic metric was the dramatic reduction of computational cost of the segmentation algorithm, at least by 70 times. Discussion and Conclusion: The qualitative and quantitative results have shown that the Log-Euclidean metric is a good substitute for the geodesic metric when using a statistical surface evolution algorithm in DTIs segmentation.

  6. A Non-Destructive Optical Method for the DP Measurement of Paper Insulation Based on the Free Fibers in Transformer Oil

    Directory of Open Access Journals (Sweden)

    Lei Peng

    2018-03-01

    Full Text Available In order to explore a non-destructive method for measuring the polymerization degree (DP of paper insulation in transformer, a new method that based on the optical properties of free fiber particles in transformer oil was studied. The chromatic dispersion images of fibers with different aging degree were obtained by polarizing microscope, and the eigenvalues (r, b, and Mahalanobis distance of the images were extracted by the RGB (red, blue, and green tricolor analysis method. Then, the correlation between the three eigenvalues and DP of paper insulation were simulated respectively. The results showed that the color of images changed from blue-purple to orange-yellow gradually with the increase of aging degree. For the three eigenvalues, the relationship between Mahalanobis distance and DP had the best goodness of fit (R2 = 0.98, higher than that of r (0.94 and b (0.94. The mean square error of the relationship between Mahalanobis distance and DP (52.17 was also significantly lower than that of r and b (97.58, 98.05. Therefore, the DP of unknown paper insulation could be calculated by the simulated relationship of Mahalanobis distance and DP.

  7. Classification in medical image analysis using adaptive metric k-NN

    DEFF Research Database (Denmark)

    Chen, Chen; Chernoff, Konstantin; Karemore, Gopal

    2010-01-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier...

  8. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  9. Networks and centroid metrics for understanding football

    African Journals Online (AJOL)

    Gonçalo Dias

    games. However, it seems that the centroid metric, supported only by the position of players in the field ...... the strategy adopted by the coach (Gama et al., 2014). ... centroid distance as measures of team's tactical performance in youth football.

  10. The Metric of Colour Space

    DEFF Research Database (Denmark)

    Gravesen, Jens

    2015-01-01

    and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent......The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...

  11. Una Modificación a la Distancia de Cook Una Modificación a la Distancia de Cook

    Directory of Open Access Journals (Sweden)

    José Antonio Díaz García

    2012-02-01

    Full Text Available En este trabajo se propone una modificación de la distancia de Cook, basándose en la distancia de Mahalanobis generalizada, en el contexto del modelo de regresión lineal multivariado con distribución normal. Se establece además, la distribución exacta del estadístico basado en esta distancia de Mahalanobis generalizada, la cual proporciona puntos críticos para identificar “outliers” en un conjunto de datos. Este procedimiento, se ilustra con un ejemplo, en el caso de la regresión lineal múltiple multivariada.A modification of the classical Cook´s distance is proposed in this paper, being based on the distance of widespread Mahalanobis in the context of multivariate normal linear regression model. Further more, the exact distribution of a pivotal type statistics based on this generalized Mahalanobis distance is established, providing critical points for the identification of outliers in data points. The procedure is illustrated with an example, in the case of multiple and multivariate linear regression.

  12. Some Metric Properties of Planar Gaussian Free Field

    Science.gov (United States)

    Goswami, Subhajit

    In this thesis we study the properties of some metrics arising from two-dimensional Gaussian free field (GFF), namely the Liouville first-passage percolation (Liouville FPP), the Liouville graph distance and an effective resistance metric. In Chapter 1, we define these metrics as well as discuss the motivations for studying them. Roughly speaking, Liouville FPP is the shortest path metric in a planar domain D where the length of a path P is given by ∫Pe gammah(z)|dz| where h is the GFF on D and gamma > 0. In Chapter 2, we present an upper bound on the expected Liouville FPP distance between two typical points for small values of gamma (the near-Euclidean regime). A similar upper bound is derived in Chapter 3 for the Liouville graph distance which is, roughly, the minimal number of Euclidean balls with comparable Liouville quantum gravity (LQG) measure whose union contains a continuous path between two endpoints. Our bounds seem to be in disagreement with Watabiki's prediction (1993) on the random metric of Liouville quantum gravity in this regime. The contents of these two chapters are based on a joint work with Jian Ding. In Chapter 4, we derive some asymptotic estimates for effective resistances on a random network which is defined as follows. Given any gamma > 0 and for eta = {etav}v∈Z2 denoting a sample of the two-dimensional discrete Gaussian free field on Z2 pinned at the origin, we equip the edge ( u, v) with conductance egamma(etau + eta v). The metric structure of effective resistance plays a crucial role in our proof of the main result in Chapter 4. The primary motivation behind this metric is to understand the random walk on Z 2 where the edge (u, v) has weight egamma(etau + etav). Using the estimates from Chapter 4 we show in Chapter 5 that for almost every eta, this random walk is recurrent and that, with probability tending to 1 as T → infinity, the return probability at time 2T decays as T-1+o(1). In addition, we prove a version of subdiffusive

  13. Generalized Distance Transforms and Skeletons in Graphics Hardware

    NARCIS (Netherlands)

    Strzodka, R.; Telea, A.

    2004-01-01

    We present a framework for computing generalized distance transforms and skeletons of two-dimensional objects using graphics hardware. Our method is based on the concept of footprint splatting. Combining different splats produces weighted distance transforms for different metrics, as well as the

  14. Emergence of the scale-invariant proportion in a flock from the metric-topological interaction.

    Science.gov (United States)

    Niizato, Takayuki; Murakami, Hisashi; Gunji, Yukio-Pegio

    2014-05-01

    Recently, it has become possible to more precisely analyze flocking behavior. Such research has prompted a reconsideration of the notion of neighborhoods in the theoretical model. Flocking based on topological distance is one such result. In a topological flocking model, a bird does not interact with its neighbors on the basis of a fixed-size neighborhood (i.e., on the basis of metric distance), but instead interacts with its nearest seven neighbors. Cavagna et al., moreover, found a new phenomenon in flocks that can be explained by neither metric distance nor topological distance: they found that correlated domains in a flock were larger than the metric and topological distance and that these domains were proportional to the total flock size. However, the role of scale-free correlation is still unclear. In a previous study, we constructed a metric-topological interaction model on three-dimensional spaces and showed that this model exhibited scale-free correlation. In this study, we found that scale-free correlation in a two-dimensional flock was more robust than in a three-dimensional flock for the threshold parameter. Furthermore, we also found a qualitative difference in behavior from using the fluctuation coherence, which we observed on three-dimensional flocking behavior. Our study suggests that two-dimensional flocks try to maintain a balance between the flock size and flock mobility by breaking into several smaller flocks. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Distance-Based Phylogenetic Methods Around a Polytomy.

    Science.gov (United States)

    Davidson, Ruth; Sullivant, Seth

    2014-01-01

    Distance-based phylogenetic algorithms attempt to solve the NP-hard least-squares phylogeny problem by mapping an arbitrary dissimilarity map representing biological data to a tree metric. The set of all dissimilarity maps is a Euclidean space properly containing the space of all tree metrics as a polyhedral fan. Outputs of distance-based tree reconstruction algorithms such as UPGMA and neighbor-joining are points in the maximal cones in the fan. Tree metrics with polytomies lie at the intersections of maximal cones. A phylogenetic algorithm divides the space of all dissimilarity maps into regions based upon which combinatorial tree is reconstructed by the algorithm. Comparison of phylogenetic methods can be done by comparing the geometry of these regions. We use polyhedral geometry to compare the local nature of the subdivisions induced by least-squares phylogeny, UPGMA, and neighbor-joining when the true tree has a single polytomy with exactly four neighbors. Our results suggest that in some circumstances, UPGMA and neighbor-joining poorly match least-squares phylogeny.

  16. Metric interpretation of gauge fields in noncommutative geometry

    International Nuclear Information System (INIS)

    Martinetti, P.

    2007-01-01

    We shall give an overview of the metric interpretation of gauge fields in noncommutative geometry, via Connes distance formula. Especially we shall focus on the Higgs fields in the standard model, and gauge fields in various models of fiber bundle. (author)

  17. Local adjacency metric dimension of sun graph and stacked book graph

    Science.gov (United States)

    Yulisda Badri, Alifiah; Darmaji

    2018-03-01

    A graph is a mathematical system consisting of a non-empty set of nodes and a set of empty sides. One of the topics to be studied in graph theory is the metric dimension. Application in the metric dimension is the navigation robot system on a path. Robot moves from one vertex to another vertex in the field by minimizing the errors that occur in translating the instructions (code) obtained from the vertices of that location. To move the robot must give different instructions (code). In order for the robot to move efficiently, the robot must be fast to translate the code of the nodes of the location it passes. so that the location vertex has a minimum distance. However, if the robot must move with the vertex location on a very large field, so the robot can not detect because the distance is too far.[6] In this case, the robot can determine its position by utilizing location vertices based on adjacency. The problem is to find the minimum cardinality of the required location vertex, and where to put, so that the robot can determine its location. The solution to this problem is the dimension of adjacency metric and adjacency metric bases. Rodrguez-Velzquez and Fernau combine the adjacency metric dimensions with local metric dimensions, thus becoming the local adjacency metric dimension. In the local adjacency metric dimension each vertex in the graph may have the same adjacency representation as the terms of the vertices. To obtain the local metric dimension of values in the graph of the Sun and the stacked book graph is used the construction method by considering the representation of each adjacent vertex of the graph.

  18. Comparing Phylogenetic Trees by Matching Nodes Using the Transfer Distance Between Partitions.

    Science.gov (United States)

    Bogdanowicz, Damian; Giaro, Krzysztof

    2017-05-01

    Ability to quantify dissimilarity of different phylogenetic trees describing the relationship between the same group of taxa is required in various types of phylogenetic studies. For example, such metrics are used to assess the quality of phylogeny construction methods, to define optimization criteria in supertree building algorithms, or to find horizontal gene transfer (HGT) events. Among the set of metrics described so far in the literature, the most commonly used seems to be the Robinson-Foulds distance. In this article, we define a new metric for rooted trees-the Matching Pair (MP) distance. The MP metric uses the concept of the minimum-weight perfect matching in a complete bipartite graph constructed from partitions of all pairs of leaves of the compared phylogenetic trees. We analyze the properties of the MP metric and present computational experiments showing its potential applicability in tasks related to finding the HGT events.

  19. Are contemporary tourists consuming distance?

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    2012. Background The background for this research, which explores how tourists represent distance and whether or not distance can be said to be consumed by contemporary tourists, is the increasing leisure mobility of people. Travelling for the purpose of visiting friends and relatives is increasing...... of understanding mobility at a conceptual level, and distance matters to people's manifest mobility: how they travel and how far they travel are central elements of their movements. Therefore leisure mobility (indeed all mobility) is the activity of relating across distance, either through actual corporeal...... metric representation. These representations are the focus for this research. Research Aim and Questions The aim of this research is thus to explore how distance is being represented within the context of leisure mobility. Further the aim is to explore how or whether distance is being consumed...

  20. Model assessment using a multi-metric ranking technique

    Science.gov (United States)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  1. The transposition distance for phylogenetic trees

    OpenAIRE

    Rossello, Francesc; Valiente, Gabriel

    2006-01-01

    The search for similarity and dissimilarity measures on phylogenetic trees has been motivated by the computation of consensus trees, the search by similarity in phylogenetic databases, and the assessment of clustering results in bioinformatics. The transposition distance for fully resolved phylogenetic trees is a recent addition to the extensive collection of available metrics for comparing phylogenetic trees. In this paper, we generalize the transposition distance from fully resolved to arbi...

  2. A Lorentzian Gromov-Hausdorff notion of distance

    International Nuclear Information System (INIS)

    Noldus, Johan

    2004-01-01

    This paper is the first of three in which I study the moduli space of isometry classes of (compact) globally hyperbolic spacetimes (with boundary). I introduce a notion of Gromov-Hausdorff distance which makes this moduli space into a metric space. Further properties of this metric space are studied in the next two papers. The importance of the work is in fields such as cosmology, quantum gravity and - for the mathematicians - global Lorentzian geometry

  3. Metrics for Performance Evaluation of Patient Exercises during Physical Therapy.

    Science.gov (United States)

    Vakanski, Aleksandar; Ferguson, Jake M; Lee, Stephen

    2017-06-01

    The article proposes a set of metrics for evaluation of patient performance in physical therapy exercises. Taxonomy is employed that classifies the metrics into quantitative and qualitative categories, based on the level of abstraction of the captured motion sequences. Further, the quantitative metrics are classified into model-less and model-based metrics, in reference to whether the evaluation employs the raw measurements of patient performed motions, or whether the evaluation is based on a mathematical model of the motions. The reviewed metrics include root-mean square distance, Kullback Leibler divergence, log-likelihood, heuristic consistency, Fugl-Meyer Assessment, and similar. The metrics are evaluated for a set of five human motions captured with a Kinect sensor. The metrics can potentially be integrated into a system that employs machine learning for modelling and assessment of the consistency of patient performance in home-based therapy setting. Automated performance evaluation can overcome the inherent subjectivity in human performed therapy assessment, and it can increase the adherence to prescribed therapy plans, and reduce healthcare costs.

  4. Hacia una visión holística de la distancia en los negocios internacionales: el caso colombiano || Towards Holistic Version of Distance in International Business: The Colombian Case

    Directory of Open Access Journals (Sweden)

    Caicedo Marulanda, Carolina

    2017-06-01

    Full Text Available Este artículo presenta nuevas estimaciones de distancia en los negocios internacionales entre Colombia y 57 países de Europa, América, Asia, África y Oceanía. El concepto de distancia que utilizamos trasciende el plano de lo institucional e incorpora tanto aspectos institucionales como aquellos que aún no han sido considerados en la literatura como el mercado laboral, la práctica mercantil y la innovación. De esta forma, a partir de la construcción de un índice sintético de distancia nosotros calculamos las distancias entre Colombia y los países de la muestra usando el concepto de distancia de Mahalanobis en las dimensiones aquí analizadas. Nuestros resultados muestran que si bien hay avances en la globalización sobre los negocios internacionales entre Colombia y los países de la muestra, todavía existen restricciones a las que se enfrentan las empresas colombianas que están interesadas en incursionar estos mercados buscando beneficiarse de los acuerdos de libre comercio. || This article presents new estimations of the distance in international business between Colombia and 57 countries in Europe, America, Asia, Africa and Oceania. The concept of distance that we use transcends the institutional level and incorporates both institutional aspects and those that have not yet been considered in the literature such as labor market, business practice and innovation. In this way, we build a holistic distance index and discuss the results for the specific case of Colombia. Thus, from the construction of a synthetic index of distance, we calculated the distances between Colombia and the countries of the sample using the Mahalanobis distance in the dimensions analyzed here. Our results show that although there is progress in the globalization of international business between Colombia and the countries of the sample there are still restrictions faced by Colombian companies that are interested in entering these markets seeking to benefit

  5. Video Analytics Evaluation: Survey of Datasets, Performance Metrics and Approaches

    Science.gov (United States)

    2014-09-01

    people with different ethnicity and gender . Cur- rently we have four subjects, but more can be added in the future. • Lighting Variations. We consider...is however not a proper distance as the triangular inequality condition is not met. For this reason, the next metric should be preferred. • the...and Alan F. Smeaton and Georges Quenot, An Overview of the Goals, Tasks, Data, Evaluation Mechanisms and Metrics, Proceedings of TRECVID 2011, NIST, USA

  6. Predicting speech release from masking through spatial separation in distance

    DEFF Research Database (Denmark)

    Chabot-Leclerc, Alexandre; Dau, Torsten

    2014-01-01

    of spatial release from masking (SRM) where the masker is moved, on-axis, away from the target. Two binaural models, which use the conventional audio signal-to-noise ratio (SNR) in the decision metric, and two monaural models, using a decision metric based on the SNR in the envelope domain (SNRenv), were...... considered. The predictions were compared to data from Westermann et al. [2013, POMA, 19, 050156] in condi- tions where the target was located 0.5 m in front of the listener and the masker was presented at a distance of 0.5, 2, 5 or 10 m in front of the listener. The data showed an SRM of 10 dB when moving...... the masker from a distance of 0.5 m to a distance of 10 m. The long-term monaural model based on the SNRenv metric was able to account for most of the SRM data, whereas the models that used the audio SNR did not predict any SRM, even when they included an equalizationcancellation-like process. The short...

  7. Clustering of financial time series

    Science.gov (United States)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  8. H-Metric: Characterizing Image Datasets via Homogenization Based on KNN-Queries

    Directory of Open Access Journals (Sweden)

    Welington M da Silva

    2012-01-01

    Full Text Available Precision-Recall is one of the main metrics for evaluating content-based image retrieval techniques. However, it does not provide an ample perception of the properties of an image dataset immersed in a metric space. In this work, we describe an alternative metric named H-Metric, which is determined along a sequence of controlled modifications in the image dataset. The process is named homogenization and works by altering the homogeneity characteristics of the classes of the images. The result is a process that measures how hard it is to deal with a set of images in respect to content-based retrieval, offering support in the task of analyzing configurations of distance functions and of features extractors.

  9. Rapid detection of foodborne microorganisms on food surface using Fourier transform Raman spectroscopy

    Science.gov (United States)

    Yang, Hong; Irudayaraj, Joseph

    2003-02-01

    Fourier transform (FT) Raman spectroscopy was used for non-destructive characterization and differentiation of six different microorganisms including the pathogen Escherichia coli O157:H7 on whole apples. Mahalanobis distance metric was used to evaluate and quantify the statistical differences between the spectra of six different microorganisms. The same procedure was extended to discriminate six different strains of E. coli. The FT-Raman procedure was not only successful in discriminating the different E. coli strain but also accurately differentiated the pathogen from non-pathogens. Results demonstrate that FT-Raman spectroscopy can be an excellent tool for rapid examination of food surfaces for microorganism contamination and for the classification of microbial cultures.

  10. A PEG Construction of LDPC Codes Based on the Betweenness Centrality Metric

    Directory of Open Access Journals (Sweden)

    BHURTAH-SEEWOOSUNGKUR, I.

    2016-05-01

    Full Text Available Progressive Edge Growth (PEG constructions are usually based on optimizing the distance metric by using various methods. In this work however, the distance metric is replaced by a different one, namely the betweenness centrality metric, which was shown to enhance routing performance in wireless mesh networks. A new type of PEG construction for Low-Density Parity-Check (LDPC codes is introduced based on the betweenness centrality metric borrowed from social networks terminology given that the bipartite graph describing the LDPC is analogous to a network of nodes. The algorithm is very efficient in filling edges on the bipartite graph by adding its connections in an edge-by-edge manner. The smallest graph size the new code could construct surpasses those obtained from a modified PEG algorithm - the RandPEG algorithm. To the best of the authors' knowledge, this paper produces the best regular LDPC column-weight two graphs. In addition, the technique proves to be competitive in terms of error-correcting performance. When compared to MacKay, PEG and other recent modified-PEG codes, the algorithm gives better performance over high SNR due to its particular edge and local graph properties.

  11. A Feeling for Numbers: Shared Metric for Symbolic and Tactile Numerosities

    Directory of Open Access Journals (Sweden)

    Florian eKrause

    2013-01-01

    Full Text Available Evidence for an approximate analogue system of numbers has been provided by the finding that the comparison of two numerals takes longer and is more error prone if the semantic distance between the numbers becomes smaller (so-called numerical distance effect. Recent embodied theories suggest that analogue number representations are based on previous sensory experiences and constitute therefore a common magnitude metric shared by multiple domains. Here we demonstrate the existence of a cross-modal semantic distance effect between symbolic and tactile numerosities. Participants received tactile stimulations of different amounts of fingers while reading Arabic digits and indicated verbally whether the amount of stimulated fingers was different from the simultaneously presented digit or not. The larger the semantic distance was between the two numerosities, the faster and more accurate participants made their judgements. This cross-modal numerosity distance effect suggests a direct connection between tactile sensations and the concept of numerical magnitude. A second experiment replicated the interaction between symbolic and tactile numerosities and showed that this effect is not modulated by the participants' finger counting habits. Taken together, our data provide novel evidence for a shared metric for symbolic and tactile numerosites as an instance of an embodied representation of numbers.

  12. Clustering by Partitioning around Medoids using Distance-Based ...

    African Journals Online (AJOL)

    OLUWASOGO

    outperforms both the Euclidean and Manhattan distance metrics in certain situations. KEYWORDS: PAM ... version of a dataset, compare the quality of clusters obtained from the Euclidean .... B. Theoretical Framework and Methodology.

  13. Performance evaluation of a distance learning program.

    OpenAIRE

    Dailey, D. J.; Eno, K. R.; Brinkley, J. F.

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macint...

  14. A Simple Metric for Determining Resolution in Optical, Ion, and Electron Microscope Images.

    Science.gov (United States)

    Curtin, Alexandra E; Skinner, Ryan; Sanders, Aric W

    2015-06-01

    A resolution metric intended for resolution analysis of arbitrary spatially calibrated images is presented. By fitting a simple sigmoidal function to pixel intensities across slices of an image taken perpendicular to light-dark edges, the mean distance over which the light-dark transition occurs can be determined. A fixed multiple of this characteristic distance is then reported as the image resolution. The prefactor is determined by analysis of scanning transmission electron microscope high-angle annular dark field images of Si. This metric has been applied to optical, scanning electron microscope, and helium ion microscope images. This method provides quantitative feedback about image resolution, independent of the tool on which the data were collected. In addition, our analysis provides a nonarbitrary and self-consistent framework that any end user can utilize to evaluate the resolution of multiple microscopes from any vendor using the same metric.

  15. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  16. Distance metric learning for complex networks: Towards size-independent comparison of network structures

    Science.gov (United States)

    Aliakbary, Sadegh; Motallebi, Sadegh; Rashidian, Sina; Habibi, Jafar; Movaghar, Ali

    2015-02-01

    Real networks show nontrivial topological properties such as community structure and long-tail degree distribution. Moreover, many network analysis applications are based on topological comparison of complex networks. Classification and clustering of networks, model selection, and anomaly detection are just some applications of network comparison. In these applications, an effective similarity metric is needed which, given two complex networks of possibly different sizes, evaluates the amount of similarity between the structural features of the two networks. Traditional graph comparison approaches, such as isomorphism-based methods, are not only too time consuming but also inappropriate to compare networks with different sizes. In this paper, we propose an intelligent method based on the genetic algorithms for integrating, selecting, and weighting the network features in order to develop an effective similarity measure for complex networks. The proposed similarity metric outperforms state of the art methods with respect to different evaluation criteria.

  17. Natural metrics and least-committed priors for articulated tracking

    DEFF Research Database (Denmark)

    Hauberg, Søren; Sommer, Stefan Horst; Pedersen, Kim Steenstrup

    2012-01-01

    of joint positions, which is embedded in a high dimensional Euclidean space. This Riemannian manifold inherits the metric from the embedding space, such that distances are measured as the combined physical length that joints travel during movements. We then develop a least-committed Brownian motion model...

  18. Converging from branching to linear metrics on Markov chains

    DEFF Research Database (Denmark)

    Bacci, Giorgio; Bacci, Giovanni; Larsen, Kim G.

    2017-01-01

    -approximant is computable in polynomial time in the size of the MC. The upper-approximants are bisimilarity-like pseudometrics (hence, branching-time distances) that converge point-wise to the linear-time metrics. This convergence is interesting in itself, because it reveals a nontrivial relation between branching...

  19. A Single Conjunction Risk Assessment Metric: the F-Value

    Science.gov (United States)

    Frigm, Ryan Clayton; Newman, Lauri K.

    2009-01-01

    The Conjunction Assessment Team at NASA Goddard Space Flight Center provides conjunction risk assessment for many NASA robotic missions. These risk assessments are based on several figures of merit, such as miss distance, probability of collision, and orbit determination solution quality. However, these individual metrics do not singly capture the overall risk associated with a conjunction, making it difficult for someone without this complete understanding to take action, such as an avoidance maneuver. The goal of this analysis is to introduce a single risk index metric that can easily convey the level of risk without all of the technical details. The proposed index is called the conjunction "F-value." This paper presents the concept of the F-value and the tuning of the metric for use in routine Conjunction Assessment operations.

  20. Prioritizing Urban Habitats for Connectivity Conservation: Integrating Centrality and Ecological Metrics.

    Science.gov (United States)

    Poodat, Fatemeh; Arrowsmith, Colin; Fraser, David; Gordon, Ascelin

    2015-09-01

    Connectivity among fragmented areas of habitat has long been acknowledged as important for the viability of biological conservation, especially within highly modified landscapes. Identifying important habitat patches in ecological connectivity is a priority for many conservation strategies, and the application of 'graph theory' has been shown to provide useful information on connectivity. Despite the large number of metrics for connectivity derived from graph theory, only a small number have been compared in terms of the importance they assign to nodes in a network. This paper presents a study that aims to define a new set of metrics and compares these with traditional graph-based metrics, used in the prioritization of habitat patches for ecological connectivity. The metrics measured consist of "topological" metrics, "ecological metrics," and "integrated metrics," Integrated metrics are a combination of topological and ecological metrics. Eight metrics were applied to the habitat network for the fat-tailed dunnart within Greater Melbourne, Australia. A non-directional network was developed in which nodes were linked to adjacent nodes. These links were then weighted by the effective distance between patches. By applying each of the eight metrics for the study network, nodes were ranked according to their contribution to the overall network connectivity. The structured comparison revealed the similarity and differences in the way the habitat for the fat-tailed dunnart was ranked based on different classes of metrics. Due to the differences in the way the metrics operate, a suitable metric should be chosen that best meets the objectives established by the decision maker.

  1. Quasi-metrics, midpoints and applications

    Energy Technology Data Exchange (ETDEWEB)

    Valero, O.

    2017-07-01

    In applied sciences, the scientific community uses simultaneously different kinds of information coming from several sources in order to infer a conclusion or working decision. In the literature there are many techniques for merging the information and providing, hence, a meaningful fused data. In mostpractical cases such fusion methods are based on aggregation operators on somenumerical values, i.e. the aim of the fusion process is to obtain arepresentative number from a finite sequence of numerical data. In the aforementioned cases, the input data presents some kind of imprecision and for thisreason it is represented as fuzzy sets. Moreover, in such problems the comparisons between the numerical values that represent the information described by the fuzzy sets become necessary. The aforementioned comparisons are made by means of a distance defined on fuzzy sets. Thus, the numerical operators aggregating distances between fuzzy sets as incoming data play a central role in applied problems. Recently, J.J. Nieto and A. Torres gave some applications of the aggregation of distances on fuzzy sets to the study of real medical data in /cite{Nieto}. These applications are based on the notion of segment joining two given fuzzy sets and on the notion of set of midpoints between fuzzy sets. A few results obtained by Nieto and Torres have been generalized in turn by Casasnovas and Rossell/'{o} in /cite{Casas,Casas2}. Nowadays, quasi-metrics provide efficient tools in some fields of computer science and in bioinformatics. Motivated by the exposed facts, a study of segments joining two fuzzy sets and of midpoints between fuzzy sets when the measure, used for comparisons, is a quasi-metric has been made in /cite{Casas3, SebVal2013,TiradoValero}. (Author)

  2. Improved Iris Recognition through Fusion of Hamming Distance and Fragile Bit Distance.

    Science.gov (United States)

    Hollingsworth, Karen P; Bowyer, Kevin W; Flynn, Patrick J

    2011-12-01

    The most common iris biometric algorithm represents the texture of an iris using a binary iris code. Not all bits in an iris code are equally consistent. A bit is deemed fragile if its value changes across iris codes created from different images of the same iris. Previous research has shown that iris recognition performance can be improved by masking these fragile bits. Rather than ignoring fragile bits completely, we consider what beneficial information can be obtained from the fragile bits. We find that the locations of fragile bits tend to be consistent across different iris codes of the same eye. We present a metric, called the fragile bit distance, which quantitatively measures the coincidence of the fragile bit patterns in two iris codes. We find that score fusion of fragile bit distance and Hamming distance works better for recognition than Hamming distance alone. To our knowledge, this is the first and only work to use the coincidence of fragile bit locations to improve the accuracy of matches.

  3. Defining functional distances over Gene Ontology

    Directory of Open Access Journals (Sweden)

    del Pozo Angela

    2008-01-01

    Full Text Available Abstract Background A fundamental problem when trying to define the functional relationships between proteins is the difficulty in quantifying functional similarities, even when well-structured ontologies exist regarding the activity of proteins (i.e. 'gene ontology' -GO-. However, functional metrics can overcome the problems in the comparing and evaluating functional assignments and predictions. As a reference of proximity, previous approaches to compare GO terms considered linkage in terms of ontology weighted by a probability distribution that balances the non-uniform 'richness' of different parts of the Direct Acyclic Graph. Here, we have followed a different approach to quantify functional similarities between GO terms. Results We propose a new method to derive 'functional distances' between GO terms that is based on the simultaneous occurrence of terms in the same set of Interpro entries, instead of relying on the structure of the GO. The coincidence of GO terms reveals natural biological links between the GO functions and defines a distance model Df which fulfils the properties of a Metric Space. The distances obtained in this way can be represented as a hierarchical 'Functional Tree'. Conclusion The method proposed provides a new definition of distance that enables the similarity between GO terms to be quantified. Additionally, the 'Functional Tree' defines groups with biological meaning enhancing its utility for protein function comparison and prediction. Finally, this approach could be for function-based protein searches in databases, and for analysing the gene clusters produced by DNA array experiments.

  4. Mahalanobis distance screening of Arabidopsis mutants with chlorophyll fluorescence

    Czech Academy of Sciences Publication Activity Database

    Codrea, C. C.; Hakala-Yatkin, M.; Karlund-Marttila, A.; Nedbal, Ladislav; Aittokallio, T.; Nevalainen, O. S.; Tyystjärvi, E.

    2010-01-01

    Roč. 105, č. 3 (2010), s. 273-283 ISSN 0166-8595 Institutional research plan: CEZ:AV0Z60870520 Keywords : arabidopsis thaliana * chlorophyll fluorescence * fluorescence imaging * mutant detection * outlier detection Subject RIV: EH - Ecology, Behaviour Impact factor: 2.410, year: 2010 http://www.springerlink.com/content/x3586512462pn006/

  5. Phylo_dCor: distance correlation as a novel metric for phylogenetic profiling.

    Science.gov (United States)

    Sferra, Gabriella; Fratini, Federica; Ponzi, Marta; Pizzi, Elisabetta

    2017-09-05

    Elaboration of powerful methods to predict functional and/or physical protein-protein interactions from genome sequence is one of the main tasks in the post-genomic era. Phylogenetic profiling allows the prediction of protein-protein interactions at a whole genome level in both Prokaryotes and Eukaryotes. For this reason it is considered one of the most promising methods. Here, we propose an improvement of phylogenetic profiling that enables handling of large genomic datasets and infer global protein-protein interactions. This method uses the distance correlation as a new measure of phylogenetic profile similarity. We constructed robust reference sets and developed Phylo-dCor, a parallelized version of the algorithm for calculating the distance correlation that makes it applicable to large genomic data. Using Saccharomyces cerevisiae and Escherichia coli genome datasets, we showed that Phylo-dCor outperforms phylogenetic profiling methods previously described based on the mutual information and Pearson's correlation as measures of profile similarity. In this work, we constructed and assessed robust reference sets and propose the distance correlation as a measure for comparing phylogenetic profiles. To make it applicable to large genomic data, we developed Phylo-dCor, a parallelized version of the algorithm for calculating the distance correlation. Two R scripts that can be run on a wide range of machines are available upon request.

  6. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However

  7. Is Time the Best Metric to Measure Carbon-Related Climate Change Potential and Tune the Economy Toward Reduced Fossil Carbon Extraction?

    Science.gov (United States)

    DeGroff, F. A.

    2016-12-01

    Anthropogenic changes to non-anthropogenic carbon fluxes are a primary driver of climate change. There currently exists no comprehensive metric to measure and value anthropogenic changes in carbon flux between all states of carbon. Focusing on atmospheric carbon emissions as a measure of anthropogenic activity on the environment ignores the fungible characteristics of carbon that are crucial in both the biosphere and the worldwide economy. Focusing on a single form of inorganic carbon as a proxy metric for the plethora of anthropogenic activity and carbon compounds will prove inadequate, convoluted, and unmanageable. A broader, more basic metric is needed to capture the entirety of carbon activity, particularly in an economic, profit-driven environment. We propose a new metric to measure changes in the temporal distance of any form or state of carbon from one state to another. Such a metric would be especially useful to measure the temporal distance of carbon from sinks such as the atmosphere or oceans. The effect of changes in carbon flux as a result of any human activity can be measured by the difference between the anthropogenic and non-anthropogenic temporal distance. The change in the temporal distance is a measure of the climate change potential much like voltage is a measure of electrical potential. The integral of the climate change potential is proportional to the anthropogenic climate change. We also propose a logarithmic vector scale for carbon quality, cq, as a measure of anthropogenic changes in carbon flux. The distance between the cq vector starting and ending temporal distances represents the change in cq. A base-10 logarithmic scale would allow the addition and subtraction of exponents to calculate changes in cq. As anthropogenic activity changes the temporal distance of carbon, the change in cq is measured as: cq = ß ( log10 [mean carbon temporal distance] ) where ß represents the carbon price coefficient for a particular country. For any

  8. SU-G-BRB-16: Vulnerabilities in the Gamma Metric

    International Nuclear Information System (INIS)

    Neal, B; Siebers, J

    2016-01-01

    Purpose: To explore vulnerabilities in the gamma index metric that undermine its wide use as a radiation therapy quality assurance tool. Methods: 2D test field pairs (images) are created specifically to achieve high gamma passing rates, but to also include gross errors by exploiting the distance-to-agreement and percent-passing components of the metric. The first set has no requirement of clinical practicality, but is intended to expose vulnerabilities. The second set exposes clinically realistic vulnerabilities. To circumvent limitations inherent to user-specific tuning of prediction algorithms to match measurements, digital test cases are manually constructed, thereby mimicking high-quality image prediction. Results: With a 3 mm distance-to-agreement metric, changing field size by ±6 mm results in a gamma passing rate over 99%. For a uniform field, a lattice of passing points spaced 5 mm apart results in a passing rate of 100%. Exploiting the percent-passing component, a 10×10 cm"2 field can have a 95% passing rate when an 8 cm"2=2.8×2.8 cm"2 highly out-of-tolerance (e.g. zero dose) square is missing from the comparison image. For clinically realistic vulnerabilities, an arc plan for which a 2D image is created can have a >95% passing rate solely due to agreement in the lateral spillage, with the failing 5% in the critical target region. A field with an integrated boost (e.g whole brain plus small metastases) could neglect the metastases entirely, yet still pass with a 95% threshold. All the failure modes described would be visually apparent on a gamma-map image. Conclusion: The %gamma<1 metric has significant vulnerabilities. High passing rates can obscure critical faults in hypothetical and delivered radiation doses. Great caution should be used with gamma as a QA metric; users should inspect the gamma-map. Visual analysis of gamma-maps may be impractical for cine acquisition.

  9. SU-G-BRB-16: Vulnerabilities in the Gamma Metric

    Energy Technology Data Exchange (ETDEWEB)

    Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To explore vulnerabilities in the gamma index metric that undermine its wide use as a radiation therapy quality assurance tool. Methods: 2D test field pairs (images) are created specifically to achieve high gamma passing rates, but to also include gross errors by exploiting the distance-to-agreement and percent-passing components of the metric. The first set has no requirement of clinical practicality, but is intended to expose vulnerabilities. The second set exposes clinically realistic vulnerabilities. To circumvent limitations inherent to user-specific tuning of prediction algorithms to match measurements, digital test cases are manually constructed, thereby mimicking high-quality image prediction. Results: With a 3 mm distance-to-agreement metric, changing field size by ±6 mm results in a gamma passing rate over 99%. For a uniform field, a lattice of passing points spaced 5 mm apart results in a passing rate of 100%. Exploiting the percent-passing component, a 10×10 cm{sup 2} field can have a 95% passing rate when an 8 cm{sup 2}=2.8×2.8 cm{sup 2} highly out-of-tolerance (e.g. zero dose) square is missing from the comparison image. For clinically realistic vulnerabilities, an arc plan for which a 2D image is created can have a >95% passing rate solely due to agreement in the lateral spillage, with the failing 5% in the critical target region. A field with an integrated boost (e.g whole brain plus small metastases) could neglect the metastases entirely, yet still pass with a 95% threshold. All the failure modes described would be visually apparent on a gamma-map image. Conclusion: The %gamma<1 metric has significant vulnerabilities. High passing rates can obscure critical faults in hypothetical and delivered radiation doses. Great caution should be used with gamma as a QA metric; users should inspect the gamma-map. Visual analysis of gamma-maps may be impractical for cine acquisition.

  10. Permutation-invariant distance between atomic configurations

    Science.gov (United States)

    Ferré, Grégoire; Maillet, Jean-Bernard; Stoltz, Gabriel

    2015-09-01

    We present a permutation-invariant distance between atomic configurations, defined through a functional representation of atomic positions. This distance enables us to directly compare different atomic environments with an arbitrary number of particles, without going through a space of reduced dimensionality (i.e., fingerprints) as an intermediate step. Moreover, this distance is naturally invariant through permutations of atoms, avoiding the time consuming associated minimization required by other common criteria (like the root mean square distance). Finally, the invariance through global rotations is accounted for by a minimization procedure in the space of rotations solved by Monte Carlo simulated annealing. A formal framework is also introduced, showing that the distance we propose verifies the property of a metric on the space of atomic configurations. Two examples of applications are proposed. The first one consists in evaluating faithfulness of some fingerprints (or descriptors), i.e., their capacity to represent the structural information of a configuration. The second application concerns structural analysis, where our distance proves to be efficient in discriminating different local structures and even classifying their degree of similarity.

  11. Permutation-invariant distance between atomic configurations

    International Nuclear Information System (INIS)

    Ferré, Grégoire; Maillet, Jean-Bernard; Stoltz, Gabriel

    2015-01-01

    We present a permutation-invariant distance between atomic configurations, defined through a functional representation of atomic positions. This distance enables us to directly compare different atomic environments with an arbitrary number of particles, without going through a space of reduced dimensionality (i.e., fingerprints) as an intermediate step. Moreover, this distance is naturally invariant through permutations of atoms, avoiding the time consuming associated minimization required by other common criteria (like the root mean square distance). Finally, the invariance through global rotations is accounted for by a minimization procedure in the space of rotations solved by Monte Carlo simulated annealing. A formal framework is also introduced, showing that the distance we propose verifies the property of a metric on the space of atomic configurations. Two examples of applications are proposed. The first one consists in evaluating faithfulness of some fingerprints (or descriptors), i.e., their capacity to represent the structural information of a configuration. The second application concerns structural analysis, where our distance proves to be efficient in discriminating different local structures and even classifying their degree of similarity

  12. A comparison of linear approaches to filter out environmental effects in structural health monitoring

    Science.gov (United States)

    Deraemaeker, A.; Worden, K.

    2018-05-01

    This paper discusses the possibility of using the Mahalanobis squared-distance to perform robust novelty detection in the presence of important environmental variability in a multivariate feature vector. By performing an eigenvalue decomposition of the covariance matrix used to compute that distance, it is shown that the Mahalanobis squared-distance can be written as the sum of independent terms which result from a transformation from the feature vector space to a space of independent variables. In general, especially when the size of the features vector is large, there are dominant eigenvalues and eigenvectors associated with the covariance matrix, so that a set of principal components can be defined. Because the associated eigenvalues are high, their contribution to the Mahalanobis squared-distance is low, while the contribution of the other components is high due to the low value of the associated eigenvalues. This analysis shows that the Mahalanobis distance naturally filters out the variability in the training data. This property can be used to remove the effect of the environment in damage detection, in much the same way as two other established techniques, principal component analysis and factor analysis. The three techniques are compared here using real experimental data from a wooden bridge for which the feature vector consists in eigenfrequencies and modeshapes collected under changing environmental conditions, as well as damaged conditions simulated with an added mass. The results confirm the similarity between the three techniques and the ability to filter out environmental effects, while keeping a high sensitivity to structural changes. The results also show that even after filtering out the environmental effects, the normality assumption cannot be made for the residual feature vector. An alternative is demonstrated here based on extreme value statistics which results in a much better threshold which avoids false positives in the training data, while

  13. Costing improvement of remanufacturing crankshaft by integrating Mahalanobis-Taguchi System and Activity based Costing

    Science.gov (United States)

    Abu, M. Y.; Nor, E. E. Mohd; Rahman, M. S. Abd

    2018-04-01

    Integration between quality and costing system is very crucial in order to achieve an accurate product cost and profit. Current practice by most of remanufacturers, there are still lacking on optimization during the remanufacturing process which contributed to incorrect variables consideration to the costing system. Meanwhile, traditional costing accounting being practice has distortion in the cost unit which lead to inaccurate cost of product. The aim of this work is to identify the critical and non-critical variables during remanufacturing process using Mahalanobis-Taguchi System and simultaneously estimate the cost using Activity Based Costing method. The orthogonal array was applied to indicate the contribution of variables in the factorial effect graph and the critical variables were considered with overhead costs that are actually demanding the activities. This work improved the quality inspection together with costing system to produce an accurate profitability information. As a result, the cost per unit of remanufactured crankshaft of MAN engine model with 5 critical crankpins is MYR609.50 while Detroit engine model with 4 critical crankpins is MYR1254.80. The significant of output demonstrated through promoting green by reducing re-melting process of damaged parts to ensure consistent benefit of return cores.

  14. Monoparametric family of metrics derived from classical Jensen-Shannon divergence

    Science.gov (United States)

    Osán, Tristán M.; Bussandri, Diego G.; Lamberti, Pedro W.

    2018-04-01

    Jensen-Shannon divergence is a well known multi-purpose measure of dissimilarity between probability distributions. It has been proven that the square root of this quantity is a true metric in the sense that, in addition to the basic properties of a distance, it also satisfies the triangle inequality. In this work we extend this last result to prove that in fact it is possible to derive a monoparametric family of metrics from the classical Jensen-Shannon divergence. Motivated by our results, an application into the field of symbolic sequences segmentation is explored. Additionally, we analyze the possibility to extend this result into the quantum realm.

  15. Validation of network communicability metrics for the analysis of brain structural networks.

    Directory of Open Access Journals (Sweden)

    Jennifer Andreotti

    Full Text Available Computational network analysis provides new methods to analyze the brain's structural organization based on diffusion imaging tractography data. Networks are characterized by global and local metrics that have recently given promising insights into diagnosis and the further understanding of psychiatric and neurologic disorders. Most of these metrics are based on the idea that information in a network flows along the shortest paths. In contrast to this notion, communicability is a broader measure of connectivity which assumes that information could flow along all possible paths between two nodes. In our work, the features of network metrics related to communicability were explored for the first time in the healthy structural brain network. In addition, the sensitivity of such metrics was analysed using simulated lesions to specific nodes and network connections. Results showed advantages of communicability over conventional metrics in detecting densely connected nodes as well as subsets of nodes vulnerable to lesions. In addition, communicability centrality was shown to be widely affected by the lesions and the changes were negatively correlated with the distance from lesion site. In summary, our analysis suggests that communicability metrics that may provide an insight into the integrative properties of the structural brain network and that these metrics may be useful for the analysis of brain networks in the presence of lesions. Nevertheless, the interpretation of communicability is not straightforward; hence these metrics should be used as a supplement to the more standard connectivity network metrics.

  16. Estimation and Mapping Forest Attributes Using “k Nearest Neighbor” Method on IRS-P6 LISS III Satellite Image Data

    Directory of Open Access Journals (Sweden)

    Amir Eslam Bonyad

    2015-06-01

    Full Text Available In this study, we explored the utility of k Nearest Neighbor (kNN algorithm to integrate IRS-P6 LISS III satellite imagery data and ground inventory data for application in forest attributes (DBH, trees height, volume, basal area, density and forest cover type estimation and mapping. The ground inventory data was based on a systematic-random sampling grid and the numbers of sampling plots were 408 circular plots in a plantation in Guilan province, north of Iran. We concluded that kNN method was useful tool for mapping at a fine accuracy between 80% and 93.94%. Values of k between 5 and 8 seemed appropriate. The best distance metrics were found Euclidean, Fuzzy and Mahalanobis. Results showed that kNN was accurate enough for practical applicability for mapping forest areas.

  17. The estimation of genetic distance and discriminant variables on breed of duck (Alabio, Bali, Khaki Campbell, Mojosari and Pegagan by morphological analysis

    Directory of Open Access Journals (Sweden)

    B Brahmantiyo

    2003-03-01

    Full Text Available A study on morphological body conformation of Alabio, Bali, Khaki Campbell, Mojosari and Pegagan ducks was carried out to determine the genetic distance and discriminant variables. This research was held in Research Institute for Animal Production, Ciawi, Bogor using 65 Alabio ducks, 40 Bali ducks, 36 Khaki Campbell ducks, 60 Mojosari ducks and 30 Pegagan ducks. Seven different body parts were measured, they were the length of femur, tibia, tarsometatarsus, the circumference of tarsometatarsus, the length of third digits, wing and maxilla. General Linear Models and simple discriminant analysis were used in this observation (SAS package program. Male and female Pegagan ducks had morphological size bigger than Alabio, Bali, Khaki Campbell and Mojosari ducks. Khaki Campbell ducks were mixed with Bali ducks (47.22% and Pegagan ducks from isolated location in South Sumatera were lightly mixed with Alabio and Bali. Mahalanobis genetic distance showed that Bali and Khaki Campbell ducks, also, Alabio and Mojosari ducks had similarity, with genetic distance of 1.420 and 1.548, respectively. Results from canonical analysis showed that the most discriminant variables were obtained from the length of femur, tibia and third digits.

  18. Open Problem: Kernel methods on manifolds and metric spaces

    DEFF Research Database (Denmark)

    Feragen, Aasa; Hauberg, Søren

    2016-01-01

    Radial kernels are well-suited for machine learning over general geodesic metric spaces, where pairwise distances are often the only computable quantity available. We have recently shown that geodesic exponential kernels are only positive definite for all bandwidths when the input space has strong...... linear properties. This negative result hints that radial kernel are perhaps not suitable over geodesic metric spaces after all. Here, however, we present evidence that large intervals of bandwidths exist where geodesic exponential kernels have high probability of being positive definite over finite...... datasets, while still having significant predictive power. From this we formulate conjectures on the probability of a positive definite kernel matrix for a finite random sample, depending on the geometry of the data space and the spread of the sample....

  19. Quantum metric spaces as a model for pregeometry

    International Nuclear Information System (INIS)

    Alvarez, E.; Cespedes, J.; Verdaguer, E.

    1992-01-01

    A new arena for the dynamics of spacetime is proposed, in which the basic quantum variable is the two-point distance on a metric space. The scaling dimension (that is, the Kolmogorov capacity) in the neighborhood of each point then defines in a natural way a local concept of dimension. We study our model in the region of parameter space in which the resulting spacetime is not too different from a smooth manifold

  20. Integration of Mahalanobis-Taguchi system and traditional cost accounting for remanufacturing crankshaft

    Science.gov (United States)

    Abu, M. Y.; Norizan, N. S.; Rahman, M. S. Abd

    2018-04-01

    Remanufacturing is a sustainability strategic planning which transforming the end of life product to as new performance with their warranty is same or better than the original product. In order to quantify the advantages of this strategy, all the processes must implement the optimization to reach the ultimate goal and reduce the waste generated. The aim of this work is to evaluate the criticality of parameters on the end of life crankshaft based on Taguchi’s orthogonal array. Then, estimate the cost using traditional cost accounting by considering the critical parameters. By implementing the optimization, the remanufacturer obviously produced lower cost and waste during production with higher potential to gain the profit. Mahalanobis-Taguchi System was proven as a powerful method of optimization that revealed the criticality of parameters. When subjected the method to the MAN engine model, there was 5 out of 6 crankpins were critical which need for grinding process while no changes happened to the Caterpillar engine model. Meanwhile, the cost per unit for MAN engine model was changed from MYR1401.29 to RM1251.29 while for Caterpillar engine model have no changes due to the no changes on criticality of parameters consideration. Therefore, by integrating the optimization and costing through remanufacturing process, a better decision can be achieved after observing the potential profit will be gained. The significant of output demonstrated through promoting sustainability by reducing re-melting process of damaged parts to ensure consistent benefit of return cores.

  1. Statistical distance and the approach to KNO scaling

    International Nuclear Information System (INIS)

    Diosi, L.; Hegyi, S.; Krasznovszky, S.

    1990-05-01

    A new method is proposed for characterizing the approach to KNO scaling. The essence of our method lies in the concept of statistical distance between nearby KNO distributions which reflects their distinguishability in spite of multiplicity fluctuations. It is shown that the geometry induced by the distance function defines a natural metric on the parameter space of a certain family of KNO distributions. Some examples are given in which the energy dependences of distinguishability of neighbouring KNO distributions are compared in nondiffractive hadron-hadron collisions and electron-positron annihilation. (author) 19 refs.; 4 figs

  2. [Applicability of traditional landscape metrics in evaluating urban heat island effect].

    Science.gov (United States)

    Chen, Ai-Lian; Sun, Ran-Hao; Chen, Li-Ding

    2012-08-01

    By using 24 landscape metrics, this paper evaluated the urban heat island effect in parts of Beijing downtown area. QuickBird (QB) images were used to extract the landscape type information, and the thermal bands from Landsat Enhanced Thematic Mapper Plus (ETM+) images were used to extract the land surface temperature (LST) in four seasons of the same year. The 24 landscape pattern metrics were calculated at landscape and class levels in a fixed window with 120 mx 120 m in size, with the applicability of these traditional landscape metrics in evaluating the urban heat island effect examined. Among the 24 landscape metrics, only the percentage composition of landscape (PLAND), patch density (PD), largest patch index (LPI), coefficient of Euclidean nearest-neighbor distance variance (ENN_CV), and landscape division index (DIVISION) at landscape level were significantly correlated with the LST in March, May, and November, and the PLAND, LPI, DIVISION, percentage of like adjacencies, and interspersion and juxtaposition index at class level showed significant correlations with the LST in March, May, July, and December, especially in July. Some metrics such as PD, edge density, clumpiness index, patch cohesion index, effective mesh size, splitting index, aggregation index, and normalized landscape shape index showed varying correlations with the LST at different class levels. The traditional landscape metrics could not be appropriate in evaluating the effects of river on LST, while some of the metrics could be useful in characterizing urban LST and analyzing the urban heat island effect, but screening and examining should be made on the metrics.

  3. Performance evaluation of a distance learning program.

    Science.gov (United States)

    Dailey, D J; Eno, K R; Brinkley, J F

    1994-01-01

    This paper presents a performance metric which uses a single number to characterize the response time for a non-deterministic client-server application operating over the Internet. When applied to a Macintosh-based distance learning application called the Digital Anatomist Browser, the metric allowed us to observe that "A typical student doing a typical mix of Browser commands on a typical data set will experience the same delay if they use a slow Macintosh on a local network or a fast Macintosh on the other side of the country accessing the data over the Internet." The methodology presented is applicable to other client-server applications that are rapidly appearing on the Internet.

  4. Sharp metric obstructions for quasi-Einstein metrics

    Science.gov (United States)

    Case, Jeffrey S.

    2013-02-01

    Using the tractor calculus to study smooth metric measure spaces, we adapt results of Gover and Nurowski to give sharp metric obstructions to the existence of quasi-Einstein metrics on suitably generic manifolds. We do this by introducing an analogue of the Weyl tractor W to the setting of smooth metric measure spaces. The obstructions we obtain can be realized as tensorial invariants which are polynomial in the Riemann curvature tensor and its divergence. By taking suitable limits of their tensorial forms, we then find obstructions to the existence of static potentials, generalizing to higher dimensions a result of Bartnik and Tod, and to the existence of potentials for gradient Ricci solitons.

  5. Analysis of transitions at two-fold redundant sites in mammalian genomes. Transition redundant approach-to-equilibrium (TREx distance metrics

    Directory of Open Access Journals (Sweden)

    Liberles David A

    2006-03-01

    sites within two-fold redundant coding systems were examined in the mouse, rat, and human genomes. The key metric (f2, the fraction of those sites that holds the same nucleotide, was measured for putative ortholog pairs. A transition redundant exchange (TREx distance was calculated from f2 for these pairs. Pyrimidine-pyrimidine transitions at these sites occur approximately 14% faster than purine-purine transitions in various lineages. Transition rate constants were similar in different genes within the same lineages; within a set of orthologs, the f2 distribution is only modest overdispersed. No correlation between disparity and overdispersion is observed. In rodents, evidence was found for greater conservation of TREx sites in genes on the X chromosome, accounting for a small part of the overdispersion, however. Conclusion The TREx metric is useful to analyze the history of transition rate constants within these mammals over the past 100 million years. The TREx metric estimates the extent to which silent nucleotide substitutions accumulate in different genes, on different chromosomes, with different compositions, in different lineages, and at different times.

  6. $\\eta$-metric structures

    OpenAIRE

    Gaba, Yaé Ulrich

    2017-01-01

    In this paper, we discuss recent results about generalized metric spaces and fixed point theory. We introduce the notion of $\\eta$-cone metric spaces, give some topological properties and prove some fixed point theorems for contractive type maps on these spaces. In particular we show that theses $\\eta$-cone metric spaces are natural generalizations of both cone metric spaces and metric type spaces.

  7. An image processing approach to computing distances between RNA secondary structures dot plots

    Directory of Open Access Journals (Sweden)

    Sapiro Guillermo

    2009-02-01

    Full Text Available Abstract Background Computing the distance between two RNA secondary structures can contribute in understanding the functional relationship between them. When used repeatedly, such a procedure may lead to finding a query RNA structure of interest in a database of structures. Several methods are available for computing distances between RNAs represented as strings or graphs, but none utilize the RNA representation with dot plots. Since dot plots are essentially digital images, there is a clear motivation to devise an algorithm for computing the distance between dot plots based on image processing methods. Results We have developed a new metric dubbed 'DoPloCompare', which compares two RNA structures. The method is based on comparing dot plot diagrams that represent the secondary structures. When analyzing two diagrams and motivated by image processing, the distance is based on a combination of histogram correlations and a geometrical distance measure. We introduce, describe, and illustrate the procedure by two applications that utilize this metric on RNA sequences. The first application is the RNA design problem, where the goal is to find the nucleotide sequence for a given secondary structure. Examples where our proposed distance measure outperforms others are given. The second application locates peculiar point mutations that induce significant structural alternations relative to the wild type predicted secondary structure. The approach reported in the past to solve this problem was tested on several RNA sequences with known secondary structures to affirm their prediction, as well as on a data set of ribosomal pieces. These pieces were computationally cut from a ribosome for which an experimentally derived secondary structure is available, and on each piece the prediction conveys similarity to the experimental result. Our newly proposed distance measure shows benefit in this problem as well when compared to standard methods used for assessing

  8. Analysis and Comparison of Information Theory-based Distances for Genomic Strings

    Science.gov (United States)

    Balzano, Walter; Cicalese, Ferdinando; Del Sorbo, Maria Rosaria; Vaccaro, Ugo

    2008-07-01

    Genomic string comparison via alignment are widely applied for mining and retrieval of information in biological databases. In some situation, the effectiveness of such alignment based comparison is still unclear, e.g., for sequences with non-uniform length and with significant shuffling of identical substrings. An alternative approach is the one based on information theory distances. Biological data information content is stored in very long strings of only four characters. In last ten years, several entropic measures have been proposed for genomic string analysis. Notwithstanding their individual merit and experimental validation, to the nest of our knowledge, there is no direct comparison of these different metrics. We shall present four of the most representative alignment-free distance measures, based on mutual information. Each one has a different origin and expression. Our comparison involves a sort of arrangement, to reduce different concepts to a unique formalism, so as it has been possible to construct a phylogenetic tree for each of them. The trees produced via these metrics are compared to the ones widely accepted as biologically validated. In general the results provided more evidence of the reliability of the alignment-free distance models. Also, we observe that one of the metrics appeared to be more robust than the other three. We believe that this result can be object of further researches and observations. Many of the results of experimentation, the graphics and the table are available at the following URL: http://people.na.infn.it/˜wbalzano/BIO

  9. Simulation of devices mobility to estimate wireless channel quality metrics in 5G networks

    Science.gov (United States)

    Orlov, Yu.; Fedorov, S.; Samuylov, A.; Gaidamaka, Yu.; Molchanov, D.

    2017-07-01

    The problem of channel quality estimation for devices in a wireless 5G network is formulated. As a performance metrics of interest we choose the signal-to-interference-plus-noise ratio, which depends essentially on the distance between the communicating devices. A model with a plurality of moving devices in a bounded three-dimensional space and a simulation algorithm to determine the distances between the devices for a given motion model are devised.

  10. Distances on Cosmological Scales with VLTI

    OpenAIRE

    Karovska, Margarita; Elvis, Martin; Marengo, Massimo

    2003-01-01

    We present here a new method using interferometric measurements of quasars, that allows the determination of direct geometrical distances on cosmic scales. Quasar Broad Emission Line Regions sizes provide a "meter rule" with which to measure the metric of the Universe. This method is less dependent of model assumptions, and even of variations in the fundamental constants (other than c). We discuss the spectral and spatial requirements on the VLTI observations needed to carry out these measure...

  11. Effect of Image Linearization on Normalized Compression Distance

    Science.gov (United States)

    Mortensen, Jonathan; Wu, Jia Jie; Furst, Jacob; Rogers, John; Raicu, Daniela

    Normalized Information Distance, based on Kolmogorov complexity, is an emerging metric for image similarity. It is approximated by the Normalized Compression Distance (NCD) which generates the relative distance between two strings by using standard compression algorithms to compare linear strings of information. This relative distance quantifies the degree of similarity between the two objects. NCD has been shown to measure similarity effectively on information which is already a string: genomic string comparisons have created accurate phylogeny trees and NCD has also been used to classify music. Currently, to find a similarity measure using NCD for images, the images must first be linearized into a string, and then compared. To understand how linearization of a 2D image affects the similarity measure, we perform four types of linearization on a subset of the Corel image database and compare each for a variety of image transformations. Our experiment shows that different linearization techniques produce statistically significant differences in NCD for identical spatial transformations.

  12. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  13. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  14. Sensitivity study of a semiautomatic supervised classifier applied to minerals from x-ray mapping images

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Nielsen, Allan Aasbjerg; Flesche, Harald

    1999-01-01

    to a small area in order to allow for the estimation of a variance-covariance matrix. This expansion is controlled by upper limits for the spatial and Euclidean spectral distances from the seed point. Second, after this initial expansion the growing of the training set is controlled by an upper limit...... is obtained by excluding observations that have high Mahalanobis distances to the training class mean. Spatial closeness is obtained by requiring connectivity. The marginal effects of changes in the parameters that are input to the seed growing algorithm are evaluated. Initially, the seed is expanded...... for the Mahalanobis distance to the current estimate of the class centre. Also, the estimates of class centres and covariance matrices may be continuously updated or the initial estimates may be used. Finally, the effect of the operator's choice of seed among a number of potential seeding points is evaluated. After...

  15. Semi-automatic supervised classification of minerals from x-ray mapping images

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Flesche, Harald; Larsen, Rasmus

    1998-01-01

    to a small area in order to allow for the estimation of a variance-covariance matrix. This expansion is controlled by upper limits for the spatial and Euclidean spectral distances from the seed point. Second, after this initial expansion the growing of the training set is controlled by an upper limit...... is obtained by excluding observations that have high Mahalanobis distances to the training class mean. Spatial closeness is obtained by requiring connectivity. The marginal effects of changes in the parameters that are input to the seed growing algorithm are evaluated. Initially, the seed is expanded...... for the Mahalanobis distance to the current estimate of the class centre. Also, the estimates of class centres and covariance matrices may be continuously updated or the initial estimates may be used. Finally, the effect of the operator's choice of seed among a number of potential seeding points is evaluated. After...

  16. Distinguishability notion based on Wootters statistical distance: Application to discrete maps

    Science.gov (United States)

    Gomez, Ignacio S.; Portesi, M.; Lamberti, P. W.

    2017-08-01

    We study the distinguishability notion given by Wootters for states represented by probability density functions. This presents the particularity that it can also be used for defining a statistical distance in chaotic unidimensional maps. Based on that definition, we provide a metric d ¯ for an arbitrary discrete map. Moreover, from d ¯ , we associate a metric space with each invariant density of a given map, which results to be the set of all distinguished points when the number of iterations of the map tends to infinity. Also, we give a characterization of the wandering set of a map in terms of the metric d ¯ , which allows us to identify the dissipative regions in the phase space. We illustrate the results in the case of the logistic and the circle maps numerically and analytically, and we obtain d ¯ and the wandering set for some characteristic values of their parameters. Finally, an extension of the metric space associated for arbitrary probability distributions (not necessarily invariant densities) is given along with some consequences. The statistical properties of distributions given by histograms are characterized in terms of the cardinal of the associated metric space. For two conjugate variables, the uncertainty principle is expressed in terms of the diameters of the associated metric space with those variables.

  17. Stochastic Learning of Multi-Instance Dictionary for Earth Mover's Distance based Histogram Comparison

    OpenAIRE

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover's distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stoc...

  18. Identificação de duplicidades de acessos de feijão por meio de técnicas multivariadas Identification of repetitive bean samples using multivariate analysis

    Directory of Open Access Journals (Sweden)

    Jaime Roberto Fonseca

    1999-03-01

    Full Text Available Este trabalho objetivou testar as técnicas de análise multivariada e da medida de divergência genética representada pela distância generalizada de Mahalanobis na seleção de descritores e na identificação de duplicidades de acessos de feijão (Phaseolus vulgaris L.. Cinqüenta acessos do Banco Ativo de Germoplasma (BAG-Feijão, da Embrapa - Centro Nacional de Pesquisa de Arroz e Feijão (CNPAF, foram avaliados em junho de 1993, utilizando-se delineamento experimental em blocos casualizados, com duas repetições. Dez descritores com características quantitativas e fenológicas foram analisados por meio de variáveis canônicas e distância de Mahalanobis. Todos os caracteres foram importantes na descrição do germoplasma. A técnica de agrupamento pela distância generalizada de Mahalanobis mostrou-se viável e eficaz na identificação de duplicidades do feijoeiro, podendo ser utilizada rotineiramente no Banco de Germoplasma.Fifty bean (Phaseolus vulgaris L. samples from the Active Germplasm Bank of Embrapa - Centro Nacional de Pesquisa de Arroz e Feijão (CNPAF, Goiânia, GO, Brazil, were studied using multivariable techniques to screen descriptors for characterization, to measure genetic diversity and to group samples, looking for repetitive access identification. Germplasm evaluation was carried out in June 1993, using the randomized block design with two replications. Ten quantitative and phenologic descriptors were used for characterization and analysed by canonic variables and Mahalanobis distance techniques. There were no redundant characters and all descriptors were important for sample description. The grouping technique using the diversity genetic measures through the Mahalanobis generalized distance showed to be efficient and viable for identification of repetitive samples, being an appropriate procedure to be used in genebanks.

  19. Combination of evidence in recommendation systems characterized by distance functions

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, L. M. (Luis Mateus)

    2002-01-01

    Recommendation systems for different Document Networks (DN) such as the World Wide Web (WWW), Digitnl Libarries, or Scientific Databases, often make use of distance functions extracted from relationships among documents and between documents and semantic tags. For instance, documents In the WWW are related via a hyperlink network, while documents in bibliographic databases are related by citation and collaboration networks.Furthermore, documents can be related to semantic tags such as keywords used to describe their content, The distance functions computed from these relations establish associative networks among items of the DN, and allow recommendation systems to identify relevant associations for iudividoal users. The process of recommendation can be improved by integrating associative data from different sources. Thus we are presented with a problem of combining evidence (about assochaons between items) from different sonrces characterized by distance functions. In this paper we summarize our work on (1) inferring associations from semi-metric distance functions and (2) combining evidence from different (distance) associative DN.

  20. Correlations between chemical composition and provenance of Justino site ceramics by INAA

    International Nuclear Information System (INIS)

    Santos, J.O.; Munita, C.S.; Valerio, M.E.G.; Oliveira, P.M.S.

    2008-01-01

    Instrumental neutron activation analysis (INAA), have been used for the definition of compositional groups of potteries from Justino site, Brazil, according to the chemical similarities of ceramic paste. The outliers were identified by means of robust Mahalanobis distance. The temper effect in the ceramic paste was studied by means of modified Mahalanobis filter. The results were interpreted by means of cluster, principal components, and discriminant analyses. This work provides contributions for the reconstruction of the prehistory of baixo Sao Francisco region, and for the reconstitution of the Brazilian Northeast ceramist population of general frame. (author)

  1. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  2. On the Averaging of Cardiac Diffusion Tensor MRI Data: The Effect of Distance Function Selection

    Science.gov (United States)

    Giannakidis, Archontis; Melkus, Gerd; Yang, Guang; Gullberg, Grant T.

    2016-01-01

    Diffusion tensor magnetic resonance imaging (DT-MRI) allows a unique insight into the microstructure of highly-directional tissues. The selection of the most proper distance function for the space of diffusion tensors is crucial in enhancing the clinical application of this imaging modality. Both linear and nonlinear metrics have been proposed in the literature over the years. The debate on the most appropriate DT-MRI distance function is still ongoing. In this paper, we presented a framework to compare the Euclidean, affine-invariant Riemannian and log-Euclidean metrics using actual high-resolution DT-MRI rat heart data. We employed temporal averaging at the diffusion tensor level of three consecutive and identically-acquired DT-MRI datasets from each of five rat hearts as a means to rectify the background noise-induced loss of myocyte directional regularity. This procedure is applied here for the first time in the context of tensor distance function selection. When compared with previous studies that used a different concrete application to juxtapose the various DT-MRI distance functions, this work is unique in that it combined the following: (i) Metrics were judged by quantitative –rather than qualitative– criteria, (ii) the comparison tools were non-biased, (iii) a longitudinal comparison operation was used on a same-voxel basis. The statistical analyses of the comparison showed that the three DT-MRI distance functions tend to provide equivalent results. Hence, we came to the conclusion that the tensor manifold for cardiac DT-MRI studies is a curved space of almost zero curvature. The signal to noise ratio dependence of the operations was investigated through simulations. Finally, the “swelling effect” occurrence following Euclidean averaging was found to be too unimportant to be worth consideration. PMID:27754986

  3. On the averaging of cardiac diffusion tensor MRI data: the effect of distance function selection

    Science.gov (United States)

    Giannakidis, Archontis; Melkus, Gerd; Yang, Guang; Gullberg, Grant T.

    2016-11-01

    Diffusion tensor magnetic resonance imaging (DT-MRI) allows a unique insight into the microstructure of highly-directional tissues. The selection of the most proper distance function for the space of diffusion tensors is crucial in enhancing the clinical application of this imaging modality. Both linear and nonlinear metrics have been proposed in the literature over the years. The debate on the most appropriate DT-MRI distance function is still ongoing. In this paper, we presented a framework to compare the Euclidean, affine-invariant Riemannian and log-Euclidean metrics using actual high-resolution DT-MRI rat heart data. We employed temporal averaging at the diffusion tensor level of three consecutive and identically-acquired DT-MRI datasets from each of five rat hearts as a means to rectify the background noise-induced loss of myocyte directional regularity. This procedure is applied here for the first time in the context of tensor distance function selection. When compared with previous studies that used a different concrete application to juxtapose the various DT-MRI distance functions, this work is unique in that it combined the following: (i) metrics were judged by quantitative—rather than qualitative—criteria, (ii) the comparison tools were non-biased, (iii) a longitudinal comparison operation was used on a same-voxel basis. The statistical analyses of the comparison showed that the three DT-MRI distance functions tend to provide equivalent results. Hence, we came to the conclusion that the tensor manifold for cardiac DT-MRI studies is a curved space of almost zero curvature. The signal to noise ratio dependence of the operations was investigated through simulations. Finally, the ‘swelling effect’ occurrence following Euclidean averaging was found to be too unimportant to be worth consideration.

  4. Pythagoras's theorem on a two-dimensional lattice from a `natural' Dirac operator and Connes's distance formula

    Science.gov (United States)

    Dai, Jian; Song, Xing-Chang

    2001-07-01

    One of the key ingredients of Connes's noncommutative geometry is a generalized Dirac operator which induces a metric (Connes's distance) on the pure state space. We generalize such a Dirac operator devised by Dimakis et al, whose Connes distance recovers the linear distance on an one-dimensional lattice, to the two-dimensional case. This Dirac operator has the local eigenvalue property and induces a Euclidean distance on this two-dimensional lattice, which is referred to as `natural'. This kind of Dirac operator can be easily generalized into any higher-dimensional lattices.

  5. PERMANOVA-S: association test for microbial community composition that accommodates confounders and multiple distances

    OpenAIRE

    Tang, Zheng-Zheng; Chen, Guanhua; Alekseyenko, Alexander V.

    2016-01-01

    Motivation: Recent advances in sequencing technology have made it possible to obtain high-throughput data on the composition of microbial communities and to study the effects of dysbiosis on the human host. Analysis of pairwise intersample distances quantifies the association between the microbiome diversity and covariates of interest (e.g. environmental factors, clinical outcomes, treatment groups). In the design of these analyses, multiple choices for distance metrics are available. Most di...

  6. Fluorescence-based classification of Caribbean coral reef organisms and substrates

    Science.gov (United States)

    Zawada, David G.; Mazel, Charles H.

    2014-01-01

    A diverse group of coral reef organisms, representing several phyla, possess fluorescent pigments. We investigated the potential of using the characteristic fluorescence emission spectra of these pigments to enable unsupervised, optical classification of coral reef habitats. We compiled a library of characteristic fluorescence spectra through in situ and laboratory measurements from a variety of specimens throughout the Caribbean. Because fluorescent pigments are not species-specific, the spectral library is organized in terms of 15 functional groups. We investigated the spectral separability of the functional groups in terms of the number of wavebands required to distinguish between them, using the similarity measures Spectral Angle Mapper (SAM), Spectral Information Divergence (SID), SID-SAM mixed measure, and Mahalanobis distance. This set of measures represents geometric, stochastic, joint geometric-stochastic, and statistical approaches to classifying spectra. Our hyperspectral fluorescence data were used to generate sets of 4-, 6-, and 8-waveband spectra, including random variations in relative signal amplitude, spectral peak shifts, and water-column attenuation. Each set consisted of 2 different band definitions: ‘optimally-picked’ and ‘evenly-spaced.’ The optimally-picked wavebands were chosen to coincide with as many peaks as possible in the functional group spectra. Reference libraries were formed from half of the spectra in each set and used for training purposes. Average classification accuracies ranged from 76.3% for SAM with 4 evenly-spaced wavebands to 93.8% for Mahalanobis distance with 8 evenly-spaced wavebands. The Mahalanobis distance consistently outperformed the other measures. In a second test, empirically-measured spectra were classified using the same reference libraries and the Mahalanobis distance for just the 8 evenly-spaced waveband case. Average classification accuracies were 84% and 87%, corresponding to the extremes in modeled

  7. Properties of C-metric spaces

    Science.gov (United States)

    Croitoru, Anca; Apreutesei, Gabriela; Mastorakis, Nikos E.

    2017-09-01

    The subject of this paper belongs to the theory of approximate metrics [23]. An approximate metric on X is a real application defined on X × X that satisfies only a part of the metric axioms. In a recent paper [23], we introduced a new type of approximate metric, named C-metric, that is an application which satisfies only two metric axioms: symmetry and triangular inequality. The remarkable fact in a C-metric space is that a topological structure induced by the C-metric can be defined. The innovative idea of this paper is that we obtain some convergence properties of a C-metric space in the absence of a metric. In this paper we investigate C-metric spaces. The paper is divided into four sections. Section 1 is for Introduction. In Section 2 we recall some concepts and preliminary results. In Section 3 we present some properties of C-metric spaces, such as convergence properties, a canonical decomposition and a C-fixed point theorem. Finally, in Section 4 some conclusions are highlighted.

  8. Learning Low-Dimensional Metrics

    OpenAIRE

    Jain, Lalit; Mason, Blake; Nowak, Robert

    2017-01-01

    This paper investigates the theoretical foundations of metric learning, focused on three key questions that are not fully addressed in prior work: 1) we consider learning general low-dimensional (low-rank) metrics as well as sparse metrics; 2) we develop upper and lower (minimax)bounds on the generalization error; 3) we quantify the sample complexity of metric learning in terms of the dimension of the feature space and the dimension/rank of the underlying metric;4) we also bound the accuracy ...

  9. Gravitomagnetic bending angle of light with finite-distance corrections in stationary axisymmetric spacetimes

    Science.gov (United States)

    Ono, Toshiaki; Ishihara, Asahi; Asada, Hideki

    2017-11-01

    By using the Gauss-Bonnet theorem, the bending angle of light in a static, spherically symmetric and asymptotically flat spacetime has been recently discussed, especially by taking account of the finite distance from a lens object to a light source and a receiver [Ishihara, Suzuki, Ono, Asada, Phys. Rev. D 95, 044017 (2017), 10.1103/PhysRevD.95.044017]. We discuss a possible extension of the method of calculating the bending angle of light to stationary, axisymmetric and asymptotically flat spacetimes. For this purpose, we consider the light rays on the equatorial plane in the axisymmetric spacetime. We introduce a spatial metric to define the bending angle of light in the finite-distance situation. We show that the proposed bending angle of light is coordinate-invariant by using the Gauss-Bonnet theorem. The nonvanishing geodesic curvature of the photon orbit with the spatial metric is caused in gravitomagnetism, even though the light ray in the four-dimensional spacetime follows the null geodesic. Finally, we consider Kerr spacetime as an example in order to examine how the bending angle of light is computed by the present method. The finite-distance correction to the gravitomagnetic deflection angle due to the Sun's spin is around a pico-arcsecond level. The finite-distance corrections for Sgr A* also are estimated to be very small. Therefore, the gravitomagnetic finite-distance corrections for these objects are unlikely to be observed with present technology.

  10. Lipschitz Metrics for a Class of Nonlinear Wave Equations

    Science.gov (United States)

    Bressan, Alberto; Chen, Geng

    2017-12-01

    The nonlinear wave equation {u_{tt}-c(u)(c(u)u_x)_x=0} determines a flow of conservative solutions taking values in the space {H^1(R)}. However, this flow is not continuous with respect to the natural H 1 distance. The aim of this paper is to construct a new metric which renders the flow uniformly Lipschitz continuous on bounded subsets of {H^1(R)}. For this purpose, H 1 is given the structure of a Finsler manifold, where the norm of tangent vectors is defined in terms of an optimal transportation problem. For paths of piecewise smooth solutions, one can carefully estimate how the weighted length grows in time. By the generic regularity result proved in [7], these piecewise regular paths are dense and can be used to construct a geodesic distance with the desired Lipschitz property.

  11. a tensor theory of gravitation in a curved metric on a flat background

    International Nuclear Information System (INIS)

    Drummond, J.E.

    1979-01-01

    A theory of gravity is proposed using a tensor potential for the field on a flat metric. This potential cannot be isolated by local observations, but some details can be deduced from measurements at a distance. The requirement that the field equations for the tensor potential shall be deducible from an action integral, that the action and field equations are gauge invariant, and, conversely, that the Lagrangian in the action integral can be integrated from the field equations leads to Einstein's field equations. The requirement that the field energy-momentum tensor exists leads to a constraint on the tensor potential. If the constraint is a differential gauge condition, then it can only be the Hilbert condition giving a unique background tensor, metric tensor and tensor potential. For a continuous field inside a solid sphere the metric must be homogeneous in the spatial coordinates, and the associated field energy-momentum tensor has properties consistent with Newtonian dynamics. (author)

  12. Scalar-metric and scalar-metric-torsion gravitational theories

    International Nuclear Information System (INIS)

    Aldersley, S.J.

    1977-01-01

    The techniques of dimensional analysis and of the theory of tensorial concomitants are employed to study field equations in gravitational theories which incorporate scalar fields of the Brans-Dicke type. Within the context of scalar-metric gravitational theories, a uniqueness theorem for the geometric (or gravitational) part of the field equations is proven and a Lagrangian is determined which is uniquely specified by dimensional analysis. Within the context of scalar-metric-torsion gravitational theories a uniqueness theorem for field Lagrangians is presented and the corresponding Euler-Lagrange equations are given. Finally, an example of a scalar-metric-torsion theory is presented which is similar in many respects to the Brans-Dicke theory and the Einstein-Cartan theory

  13. Metrics of quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Chen Jingling

    2011-01-01

    In this work we study metrics of quantum states, which are natural generalizations of the usual trace metric and Bures metric. Some useful properties of the metrics are proved, such as the joint convexity and contractivity under quantum operations. Our result has a potential application in studying the geometry of quantum states as well as the entanglement detection.

  14. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    Science.gov (United States)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  15. Morphometric differentiation of Corex ligerica Gay in Poland

    Directory of Open Access Journals (Sweden)

    Lech Urbaniak

    2014-01-01

    Full Text Available The experimental material involved 7 Carex ligerica Gay populations which were cultured in standardised conditions in a greenhouse before their spikes were collected for morphological studies. Four characters reflecting size of male and female glumes, selected from particular spikes were examined. Mahalanobis distances for each pair of populations were calculated and their significance was estimated using Hotellings T2 statistics. Dendrite was constructed on the basis of shortest Mahalanobis distances while Euclidean distances provided grounds for hierarchy grouping. The result obtained from a multivariate analysis indicated a definite interpopulation variability within the species. All of the examined populations were found to differ significantly on the grounds of Mahalanobis distances. The dendrograms manifested the distinct character of the populations originating from regions around the lower course of the Vistula river - 5 (Toruń-Wrzosy, 3 (Tychnowy, 2 (Piaski and 6 (Kadyny, not noted before. Moreover, similarity of two geographically distant populations, the population 1 (Złotoria from the Central Poland and the population 4 (Szumiłowo from the western part of the country, attracted attention, as well as the individual character of the population 7 (Kopanica originating from the southernmost location. From the point of view of historical geography of plants, the obtained differentiation pattern may represent sequele of migration in the postglacial period, which crossed the area of Poland along multiple distinct pathways. The obtained results point to importance of culturing plants in uniform conditions of a greenhouse, which permits to describe genetic variability unbiased by modifying effects of the environment.

  16. METRIC context unit architecture

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, R.O.

    1988-01-01

    METRIC is an architecture for a simple but powerful Reduced Instruction Set Computer (RISC). Its speed comes from the simultaneous processing of several instruction streams, with instructions from the various streams being dispatched into METRIC's execution pipeline as they become available for execution. The pipeline is thus kept full, with a mix of instructions for several contexts in execution at the same time. True parallel programming is supported within a single execution unit, the METRIC Context Unit. METRIC's architecture provides for expansion through the addition of multiple Context Units and of specialized Functional Units. The architecture thus spans a range of size and performance from a single-chip microcomputer up through large and powerful multiprocessors. This research concentrates on the specification of the METRIC Context Unit at the architectural level. Performance tradeoffs made during METRIC's design are discussed, and projections of METRIC's performance are made based on simulation studies.

  17. MyLibrary@LANL: proximity and semi-metric networks for a collaborative and recommender web service

    Energy Technology Data Exchange (ETDEWEB)

    Rocha, L. M. [Indiana Univ., Bloomington, IN (United States). School of Informatics and Cognitive Science Program; Simas, T. [Indiana Univ., Bloomington, IN (United States). School of Informatics and Cognitive Science Program; Rechtsteiner, A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); DiGiacomo, M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Research Library; Luce, R. E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Research Library

    2005-09-01

    We describe a network approach to building recommendation systems for a WWW service. We employ two different types of weighted graphs in our analysis and development: Proximity graphs, a type of Fuzzy Graphs based on a co-occurrence probability, and semi-metric distance graphs, which do not observe the triangle inequality of Euclidean distances. Both types of graphs are used to develop intelligent recommendation and collaboration systems for the MyLibrary@LANL web service, a user-centered front-end to the Los Alamos National Laboratory's (LANL) digital library collections and WWW resources.

  18. Integrated ancillary and remote sensing data for land use ...

    African Journals Online (AJOL)

    Full Name

    The application of GMM to remote sensing image classification ... A . The boundary that has a Mahalanobis distance to the centre ... yields the Baye's theorem: ..... bands were extracted using the layer properties tool and visualised in MATLAB ...

  19. Approximate Bayesian computation for forward modeling in cosmology

    International Nuclear Information System (INIS)

    Akeret, Joël; Refregier, Alexandre; Amara, Adam; Seehars, Sebastian; Hasner, Caspar

    2015-01-01

    Bayesian inference is often used in cosmology and astrophysics to derive constraints on model parameters from observations. This approach relies on the ability to compute the likelihood of the data given a choice of model parameters. In many practical situations, the likelihood function may however be unavailable or intractable due to non-gaussian errors, non-linear measurements processes, or complex data formats such as catalogs and maps. In these cases, the simulation of mock data sets can often be made through forward modeling. We discuss how Approximate Bayesian Computation (ABC) can be used in these cases to derive an approximation to the posterior constraints using simulated data sets. This technique relies on the sampling of the parameter set, a distance metric to quantify the difference between the observation and the simulations and summary statistics to compress the information in the data. We first review the principles of ABC and discuss its implementation using a Population Monte-Carlo (PMC) algorithm and the Mahalanobis distance metric. We test the performance of the implementation using a Gaussian toy model. We then apply the ABC technique to the practical case of the calibration of image simulations for wide field cosmological surveys. We find that the ABC analysis is able to provide reliable parameter constraints for this problem and is therefore a promising technique for other applications in cosmology and astrophysics. Our implementation of the ABC PMC method is made available via a public code release

  20. Observable traces of non-metricity: New constraints on metric-affine gravity

    Science.gov (United States)

    Delhom-Latorre, Adrià; Olmo, Gonzalo J.; Ronco, Michele

    2018-05-01

    Relaxing the Riemannian condition to incorporate geometric quantities such as torsion and non-metricity may allow to explore new physics associated with defects in a hypothetical space-time microstructure. Here we show that non-metricity produces observable effects in quantum fields in the form of 4-fermion contact interactions, thereby allowing us to constrain the scale of non-metricity to be greater than 1 TeV by using results on Bahbah scattering. Our analysis is carried out in the framework of a wide class of theories of gravity in the metric-affine approach. The bound obtained represents an improvement of several orders of magnitude to previous experimental constraints.

  1. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    Science.gov (United States)

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  2. Fulltext PDF

    Indian Academy of Sciences (India)

    effect, the Mahalanobis distance, the Chandra- ... More importantly, almost all of this work was .... ced age, failing health and lack of assistance .... effect. Balance 55 recordings are of good quality. Sixteen calls have one to three other.

  3. A Complete Quantitative Deduction System for the Bisimilarity Distance on Markov Chains

    DEFF Research Database (Denmark)

    Bacci, Giovanni; Bacci, Giorgio; Larsen, Kim Guldstrand

    2017-01-01

    In this paper we propose a complete axiomatization of the bisimilarity distance of Desharnais et al. for the class of finite labelled Markov chains. Our axiomatization is given in the style of a quantitative extension of equational logic recently proposed by Mardare, Panangaden, and Plotkin (LICS...... an axiom for dealing with the Kantorovich distance between probability distributions. The axiomatization is then used to propose a metric extension of a Kleene's style representation theorem for finite labelled Markov chains, that was proposed (in a more general coalgebraic fashion) by Silva et al. (Inf...

  4. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  5. Image retrieval by information fusion based on scalable vocabulary tree and robust Hausdorff distance

    Science.gov (United States)

    Che, Chang; Yu, Xiaoyang; Sun, Xiaoming; Yu, Boyang

    2017-12-01

    In recent years, Scalable Vocabulary Tree (SVT) has been shown to be effective in image retrieval. However, for general images where the foreground is the object to be recognized while the background is cluttered, the performance of the current SVT framework is restricted. In this paper, a new image retrieval framework that incorporates a robust distance metric and information fusion is proposed, which improves the retrieval performance relative to the baseline SVT approach. First, the visual words that represent the background are diminished by using a robust Hausdorff distance between different images. Second, image matching results based on three image signature representations are fused, which enhances the retrieval precision. We conducted intensive experiments on small-scale to large-scale image datasets: Corel-9, Corel-48, and PKU-198, where the proposed Hausdorff metric and information fusion outperforms the state-of-the-art methods by about 13, 15, and 15%, respectively.

  6. On using multiple routing metrics with destination sequenced distance vector protocol for MultiHop wireless ad hoc networks

    Science.gov (United States)

    Mehic, M.; Fazio, P.; Voznak, M.; Partila, P.; Komosny, D.; Tovarek, J.; Chmelikova, Z.

    2016-05-01

    A mobile ad hoc network is a collection of mobile nodes which communicate without a fixed backbone or centralized infrastructure. Due to the frequent mobility of nodes, routes connecting two distant nodes may change. Therefore, it is not possible to establish a priori fixed paths for message delivery through the network. Because of its importance, routing is the most studied problem in mobile ad hoc networks. In addition, if the Quality of Service (QoS) is demanded, one must guarantee the QoS not only over a single hop but over an entire wireless multi-hop path which may not be a trivial task. In turns, this requires the propagation of QoS information within the network. The key to the support of QoS reporting is QoS routing, which provides path QoS information at each source. To support QoS for real-time traffic one needs to know not only minimum delay on the path to the destination but also the bandwidth available on it. Therefore, throughput, end-to-end delay, and routing overhead are traditional performance metrics used to evaluate the performance of routing protocol. To obtain additional information about the link, most of quality-link metrics are based on calculation of the lost probabilities of links by broadcasting probe packets. In this paper, we address the problem of including multiple routing metrics in existing routing packets that are broadcasted through the network. We evaluate the efficiency of such approach with modified version of DSDV routing protocols in ns-3 simulator.

  7. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  8. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  9. Metrics with vanishing quantum corrections

    International Nuclear Information System (INIS)

    Coley, A A; Hervik, S; Gibbons, G W; Pope, C N

    2008-01-01

    We investigate solutions of the classical Einstein or supergravity equations that solve any set of quantum corrected Einstein equations in which the Einstein tensor plus a multiple of the metric is equated to a symmetric conserved tensor T μν (g αβ , ∂ τ g αβ , ∂ τ ∂ σ g αβ , ...,) constructed from sums of terms, the involving contractions of the metric and powers of arbitrary covariant derivatives of the curvature tensor. A classical solution, such as an Einstein metric, is called universal if, when evaluated on that Einstein metric, T μν is a multiple of the metric. A Ricci flat classical solution is called strongly universal if, when evaluated on that Ricci flat metric, T μν vanishes. It is well known that pp-waves in four spacetime dimensions are strongly universal. We focus attention on a natural generalization; Einstein metrics with holonomy Sim(n - 2) in which all scalar invariants are zero or constant. In four dimensions we demonstrate that the generalized Ghanam-Thompson metric is weakly universal and that the Goldberg-Kerr metric is strongly universal; indeed, we show that universality extends to all four-dimensional Sim(2) Einstein metrics. We also discuss generalizations to higher dimensions

  10. A Distance Bounding Protocol for Location-Cloaked Applications.

    Science.gov (United States)

    Molina-Martínez, Cristián; Galdames, Patricio; Duran-Faundez, Cristian

    2018-04-26

    Location-based services (LBSs) assume that users are willing to release trustworthy and useful details about their whereabouts. However, many location privacy concerns have arisen. For location privacy protection, several algorithms build a cloaking region to hide a user’s location. However, many applications may not operate adequately on cloaked locations. For example, a traditional distance bounding protocol (DBP)—which is run by two nodes called the prover and the verifier—may conclude an untight and useless distance between these two entities. An LBS (verifier) may use this distance as a metric of usefulness and trustworthiness of the location claimed by the user (prover). However, we show that if a tight distance is desired, traditional DBP can refine a user’s cloaked location and compromise its location privacy. To find a proper balance, we propose a location-privacy-aware DBP protocol. Our solution consists of adding some small delays before submitting any user’s response. We show that several issues arise when a certain delay is chosen, and we propose some solutions. The effectiveness of our techniques in balancing location refinement and utility is demonstrated through simulation.

  11. A Distance Bounding Protocol for Location-Cloaked Applications

    Directory of Open Access Journals (Sweden)

    Cristián Molina-Martínez

    2018-04-01

    Full Text Available Location-based services (LBSs assume that users are willing to release trustworthy and useful details about their whereabouts. However, many location privacy concerns have arisen. For location privacy protection, several algorithms build a cloaking region to hide a user’s location. However, many applications may not operate adequately on cloaked locations. For example, a traditional distance bounding protocol (DBP—which is run by two nodes called the prover and the verifier—may conclude an untight and useless distance between these two entities. An LBS (verifier may use this distance as a metric of usefulness and trustworthiness of the location claimed by the user (prover. However, we show that if a tight distance is desired, traditional DBP can refine a user’s cloaked location and compromise its location privacy. To find a proper balance, we propose a location-privacy-aware DBP protocol. Our solution consists of adding some small delays before submitting any user’s response. We show that several issues arise when a certain delay is chosen, and we propose some solutions. The effectiveness of our techniques in balancing location refinement and utility is demonstrated through simulation.

  12. Use of different exposure metrics for understanding multi-modal travel injury risk

    Directory of Open Access Journals (Sweden)

    S. Ilgin Guler

    2016-08-01

    Full Text Available The objective of this work is to identify characteristics of different metrics of exposure for quantifying multi-modal travel injury risk. First, a discussion on the use of time-based and trip-based metrics for road user exposure to injury risk, considering multiple travel modes, is presented. The main difference between a time-based and trip-based metric is argued to be that a time-based metric reflects the actual duration of time spent on the road exposed to the travel risks. This can be proven to be important when considering multiple modes since different modes typically different speeds and average travel distances. Next, the use of total number of trips, total time traveled, and mode share (time-based or trip-based is considered to compare the injury risk of a given mode at different locations. It is argued that using mode share the safety concept which focuses on absolute numbers can be generalized. Quantitative results are also obtained from combining travel survey data with police collision reports for ten counties in California. The data are aggregated for five modes: (i cars, (ii SUVs, (iii transit riders, (iv bicyclists, and (v pedestrians. These aggregated data are used to compare travel risk of different modes with time-based or trip-based exposure metrics. These quantitative results confirm the initial qualitative discussions. As the penetration of mobile probes for transportation data collection increases, the insights of this study can provide guidance on how to best utilize the added value of such data to better quantify travel injury risk, and improve safety.

  13. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  14. SPATIAL CLUSTER AND OUTLIER IDENTIFICATION OF GEOCHEMICAL ASSOCIATION OF ELEMENTS: A CASE STUDY IN JUIRUI COPPER MINING AREA

    Directory of Open Access Journals (Sweden)

    Tien Thanh NGUYEN

    2016-12-01

    Full Text Available Spatial clusters and spatial outliers play an important role in the study of the spatial distribution patterns of geochemical data. They characterize the fundamental properties of mineralization processes, the spatial distribution of mineral deposits, and ore element concentrations in mineral districts. In this study, a new method for the study of spatial distribution patterns of multivariate data is proposed based on a combination of robust Mahalanobis distance and local Moran’s Ii. In order to construct the spatial matrix, the Moran's I spatial correlogram was first used to determine the range. The robust Mahalanobis distances were then computed for an association of elements. Finally, local Moran’s Ii statistics was used to measure the degree of spatial association and discover the spatial distribution patterns of associations of Cu, Au, Mo, Ag, Pb, Zn, As, and Sb elements including spatial clusters and spatial outliers. Spatial patterns were analyzed at six different spatial scales (2km, 4 km, 6 km, 8 km, 10 km and 12 km for both the raw data and Box-Cox transformed data. The results show that identified spatial cluster and spatial outlier areas using local Moran’s Ii and the robust Mahalanobis accord the objective reality and have a good conformity with known deposits in the study area.

  15. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 ...We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... of (unbounded) metric-adjusted skew information....

  16. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  17. A p-Adic Metric for Particle Mass Scale Organization with Genetic Divisors

    International Nuclear Information System (INIS)

    Dai, Yang; Borisov, Alexey B.; Boyer, Keith; Rhodes, Charles K.

    2001-01-01

    The concept of genetic divisors can be given a quantitative measure with a non-Archimedean p-adic metric that is both computationally convenient and physically motivated. For two particles possessing distinct mass parameters x and y, the metric distance D(x, y) is expressed on the field of rational numbers Q as the inverse of the greatest common divisor [gcd (x , y)]. As a measure of genetic similarity, this metric can be applied to (1) the mass numbers of particle states and (2) the corresponding subgroup orders of these systems. The use of the Bezout identity in the form of a congruence for the expression of the gcd (x , y) corresponding to the v e and μ neutrinos (a) connects the genetic divisor concept to the cosmic seesaw congruence, (b) provides support for the δ-conjecture concerning the subgroup structure of particle states, and (c) quantitatively strengthens the interlocking relationships joining the values of the prospectively derived (i) electron neutrino (v e ) mass (0.808 meV), (ii) muon neutrino (v μ ) mass (27.68 meV), and (iii) unified strong-electroweak coupling constant (α* -1 = 34.26)

  18. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  19. The metric system: An introduction

    Energy Technology Data Exchange (ETDEWEB)

    Lumley, S.M.

    1995-05-01

    On July 13, 1992, Deputy Director Duane Sewell restated the Laboratory`s policy on conversion to the metric system which was established in 1974. Sewell`s memo announced the Laboratory`s intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory`s conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on July 25, 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation`s conversion to the metric system. The second part of this report is on applying the metric system.

  20. Attack-Resistant Trust Metrics

    Science.gov (United States)

    Levien, Raph

    The Internet is an amazingly powerful tool for connecting people together, unmatched in human history. Yet, with that power comes great potential for spam and abuse. Trust metrics are an attempt to compute the set of which people are trustworthy and which are likely attackers. This chapter presents two specific trust metrics developed and deployed on the Advogato Website, which is a community blog for free software developers. This real-world experience demonstrates that the trust metrics fulfilled their goals, but that for good results, it is important to match the assumptions of the abstract trust metric computation to the real-world implementation.

  1. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  2. The Graph, Geometry and Symmetries of the Genetic Code with Hamming Metric

    Directory of Open Access Journals (Sweden)

    Reijer Lenstra

    2015-07-01

    Full Text Available The similarity patterns of the genetic code result from similar codons encoding similar messages. We develop a new mathematical model to analyze these patterns. The physicochemical characteristics of amino acids objectively quantify their differences and similarities; the Hamming metric does the same for the 64 codons of the codon set. (Hamming distances equal the number of different codon positions: AAA and AAC are at 1-distance; codons are maximally at 3-distance. The CodonPolytope, a 9-dimensional geometric object, is spanned by 64 vertices that represent the codons and the Euclidian distances between these vertices correspond one-to-one with intercodon Hamming distances. The CodonGraph represents the vertices and edges of the polytope; each edge equals a Hamming 1-distance. The mirror reflection symmetry group of the polytope is isomorphic to the largest permutation symmetry group of the codon set that preserves Hamming distances. These groups contain 82,944 symmetries. Many polytope symmetries coincide with the degeneracy and similarity patterns of the genetic code. These code symmetries are strongly related with the face structure of the polytope with smaller faces displaying stronger code symmetries. Splitting the polytope stepwise into smaller faces models an early evolution of the code that generates this hierarchy of code symmetries. The canonical code represents a class of 41,472 codes with equivalent symmetries; a single class among an astronomical number of symmetry classes comprising all possible codes.

  3. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  4. The variability of Scots pine from Piekielna Góra as expressed by morphological and anatomical traits of needles

    Directory of Open Access Journals (Sweden)

    Maria A. Bobowicz

    2014-01-01

    Full Text Available Two-year old needles were collected from 30 standing Scots pine trees on Piekielna Góra. These needles were analysed in respect to 13 morphological and anatomical traits. The data so obtained was subjected to a whole range of multi-trait analytical methods in an attempt to determine the variability among the randomly chosen trees. Multivariate analysis of variance and canonical analysis were done as well as calculation of Mahalanobis distances between each pair of trees and their significance was tested by the Hotelling T2 statistics. Aminimum spanning tree was constructed on the basis of the shortest Mahalanobis distances, while a dendrogram (cluster analysis was compiled on the basis of Euclidean distances. It was found that in spite of the fact that the studied population sample of pines did not form internal, significantly differentiated groups, the variability among particular trees was large and depended on the given trait. The number of resin canals best differentiated the studied trees, while the Marcet coefficient did not significantly differentia­te any pair of trees.

  5. Holographic Spherically Symmetric Metrics

    Science.gov (United States)

    Petri, Michael

    The holographic principle (HP) conjectures, that the maximum number of degrees of freedom of any realistic physical system is proportional to the system's boundary area. The HP has its roots in the study of black holes. It has recently been applied to cosmological solutions. In this article we apply the HP to spherically symmetric static space-times. We find that any regular spherically symmetric object saturating the HP is subject to tight constraints on the (interior) metric, energy-density, temperature and entropy-density. Whenever gravity can be described by a metric theory, gravity is macroscopically scale invariant and the laws of thermodynamics hold locally and globally, the (interior) metric of a regular holographic object is uniquely determined up to a constant factor and the interior matter-state must follow well defined scaling relations. When the metric theory of gravity is general relativity, the interior matter has an overall string equation of state (EOS) and a unique total energy-density. Thus the holographic metric derived in this article can serve as simple interior 4D realization of Mathur's string fuzzball proposal. Some properties of the holographic metric and its possible experimental verification are discussed. The geodesics of the holographic metric describe an isotropically expanding (or contracting) universe with a nearly homogeneous matter-distribution within the local Hubble volume. Due to the overall string EOS the active gravitational mass-density is zero, resulting in a coasting expansion with Ht = 1, which is compatible with the recent GRB-data.

  6. High Speed Railway Environment Safety Evaluation Based on Measurement Attribute Recognition Model

    Directory of Open Access Journals (Sweden)

    Qizhou Hu

    2014-01-01

    Full Text Available In order to rationally evaluate the high speed railway operation safety level, the environmental safety evaluation index system of high speed railway should be well established by means of analyzing the impact mechanism of severe weather such as raining, thundering, lightning, earthquake, winding, and snowing. In addition to that, the attribute recognition will be identified to determine the similarity between samples and their corresponding attribute classes on the multidimensional space, which is on the basis of the Mahalanobis distance measurement function in terms of Mahalanobis distance with the characteristics of noncorrelation and nondimensionless influence. On top of the assumption, the high speed railway of China environment safety situation will be well elaborated by the suggested methods. The results from the detailed analysis show that the evaluation is basically matched up with the actual situation and could lay a scientific foundation for the high speed railway operation safety.

  7. A spatial model of white sturgeon rearing habitat in the lower Columbia River, USA

    Science.gov (United States)

    Hatten, J.R.; Parsley, M.J.

    2009-01-01

    Concerns over the potential effects of in-water placement of dredged materials prompted us to develop a GIS-based model that characterizes in a spatially explicit manner white sturgeon Acipenser transmontanus rearing habitat in the lower Columbia River, USA. The spatial model was developed using water depth, riverbed slope and roughness, fish positions collected in 2002, and Mahalanobis distance (D2). We created a habitat suitability map by identifying a Mahalanobis distance under which >50% of white sturgeon locations occurred in 2002 (i.e., high-probability habitat). White sturgeon preferred relatively moderate to high water depths, and low to moderate riverbed slope and roughness values. The eigenvectors indicated that riverbed slope and roughness were slightly more important than water depth, but all three variables were important. We estimated the impacts that fill might have on sturgeon habitat by simulating the addition of fill to the thalweg, in 3-m increments, and recomputing Mahalanobis distances. Channel filling simulations revealed that up to 9 m of fill would have little impact on high-probability habitat, but 12 and 15 m of fill resulted in habitat declines of ???12% and ???45%, respectively. This is the first spatially explicit predictive model of white sturgeon rearing habitat in the lower Columbia River, and the first to quantitatively predict the impacts of dredging operations on sturgeon habitat. Future research should consider whether water velocity improves the accuracy and specificity of the model, and to assess its applicability to other areas in the Columbia River.

  8. Metrics to describe the effects of landscape pattern on hydrology in a lotic peatland

    Science.gov (United States)

    Yuan, J.; Cohen, M. J.; Kaplan, D. A.; Acharya, S.; Larsen, L.; Nungesser, M.

    2013-12-01

    Strong reciprocal interactions exist between landscape patterns and ecological processes. Hydrology is the dominant abiotic driver of ecological processes in wetlands, particularly flowing wetlands, but is both the control on and controlled by the geometry of vegetation patterning. Landscape metrics are widely used to quantitatively link pattern and process. Our goal here was to use several candidate spatial pattern metrics to predict the effects of wetland vegetation pattern on hydrologic regime, specifically hydroperiod, in the ridge-slough patterned landscape of the Everglades. The metrics focus on the capacity for longitudinally connected flow, and thus the ability of this low-gradient patterned landscape to route water from upstream. We first explored flow friction cost (FFC), a weighted spatial distance procedure wherein ridges have a high flow cost than sloughs by virtue of their elevation and vegetation structure, to evaluate water movement through different landscape configurations. We also investigated existing published flow metrics, specifically the Directional Connectivity Index (DCI) and Landscape Discharge Competence (LDC), that seek to quantify connectivity, one of the sentinel targets of ecological restoration. Hydroperiod was estimated using a numerical hydrologic model (SWIFT 2D) in real and synthetic landscapes with varying vegetation properties ( patch anisotropy, ridge density). Synthetic landscapes were constrained by the geostatistical properties of the best conserved patterned, and contained five anisotropy levels and seven ridge density levels. These were used to construct the relationship between landscape metrics and hydroperiod. Then, using historical images from 1940 to 2004, we applied the metrics toback-cast hydroperiod. Current vegetation maps were used to test scale dependency for each metric. Our results suggest that both FFC and DCI are good predictors of hydroperiod under free flowing conditions, and that they can be used

  9. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  10. Comparison between Fisherian and Bayesian approach to ...

    African Journals Online (AJOL)

    ... of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to be better than Fisher's approach because of its low misclassification error rate. Keywords: variance-covariance matrices, centroids, prior probability, mahalanobis distance, probability of misclassification ...

  11. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  12. Long-distance quantum communication over noisy networks without long-time quantum memory

    Science.gov (United States)

    Mazurek, Paweł; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Łodyga, Justyna; Pankowski, Łukasz; PrzysieŻna, Anna

    2014-12-01

    The problem of sharing entanglement over large distances is crucial for implementations of quantum cryptography. A possible scheme for long-distance entanglement sharing and quantum communication exploits networks whose nodes share Einstein-Podolsky-Rosen (EPR) pairs. In Perseguers et al. [Phys. Rev. A 78, 062324 (2008), 10.1103/PhysRevA.78.062324] the authors put forward an important isomorphism between storing quantum information in a dimension D and transmission of quantum information in a D +1 -dimensional network. We show that it is possible to obtain long-distance entanglement in a noisy two-dimensional (2D) network, even when taking into account that encoding and decoding of a state is exposed to an error. For 3D networks we propose a simple encoding and decoding scheme based solely on syndrome measurements on 2D Kitaev topological quantum memory. Our procedure constitutes an alternative scheme of state injection that can be used for universal quantum computation on 2D Kitaev code. It is shown that the encoding scheme is equivalent to teleporting the state, from a specific node into a whole two-dimensional network, through some virtual EPR pair existing within the rest of network qubits. We present an analytic lower bound on fidelity of the encoding and decoding procedure, using as our main tool a modified metric on space-time lattice, deviating from a taxicab metric at the first and the last time slices.

  13. DLA Energy Biofuel Feedstock Metrics Study

    Science.gov (United States)

    2012-12-11

    moderately/highly in- vasive  Metric 2: Genetically modified organism ( GMO ) hazard, Yes/No and Hazard Category  Metric 3: Species hybridization...4– biofuel distribution Stage # 5– biofuel use Metric 1: State inva- siveness ranking Yes Minimal Minimal No No Metric 2: GMO hazard Yes...may utilize GMO microbial or microalgae species across the applicable biofuel life cycles (stages 1–3). The following consequence Metrics 4–6 then

  14. On the importance of the distance measures used to train and test knowledge-based potentials for proteins

    DEFF Research Database (Denmark)

    Carlsen, Martin; Koehl, Patrice; Røgen, Peter

    2014-01-01

    (PPD), while the other had the set of all distances filtered to reflect consistency in an ensemble of decoys (PPE). We tested four types of metric to characterize the distance between the decoy and the native structure, two based on extrinsic geometry (RMSD and GTD-TS*), and two based on intrinsic...... geometry (Q* and MT). The corresponding eight potentials were tested on a large collection of decoy sets. We found that it is usually better to train a potential using an intrinsic distance measure. We also found that PPE outperforms PPD, emphasizing the benefits of capturing consistent information...

  15. Biomechanical CT Metrics Are Associated With Patient Outcomes in COPD

    Science.gov (United States)

    Bodduluri, Sandeep; Bhatt, Surya P; Hoffman, Eric A.; Newell, John D.; Martinez, Carlos H.; Dransfield, Mark T.; Han, Meilan K.; Reinhardt, Joseph M.

    2017-01-01

    Background Traditional metrics of lung disease such as those derived from spirometry and static single-volume CT images are used to explain respiratory morbidity in patients with chronic obstructive pulmonary disease (COPD), but are insufficient. We hypothesized that the mean Jacobian determinant, a measure of local lung expansion and contraction with respiration, would contribute independently to clinically relevant functional outcomes. Methods We applied image registration techniques to paired inspiratory-expiratory CT scans and derived the Jacobian determinant of the deformation field between the two lung volumes to map local volume change with respiration. We analyzed 490 participants with COPD with multivariable regression models to assess strengths of association between traditional CT metrics of disease and the Jacobian determinant with respiratory morbidity including dyspnea (mMRC), St Georges Respiratory Questionnaire (SGRQ) score, six-minute walk distance (6MWD), and the BODE index, as well as all-cause mortality. Results The Jacobian determinant was significantly associated with SGRQ (adjusted regression co-efficient β = −11.75,95%CI −21.6 to −1.7;p=0.020), and with 6MWD (β=321.15, 95%CI 134.1 to 508.1;p<0.001), independent of age, sex, race, body-mass-index, FEV1, smoking pack-years, CT emphysema, CT gas trapping, airway wall thickness, and CT scanner protocol. The mean Jacobian determinant was also independently associated with the BODE index (β= −0.41, 95%CI −0.80 to −0.02; p = 0.039), and mortality on follow-up (adjusted hazards ratio = 4.26, 95%CI = 0.93 to 19.23; p = 0.064). Conclusion Biomechanical metrics representing local lung expansion and contraction improve prediction of respiratory morbidity and mortality and offer additional prognostic information beyond traditional measures of lung function and static single-volume CT metrics. PMID:28044005

  16. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  17. Computational Analysis of Distance Operators for the Iterative Closest Point Algorithm.

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    Full Text Available The Iterative Closest Point (ICP algorithm is currently one of the most popular methods for rigid registration so that it has become the standard in the Robotics and Computer Vision communities. Many applications take advantage of it to align 2D/3D surfaces due to its popularity and simplicity. Nevertheless, some of its phases present a high computational cost thus rendering impossible some of its applications. In this work, it is proposed an efficient approach for the matching phase of the Iterative Closest Point algorithm. This stage is the main bottleneck of that method so that any efficiency improvement has a great positive impact on the performance of the algorithm. The proposal consists in using low computational cost point-to-point distance metrics instead of classic Euclidean one. The candidates analysed are the Chebyshev and Manhattan distance metrics due to their simpler formulation. The experiments carried out have validated the performance, robustness and quality of the proposal. Different experimental cases and configurations have been set up including a heterogeneous set of 3D figures, several scenarios with partial data and random noise. The results prove that an average speed up of 14% can be obtained while preserving the convergence properties of the algorithm and the quality of the final results.

  18. EDUCATEE'S THESAURUS AS AN OBJECT OF MEASURING LEARNED MATERIAL OF THE DISTANCE LEARNING COURSE

    Directory of Open Access Journals (Sweden)

    Alexander Aleksandrovich RYBANOV

    2013-10-01

    Full Text Available Monitoring and control over the process of studying the distance learning course are based on solving the problem of making out an adequate integral mark to the educatee for mastering entire study course, by testing results. It is suggested to use the degree of correspondence between educatee's thesaurus and the study course thesaurus as an integral mark for the degree of mastering the distance learning course. Study course thesaurus is a set of the course objects with relations between them specified. The article considers metrics of the study course thesaurus complexity, made on the basis of the graph theory and the information theory. It is suggested to use the amount of information contained in the study course thesaurus graph as the metrics of the study course thesaurus complexity. Educatee's thesaurus is considered as an object of measuring educational material learned at the semantic level and is assessed on the basis of amount of information contained in its graph, taking into account the factors of learning the thesaurus objects.

  19. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2018-01-01

    The file attached to this record is the author's final peer reviewed version The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, instead of using existing metrics, n...

  20. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  1. Generalized Painleve-Gullstrand metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lin Chunyu [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: l2891112@mail.ncku.edu.tw; Soo Chopin [Department of Physics, National Cheng Kung University, Tainan 70101, Taiwan (China)], E-mail: cpsoo@mail.ncku.edu.tw

    2009-02-02

    An obstruction to the implementation of spatially flat Painleve-Gullstrand (PG) slicings is demonstrated, and explicitly discussed for Reissner-Nordstroem and Schwarzschild-anti-deSitter spacetimes. Generalizations of PG slicings which are not spatially flat but which remain regular at the horizons are introduced. These metrics can be obtained from standard spherically symmetric metrics by physical Lorentz boosts. With these generalized PG metrics, problematic contributions to the imaginary part of the action in the Parikh-Wilczek derivation of Hawking radiation due to the obstruction can be avoided.

  2. Kerr metric in the deSitter background

    International Nuclear Information System (INIS)

    Vaidya, P.C.

    1984-01-01

    In addition to the Kerr metric with cosmological constant Λ several other metrics are presented giving a Kerr-like solution of Einstein's equations in the background of deSitter universe. A new metric of what may be termed as rotating deSitter space-time devoid of matter but containing null fluid with twisting null rays, has been presented. This metric reduces to the standard deSitter metric when the twist in the rays vanishes. Kerr metric in this background is the immediate generalization of Schwarzschild's exterior metric with cosmological constant. (author)

  3. Shape anisotropy: tensor distance to anisotropy measure

    Science.gov (United States)

    Weldeselassie, Yonas T.; El-Hilo, Saba; Atkins, M. S.

    2011-03-01

    Fractional anisotropy, defined as the distance of a diffusion tensor from its closest isotropic tensor, has been extensively studied as quantitative anisotropy measure for diffusion tensor magnetic resonance images (DT-MRI). It has been used to reveal the white matter profile of brain images, as guiding feature for seeding and stopping in fiber tractography and for the diagnosis and assessment of degenerative brain diseases. Despite its extensive use in DT-MRI community, however, not much attention has been given to the mathematical correctness of its derivation from diffusion tensors which is achieved using Euclidean dot product in 9D space. But, recent progress in DT-MRI has shown that the space of diffusion tensors does not form a Euclidean vector space and thus Euclidean dot product is not appropriate for tensors. In this paper, we propose a novel and robust rotationally invariant diffusion anisotropy measure derived using the recently proposed Log-Euclidean and J-divergence tensor distance measures. An interesting finding of our work is that given a diffusion tensor, its closest isotropic tensor is different for different tensor distance metrics used. We demonstrate qualitatively that our new anisotropy measure reveals superior white matter profile of DT-MR brain images and analytically show that it has a higher signal to noise ratio than fractional anisotropy.

  4. Genetic Divergence in Eucalyptus camaldulensis Progenies in the Savanna Biome in Mato Grosso, Brazil.

    Directory of Open Access Journals (Sweden)

    Reginaldo Brito da Costa

    Full Text Available Assessing the parental genetic differences and their subsequent prediction of progeny performance is an important first step to assure the efficiency of any breeding program. In this study, we estimate the genetic divergence in Eucalyptus camaldulensis based on the morphological traits of 132 progenies grown in a savanna biome. Thus, a field experiment was performed using a randomized block design and five replications to compare divergences in total height, commercial height, diameter at breast height, stem form and survival rate at 48 months. Tocher's clustering method was performed using the Mahalanobis and Euclidian distances. The Mahalanobis distance seemed more reliable for the assessed parameters and clustered all of the progenies into fourteen major groups. The most similar progenies (86 accessions were clustered into Group I, while the most dissimilar (1 progeny represented Group XIV. The divergence analysis indicated that promising crosses could be made between progenies allocated in different groups for high genetic divergence and for favorable morphological traits.

  5. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  6. Learning a Novel Detection Metric for the Detection of O’Connell Effect Eclipsing Binaries

    Science.gov (United States)

    Johnston, Kyle; Haber, Rana; Knote, Matthew; Caballero-Nieves, Saida Maria; Peter, Adrian; Petit, Véronique

    2018-01-01

    With the advent of digital astronomy, new benefits and new challenges have been presented to the modern day astronomer. No longer can the astronomer rely on manual processing, instead the profession as a whole has begun to adopt more advanced computational means. Here we focus on the construction and application of a novel time-domain signature extraction methodology and the development of a supporting supervised pattern detection algorithm for the targeted identification of eclipsing binaries which demonstrate a feature known as the O’Connell Effect. A methodology for the reduction of stellar variable observations (time-domain data) into Distribution Fields (DF) is presented. Push-Pull metric learning, a variant of LMNN learning, is used to generate a learned distance metric for the specific detection problem proposed. The metric will be trained on a set of a labelled Kepler eclipsing binary data, in particular systems showing the O’Connell effect. Performance estimates will be presented, as well the results of the detector applied to an unlabeled Kepler EB data set; this work is a crucial step in the upcoming era of big data from the next generation of big telescopes, such as LSST.

  7. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  8. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  9. Representing distance, consuming distance

    DEFF Research Database (Denmark)

    Larsen, Gunvor Riber

    Title: Representing Distance, Consuming Distance Abstract: Distance is a condition for corporeal and virtual mobilities, for desired and actual travel, but yet it has received relatively little attention as a theoretical entity in its own right. Understandings of and assumptions about distance...... are being consumed in the contemporary society, in the same way as places, media, cultures and status are being consumed (Urry 1995, Featherstone 2007). An exploration of distance and its representations through contemporary consumption theory could expose what role distance plays in forming...

  10. Mapping growing stock volume and forest live biomass: a case study of the Polissya region of Ukraine

    Science.gov (United States)

    Bilous, Andrii; Myroniuk, Viktor; Holiaka, Dmytrii; Bilous, Svitlana; See, Linda; Schepaschenko, Dmitry

    2017-10-01

    Forest inventory and biomass mapping are important tasks that require inputs from multiple data sources. In this paper we implement two methods for the Ukrainian region of Polissya: random forest (RF) for tree species prediction and k-nearest neighbors (k-NN) for growing stock volume and biomass mapping. We examined the suitability of the five-band RapidEye satellite image to predict the distribution of six tree species. The accuracy of RF is quite high: ~99% for forest/non-forest mask and 89% for tree species prediction. Our results demonstrate that inclusion of elevation as a predictor variable in the RF model improved the performance of tree species classification. We evaluated different distance metrics for the k-NN method, including Euclidean or Mahalanobis distance, most similar neighbor (MSN), gradient nearest neighbor, and independent component analysis. The MSN with the four nearest neighbors (k = 4) is the most precise (according to the root-mean-square deviation) for predicting forest attributes across the study area. The k-NN method allowed us to estimate growing stock volume with an accuracy of 3 m3 ha-1 and for live biomass of about 2 t ha-1 over the study area.

  11. White-headed woodpecker nesting ecology after wildfire

    Science.gov (United States)

    Catherine S. Wightman; Victoria A. Saab; Chris Forristal; Kim Mellen-Mclean; Amy Markus

    2010-01-01

    Within forests susceptible to wildfire and insect infestations, land managers need to balance dead tree removal and habitat requirements for wildlife species associated with snags. We used Mahalanobis distance methods to develop predictive models of white-headed woodpecker (Picoides albolarvatus) nesting habitat in postfire ponderosa pine (Pinus ponderosa)-dominated...

  12. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  13. Brand metrics that matter

    NARCIS (Netherlands)

    Muntinga, D.; Bernritter, S.

    2017-01-01

    Het merk staat steeds meer centraal in de organisatie. Het is daarom essentieel om de gezondheid, prestaties en ontwikkelingen van het merk te meten. Het is echter een uitdaging om de juiste brand metrics te selecteren. Een enorme hoeveelheid metrics vraagt de aandacht van merkbeheerders. Maar welke

  14. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  15. Taylor expansion of luminosity distance in Szekeres cosmological models: effects of local structures evolution on cosmographic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Villani, Mattia, E-mail: villani@fi.infn.it [Sezione INFN di Firenze, Polo Scientifico Via Sansone 1, 50019, Sesto Fiorentino (Italy)

    2014-06-01

    We consider the Goode-Wainwright representation of the Szekeres cosmological models and calculate the Taylor expansion of the luminosity distance in order to study the effects of the inhomogeneities on cosmographic parameters. Without making a particular choice for the arbitrary functions defining the metric, we Taylor expand up to the second order in redshift for Family I and up to the third order for Family II Szekeres metrics under the hypotesis, based on observation, that local structure formation is over. In a conservative fashion, we also allow for the existence of a non null cosmological constant.

  16. Pythagoras's theorem on a two-dimensional lattice from a 'natural' Dirac operator and Connes's distance formula

    Energy Technology Data Exchange (ETDEWEB)

    Dai Jian [Theory Group, Department of Physics, Peking University, Beijing (China)]. E-mail: jdai@mail.phy.pku.edu.cn; Song Xingchang [Theory Group, Department of Physics, Peking University, Beijing (China)]. E-mail: songxc@ibm320h.phy.pku.edu.cn

    2001-07-13

    One of the key ingredients of Connes's noncommutative geometry is a generalized Dirac operator which induces a metric (Connes's distance) on the pure state space. We generalize such a Dirac operator devised by Dimakis et al, whose Connes distance recovers the linear distance on an one-dimensional lattice, to the two-dimensional case. This Dirac operator has the local eigenvalue property and induces a Euclidean distance on this two-dimensional lattice, which is referred to as 'natural'. This kind of Dirac operator can be easily generalized into any higher-dimensional lattices. (author)

  17. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  18. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  19. Use of Linear Discriminant Function Analysis in Five Yield Sub ...

    African Journals Online (AJOL)

    K-means cluster analysis grouped the 134 accessions into four distinct groups. Pairwise Mahalanobis 2 distance (D) among some of the groups was highly significant. From the study the yield sub-characters pod length, pod width, peduncle length and 100-seed weight contributed most to group separation in the cowpea ...

  20. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  1. Energy functionals for Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Headrick, M; Nassar, A

    2013-01-01

    We identify a set of ''energy'' functionals on the space of metrics in a given Kähler class on a Calabi-Yau manifold, which are bounded below and minimized uniquely on the Ricci-flat metric in that class. Using these functionals, we recast the problem of numerically solving the Einstein equation as an optimization problem. We apply this strategy, using the ''algebraic'' metrics (metrics for which the Kähler potential is given in terms of a polynomial in the projective coordinates), to the Fermat quartic and to a one-parameter family of quintics that includes the Fermat and conifold quintics. We show that this method yields approximations to the Ricci-flat metric that are exponentially accurate in the degree of the polynomial (except at the conifold point, where the convergence is polynomial), and therefore orders of magnitude more accurate than the balanced metrics, previously studied as approximations to the Ricci-flat metric. The method is relatively fast and easy to implement. On the theoretical side, we also show that the functionals can be used to give a heuristic proof of Yau's theorem

  2. Clustering by Partitioning around Medoids using Distance-Based Similarity Measures on Interval-Scaled Variables

    Directory of Open Access Journals (Sweden)

    D. L. Nkweteyim

    2018-03-01

    Full Text Available It is reported in this paper, the results of a study of the partitioning around medoids (PAM clustering algorithm applied to four datasets, both standardized and not, and of varying sizes and numbers of clusters. The angular distance proximity measure in addition to the two more traditional proximity measures, namely the Euclidean distance and Manhattan distance, was used to compute object-object similarity. The data used in the study comprise three widely available datasets, and one that was constructed from publicly available climate data. Results replicate some of the well known facts about the PAM algorithm, namely that the quality of the clusters generated tend to be much better for small datasets, that the silhouette value is a good, even if not perfect, guide for the optimal number of clusters to generate, and that human intervention is required to interpret generated clusters. Additionally, results also indicate that the angular distance measure, which traditionally has not been widely used in clustering, outperforms both the Euclidean and Manhattan distance metrics in certain situations.

  3. New Sensors for Cultural Heritage Metric Survey: The ToF Cameras

    Directory of Open Access Journals (Sweden)

    Filiberto Chiabrando

    2011-12-01

    Full Text Available ToF cameras are new instruments based on CCD/CMOS sensors which measure distances instead of radiometry. The resulting point clouds show the same properties (both in terms of accuracy and resolution of the point clouds acquired by means of traditional LiDAR devices. ToF cameras are cheap instruments (less than 10.000 € based on video real time distance measurements and can represent an interesting alternative to the more expensive LiDAR instruments. In addition, the limited weight and dimensions of ToF cameras allow a reduction of some practical problems such as transportation and on-site management. Most of the commercial ToF cameras use the phase-shift method to measure distances. Due to the use of only one wavelength, most of them have limited range of application (usually about 5 or 10 m. After a brief description of the main characteristics of these instruments, this paper explains and comments the results of the first experimental applications of ToF cameras in Cultural Heritage 3D metric survey.  The possibility to acquire more than 30 frames/s and future developments of these devices in terms of use of more than one wavelength to overcome the ambiguity problem allow to foresee new interesting applications.

  4. Detection of image structures using the Fisher information and the Rao metric.

    Science.gov (United States)

    Maybank, Stephen J

    2004-12-01

    In many detection problems, the structures to be detected are parameterized by the points of a parameter space. If the conditional probability density function for the measurements is known, then detection can be achieved by sampling the parameter space at a finite number of points and checking each point to see if the corresponding structure is supported by the data. The number of samples and the distances between neighboring samples are calculated using the Rao metric on the parameter space. The Rao metric is obtained from the Fisher information which is, in turn, obtained from the conditional probability density function. An upper bound is obtained for the probability of a false detection. The calculations are simplified in the low noise case by making an asymptotic approximation to the Fisher information. An application to line detection is described. Expressions are obtained for the asymptotic approximation to the Fisher information, the volume of the parameter space, and the number of samples. The time complexity for line detection is estimated. An experimental comparison is made with a Hough transform-based method for detecting lines.

  5. Regge calculus from discontinuous metrics

    International Nuclear Information System (INIS)

    Khatsymovsky, V.M.

    2003-01-01

    Regge calculus is considered as a particular case of the more general system where the linklengths of any two neighbouring 4-tetrahedra do not necessarily coincide on their common face. This system is treated as that one described by metric discontinuous on the faces. In the superspace of all discontinuous metrics the Regge calculus metrics form some hypersurface defined by continuity conditions. Quantum theory of the discontinuous metric system is assumed to be fixed somehow in the form of quantum measure on (the space of functionals on) the superspace. The problem of reducing this measure to the Regge hypersurface is addressed. The quantum Regge calculus measure is defined from a discontinuous metric measure by inserting the δ-function-like phase factor. The requirement that continuity conditions be imposed in a 'face-independent' way fixes this factor uniquely. The term 'face-independent' means that this factor depends only on the (hyper)plane spanned by the face, not on it's form and size. This requirement seems to be natural from the viewpoint of existence of the well-defined continuum limit maximally free of lattice artefacts

  6. Numerical Calabi-Yau metrics

    International Nuclear Information System (INIS)

    Douglas, Michael R.; Karp, Robert L.; Lukic, Sergio; Reinbacher, Rene

    2008-01-01

    We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results

  7. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  8. Comparação de métodos de agrupamento para o estudo da divergência genética em cultivares de feijão Comparison of cluster methods for the study of genetic diversity in common bean cultivars

    Directory of Open Access Journals (Sweden)

    Alberto Cargnelutti Filho

    2008-11-01

    plant, number of seeds per pod, weight of 100 grains, final population of plants, number of days of the emergency to flowering, number of days of the emergency to harvest, height of first pod insertion and height of the final pod insertion. Clusters based on the standardized average euclidian distance are distinct from those formed on the basis of Mahalanobis generalized distance. The Tocher's method and hierarchical methods of the single linkage, Ward, complete linkage, median, the average linkage within the group and average linkage between groups, based on the Mahalanobis generalized distance form agreement cluster. The cultivar 'Iraí' presents distinct behavior from other cultivars.

  9. Metrics for Evaluation of Student Models

    Science.gov (United States)

    Pelanek, Radek

    2015-01-01

    Researchers use many different metrics for evaluation of performance of student models. The aim of this paper is to provide an overview of commonly used metrics, to discuss properties, advantages, and disadvantages of different metrics, to summarize current practice in educational data mining, and to provide guidance for evaluation of student…

  10. Riemannian metric optimization on surfaces (RMOS) for intrinsic brain mapping in the Laplace-Beltrami embedding space.

    Science.gov (United States)

    Gahm, Jin Kyu; Shi, Yonggang

    2018-05-01

    Surface mapping methods play an important role in various brain imaging studies from tracking the maturation of adolescent brains to mapping gray matter atrophy patterns in Alzheimer's disease. Popular surface mapping approaches based on spherical registration, however, have inherent numerical limitations when severe metric distortions are present during the spherical parameterization step. In this paper, we propose a novel computational framework for intrinsic surface mapping in the Laplace-Beltrami (LB) embedding space based on Riemannian metric optimization on surfaces (RMOS). Given a diffeomorphism between two surfaces, an isometry can be defined using the pullback metric, which in turn results in identical LB embeddings from the two surfaces. The proposed RMOS approach builds upon this mathematical foundation and achieves general feature-driven surface mapping in the LB embedding space by iteratively optimizing the Riemannian metric defined on the edges of triangular meshes. At the core of our framework is an optimization engine that converts an energy function for surface mapping into a distance measure in the LB embedding space, which can be effectively optimized using gradients of the LB eigen-system with respect to the Riemannian metrics. In the experimental results, we compare the RMOS algorithm with spherical registration using large-scale brain imaging data, and show that RMOS achieves superior performance in the prediction of hippocampal subfields and cortical gyral labels, and the holistic mapping of striatal surfaces for the construction of a striatal connectivity atlas from substantia nigra. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Development of a perceptually calibrated objective metric of noise

    Science.gov (United States)

    Keelan, Brian W.; Jin, Elaine W.; Prokushkin, Sergey

    2011-01-01

    A system simulation model was used to create scene-dependent noise masks that reflect current performance of mobile phone cameras. Stimuli with different overall magnitudes of noise and with varying mixtures of red, green, blue, and luminance noises were included in the study. Eleven treatments in each of ten pictorial scenes were evaluated by twenty observers using the softcopy ruler method. In addition to determining the quality loss function in just noticeable differences (JNDs) for the average observer and scene, transformations for different combinations of observer sensitivity and scene susceptibility were derived. The psychophysical results were used to optimize an objective metric of isotropic noise based on system noise power spectra (NPS), which were integrated over a visual frequency weighting function to yield perceptually relevant variances and covariances in CIE L*a*b* space. Because the frequency weighting function is expressed in terms of cycles per degree at the retina, it accounts for display pixel size and viewing distance effects, so application-specific predictions can be made. Excellent results were obtained using only L* and a* variances and L*a* covariance, with relative weights of 100, 5, and 12, respectively. The positive a* weight suggests that the luminance (photopic) weighting is slightly narrow on the long wavelength side for predicting perceived noisiness. The L*a* covariance term, which is normally negative, reflects masking between L* and a* noise, as confirmed in informal evaluations. Test targets in linear sRGB and rendered L*a*b* spaces for each treatment are available at http://www.aptina.com/ImArch/ to enable other researchers to test metrics of their own design and calibrate them to JNDs of quality loss without performing additional observer experiments. Such JND-calibrated noise metrics are particularly valuable for comparing the impact of noise and other attributes, and for computing overall image quality.

  12. A remodelling metric for angular fibre distributions and its application to diseased carotid bifurcations.

    LENUS (Irish Health Repository)

    Creane, Arthur

    2012-07-01

    Many soft biological tissues contain collagen fibres, which act as major load bearing constituents. The orientation and the dispersion of these fibres influence the macroscopic mechanical properties of the tissue and are therefore of importance in several areas of research including constitutive model development, tissue engineering and mechanobiology. Qualitative comparisons between these fibre architectures can be made using vector plots of mean orientations and contour plots of fibre dispersion but quantitative comparison cannot be achieved using these methods. We propose a \\'remodelling metric\\' between two angular fibre distributions, which represents the mean rotational effort required to transform one into the other. It is an adaptation of the earth mover\\'s distance, a similarity measure between two histograms\\/signatures used in image analysis, which represents the minimal cost of transforming one distribution into the other by moving distribution mass around. In this paper, its utility is demonstrated by considering the change in fibre architecture during a period of plaque growth in finite element models of the carotid bifurcation. The fibre architecture is predicted using a strain-based remodelling algorithm. We investigate the remodelling metric\\'s potential as a clinical indicator of plaque vulnerability by comparing results between symptomatic and asymptomatic carotid bifurcations. Fibre remodelling was found to occur at regions of plaque burden. As plaque thickness increased, so did the remodelling metric. A measure of the total predicted fibre remodelling during plaque growth, TRM, was found to be higher in the symptomatic group than in the asymptomatic group. Furthermore, a measure of the total fibre remodelling per plaque size, TRM\\/TPB, was found to be significantly higher in the symptomatic vessels. The remodelling metric may prove to be a useful tool in other soft tissues and engineered scaffolds where fibre adaptation is also present.

  13. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  14. DISTANCES TO DARK CLOUDS: COMPARING EXTINCTION DISTANCES TO MASER PARALLAX DISTANCES

    International Nuclear Information System (INIS)

    Foster, Jonathan B.; Jackson, James M.; Stead, Joseph J.; Hoare, Melvin G.; Benjamin, Robert A.

    2012-01-01

    We test two different methods of using near-infrared extinction to estimate distances to dark clouds in the first quadrant of the Galaxy using large near-infrared (Two Micron All Sky Survey and UKIRT Infrared Deep Sky Survey) surveys. Very long baseline interferometry parallax measurements of masers around massive young stars provide the most direct and bias-free measurement of the distance to these dark clouds. We compare the extinction distance estimates to these maser parallax distances. We also compare these distances to kinematic distances, including recent re-calibrations of the Galactic rotation curve. The extinction distance methods agree with the maser parallax distances (within the errors) between 66% and 100% of the time (depending on method and input survey) and between 85% and 100% of the time outside of the crowded Galactic center. Although the sample size is small, extinction distance methods reproduce maser parallax distances better than kinematic distances; furthermore, extinction distance methods do not suffer from the kinematic distance ambiguity. This validation gives us confidence that these extinction methods may be extended to additional dark clouds where maser parallaxes are not available.

  15. Genomic validation of the differential preservation of population history in modern human cranial anatomy.

    Science.gov (United States)

    Reyes-Centeno, Hugo; Ghirotto, Silvia; Harvati, Katerina

    2017-01-01

    In modern humans, the significant correlation between neutral genetic loci and cranial anatomy suggests that the cranium preserves a population history signature. However, there is disagreement on whether certain parts of the cranium preserve this signature to a greater degree than other parts. It is also unclear how different quantitative measures of phenotype affect the association of genetic variation and anatomy. Here, we revisit these matters by testing the correlation of genetic distances and various phenotypic distances for ten modern human populations. Geometric morphometric shape data from the crania of adult individuals (n = 224) are used to calculate phenotypic P ST , Procrustes, and Mahalanobis distances. We calculate their correlation to neutral genetic distances, F ST , derived from single nucleotide polymorphisms (SNPs). We subset the cranial data into landmark configurations that include the neurocranium, the face, and the temporal bone in order to evaluate whether these cranial regions are differentially correlated to neutral genetic variation. Our results show that P ST , Mahalanobis, and Procrustes distances are correlated with F ST distances to varying degrees. They indicate that overall cranial shape is significantly correlated with neutral genetic variation. Of the component parts examined, P ST distances for both the temporal bone and the face have a stronger association with F ST distances than the neurocranium. When controlling for population divergence time, only the whole cranium and the temporal bone have a statistically significant association with F ST distances. Our results confirm that the cranium, as a whole, and the temporal bone can be used to reconstruct modern human population history. © 2016 Wiley Periodicals, Inc.

  16. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  17. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  18. Partial rectangular metric spaces and fixed point theorems.

    Science.gov (United States)

    Shukla, Satish

    2014-01-01

    The purpose of this paper is to introduce the concept of partial rectangular metric spaces as a generalization of rectangular metric and partial metric spaces. Some properties of partial rectangular metric spaces and some fixed point results for quasitype contraction in partial rectangular metric spaces are proved. Some examples are given to illustrate the observed results.

  19. The Edit Distance as a Measure of Perceived Rhythmic Similarity

    Directory of Open Access Journals (Sweden)

    Olaf Post

    2012-07-01

    Full Text Available The ‘edit distance’ (or ‘Levenshtein distance’ measure of distance between two data sets is defined as the minimum number of editing operations – insertions, deletions, and substitutions – that are required to transform one data set to the other (Orpen and Huron, 1992. This measure of distance has been applied frequently and successfully in music information retrieval, but rarely in predicting human perception of distance. In this study, we investigate the effectiveness of the edit distance as a predictor of perceived rhythmic dissimilarity under simple rhythmic alterations. Approaching rhythms as a set of pulses that are either onsets or silences, we study two types of alterations. The first experiment is designed to test the model’s accuracy for rhythms that are relatively similar; whether rhythmic variations with the same edit distance to a source rhythm are also perceived as relatively similar by human subjects. In addition, we observe whether the salience of an edit operation is affected by its metric placement in the rhythm. Instead of using a rhythm that regularly subdivides a 4/4 meter, our source rhythm is a syncopated 16-pulse rhythm, the son. Results show a high correlation between the predictions by the edit distance model and human similarity judgments (r = 0.87; a higher correlation than for the well-known generative theory of tonal music (r = 0.64. In the second experiment, we seek to assess the accuracy of the edit distance model in predicting relatively dissimilar rhythms. The stimuli used are random permutations of the son’s inter-onset intervals: 3-3-4-2-4. The results again indicate that the edit distance correlates well with the perceived rhythmic dissimilarity judgments of the subjects (r = 0.76. To gain insight in the relationships between the individual rhythms, the results are also presented by means of graphic phylogenetic trees.

  20. Nonlinear partial least squares with Hellinger distance for nonlinear process monitoring

    KAUST Repository

    Harrou, Fouzi; Madakyaru, Muddu; Sun, Ying

    2017-01-01

    This paper proposes an efficient data-based anomaly detection method that can be used for monitoring nonlinear processes. The proposed method merges advantages of nonlinear projection to latent structures (NLPLS) modeling and those of Hellinger distance (HD) metric to identify abnormal changes in highly correlated multivariate data. Specifically, the HD is used to quantify the dissimilarity between current NLPLS-based residual and reference probability distributions. The performances of the developed anomaly detection using NLPLS-based HD technique is illustrated using simulated plug flow reactor data.

  1. Nonlinear partial least squares with Hellinger distance for nonlinear process monitoring

    KAUST Repository

    Harrou, Fouzi

    2017-02-16

    This paper proposes an efficient data-based anomaly detection method that can be used for monitoring nonlinear processes. The proposed method merges advantages of nonlinear projection to latent structures (NLPLS) modeling and those of Hellinger distance (HD) metric to identify abnormal changes in highly correlated multivariate data. Specifically, the HD is used to quantify the dissimilarity between current NLPLS-based residual and reference probability distributions. The performances of the developed anomaly detection using NLPLS-based HD technique is illustrated using simulated plug flow reactor data.

  2. Background metric in supergravity theories

    International Nuclear Information System (INIS)

    Yoneya, T.

    1978-01-01

    In supergravity theories, we investigate the conformal anomaly of the path-integral determinant and the problem of fermion zero modes in the presence of a nontrivial background metric. Except in SO(3) -invariant supergravity, there are nonvanishing conformal anomalies. As a consequence, amplitudes around the nontrivial background metric contain unpredictable arbitrariness. The fermion zero modes which are explicitly constructed for the Euclidean Schwarzschild metric are interpreted as an indication of the supersymmetric multiplet structure of a black hole. The degree of degeneracy of a black hole is 2/sup 4n/ in SO(n) supergravity

  3. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  4. INVESTIGATION AND EVALUATION OF SPATIAL PATTERNS IN TABRIZ PARKS USING LANDSCAPE METRICS

    Directory of Open Access Journals (Sweden)

    Ali Majnouni Toutakhane

    2016-01-01

    Full Text Available Nowadays, the green spaces in cities and especially metropolises have adopted a variety of functions. In addition to improving the environmental conditions, they are suitable places for spending free times and mitigating nervous pressures of the machinery life based on their distribution and dispersion in the cities. In this research, in order to study the spatial distribution and composition of the parks and green spaces in Tabriz metropolis, the map of Parks prepared using the digital atlas of Tabriz parks and Arc Map and IDRISI softwares. Then, quantitative information of spatial patterns of Tabriz parks provided using Fragstats software and a selection of landscape metrics including: the area of class, patch density, percentage of landscape, average patch size, average patch area, largest patch index, landscape shape index, average Euclidean distance of the nearest neighborhood and average index of patch shape. Then the spatial distribution, composition, extent and continuity of the parks was evaluated. Overall, only 8.5 percent of the landscape is assigned to the parks, and they are studied in three classes of neighborhood, district and regional parks. Neighborhood parks and green spaces have a better spatial distribution pattern compared to the other classes and the studied metrics showed better results for this class. In contrast, the quantitative results of the metrics calculated for regional parks, showed the most unfavorable spatial status for this class of parks among the three classes studied in Tabriz city.

  5. Test of the FLRW Metric and Curvature with Strong Lens Time Delays

    International Nuclear Information System (INIS)

    Liao, Kai; Li, Zhengxiang; Wang, Guo-Jian; Fan, Xi-Long

    2017-01-01

    We present a new model-independent strategy for testing the Friedmann–Lemaître–Robertson–Walker (FLRW) metric and constraining cosmic curvature, based on future time-delay measurements of strongly lensed quasar-elliptical galaxy systems from the Large Synoptic Survey Telescope and supernova observations from the Dark Energy Survey. The test only relies on geometric optics. It is independent of the energy contents of the universe and the validity of the Einstein equation on cosmological scales. The study comprises two levels: testing the FLRW metric through the distance sum rule (DSR) and determining/constraining cosmic curvature. We propose an effective and efficient (redshift) evolution model for performing the former test, which allows us to concretely specify the violation criterion for the FLRW DSR. If the FLRW metric is consistent with the observations, then on the second level the cosmic curvature parameter will be constrained to ∼0.057 or ∼0.041 (1 σ ), depending on the availability of high-redshift supernovae, which is much more stringent than current model-independent techniques. We also show that the bias in the time-delay method might be well controlled, leading to robust results. The proposed method is a new independent tool for both testing the fundamental assumptions of homogeneity and isotropy in cosmology and for determining cosmic curvature. It is complementary to cosmic microwave background plus baryon acoustic oscillation analyses, which normally assume a cosmological model with dark energy domination in the late-time universe.

  6. Test of the FLRW Metric and Curvature with Strong Lens Time Delays

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Kai [School of Science, Wuhan University of Technology, Wuhan 430070 (China); Li, Zhengxiang; Wang, Guo-Jian [Department of Astronomy, Beijing Normal University, Beijing 100875 (China); Fan, Xi-Long, E-mail: liaokai@whut.edu.cn, E-mail: xilong.fan@glasgow.ac.uk [Department of Physics and Mechanical and Electrical Engineering, Hubei University of Education, Wuhan 430205 (China)

    2017-04-20

    We present a new model-independent strategy for testing the Friedmann–Lemaître–Robertson–Walker (FLRW) metric and constraining cosmic curvature, based on future time-delay measurements of strongly lensed quasar-elliptical galaxy systems from the Large Synoptic Survey Telescope and supernova observations from the Dark Energy Survey. The test only relies on geometric optics. It is independent of the energy contents of the universe and the validity of the Einstein equation on cosmological scales. The study comprises two levels: testing the FLRW metric through the distance sum rule (DSR) and determining/constraining cosmic curvature. We propose an effective and efficient (redshift) evolution model for performing the former test, which allows us to concretely specify the violation criterion for the FLRW DSR. If the FLRW metric is consistent with the observations, then on the second level the cosmic curvature parameter will be constrained to ∼0.057 or ∼0.041 (1 σ ), depending on the availability of high-redshift supernovae, which is much more stringent than current model-independent techniques. We also show that the bias in the time-delay method might be well controlled, leading to robust results. The proposed method is a new independent tool for both testing the fundamental assumptions of homogeneity and isotropy in cosmology and for determining cosmic curvature. It is complementary to cosmic microwave background plus baryon acoustic oscillation analyses, which normally assume a cosmological model with dark energy domination in the late-time universe.

  7. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  8. Balanced metrics for vector bundles and polarised manifolds

    DEFF Research Database (Denmark)

    Garcia Fernandez, Mario; Ross, Julius

    2012-01-01

    leads to a Hermitian-Einstein metric on E and a constant scalar curvature Kähler metric in c_1(L). For special values of α, limits of balanced metrics are solutions of a system of coupled equations relating a Hermitian-Einstein metric on E and a Kähler metric in c1(L). For this, we compute the top two......We consider a notion of balanced metrics for triples (X, L, E) which depend on a parameter α, where X is smooth complex manifold with an ample line bundle L and E is a holomorphic vector bundle over X. For generic choice of α, we prove that the limit of a convergent sequence of balanced metrics...

  9. Auto-associative Kernel Regression Model with Weighted Distance Metric for Instrument Drift Monitoring

    International Nuclear Information System (INIS)

    Shin, Ho Cheol; Park, Moon Ghu; You, Skin

    2006-01-01

    Recently, many on-line approaches to instrument channel surveillance (drift monitoring and fault detection) have been reported worldwide. On-line monitoring (OLM) method evaluates instrument channel performance by assessing its consistency with other plant indications through parametric or non-parametric models. The heart of an OLM system is the model giving an estimate of the true process parameter value against individual measurements. This model gives process parameter estimate calculated as a function of other plant measurements which can be used to identify small sensor drifts that would require the sensor to be manually calibrated or replaced. This paper describes an improvement of auto associative kernel regression (AAKR) by introducing a correlation coefficient weighting on kernel distances. The prediction performance of the developed method is compared with conventional auto-associative kernel regression

  10. The method of multispectral image processing of phytoplankton processing for environmental control of water pollution

    Science.gov (United States)

    Petruk, Vasil; Kvaternyuk, Sergii; Yasynska, Victoria; Kozachuk, Anastasia; Kotyra, Andrzej; Romaniuk, Ryszard S.; Askarova, Nursanat

    2015-12-01

    The paper presents improvement of the method of environmental monitoring of water bodies based on bioindication by phytoplankton, which identify phytoplankton particles carried out on the basis of comparison array multispectral images using Bayesian classifier of solving function based on Mahalanobis distance. It allows to evaluate objectively complex anthropogenic and technological impacts on aquatic ecosystems.

  11. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  12. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  13. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  14. Measuring Information Security: Guidelines to Build Metrics

    Science.gov (United States)

    von Faber, Eberhard

    Measuring information security is a genuine interest of security managers. With metrics they can develop their security organization's visibility and standing within the enterprise or public authority as a whole. Organizations using information technology need to use security metrics. Despite the clear demands and advantages, security metrics are often poorly developed or ineffective parameters are collected and analysed. This paper describes best practices for the development of security metrics. First attention is drawn to motivation showing both requirements and benefits. The main body of this paper lists things which need to be observed (characteristic of metrics), things which can be measured (how measurements can be conducted) and steps for the development and implementation of metrics (procedures and planning). Analysis and communication is also key when using security metrics. Examples are also given in order to develop a better understanding. The author wants to resume, continue and develop the discussion about a topic which is or increasingly will be a critical factor of success for any security managers in larger organizations.

  15. Genetic divergence in the common bean (Phaseolus vulgaris L.) in the Cerrado-Pantanal ecotone.

    Science.gov (United States)

    da Silva, F A; Corrêa, A M; Teodoro, P E; Lopes, K V; Corrêa, C C G

    2017-03-30

    Evaluating genetic diversity among genotypes is important for providing parameters for the identification of superior genotypes, because the choice of parents that form segregating populations is crucial. Our objectives were to i) evaluate agronomic performance; ii) compare clustering methods; iii) ascertain the relative contributions of the variables evaluated; and iv) identify the most promising hybrids to produce superior segregating populations. The trial was conducted in 2015 at the State University of Mato Grosso do Sul, Brazil. We used a randomized block design with three replications, and recorded the days to emergence, days to flowering, days to maturity, plant height, number of branches, number of pods, number of seeds per pod, weight of 100 grains, and productivity. The genetic diversity of the genotypes was determined by cluster analysis using two dissimilarity measures: the Euclidean distance and the standardized mean Mahalanobis distance using the Ward hierarchical method. The genotypes 'CNFC 10762', 'IAC Dawn', and 'BRS Style' had the highest grain yields, and clusters that were based on the Euclidean distance differed from those based on the Mahalanobis distance, the second being more precise. The yield grain character has greater relevance to the dispute. Hybrids with a high heterotic effect can be obtained by crossing 'IAC Alvorada' with 'CNFC 10762', 'IAC Alvorada' with 'CNFC 10764', and 'BRS Style' with 'IAC Alvorada'.

  16. How accurately can the peak skin dose in fluoroscopy be determined using indirect dose metrics?

    International Nuclear Information System (INIS)

    Jones, A. Kyle; Ensor, Joe E.; Pasciak, Alexander S.

    2014-01-01

    Purpose: Skin dosimetry is important for fluoroscopically-guided interventions, as peak skin doses (PSD) that result in skin reactions can be reached during these procedures. There is no consensus as to whether or not indirect skin dosimetry is sufficiently accurate for fluoroscopically-guided interventions. However, measuring PSD with film is difficult and the decision to do so must be madea priori. The purpose of this study was to assess the accuracy of different types of indirect dose estimates and to determine if PSD can be calculated within ±50% using indirect dose metrics for embolization procedures. Methods: PSD were measured directly using radiochromic film for 41 consecutive embolization procedures at two sites. Indirect dose metrics from the procedures were collected, including reference air kerma. Four different estimates of PSD were calculated from the indirect dose metrics and compared along with reference air kerma to the measured PSD for each case. The four indirect estimates included a standard calculation method, the use of detailed information from the radiation dose structured report, and two simplified calculation methods based on the standard method. Indirect dosimetry results were compared with direct measurements, including an analysis of uncertainty associated with film dosimetry. Factors affecting the accuracy of the different indirect estimates were examined. Results: When using the standard calculation method, calculated PSD were within ±35% for all 41 procedures studied. Calculated PSD were within ±50% for a simplified method using a single source-to-patient distance for all calculations. Reference air kerma was within ±50% for all but one procedure. Cases for which reference air kerma or calculated PSD exhibited large (±35%) differences from the measured PSD were analyzed, and two main causative factors were identified: unusually small or large source-to-patient distances and large contributions to reference air kerma from cone

  17. Multimetric indices: How many metrics?

    Science.gov (United States)

    Multimetric indices (MMI’s) often include 5 to 15 metrics, each representing a different attribute of assemblage condition, such as species diversity, tolerant taxa, and nonnative taxa. Is there an optimal number of metrics for MMIs? To explore this question, I created 1000 9-met...

  18. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  19. Moving from gamma passing rates to patient DVH-based QA metrics in pretreatment dose QA

    Energy Technology Data Exchange (ETDEWEB)

    Zhen, Heming; Nelms, Benjamin E.; Tome, Wolfgang A. [Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 and Canis Lupus LLC, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 and Department of Human Oncology, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-10-15

    Purpose: The purpose of this work is to explore the usefulness of the gamma passing rate metric for per-patient, pretreatment dose QA and to validate a novel patient-dose/DVH-based method and its accuracy and correlation. Specifically, correlations between: (1) gamma passing rates for three 3D dosimeter detector geometries vs clinically relevant patient DVH-based metrics; (2) Gamma passing rates of whole patient dose grids vs DVH-based metrics, (3) gamma passing rates filtered by region of interest (ROI) vs DVH-based metrics, and (4) the capability of a novel software algorithm that estimates corrected patient Dose-DVH based on conventional phan-tom QA data are analyzed. Methods: Ninety six unique ''imperfect'' step-and-shoot IMRT plans were generated by applying four different types of errors on 24 clinical Head/Neck patients. The 3D patient doses as well as the dose to a cylindrical QA phantom were then recalculated using an error-free beam model to serve as a simulated measurement for comparison. Resulting deviations to the planned vs simulated measured DVH-based metrics were generated, as were gamma passing rates for a variety of difference/distance criteria covering: dose-in-phantom comparisons and dose-in-patient comparisons, with the in-patient results calculated both over the whole grid and per-ROI volume. Finally, patient dose and DVH were predicted using the conventional per-beam planar data as input into a commercial ''planned dose perturbation'' (PDP) algorithm, and the results of these predicted DVH-based metrics were compared to the known values. Results: A range of weak to moderate correlations were found between clinically relevant patient DVH metrics (CTV-D95, parotid D{sub mean}, spinal cord D1cc, and larynx D{sub mean}) and both 3D detector and 3D patient gamma passing rate (3%/3 mm, 2%/2 mm) for dose-in-phantom along with dose-in-patient for both whole patient volume and filtered per-ROI. There was

  20. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  1. Common Metrics for Human-Robot Interaction

    Science.gov (United States)

    Steinfeld, Aaron; Lewis, Michael; Fong, Terrence; Scholtz, Jean; Schultz, Alan; Kaber, David; Goodrich, Michael

    2006-01-01

    This paper describes an effort to identify common metrics for task-oriented human-robot interaction (HRI). We begin by discussing the need for a toolkit of HRI metrics. We then describe the framework of our work and identify important biasing factors that must be taken into consideration. Finally, we present suggested common metrics for standardization and a case study. Preparation of a larger, more detailed toolkit is in progress.

  2. Narrowing the Gap Between QoS Metrics and Web QoE Using Above-the-fold Metrics

    OpenAIRE

    da Hora, Diego Neves; Asrese, Alemnew; Christophides, Vassilis; Teixeira, Renata; Rossi, Dario

    2018-01-01

    International audience; Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground tr...

  3. Are Current Physical Match Performance Metrics in Elite Soccer Fit for Purpose or is the Adoption of an Integrated Approach Needed?

    Science.gov (United States)

    Bradley, Paul S; Ade, Jack D

    2018-01-18

    Time-motion analysis is a valuable data-collection technique used to quantify the physical match performance of elite soccer players. For over 40 years researchers have adopted a 'traditional' approach when evaluating match demands by simply reporting the distance covered or time spent along a motion continuum of walking through to sprinting. This methodology quantifies physical metrics in isolation without integrating other factors and this ultimately leads to a one-dimensional insight into match performance. Thus, this commentary proposes a novel 'integrated' approach that focuses on a sensitive physical metric such as high-intensity running but contextualizes this in relation to key tactical activities for each position and collectively for the team. In the example presented, the 'integrated' model clearly unveils the unique high-intensity profile that exists due to distinct tactical roles, rather than one-dimensional 'blind' distances produced by 'traditional' models. Intuitively this innovative concept may aid the coaches understanding of the physical performance in relation to the tactical roles and instructions given to the players. Additionally, it will enable practitioners to more effectively translate match metrics into training and testing protocols. This innovative model may well aid advances in other team sports that incorporate similar intermittent movements with tactical purpose. Evidence of the merits and application of this new concept are needed before the scientific community accepts this model as it may well add complexity to an area that conceivably needs simplicity.

  4. Factor structure of the Tomimatsu-Sato metrics

    International Nuclear Information System (INIS)

    Perjes, Z.

    1989-02-01

    Based on an earlier result stating that δ = 3 Tomimatsu-Sato (TS) metrics can be factored over the field of integers, an analogous representation for higher TS metrics was sought. It is shown that the factoring property of TS metrics follows from the structure of special Hankel determinants. A set of linear algebraic equations determining the factors was defined, and the factors of the first five TS metrics were tabulated, together with their primitive factors. (R.P.) 4 refs.; 2 tabs

  5. ST-intuitionistic fuzzy metric space with properties

    Science.gov (United States)

    Arora, Sahil; Kumar, Tanuj

    2017-07-01

    In this paper, we define ST-intuitionistic fuzzy metric space and the notion of convergence and completeness properties of cauchy sequences is studied. Further, we prove some properties of ST-intuitionistic fuzzy metric space. Finally, we introduce the concept of symmetric ST Intuitionistic Fuzzy metric space.

  6. Analysis of Camera Parameters Value in Various Object Distances Calibration

    International Nuclear Information System (INIS)

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  7. Measuring the distance from saddle points and driving to locate them over quantum control landscapes

    International Nuclear Information System (INIS)

    Sun, Qiuyang; Riviello, Gregory; Rabitz, Herschel; Wu, Re-Bing

    2015-01-01

    Optimal control of quantum phenomena involves the introduction of a cost functional J to characterize the degree of achieving a physical objective by a chosen shaped electromagnetic field. The cost functional dependence upon the control forms a control landscape. Two theoretically important canonical cases are the landscapes associated with seeking to achieve either a physical observable or a unitary transformation. Upon satisfaction of particular assumptions, both landscapes are analytically known to be trap-free, yet possess saddle points at precise suboptimal J values. The presence of saddles on the landscapes can influence the effort needed to find an optimal field. As a foundation to future algorithm development and analyzes, we define metrics that identify the ‘distance’ from a given saddle based on the sufficient and necessary conditions for the existence of the saddles. Algorithms are introduced utilizing the metrics to find a control such that the dynamics arrive at a targeted saddle. The saddle distance metric and saddle-seeking methodology is tested numerically in several model systems. (paper)

  8. Prototypic Development and Evaluation of a Medium Format Metric Camera

    Science.gov (United States)

    Hastedt, H.; Rofallski, R.; Luhmann, T.; Rosenbauer, R.; Ochsner, D.; Rieke-Zapp, D.

    2018-05-01

    Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2-3 m in each direction) and large volumes (around 20 x 20 x 1-10 m). The requested precision in object space (1σ RMS) is defined to be within 0.1-0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1) high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2) a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3) a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002). Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm-0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement). All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.

  9. PROTOTYPIC DEVELOPMENT AND EVALUATION OF A MEDIUM FORMAT METRIC CAMERA

    Directory of Open Access Journals (Sweden)

    H. Hastedt

    2018-05-01

    Full Text Available Engineering applications require high-precision 3D measurement techniques for object sizes that vary between small volumes (2–3 m in each direction and large volumes (around 20 x 20 x 1–10 m. The requested precision in object space (1σ RMS is defined to be within 0.1–0.2 mm for large volumes and less than 0.01 mm for small volumes. In particular, focussing large volume applications the availability of a metric camera would have different advantages for several reasons: 1 high-quality optical components and stabilisations allow for a stable interior geometry of the camera itself, 2 a stable geometry leads to a stable interior orientation that enables for an a priori camera calibration, 3 a higher resulting precision can be expected. With this article the development and accuracy evaluation of a new metric camera, the ALPA 12 FPS add|metric will be presented. Its general accuracy potential is tested against calibrated lengths in a small volume test environment based on the German Guideline VDI/VDE 2634.1 (2002. Maximum length measurement errors of less than 0.025 mm are achieved with different scenarios having been tested. The accuracy potential for large volumes is estimated within a feasibility study on the application of photogrammetric measurements for the deformation estimation on a large wooden shipwreck in the German Maritime Museum. An accuracy of 0.2 mm–0.4 mm is reached for a length of 28 m (given by a distance from a lasertracker network measurement. All analyses have proven high stabilities of the interior orientation of the camera and indicate the applicability for a priori camera calibration for subsequent 3D measurements.

  10. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  11. Defining a Progress Metric for CERT RMM Improvement

    Science.gov (United States)

    2017-09-14

    REV-03.18.2016.0 Defining a Progress Metric for CERT-RMM Improvement Gregory Crabb Nader Mehravari David Tobar September 2017 TECHNICAL ...fendable resource allocation decisions. Technical metrics measure aspects of controls implemented through technology (systems, soft- ware, hardware...implementation metric would be the percentage of users who have received anti-phishing training . • Effectiveness/efficiency metrics measure whether

  12. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  13. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  14. Metrical Phonology: German Sound System.

    Science.gov (United States)

    Tice, Bradley S.

    Metrical phonology, a linguistic process of phonological stress assessment and diagrammatic simplification of sentence and word stress, is discussed as it is found in the English and German languages. The objective is to promote use of metrical phonology as a tool for enhancing instruction in stress patterns in words and sentences, particularly in…

  15. Construction of Einstein-Sasaki metrics in D≥7

    International Nuclear Information System (INIS)

    Lue, H.; Pope, C. N.; Vazquez-Poritz, J. F.

    2007-01-01

    We construct explicit Einstein-Kaehler metrics in all even dimensions D=2n+4≥6, in terms of a 2n-dimensional Einstein-Kaehler base metric. These are cohomogeneity 2 metrics which have the new feature of including a NUT-type parameter, or gravomagnetic charge, in addition to..' in addition to mass and rotation parameters. Using a canonical construction, these metrics all yield Einstein-Sasaki metrics in dimensions D=2n+5≥7. As is commonly the case in this type of construction, for suitable choices of the free parameters the Einstein-Sasaki metrics can extend smoothly onto complete and nonsingular manifolds, even though the underlying Einstein-Kaehler metric has conical singularities. We discuss some explicit examples in the case of seven-dimensional Einstein-Sasaki spaces. These new spaces can provide supersymmetric backgrounds in M theory, which play a role in the AdS 4 /CFT 3 correspondence

  16. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  17. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  18. Managing distance and covariate information with point-based clustering

    Directory of Open Access Journals (Sweden)

    Peter A. Whigham

    2016-09-01

    Full Text Available Abstract Background Geographic perspectives of disease and the human condition often involve point-based observations and questions of clustering or dispersion within a spatial context. These problems involve a finite set of point observations and are constrained by a larger, but finite, set of locations where the observations could occur. Developing a rigorous method for pattern analysis in this context requires handling spatial covariates, a method for constrained finite spatial clustering, and addressing bias in geographic distance measures. An approach, based on Ripley’s K and applied to the problem of clustering with deliberate self-harm (DSH, is presented. Methods Point-based Monte-Carlo simulation of Ripley’s K, accounting for socio-economic deprivation and sources of distance measurement bias, was developed to estimate clustering of DSH at a range of spatial scales. A rotated Minkowski L1 distance metric allowed variation in physical distance and clustering to be assessed. Self-harm data was derived from an audit of 2 years’ emergency hospital presentations (n = 136 in a New Zealand town (population ~50,000. Study area was defined by residential (housing land parcels representing a finite set of possible point addresses. Results Area-based deprivation was spatially correlated. Accounting for deprivation and distance bias showed evidence for clustering of DSH for spatial scales up to 500 m with a one-sided 95 % CI, suggesting that social contagion may be present for this urban cohort. Conclusions Many problems involve finite locations in geographic space that require estimates of distance-based clustering at many scales. A Monte-Carlo approach to Ripley’s K, incorporating covariates and models for distance bias, are crucial when assessing health-related clustering. The case study showed that social network structure defined at the neighbourhood level may account for aspects of neighbourhood clustering of DSH. Accounting for

  19. A newly developed dispersal metric indicates the succession of benthic invertebrates in restored rivers.

    Science.gov (United States)

    Li, Fengqing; Sundermann, Andrea; Stoll, Stefan; Haase, Peter

    2016-11-01

    Dispersal capacity plays a fundamental role in the riverine benthic invertebrate colonization of new habitats that emerges following flash floods or restoration. However, an appropriate measure of dispersal capacity for benthic invertebrates is still lacking. The dispersal of benthic invertebrates occurs mainly during the aquatic (larval) and aerial (adult) life stages, and the dispersal of each stage can be further subdivided into active and passive modes. Based on these four possible dispersal modes, we first developed a metric (which is very similar to the well-known and widely used saprobic index) to estimate the dispersal capacity for 802 benthic invertebrate taxa by incorporating a weight for each mode. Second, we tested this metric using benthic invertebrate community data from a) 23 large restored river sites with substantial improvements of river bottom habitats dating back 1 to 10years, b) 23 unrestored sites very close to the restored sites, and c) 298 adjacent surrounding sites (mean±standard deviation: 13.0±9.5 per site) within a distance of up to 5km for each restored site in the low mountain and lowland areas of Germany. We hypothesize that our metric will reflect the temporal succession process of benthic invertebrate communities colonizing the restored sites, whereas no temporal changes are expected in the unrestored and surrounding sites. By applying our metric to these three river treatment categories, we found that the average dispersal capacity of benthic invertebrate communities in the restored sites significantly decreased in the early years following restoration, whereas there were no changes in either the unrestored or the surrounding sites. After all taxa had been divided into quartiles representing weak to strong dispersers, this pattern became even more obvious; strong dispersers colonized the restored sites during the first year after restoration and then significantly decreased over time, whereas weak dispersers continued to increase

  20. Degraded visual environment image/video quality metrics

    Science.gov (United States)

    Baumgartner, Dustin D.; Brown, Jeremy B.; Jacobs, Eddie L.; Schachter, Bruce J.

    2014-06-01

    A number of image quality metrics (IQMs) and video quality metrics (VQMs) have been proposed in the literature for evaluating techniques and systems for mitigating degraded visual environments. Some require both pristine and corrupted imagery. Others require patterned target boards in the scene. None of these metrics relates well to the task of landing a helicopter in conditions such as a brownout dust cloud. We have developed and used a variety of IQMs and VQMs related to the pilot's ability to detect hazards in the scene and to maintain situational awareness. Some of these metrics can be made agnostic to sensor type. Not only are the metrics suitable for evaluating algorithm and sensor variation, they are also suitable for choosing the most cost effective solution to improve operating conditions in degraded visual environments.

  1. On the Newtonian limit of emergent NC gravity and long-distance corrections

    International Nuclear Information System (INIS)

    Steinacker, Harold

    2009-01-01

    We show how Newtonian gravity emerges on 4-dimensional non-commutative spacetime branes in Yang-Mills matrix models. Large matter clusters such as galaxies are embedded in large-scale harmonic deformations of the space-time brane, which screen gravity for long distances. On shorter scales, the local matter distribution reproduces Newtonian gravity via local deformations of the brane and its metric. The harmonic 'gravity bag' acts as a halo with effective positive energy density. This leads in particular to a significant enhancement of the orbital velocities around galaxies at large distances compared with the Newtonian case, before dropping to zero as the geometry merges with a Milne-like cosmology. Besides these 'harmonic' solutions, there is another class of solutions which is more similar to Einstein gravity. Thus the IKKT model provides an accessible candidate for a quantum theory of gravity.

  2. The Jacobi metric for timelike geodesics in static spacetimes

    Science.gov (United States)

    Gibbons, G. W.

    2016-01-01

    It is shown that the free motion of massive particles moving in static spacetimes is given by the geodesics of an energy-dependent Riemannian metric on the spatial sections analogous to Jacobi's metric in classical dynamics. In the massless limit Jacobi's metric coincides with the energy independent Fermat or optical metric. For stationary metrics, it is known that the motion of massless particles is given by the geodesics of an energy independent Finslerian metric of Randers type. The motion of massive particles is governed by neither a Riemannian nor a Finslerian metric. The properies of the Jacobi metric for massive particles moving outside the horizon of a Schwarschild black hole are described. By constrast with the massless case, the Gaussian curvature of the equatorial sections is not always negative.

  3. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  4. Metrical connection in space-time, Newton's and Hubble's laws

    International Nuclear Information System (INIS)

    Maeder, A.

    1978-01-01

    The theory of gravitation in general relativity is not scale invariant. Here, we follow Dirac's proposition of a scale invariant theory of gravitation (i.e. a theory in which the equations keep their form when a transformation of scale is made). We examine some concepts of Weyl's geometry, like the metrical connection, the scale transformations and invariance, and we discuss their consequences for the equation of the geodetic motion and for its Newtonian limit. Under general conditions, we show that the only non-vanishing component of the coefficient of metrical connection may be identified with Hubble's constant. In this framework, the equivalent to the Newtonian approximation for the equation of motion contains an additional acceleration term Hdr vector /dt, which produces an expansion of gravitational systems. The velocity of this expansion is shown to increase linearly with the distance between interacting objects. The relative importance of this new expansion term to the Newtonian one varies like (2rhosub(c)/rho)sup(1/2), where rhosub(c) is the critical density of the Einsteinde Sitter model and rho is the mean density of the considered gravitational configuration. Thus, this 'generalized expansion' is important essentially for systems of mean density not too much above the critical density. Finally, our main conclusion is that in the integrable Weyl geometry, Hubble's law - like Newton's law - would appear as an intrinsic property of gravitation, being only the most visible manifestation of a general effect characterizing the gravitational interaction. (orig.) [de

  5. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  6. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  7. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  8. Deep Correlated Holistic Metric Learning for Sketch-Based 3D Shape Retrieval.

    Science.gov (United States)

    Dai, Guoxian; Xie, Jin; Fang, Yi

    2018-07-01

    How to effectively retrieve desired 3D models with simple queries is a long-standing problem in computer vision community. The model-based approach is quite straightforward but nontrivial, since people could not always have the desired 3D query model available by side. Recently, large amounts of wide-screen electronic devices are prevail in our daily lives, which makes the sketch-based 3D shape retrieval a promising candidate due to its simpleness and efficiency. The main challenge of sketch-based approach is the huge modality gap between sketch and 3D shape. In this paper, we proposed a novel deep correlated holistic metric learning (DCHML) method to mitigate the discrepancy between sketch and 3D shape domains. The proposed DCHML trains two distinct deep neural networks (one for each domain) jointly, which learns two deep nonlinear transformations to map features from both domains into a new feature space. The proposed loss, including discriminative loss and correlation loss, aims to increase the discrimination of features within each domain as well as the correlation between different domains. In the new feature space, the discriminative loss minimizes the intra-class distance of the deep transformed features and maximizes the inter-class distance of the deep transformed features to a large margin within each domain, while the correlation loss focused on mitigating the distribution discrepancy across different domains. Different from existing deep metric learning methods only with loss at the output layer, our proposed DCHML is trained with loss at both hidden layer and output layer to further improve the performance by encouraging features in the hidden layer also with desired properties. Our proposed method is evaluated on three benchmarks, including 3D Shape Retrieval Contest 2013, 2014, and 2016 benchmarks, and the experimental results demonstrate the superiority of our proposed method over the state-of-the-art methods.

  9. Computing the Stretch Factor of Paths, Trees, and Cycles in Weighted Fixed Orientation Metrics

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    2008-01-01

    Let G be a graph embedded in the L_1-plane. The stretch factor of G is the maximum over all pairs of distinct vertices p and q of G of the ratio L_1^G(p,q)/L_1(p,q), where L_1^G(p,q) is the L_1-distance in G between p and q. We show how to compute the stretch factor of an n-vertex path in O(n*(log...... n)^2) worst-case time and O(n) space and we mention generalizations to trees and cycles, to general weighted fixed orientation metrics, and to higher dimensions....

  10. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  11. Implications of Metric Choice for Common Applications of Readmission Metrics

    OpenAIRE

    Davies, Sheryl; Saynina, Olga; Schultz, Ellen; McDonald, Kathryn M; Baker, Laurence C

    2013-01-01

    Objective. To quantify the differential impact on hospital performance of three readmission metrics: all-cause readmission (ACR), 3M Potential Preventable Readmission (PPR), and Centers for Medicare and Medicaid 30-day readmission (CMS).

  12. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  13. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  14. Computations of Wall Distances Based on Differential Equations

    Science.gov (United States)

    Tucker, Paul G.; Rumsey, Chris L.; Spalart, Philippe R.; Bartels, Robert E.; Biedron, Robert T.

    2004-01-01

    The use of differential equations such as Eikonal, Hamilton-Jacobi and Poisson for the economical calculation of the nearest wall distance d, which is needed by some turbulence models, is explored. Modifications that could palliate some turbulence-modeling anomalies are also discussed. Economy is of especial value for deforming/adaptive grid problems. For these, ideally, d is repeatedly computed. It is shown that the Eikonal and Hamilton-Jacobi equations can be easy to implement when written in implicit (or iterated) advection and advection-diffusion equation analogous forms, respectively. These, like the Poisson Laplacian term, are commonly occurring in CFD solvers, allowing the re-use of efficient algorithms and code components. The use of the NASA CFL3D CFD program to solve the implicit Eikonal and Hamilton-Jacobi equations is explored. The re-formulated d equations are easy to implement, and are found to have robust convergence. For accurate Eikonal solutions, upwind metric differences are required. The Poisson approach is also found effective, and easiest to implement. Modified distances are not found to affect global outputs such as lift and drag significantly, at least in common situations such as airfoil flows.

  15. Viscous shear in the Kerr metric

    International Nuclear Information System (INIS)

    Anderson, M.R.; Lemos, J.P.S.

    1988-01-01

    Models of viscous flows on to black holes commonly assume a zero-torque boundary condition at the radius of the last stable Keplerian orbit. It is here shown that this condition is wrong. The viscous torque is generally non-zero at both the last stable orbit and the horizon itself. The existence of a non-zero viscous torque at the horizon does not require the transfer of energy or angular momentum across any spacelike distance, and so does not violate causality. Further, in comparison with the viscous torque in the distant, Newtonian regime, the viscous torque on the horizon is often reversed, so that angular momentum is viscously advected inwards rather than outwards. This phenomenon is first suggested by an analysis of the quasi-stationary case, and then demonstrated explicitly for a series of cold, dynamical flows which fall freely from the last stable orbit in the Schwarzschild and Kerr metrics. In the steady flows constructed here, the net torque on the hole is always directed in the usual sense; any reversal in the viscous torque is offset by an increase in the convected flux of angular momentum. (author)

  16. PERMANOVA-S: association test for microbial community composition that accommodates confounders and multiple distances.

    Science.gov (United States)

    Tang, Zheng-Zheng; Chen, Guanhua; Alekseyenko, Alexander V

    2016-09-01

    Recent advances in sequencing technology have made it possible to obtain high-throughput data on the composition of microbial communities and to study the effects of dysbiosis on the human host. Analysis of pairwise intersample distances quantifies the association between the microbiome diversity and covariates of interest (e.g. environmental factors, clinical outcomes, treatment groups). In the design of these analyses, multiple choices for distance metrics are available. Most distance-based methods, however, use a single distance and are underpowered if the distance is poorly chosen. In addition, distance-based tests cannot flexibly handle confounding variables, which can result in excessive false-positive findings. We derive presence-weighted UniFrac to complement the existing UniFrac distances for more powerful detection of the variation in species richness. We develop PERMANOVA-S, a new distance-based method that tests the association of microbiome composition with any covariates of interest. PERMANOVA-S improves the commonly-used Permutation Multivariate Analysis of Variance (PERMANOVA) test by allowing flexible confounder adjustments and ensembling multiple distances. We conducted extensive simulation studies to evaluate the performance of different distances under various patterns of association. Our simulation studies demonstrate that the power of the test relies on how well the selected distance captures the nature of the association. The PERMANOVA-S unified test combines multiple distances and achieves good power regardless of the patterns of the underlying association. We demonstrate the usefulness of our approach by reanalyzing several real microbiome datasets. miProfile software is freely available at https://medschool.vanderbilt.edu/tang-lab/software/miProfile z.tang@vanderbilt.edu or g.chen@vanderbilt.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  17. Energy-Based Metrics for Arthroscopic Skills Assessment.

    Science.gov (United States)

    Poursartip, Behnaz; LeBel, Marie-Eve; McCracken, Laura C; Escoto, Abelardo; Patel, Rajni V; Naish, Michael D; Trejos, Ana Luisa

    2017-08-05

    Minimally invasive skills assessment methods are essential in developing efficient surgical simulators and implementing consistent skills evaluation. Although numerous methods have been investigated in the literature, there is still a need to further improve the accuracy of surgical skills assessment. Energy expenditure can be an indication of motor skills proficiency. The goals of this study are to develop objective metrics based on energy expenditure, normalize these metrics, and investigate classifying trainees using these metrics. To this end, different forms of energy consisting of mechanical energy and work were considered and their values were divided by the related value of an ideal performance to develop normalized metrics. These metrics were used as inputs for various machine learning algorithms including support vector machines (SVM) and neural networks (NNs) for classification. The accuracy of the combination of the normalized energy-based metrics with these classifiers was evaluated through a leave-one-subject-out cross-validation. The proposed method was validated using 26 subjects at two experience levels (novices and experts) in three arthroscopic tasks. The results showed that there are statistically significant differences between novices and experts for almost all of the normalized energy-based metrics. The accuracy of classification using SVM and NN methods was between 70% and 95% for the various tasks. The results show that the normalized energy-based metrics and their combination with SVM and NN classifiers are capable of providing accurate classification of trainees. The assessment method proposed in this study can enhance surgical training by providing appropriate feedback to trainees about their level of expertise and can be used in the evaluation of proficiency.

  18. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  19. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  20. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  1. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  2. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  3. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  4. Noncommutative Geometry of the Moyal Plane: Translation Isometries, Connes' Distance on Coherent States, Pythagoras Equality

    Science.gov (United States)

    Martinetti, Pierre; Tomassini, Luca

    2013-10-01

    We study the metric aspect of the Moyal plane from Connes' noncommutative geometry point of view. First, we compute Connes' spectral distance associated with the natural isometric action of on the algebra of the Moyal plane . We show that the distance between any state of and any of its translated states is precisely the amplitude of the translation. As a consequence, we obtain the spectral distance between coherent states of the quantum harmonic oscillator as the Euclidean distance on the plane. We investigate the classical limit, showing that the set of coherent states equipped with Connes' spectral distance tends towards the Euclidean plane as the parameter of deformation goes to zero. The extension of these results to the action of the symplectic group is also discussed, with particular emphasis on the orbits of coherent states under rotations. Second, we compute the spectral distance in the double Moyal plane, intended as the product of (the minimal unitization of) by . We show that on the set of states obtained by translation of an arbitrary state of , this distance is given by the Pythagoras theorem. On the way, we prove some Pythagoras inequalities for the product of arbitrary unital and non-degenerate spectral triples. Applied to the Doplicher- Fredenhagen-Roberts model of quantum spacetime [DFR], these two theorems show that Connes' spectral distance and the DFR quantum length coincide on the set of states of optimal localization.

  5. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  6. 77 FR 12832 - Non-RTO/ISO Performance Metrics; Commission Staff Request Comments on Performance Metrics for...

    Science.gov (United States)

    2012-03-02

    ... Performance Metrics; Commission Staff Request Comments on Performance Metrics for Regions Outside of RTOs and... performance communicate about the benefits of RTOs and, where appropriate, (2) changes that need to be made to... common set of performance measures for markets both within and outside of ISOs/RTOs. As recommended by...

  7. Regional Sustainability: The San Luis Basin Metrics Project

    Science.gov (United States)

    There are a number of established, scientifically supported metrics of sustainability. Many of the metrics are data intensive and require extensive effort to collect data and compute. Moreover, individual metrics may not capture all aspects of a system that are relevant to sust...

  8. Fidelity induced distance measures for quantum states

    International Nuclear Information System (INIS)

    Ma Zhihao; Zhang Fulin; Chen Jingling

    2009-01-01

    Fidelity plays an important role in quantum information theory. In this Letter, we introduce new metric of quantum states induced by fidelity, and connect it with the well-known trace metric, Sine metric and Bures metric for the qubit case. The metric character is also presented for the qudit (i.e., d-dimensional system) case. The CPT contractive property and joint convex property of the metric are also studied.

  9. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  10. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  11. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  12. Generalized tolerance sensitivity and DEA metric sensitivity

    OpenAIRE

    Neralić, Luka; E. Wendell, Richard

    2015-01-01

    This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA). Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  13. On Nakhleh's metric for reduced phylogenetic networks

    OpenAIRE

    Cardona, Gabriel; Llabrés, Mercè; Rosselló, Francesc; Valiente Feruglio, Gabriel Alejandro

    2009-01-01

    We prove that Nakhleh’s metric for reduced phylogenetic networks is also a metric on the classes of tree-child phylogenetic networks, semibinary tree-sibling time consistent phylogenetic networks, and multilabeled phylogenetic trees. We also prove that it separates distinguishable phylogenetic networks. In this way, it becomes the strongest dissimilarity measure for phylogenetic networks available so far. Furthermore, we propose a generalization of that metric that separates arbitrary phyl...

  14. Generalized tolerance sensitivity and DEA metric sensitivity

    Directory of Open Access Journals (Sweden)

    Luka Neralić

    2015-03-01

    Full Text Available This paper considers the relationship between Tolerance sensitivity analysis in optimization and metric sensitivity analysis in Data Envelopment Analysis (DEA. Herein, we extend the results on the generalized Tolerance framework proposed by Wendell and Chen and show how this framework includes DEA metric sensitivity as a special case. Further, we note how recent results in Tolerance sensitivity suggest some possible extensions of the results in DEA metric sensitivity.

  15. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  16. A comparison theorem of the Kobayashi metric and the Bergman metric on a class of Reinhardt domains

    International Nuclear Information System (INIS)

    Weiping Yin.

    1990-03-01

    A comparison theorem for the Kobayashi and Bergman metric is given on a class of Reinhardt domains in C n . In the meantime, we obtain a class of complete invariant Kaehler metrics for these domains of the special cases. (author). 5 refs

  17. Using Activity Metrics for DEVS Simulation Profiling

    Directory of Open Access Journals (Sweden)

    Muzy A.

    2014-01-01

    Full Text Available Activity metrics can be used to profile DEVS models before and during the simulation. It is critical to get good activity metrics of models before and during their simulation. Having a means to compute a-priori activity of components (analytic activity may be worth when simulating a model (or parts of it for the first time. After, during the simulation, analytic activity can be corrected using dynamic one. In this paper, we introduce McCabe cyclomatic complexity metric (MCA to compute analytic activity. Both static and simulation activity metrics have been implemented through a plug-in of the DEVSimPy (DEVS Simulator in Python language environment and applied to DEVS models.

  18. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  19. Metrication: An economic wake-up call for US industry

    Science.gov (United States)

    Carver, G. P.

    1993-03-01

    As the international standard of measurement, the metric system is one key to success in the global marketplace. International standards have become an important factor in international economic competition. Non-metric products are becoming increasingly unacceptable in world markets that favor metric products. Procurement is the primary federal tool for encouraging and helping U.S. industry to convert voluntarily to the metric system. Besides the perceived unwillingness of the customer, certain regulatory language, and certain legal definitions in some states, there are no major impediments to conversion of the remaining non-metric industries to metric usage. Instead, there are good reasons for changing, including an opportunity to rethink many industry standards and to take advantage of size standardization. Also, when the remaining industries adopt the metric system, they will come into conformance with federal agencies engaged in similar activities.

  20. Conformal and related changes of metric on the product of two almost contact metric manifolds.

    OpenAIRE

    Blair, D. E.

    1990-01-01

    This paper studies conformal and related changes of the product metric on the product of two almost contact metric manifolds. It is shown that if one factor is Sasakian, the other is not, but that locally the second factor is of the type studied by Kenmotsu. The results are more general and given in terms of trans-Sasakian, α-Sasakian and β-Kenmotsu structures.

  1. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    International Nuclear Information System (INIS)

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  2. Graev metrics on free products and HNN extensions

    DEFF Research Database (Denmark)

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  3. Independent component analysis classification of laser induced breakdown spectroscopy spectra

    International Nuclear Information System (INIS)

    Forni, Olivier; Maurice, Sylvestre; Gasnault, Olivier; Wiens, Roger C.; Cousin, Agnès; Clegg, Samuel M.; Sirven, Jean-Baptiste; Lasue, Jérémie

    2013-01-01

    The ChemCam instrument on board Mars Science Laboratory (MSL) rover uses the laser-induced breakdown spectroscopy (LIBS) technique to remotely analyze Martian rocks. It retrieves spectra up to a distance of seven meters to quantify and to quantitatively analyze the sampled rocks. Like any field application, on-site measurements by LIBS are altered by diverse matrix effects which induce signal variations that are specific to the nature of the sample. Qualitative aspects remain to be studied, particularly LIBS sample identification to determine which samples are of interest for further analysis by ChemCam and other rover instruments. This can be performed with the help of different chemometric methods that model the spectra variance in order to identify a the rock from its spectrum. In this paper we test independent components analysis (ICA) rock classification by remote LIBS. We show that using measures of distance in ICA space, namely the Manhattan and the Mahalanobis distance, we can efficiently classify spectra of an unknown rock. The Mahalanobis distance gives overall better performances and is easier to manage than the Manhattan distance for which the determination of the cut-off distance is not easy. However these two techniques are complementary and their analytical performances will improve with time during MSL operations as the quantity of available Martian spectra will grow. The analysis accuracy and performances will benefit from a combination of the two approaches. - Highlights: • We use a novel independent component analysis method to classify LIBS spectra. • We demonstrate the usefulness of ICA. • We report the performances of the ICA classification. • We compare it to other classical classification schemes

  4. Validation of Metrics for Collaborative Systems

    OpenAIRE

    Ion IVAN; Cristian CIUREA

    2008-01-01

    This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  5. g-Weak Contraction in Ordered Cone Rectangular Metric Spaces

    Directory of Open Access Journals (Sweden)

    S. K. Malhotra

    2013-01-01

    Full Text Available We prove some common fixed-point theorems for the ordered g-weak contractions in cone rectangular metric spaces without assuming the normality of cone. Our results generalize some recent results from cone metric and cone rectangular metric spaces into ordered cone rectangular metric spaces. Examples are provided which illustrate the results.

  6. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  7. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  8. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  9. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  10. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  11. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  12. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  13. Explanation of Rotation Curves in Galaxies and Clusters of them, by Generalization of Schwarzschild Metric and Combination with MOND, eliminating Dark Matter

    Science.gov (United States)

    Vossos, Spyridon; Vossos, Elias

    2017-12-01

    Schwarzschild Metric is the first and the most important solution of Einstein vacuum field equations. This is associated with Lorentz metric of flat spacetime and produces the relativistic potential (Φ) and the field strength (g) outside a spherically symmetric mass or a non-rotating black hole. It has many applications such as gravitational red shift, the precession of Mercury’s orbit, Shapiro time delay etc. However, it is inefficient to explain the rotation curves in large galaxies and clusters of them, causing the necessity for dark matter. On the other hand, Modified Newtonian Dynamics (MOND) has already explained these rotation curves in many cases, using suitable interpolating function (μ) in Milgrom’s Law. In this presentation, we initially produce a Generalized Schwarzschild potential and the corresponding Metric of spacetime, in order to be in accordance with any isotropic metric of flat spacetime (including Galilean Metric of spacetime which is associated with Galilean Transformation of spacetime). From this Generalized Schwarzschild potential (Φ), we calculate the corresponding field strength (g), which is associated with the interpolating function (μ). In this way, a new relativistic potential is obtained (let us call 2nd Generalized Schwarzschild potential) which describes the gravitational interaction at any distance and for any metric of flat spacetime. Thus, not only the necessity for Dark Matter is eliminated, but also MOND becomes a pure Relativistic Theory of Gravitational Interaction. Then, we pass to the case of flat spacetime with Lorentz metric (Minkowski space), because the experimental data have been extracted using the Relativistic Doppler Shift and the gravitational red shift of Classic Relativity (CR). Thus, we Explain the Rotation Curves in Galaxies (e.g. NGC 3198) and Clusters of them as well as the Solar system, eliminating Dark Matter. This relativistic potential and the corresponding metric of spacetime have been obtained

  14. Construction of self-dual codes in the Rosenbloom-Tsfasman metric

    Science.gov (United States)

    Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin

    2017-12-01

    Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.

  15. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  16. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  17. In vitro selection of bacteria with potential for use as probiotics in marine shrimp culture

    Directory of Open Access Journals (Sweden)

    Felipe do Nascimento Vieira

    2013-08-01

    Full Text Available The objective of this work was to isolate strains of lactic acid bacteria with probiotic potential from the digestive tract of marine shrimp (Litopenaeus vannamei, and to carry out in vitro selection based on multiple characters. The ideotype (ideal proposed strain was defined by the highest averages for the traits maximum growth velocity, final count of viable cells, and inhibition halo against nine freshwater and marine pathogens, and by the lowest averages for the traits duplication time and resistance of strains to NaCl (1.5 and 3%, pH (6, 8, and 9, and biliary salts (5%. Mahalanobis distance (D² was estimated among the evaluated strains, and the best ones were those with the shortest distances to the ideotype. Ten bacterial strains were isolated and biochemically identified as Lactobacillus plantarum (3, L. brevis (3, Weissella confusa (2, Lactococcus lactis (1, and L. delbrueckii (1. Lactobacillus plantarum strains showed a wide spectrum of action and the largest inhibition halos against pathogens, both Gram-positive and negative, high growth rate, and tolerance to all evaluated parameters. In relation to ideotype, L. plantarum showed the lowest Mahalanobis (D² distance, followed by the strains of W. confusa, L. brevis, L. lactis, and L. delbrueckii. Among the analyzed bacterial strains, those of Lactobacillus plantarum have the greatest potential for use as a probiotic for marine shrimp.

  18. MEDOF - MINIMUM EUCLIDEAN DISTANCE OPTIMAL FILTER

    Science.gov (United States)

    Barton, R. S.

    1994-01-01

    The Minimum Euclidean Distance Optimal Filter program, MEDOF, generates filters for use in optical correlators. The algorithm implemented in MEDOF follows theory put forth by Richard D. Juday of NASA/JSC. This program analytically optimizes filters on arbitrary spatial light modulators such as coupled, binary, full complex, and fractional 2pi phase. MEDOF optimizes these modulators on a number of metrics including: correlation peak intensity at the origin for the centered appearance of the reference image in the input plane, signal to noise ratio including the correlation detector noise as well as the colored additive input noise, peak to correlation energy defined as the fraction of the signal energy passed by the filter that shows up in the correlation spot, and the peak to total energy which is a generalization of PCE that adds the passed colored input noise to the input image's passed energy. The user of MEDOF supplies the functions that describe the following quantities: 1) the reference signal, 2) the realizable complex encodings of both the input and filter SLM, 3) the noise model, possibly colored, as it adds at the reference image and at the correlation detection plane, and 4) the metric to analyze, here taken to be one of the analytical ones like SNR (signal to noise ratio) or PCE (peak to correlation energy) rather than peak to secondary ratio. MEDOF calculates filters for arbitrary modulators and a wide range of metrics as described above. MEDOF examines the statistics of the encoded input image's noise (if SNR or PCE is selected) and the filter SLM's (Spatial Light Modulator) available values. These statistics are used as the basis of a range for searching for the magnitude and phase of k, a pragmatically based complex constant for computing the filter transmittance from the electric field. The filter is produced for the mesh points in those ranges and the value of the metric that results from these points is computed. When the search is concluded, the

  19. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  20. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    Science.gov (United States)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this

  1. The relationship between anogenital distance, fatherhood, and fertility in adult men.

    Science.gov (United States)

    Eisenberg, Michael L; Hsieh, Michael H; Walters, Rustin Chanc; Krasnow, Ross; Lipshultz, Larry I

    2011-05-11

    Anogenital distance (AGD), a sexually dimorphic measure of genital development, is a marker for endocrine disruption in animal studies and may be shorter in infant males with genital anomalies. Given the correlation between anogenital distance and genital development, we sought to determine if anogenital distance varied in fertile compared to infertile adult men. A cross sectional study of consecutive men being evaluated for infertility and men with proven fertility was recruited from an andrology clinic. Anogenital distance (the distance from the posterior aspect of the scrotum to the anal verge) and penile length (PL) were measured using digital calipers. ANOVA and linear regression were used to determine correlations between AGD, fatherhood status, and semen analysis parameters (sperm density, motility, and total motile sperm count). A total of 117 infertile men (mean age: 35.3±17.4) and 56 fertile men (mean age: 44.8±9.7) were recruited. The infertile men possessed significantly shorter mean AGD and PL compared to the fertile controls (AGD: 31.8 vs 44.6 mm, PL: 107.1 vs 119.5 mm, pfatherhood, on both unadjusted and adjusted linear regression, AGD was significantly correlated with sperm density and total motile sperm count. After adjusting for demographic and reproductive variables, for each 1 cm increase in a man's AGD, the sperm density increases by 4.3 million sperm per mL (95% CI 0.53, 8.09, p = 0.03) and the total motile sperm count increases by 6.0 million sperm (95% CI 1.34, 10.58, p = 0.01). On adjusted analyses, no correlation was seen between penile length and semen parameters. A longer anogenital distance is associated with fatherhood and may predict normal male reproductive potential. Thus, AGD may provide a novel metric to assess reproductive potential in men.

  2. Two-dimensional manifolds with metrics of revolution

    International Nuclear Information System (INIS)

    Sabitov, I Kh

    2000-01-01

    This is a study of the topological and metric structure of two-dimensional manifolds with a metric that is locally a metric of revolution. In the case of compact manifolds this problem can be thoroughly investigated, and in particular it is explained why there are no closed analytic surfaces of revolution in R 3 other than a sphere and a torus (moreover, in the smoothness class C ∞ such surfaces, understood in a certain generalized sense, exist in any topological class)

  3. Gravitational lensing in metric theories of gravity

    International Nuclear Information System (INIS)

    Sereno, Mauro

    2003-01-01

    Gravitational lensing in metric theories of gravity is discussed. I introduce a generalized approximate metric element, inclusive of both post-post-Newtonian contributions and a gravitomagnetic field. Following Fermat's principle and standard hypotheses, I derive the time delay function and deflection angle caused by an isolated mass distribution. Several astrophysical systems are considered. In most of the cases, the gravitomagnetic correction offers the best perspectives for an observational detection. Actual measurements distinguish only marginally different metric theories from each other

  4. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  5. Hybrid metric-Palatini stars

    Science.gov (United States)

    Danilǎ, Bogdan; Harko, Tiberiu; Lobo, Francisco S. N.; Mak, M. K.

    2017-02-01

    We consider the internal structure and the physical properties of specific classes of neutron, quark and Bose-Einstein condensate stars in the recently proposed hybrid metric-Palatini gravity theory, which is a combination of the metric and Palatini f (R ) formalisms. It turns out that the theory is very successful in accounting for the observed phenomenology, since it unifies local constraints at the Solar System level and the late-time cosmic acceleration, even if the scalar field is very light. In this paper, we derive the equilibrium equations for a spherically symmetric configuration (mass continuity and Tolman-Oppenheimer-Volkoff) in the framework of the scalar-tensor representation of the hybrid metric-Palatini theory, and we investigate their solutions numerically for different equations of state of neutron and quark matter, by adopting for the scalar field potential a Higgs-type form. It turns out that the scalar-tensor definition of the potential can be represented as an Clairaut differential equation, and provides an explicit form for f (R ) given by f (R )˜R +Λeff, where Λeff is an effective cosmological constant. Furthermore, stellar models, described by the stiff fluid, radiation-like, bag model and the Bose-Einstein condensate equations of state are explicitly constructed in both general relativity and hybrid metric-Palatini gravity, thus allowing an in-depth comparison between the predictions of these two gravitational theories. As a general result it turns out that for all the considered equations of state, hybrid gravity stars are more massive than their general relativistic counterparts. Furthermore, two classes of stellar models corresponding to two particular choices of the functional form of the scalar field (constant value, and logarithmic form, respectively) are also investigated. Interestingly enough, in the case of a constant scalar field the equation of state of the matter takes the form of the bag model equation of state describing

  6. The universal connection and metrics on moduli spaces

    International Nuclear Information System (INIS)

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  7. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  8. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  9. Sustainability Metrics: The San Luis Basin Project

    Science.gov (United States)

    Sustainability is about promoting humanly desirable dynamic regimes of the environment. Metrics: ecological footprint, net regional product, exergy, emergy, and Fisher Information. Adaptive management: (1) metrics assess problem, (2) specific problem identified, and (3) managemen...

  10. Goedel-type metrics in various dimensions

    International Nuclear Information System (INIS)

    Guerses, Metin; Karasu, Atalay; Sarioglu, Oezguer

    2005-01-01

    Goedel-type metrics are introduced and used in producing charged dust solutions in various dimensions. The key ingredient is a (D - 1)-dimensional Riemannian geometry which is then employed in constructing solutions to the Einstein-Maxwell field equations with a dust distribution in D dimensions. The only essential field equation in the procedure turns out to be the source-free Maxwell's equation in the relevant background. Similarly the geodesics of this type of metric are described by the Lorentz force equation for a charged particle in the lower dimensional geometry. It is explicitly shown with several examples that Goedel-type metrics can be used in obtaining exact solutions to various supergravity theories and in constructing spacetimes that contain both closed timelike and closed null curves and that contain neither of these. Among the solutions that can be established using non-flat backgrounds, such as the Tangherlini metrics in (D - 1)-dimensions, there exists a class which can be interpreted as describing black-hole-type objects in a Goedel-like universe

  11. Standardised metrics for global surgical surveillance.

    Science.gov (United States)

    Weiser, Thomas G; Makary, Martin A; Haynes, Alex B; Dziekan, Gerald; Berry, William R; Gawande, Atul A

    2009-09-26

    Public health surveillance relies on standardised metrics to evaluate disease burden and health system performance. Such metrics have not been developed for surgical services despite increasing volume, substantial cost, and high rates of death and disability associated with surgery. The Safe Surgery Saves Lives initiative of WHO's Patient Safety Programme has developed standardised public health metrics for surgical care that are applicable worldwide. We assembled an international panel of experts to develop and define metrics for measuring the magnitude and effect of surgical care in a population, while taking into account economic feasibility and practicability. This panel recommended six measures for assessing surgical services at a national level: number of operating rooms, number of operations, number of accredited surgeons, number of accredited anaesthesia professionals, day-of-surgery death ratio, and postoperative in-hospital death ratio. We assessed the feasibility of gathering such statistics at eight diverse hospitals in eight countries and incorporated them into the WHO Guidelines for Safe Surgery, in which methods for data collection, analysis, and reporting are outlined.

  12. Developing a Security Metrics Scorecard for Healthcare Organizations.

    Science.gov (United States)

    Elrefaey, Heba; Borycki, Elizabeth; Kushniruk, Andrea

    2015-01-01

    In healthcare, information security is a key aspect of protecting a patient's privacy and ensuring systems availability to support patient care. Security managers need to measure the performance of security systems and this can be achieved by using evidence-based metrics. In this paper, we describe the development of an evidence-based security metrics scorecard specific to healthcare organizations. Study participants were asked to comment on the usability and usefulness of a prototype of a security metrics scorecard that was developed based on current research in the area of general security metrics. Study findings revealed that scorecards need to be customized for the healthcare setting in order for the security information to be useful and usable in healthcare organizations. The study findings resulted in the development of a security metrics scorecard that matches the healthcare security experts' information requirements.

  13. Landscape pattern metrics and regional assessment

    Science.gov (United States)

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  14. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  15. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  16. Predicting class testability using object-oriented metrics

    OpenAIRE

    Bruntink, Magiel; Deursen, Arie

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated by means of two case studies of large Java systems for which JUnit test cases exist. The goal of this paper is to define and evaluate a set of metrics that can be used to assess the testability of t...

  17. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  18. Hermitian-Einstein metrics on parabolic stable bundles

    International Nuclear Information System (INIS)

    Li Jiayu; Narasimhan, M.S.

    1995-12-01

    Let M-bar be a compact complex manifold of complex dimension two with a smooth Kaehler metric and D a smooth divisor on M-bar. If E is a rank 2 holomorphic vector bundle on M-bar with a stable parabolic structure along D, we prove the existence of a metric on E' = E module MbarD (compatible with the parabolic structure) which is Hermitian-Einstein with respect to the restriction of Kaehler metric of M-barD. A converse is also proved. (author). 24 refs

  19. Habitat models to assist plant protection efforts in Shenandoah National Park, Virginia, USA

    Science.gov (United States)

    Van Manen, F.T.; Young, J.A.; Thatcher, C.A.; Cass, W.B.; Ulrey, C.

    2005-01-01

    During 2002, the National Park Service initiated a demonstration project to develop science-based law enforcement strategies for the protection of at-risk natural resources, including American ginseng (Panax quinquefolius L.), bloodroot (Sanguinaria canadensis L.), and black cohosh (Cimicifuga racemosa (L.) Nutt. [syn. Actaea racemosa L.]). Harvest pressure on these species is increasing because of the growing herbal remedy market. We developed habitat models for Shenandoah National Park and the northern portion of the Blue Ridge Parkway to determine the distribution of favorable habitats of these three plant species and to demonstrate the use of that information to support plant protection activities. We compiled locations for the three plant species to delineate favorable habitats with a geographic information system (GIS). We mapped potential habitat quality for each species by calculating a multivariate statistic, Mahalanobis distance, based on GIS layers that characterized the topography, land cover, and geology of the plant locations (10-m resolution). We tested model performance with an independent dataset of plant locations, which indicated a significant relationship between Mahalanobis distance values and species occurrence. We also generated null models by examining the distribution of the Mahalanobis distance values had plants been distributed randomly. For all species, the habitat models performed markedly better than their respective null models. We used our models to direct field searches to the most favorable habitats, resulting in a sizeable number of new plant locations (82 ginseng, 73 bloodroot, and 139 black cohosh locations). The odds of finding new plant locations based on the habitat models were 4.5 (black cohosh) to 12.3 (American ginseng) times greater than random searches; thus, the habitat models can be used to improve the efficiency of plant protection efforts, (e.g., marking of plants, law enforcement activities). The field searches also

  20. Investigation of in-vehicle speech intelligibility metrics for normal hearing and hearing impaired listeners

    Science.gov (United States)

    Samardzic, Nikolina

    The effectiveness of in-vehicle speech communication can be a good indicator of the perception of the overall vehicle quality and customer satisfaction. Currently available speech intelligibility metrics do not account in their procedures for essential parameters needed for a complete and accurate evaluation of in-vehicle speech intelligibility. These include the directivity and the distance of the talker with respect to the listener, binaural listening, hearing profile of the listener, vocal effort, and multisensory hearing. In the first part of this research the effectiveness of in-vehicle application of these metrics is investigated in a series of studies to reveal their shortcomings, including a wide range of scores resulting from each of the metrics for a given measurement configuration and vehicle operating condition. In addition, the nature of a possible correlation between the scores obtained from each metric is unknown. The metrics and the subjective perception of speech intelligibility using, for example, the same speech material have not been compared in literature. As a result, in the second part of this research, an alternative method for speech intelligibility evaluation is proposed for use in the automotive industry by utilizing a virtual reality driving environment for ultimately setting targets, including the associated statistical variability, for future in-vehicle speech intelligibility evaluation. The Speech Intelligibility Index (SII) was evaluated at the sentence Speech Receptions Threshold (sSRT) for various listening situations and hearing profiles using acoustic perception jury testing and a variety of talker and listener configurations and background noise. In addition, the effect of individual sources and transfer paths of sound in an operating vehicle to the vehicle interior sound, specifically their effect on speech intelligibility was quantified, in the framework of the newly developed speech intelligibility evaluation method. Lastly

  1. Coverage Metrics for Model Checking

    Science.gov (United States)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  2. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  3. Reference-free ground truth metric for metal artifact evaluation in CT images

    International Nuclear Information System (INIS)

    Kratz, Baerbel; Ens, Svitlana; Mueller, Jan; Buzug, Thorsten M.

    2011-01-01

    Purpose: In computed tomography (CT), metal objects in the region of interest introduce data inconsistencies during acquisition. Reconstructing these data results in an image with star shaped artifacts induced by the metal inconsistencies. To enhance image quality, the influence of the metal objects can be reduced by different metal artifact reduction (MAR) strategies. For an adequate evaluation of new MAR approaches a ground truth reference data set is needed. In technical evaluations, where phantoms can be measured with and without metal inserts, ground truth data can easily be obtained by a second reference data acquisition. Obviously, this is not possible for clinical data. Here, an alternative evaluation method is presented without the need of an additionally acquired reference data set. Methods: The proposed metric is based on an inherent ground truth for metal artifacts as well as MAR methods comparison, where no reference information in terms of a second acquisition is needed. The method is based on the forward projection of a reconstructed image, which is compared to the actually measured projection data. Results: The new evaluation technique is performed on phantom and on clinical CT data with and without MAR. The metric results are then compared with methods using a reference data set as well as an expert-based classification. It is shown that the new approach is an adequate quantification technique for artifact strength in reconstructed metal or MAR CT images. Conclusions: The presented method works solely on the original projection data itself, which yields some advantages compared to distance measures in image domain using two data sets. Beside this, no parameters have to be manually chosen. The new metric is a useful evaluation alternative when no reference data are available.

  4. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  5. A Deep Similarity Metric Learning Model for Matching Text Chunks to Spatial Entities

    Science.gov (United States)

    Ma, K.; Wu, L.; Tao, L.; Li, W.; Xie, Z.

    2017-12-01

    The matching of spatial entities with related text is a long-standing research topic that has received considerable attention over the years. This task aims at enrich the contents of spatial entity, and attach the spatial location information to the text chunk. In the data fusion field, matching spatial entities with the corresponding describing text chunks has a big range of significance. However, the most traditional matching methods often rely fully on manually designed, task-specific linguistic features. This work proposes a Deep Similarity Metric Learning Model (DSMLM) based on Siamese Neural Network to learn similarity metric directly from the textural attributes of spatial entity and text chunk. The low-dimensional feature representation of the space entity and the text chunk can be learned separately. By employing the Cosine distance to measure the matching degree between the vectors, the model can make the matching pair vectors as close as possible. Mearnwhile, it makes the mismatching as far apart as possible through supervised learning. In addition, extensive experiments and analysis on geological survey data sets show that our DSMLM model can effectively capture the matching characteristics between the text chunk and the spatial entity, and achieve state-of-the-art performance.

  6. Comparison of luminance based metrics in different lighting conditions

    DEFF Research Database (Denmark)

    Wienold, J.; Kuhn, T.E.; Christoffersen, J.

    In this study, we evaluate established and newly developed metrics for predicting glare using data from three different research studies. The evaluation covers two different targets: 1. How well the user’s perception of glare magnitude correlates to the prediction of the glare metrics? 2. How well...... do the glare metrics describe the subjects’ disturbance by glare? We applied Spearman correlations, logistic regressions and an accuracy evaluation, based on an ROC-analysis. The results show that five of the twelve investigated metrics are failing at least one of the statistical tests. The other...... seven metrics CGI, modified DGI, DGP, Ev, average Luminance of the image Lavg, UGP and UGR are passing all statistical tests. DGP, CGI, DGI_mod and UGP have largest AUC and might be slightly more robust. The accuracy of the predictions of afore mentioned seven metrics for the disturbance by glare lies...

  7. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  8. Comparative Study of Trace Metrics between Bibliometrics and Patentometrics

    Directory of Open Access Journals (Sweden)

    Fred Y. Ye

    2016-06-01

    Full Text Available Purpose: To comprehensively evaluate the overall performance of a group or an individual in both bibliometrics and patentometrics. Design/methodology/approach: Trace metrics were applied to the top 30 universities in the 2014 Academic Ranking of World Universities (ARWU — computer sciences, the top 30 ESI highly cited papers in the computer sciences field in 2014, as well as the top 30 assignees and the top 30 most cited patents in the National Bureau of Economic Research (NBER computer hardware and software category. Findings: We found that, by applying trace metrics, the research or marketing impact efficiency, at both group and individual levels, was clearly observed. Furthermore, trace metrics were more sensitive to the different publication-citation distributions than the average citation and h-index were. Research limitations: Trace metrics considered publications with zero citations as negative contributions. One should clarify how he/she evaluates a zero-citation paper or patent before applying trace metrics. Practical implications: Decision makers could regularly examinine the performance of their university/company by applying trace metrics and adjust their policies accordingly. Originality/value: Trace metrics could be applied both in bibliometrics and patentometrics and provide a comprehensive view. Moreover, the high sensitivity and unique impact efficiency view provided by trace metrics can facilitate decision makers in examining and adjusting their policies.

  9. Similarity measures for face recognition

    CERN Document Server

    Vezzetti, Enrico

    2015-01-01

    Face recognition has several applications, including security, such as (authentication and identification of device users and criminal suspects), and in medicine (corrective surgery and diagnosis). Facial recognition programs rely on algorithms that can compare and compute the similarity between two sets of images. This eBook explains some of the similarity measures used in facial recognition systems in a single volume. Readers will learn about various measures including Minkowski distances, Mahalanobis distances, Hansdorff distances, cosine-based distances, among other methods. The book also summarizes errors that may occur in face recognition methods. Computer scientists "facing face" and looking to select and test different methods of computing similarities will benefit from this book. The book is also useful tool for students undertaking computer vision courses.

  10. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  11. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  12. Indefinite metric fields and the renormalization group

    International Nuclear Information System (INIS)

    Sherry, T.N.

    1976-11-01

    The renormalization group equations are derived for the Green functions of an indefinite metric field theory. In these equations one retains the mass dependence of the coefficient functions, since in the indefinite metric theories the masses cannot be neglected. The behavior of the effective coupling constant in the asymptotic and infrared limits is analyzed. The analysis is illustrated by means of a simple model incorporating indefinite metric fields. The model scales at first order, and at this order also the effective coupling constant has both ultra-violet and infra-red fixed points, the former being the bare coupling constant

  13. Kerr-Newman metric in deSitter background

    International Nuclear Information System (INIS)

    Patel, L.K.; Koppar, S.S.; Bhatt, P.V.

    1987-01-01

    In addition to the Kerr-Newman metric with cosmological constant several other metrics are presented giving Kerr-Newman type solutions of Einstein-Maxwell field equations in the background of deSitter universe. The electromagnetic field in all the solutions is assumed to be source-free. A new metric of what may be termed as an electrovac rotating deSitter space-time- a space-time devoid of matter but containing source-free electromagnetic field and a null fluid with twisting rays-has been presented. In the absence of the electromagnetic field, these solutions reduce to those discussed by Vaidya (1984). 8 refs. (author)

  14. The independence of software metrics taken at different life-cycle stages

    Science.gov (United States)

    Kafura, D.; Canning, J.; Reddy, G.

    1984-01-01

    Over the past few years a large number of software metrics have been proposed and, in varying degrees, a number of these metrics have been subjected to empirical validation which demonstrated the utility of the metrics in the software development process. Attempts to classify these metrics and to determine if the metrics in these different classes appear to be measuring distinct attributes of the software product are studied. Statistical analysis is used to determine the degree of relationship among the metrics.

  15. Rapporteur Report: Sources and Exposure Metrics for RF Epidemiology (Part 1) (invited paper)

    Energy Technology Data Exchange (ETDEWEB)

    Allen, S

    1999-07-01

    A variety of sources was considered illustrating the predominantly uniform exposure at a distance and highly non-uniform exposures close to sources. The measurement of both electric and magnetic fields was considered for near field situations but where possible the use of induced body currents was judged to be the preferred metric. The salient features affecting exposure to fields from mobile telephones and their base stations were discussed for both existing and third generation systems. As an aid to future cancer studies, high resolution numerical modelling was used to illustrate left/right exposure discrimination of bilateral organs in the head. Factors influencing both numerical and experimental dosimetry were discussed and studies to investigate the ability to appropriately rank exposure were considered important areas for research. (author)

  16. Rapporteur Report: Sources and Exposure Metrics for RF Epidemiology (Part 1) (invited paper)

    International Nuclear Information System (INIS)

    Allen, S.

    1999-01-01

    A variety of sources was considered illustrating the predominantly uniform exposure at a distance and highly non-uniform exposures close to sources. The measurement of both electric and magnetic fields was considered for near field situations but where possible the use of induced body currents was judged to be the preferred metric. The salient features affecting exposure to fields from mobile telephones and their base stations were discussed for both existing and third generation systems. As an aid to future cancer studies, high resolution numerical modelling was used to illustrate left/right exposure discrimination of bilateral organs in the head. Factors influencing both numerical and experimental dosimetry were discussed and studies to investigate the ability to appropriately rank exposure were considered important areas for research. (author)

  17. Thermodynamic metrics and optimal paths.

    Science.gov (United States)

    Sivak, David A; Crooks, Gavin E

    2012-05-11

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  18. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  19. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  20. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  1. Predicting class testability using object-oriented metrics

    NARCIS (Netherlands)

    M. Bruntink (Magiel); A. van Deursen (Arie)

    2004-01-01

    textabstractIn this paper we investigate factors of the testability of object-oriented software systems. The starting point is given by a study of the literature to obtain both an initial model of testability and existing OO metrics related to testability. Subsequently, these metrics are evaluated

  2. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  3. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  4. Performance metrics for the evaluation of hyperspectral chemical identification systems

    Science.gov (United States)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  5. Curvature properties of four-dimensional Walker metrics

    International Nuclear Information System (INIS)

    Chaichi, M; Garcia-Rio, E; Matsushita, Y

    2005-01-01

    A Walker n-manifold is a semi-Riemannian manifold, which admits a field of parallel null r-planes, r ≤ n/2. In the present paper we study curvature properties of a Walker 4-manifold (M, g) which admits a field of parallel null 2-planes. The metric g is necessarily of neutral signature (+ + - -). Such a Walker 4-manifold is the lowest dimensional example not of Lorentz type. There are three functions of coordinates which define a Walker metric. Some recent work shows that a Walker 4-manifold of restricted type whose metric is characterized by two functions exhibits a large variety of symplectic structures, Hermitian structures, Kaehler structures, etc. For such a restricted Walker 4-manifold, we shall study mainly curvature properties, e.g., conditions for a Walker metric to be Einstein, Osserman, or locally conformally flat, etc. One of our main results is the exact solutions to the Einstein equations for a restricted Walker 4-manifold

  6. Integration of multi-source data in mineral exploration

    DEFF Research Database (Denmark)

    Conradsen, Knut; Ersbøll, Bjarne Kjær; Nielsen, Allan Aasbjerg

    1991-01-01

    This paper describes several multivariate statistical analysis applications of geochemical, geophysical, and spectral variables in mineral exploration. Mahalanobis' distance is described in some detail and based on four multisource variables this measure is applied to produce a map that gives an ...... of automatically generated linear features based on Landsat TM data. The results indicate among other things a not previously recognized subsurface continuation of an already mapped lineament....

  7. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  8. Common fixed point theorems in intuitionistic fuzzy metric spaces and L-fuzzy metric spaces with nonlinear contractive condition

    International Nuclear Information System (INIS)

    Jesic, Sinisa N.; Babacev, Natasa A.

    2008-01-01

    The purpose of this paper is to prove some common fixed point theorems for a pair of R-weakly commuting mappings defined on intuitionistic fuzzy metric spaces [Park JH. Intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2004;22:1039-46] and L-fuzzy metric spaces [Saadati R, Razani A, Adibi H. A common fixed point theorem in L-fuzzy metric spaces. Chaos, Solitons and Fractals, doi:10.1016/j.chaos.2006.01.023], with nonlinear contractive condition, defined with function, first observed by Boyd and Wong [Boyd DW, Wong JSW. On nonlinear contractions. Proc Am Math Soc 1969;20:458-64]. Following Pant [Pant RP. Common fixed points of noncommuting mappings. J Math Anal Appl 1994;188:436-40] we define R-weak commutativity for a pair of mappings and then prove the main results. These results generalize some known results due to Saadati et al., and Jungck [Jungck G. Commuting maps and fixed points. Am Math Mon 1976;83:261-3]. Some examples and comments according to the preceding results are given

  9. 43 CFR 12.915 - Metric system of measurement.

    Science.gov (United States)

    2010-10-01

    ... procurements, grants, and other business-related activities. Metric implementation may take longer where the... recipient, such as when foreign competitors are producing competing products in non-metric units. (End of...

  10. Cophenetic metrics for phylogenetic trees, after Sokal and Rohlf.

    Science.gov (United States)

    Cardona, Gabriel; Mir, Arnau; Rosselló, Francesc; Rotger, Lucía; Sánchez, David

    2013-01-16

    Phylogenetic tree comparison metrics are an important tool in the study of evolution, and hence the definition of such metrics is an interesting problem in phylogenetics. In a paper in Taxon fifty years ago, Sokal and Rohlf proposed to measure quantitatively the difference between a pair of phylogenetic trees by first encoding them by means of their half-matrices of cophenetic values, and then comparing these matrices. This idea has been used several times since then to define dissimilarity measures between phylogenetic trees but, to our knowledge, no proper metric on weighted phylogenetic trees with nested taxa based on this idea has been formally defined and studied yet. Actually, the cophenetic values of pairs of different taxa alone are not enough to single out phylogenetic trees with weighted arcs or nested taxa. For every (rooted) phylogenetic tree T, let its cophenetic vectorφ(T) consist of all pairs of cophenetic values between pairs of taxa in T and all depths of taxa in T. It turns out that these cophenetic vectors single out weighted phylogenetic trees with nested taxa. We then define a family of cophenetic metrics dφ,p by comparing these cophenetic vectors by means of Lp norms, and we study, either analytically or numerically, some of their basic properties: neighbors, diameter, distribution, and their rank correlation with each other and with other metrics. The cophenetic metrics can be safely used on weighted phylogenetic trees with nested taxa and no restriction on degrees, and they can be computed in O(n2) time, where n stands for the number of taxa. The metrics dφ,1 and dφ,2 have positive skewed distributions, and they show a low rank correlation with the Robinson-Foulds metric and the nodal metrics, and a very high correlation with each other and with the splitted nodal metrics. The diameter of dφ,p, for p⩾1 , is in O(n(p+2)/p), and thus for low p they are more discriminative, having a wider range of values.

  11. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  12. Exact solutions of strong gravity in generalized metrics

    International Nuclear Information System (INIS)

    Hojman, R.; Smailagic, A.

    1981-05-01

    We consider classical solutions for the strong gravity theory of Salam and Strathdee in a wider class of metrics with positive, zero and negative curvature. It turns out that such solutions exist and their relevance for quark confinement is explored. Only metrics with positive curvature (spherical symmetry) give a confining potential in a simple picture of the scalar hadron. This supports the idea of describing the hadron as a closed microuniverse of the strong metric. (author)

  13. Development of quality metrics for ambulatory pediatric cardiology: Infection prevention.

    Science.gov (United States)

    Johnson, Jonathan N; Barrett, Cindy S; Franklin, Wayne H; Graham, Eric M; Halnon, Nancy J; Hattendorf, Brandy A; Krawczeski, Catherine D; McGovern, James J; O'Connor, Matthew J; Schultz, Amy H; Vinocur, Jeffrey M; Chowdhury, Devyani; Anderson, Jeffrey B

    2017-12-01

    In 2012, the American College of Cardiology's (ACC) Adult Congenital and Pediatric Cardiology Council established a program to develop quality metrics to guide ambulatory practices for pediatric cardiology. The council chose five areas on which to focus their efforts; chest pain, Kawasaki Disease, tetralogy of Fallot, transposition of the great arteries after arterial switch, and infection prevention. Here, we sought to describe the process, evaluation, and results of the Infection Prevention Committee's metric design process. The infection prevention metrics team consisted of 12 members from 11 institutions in North America. The group agreed to work on specific infection prevention topics including antibiotic prophylaxis for endocarditis, rheumatic fever, and asplenia/hyposplenism; influenza vaccination and respiratory syncytial virus prophylaxis (palivizumab); preoperative methods to reduce intraoperative infections; vaccinations after cardiopulmonary bypass; hand hygiene; and testing to identify splenic function in patients with heterotaxy. An extensive literature review was performed. When available, previously published guidelines were used fully in determining metrics. The committee chose eight metrics to submit to the ACC Quality Metric Expert Panel for review. Ultimately, metrics regarding hand hygiene and influenza vaccination recommendation for patients did not pass the RAND analysis. Both endocarditis prophylaxis metrics and the RSV/palivizumab metric passed the RAND analysis but fell out during the open comment period. Three metrics passed all analyses, including those for antibiotic prophylaxis in patients with heterotaxy/asplenia, for influenza vaccination compliance in healthcare personnel, and for adherence to recommended regimens of secondary prevention of rheumatic fever. The lack of convincing data to guide quality improvement initiatives in pediatric cardiology is widespread, particularly in infection prevention. Despite this, three metrics were

  14. The challenge of global water access monitoring: evaluating straight-line distance versus self-reported travel time among rural households in Mozambique.

    Science.gov (United States)

    Ho, Jeff C; Russel, Kory C; Davis, Jennifer

    2014-03-01

    Support is growing for the incorporation of fetching time and/or distance considerations in the definition of access to improved water supply used for global monitoring. Current efforts typically rely on self-reported distance and/or travel time data that have been shown to be unreliable. To date, however, there has been no head-to-head comparison of such indicators with other possible distance/time metrics. This study provides such a comparison. We examine the association between both straight-line distance and self-reported one-way travel time with measured route distances to water sources for 1,103 households in Nampula province, Mozambique. We find straight-line, or Euclidean, distance to be a good proxy for route distance (R(2) = 0.98), while self-reported travel time is a poor proxy (R(2) = 0.12). We also apply a variety of time- and distance-based indicators proposed in the literature to our sample data, finding that the share of households classified as having versus lacking access would differ by more than 70 percentage points depending on the particular indicator employed. This work highlights the importance of the ongoing debate regarding valid, reliable, and feasible strategies for monitoring progress in the provision of improved water supply services.

  15. A new form of the rotating C-metric

    International Nuclear Information System (INIS)

    Hong, Kenneth; Teo, Edward

    2005-01-01

    In a previous paper, we showed that the traditional form of the charged C-metric can be transformed, by a change of coordinates, into one with an explicitly factorizable structure function. This new form of the C-metric has the advantage that its properties become much simpler to analyse. In this paper, we propose an analogous new form for the rotating charged C-metric, with structure function G(ξ) = (1 - ξ 2 )(1 + r + Aξ)(1 + r - Aξ), where r ± are the usual locations of the horizons in the Kerr-Newman black hole. Unlike the non-rotating case, this new form is not related to the traditional one by a coordinate transformation. We show that the physical distinction between these two forms of the rotating C-metric lies in the nature of the conical singularities causing the black holes to accelerate apart: the new form is free of torsion singularities and therefore does not contain any closed timelike curves. We claim that this new form should be considered the natural generalization of the C-metric with rotation

  16. [Clinical trial data management and quality metrics system].

    Science.gov (United States)

    Chen, Zhao-hua; Huang, Qin; Deng, Ya-zhong; Zhang, Yue; Xu, Yu; Yu, Hao; Liu, Zong-fan

    2015-11-01

    Data quality management system is essential to ensure accurate, complete, consistent, and reliable data collection in clinical research. This paper is devoted to various choices of data quality metrics. They are categorized by study status, e.g. study start up, conduct, and close-out. In each category, metrics for different purposes are listed according to ALCOA+ principles such us completeness, accuracy, timeliness, traceability, etc. Some general quality metrics frequently used are also introduced. This paper contains detail information as much as possible to each metric by providing definition, purpose, evaluation, referenced benchmark, and recommended targets in favor of real practice. It is important that sponsors and data management service providers establish a robust integrated clinical trial data quality management system to ensure sustainable high quality of clinical trial deliverables. It will also support enterprise level of data evaluation and bench marking the quality of data across projects, sponsors, data management service providers by using objective metrics from the real clinical trials. We hope this will be a significant input to accelerate the improvement of clinical trial data quality in the industry.

  17. Sigma Routing Metric for RPL Protocol

    Directory of Open Access Journals (Sweden)

    Paul Sanmartin

    2018-04-01

    Full Text Available This paper presents the adaptation of a specific metric for the RPL protocol in the objective function MRHOF. Among the functions standardized by IETF, we find OF0, which is based on the minimum hop count, as well as MRHOF, which is based on the Expected Transmission Count (ETX. However, when the network becomes denser or the number of nodes increases, both OF0 and MRHOF introduce long hops, which can generate a bottleneck that restricts the network. The adaptation is proposed to optimize both OFs through a new routing metric. To solve the above problem, the metrics of the minimum number of hops and the ETX are combined by designing a new routing metric called SIGMA-ETX, in which the best route is calculated using the standard deviation of ETX values between each node, as opposed to working with the ETX average along the route. This method ensures a better routing performance in dense sensor networks. The simulations are done through the Cooja simulator, based on the Contiki operating system. The simulations showed that the proposed optimization outperforms at a high margin in both OF0 and MRHOF, in terms of network latency, packet delivery ratio, lifetime, and power consumption.

  18. Socio-Technical Security Metrics (Dagstuhl Seminar 14491)

    NARCIS (Netherlands)

    Gollmann, Dieter; Herley, Cormac; Koenig, Vincent; Pieters, Wolter; Sasse, Martina Angela

    2015-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14491 "Socio-Technical Security Metrics". In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that the dikes should be high enough to

  19. Landscape metrics for three-dimension urban pattern recognition

    Science.gov (United States)

    Liu, M.; Hu, Y.; Zhang, W.; Li, C.

    2017-12-01

    Understanding how landscape pattern determines population or ecosystem dynamics is crucial for managing our landscapes. Urban areas are becoming increasingly dominant social-ecological systems, so it is important to understand patterns of urbanization. Most studies of urban landscape pattern examine land-use maps in two dimensions because the acquisition of 3-dimensional information is difficult. We used Brista software based on Quickbird images and aerial photos to interpret the height of buildings, thus incorporating a 3-dimensional approach. We estimated the feasibility and accuracy of this approach. A total of 164,345 buildings in the Liaoning central urban agglomeration of China, which included seven cities, were measured. Twelve landscape metrics were proposed or chosen to describe the urban landscape patterns in 2- and 3-dimensional scales. The ecological and social meaning of landscape metrics were analyzed with multiple correlation analysis. The results showed that classification accuracy compared with field surveys was 87.6%, which means this method for interpreting building height was acceptable. The metrics effectively reflected the urban architecture in relation to number of buildings, area, height, 3-D shape and diversity aspects. We were able to describe the urban characteristics of each city with these metrics. The metrics also captured ecological and social meanings. The proposed landscape metrics provided a new method for urban landscape analysis in three dimensions.

  20. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  1. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  2. Metric Sex Determination of the Human Coxal Bone on a Virtual Sample using Decision Trees.

    Science.gov (United States)

    Savall, Frédéric; Faruch-Bilfeld, Marie; Dedouit, Fabrice; Sans, Nicolas; Rousseau, Hervé; Rougé, Daniel; Telmon, Norbert

    2015-11-01

    Decision trees provide an alternative to multivariate discriminant analysis, which is still the most commonly used in anthropometric studies. Our study analyzed the metric characterization of a recent virtual sample of 113 coxal bones using decision trees for sex determination. From 17 osteometric type I landmarks, a dataset was built with five classic distances traditionally reported in the literature and six new distances selected using the two-step ratio method. A ten-fold cross-validation was performed, and a decision tree was established on two subsamples (training and test sets). The decision tree established on the training set included three nodes and its application to the test set correctly classified 92% of individuals. This percentage was similar to the data of the literature. The usefulness of decision trees has been demonstrated in numerous fields. They have been already used in sex determination, body mass prediction, and ancestry estimation. This study shows another use of decision trees enabling simple and accurate sex determination. © 2015 American Academy of Forensic Sciences.

  3. Reproducibility of graph metrics of human brain functional networks.

    Science.gov (United States)

    Deuker, Lorena; Bullmore, Edward T; Smith, Marie; Christensen, Soren; Nathan, Pradeep J; Rockstroh, Brigitte; Bassett, Danielle S

    2009-10-01

    Graph theory provides many metrics of complex network organization that can be applied to analysis of brain networks derived from neuroimaging data. Here we investigated the test-retest reliability of graph metrics of functional networks derived from magnetoencephalography (MEG) data recorded in two sessions from 16 healthy volunteers who were studied at rest and during performance of the n-back working memory task in each session. For each subject's data at each session, we used a wavelet filter to estimate the mutual information (MI) between each pair of MEG sensors in each of the classical frequency intervals from gamma to low delta in the overall range 1-60 Hz. Undirected binary graphs were generated by thresholding the MI matrix and 8 global network metrics were estimated: the clustering coefficient, path length, small-worldness, efficiency, cost-efficiency, assortativity, hierarchy, and synchronizability. Reliability of each graph metric was assessed using the intraclass correlation (ICC). Good reliability was demonstrated for most metrics applied to the n-back data (mean ICC=0.62). Reliability was greater for metrics in lower frequency networks. Higher frequency gamma- and beta-band networks were less reliable at a global level but demonstrated high reliability of nodal metrics in frontal and parietal regions. Performance of the n-back task was associated with greater reliability than measurements on resting state data. Task practice was also associated with greater reliability. Collectively these results suggest that graph metrics are sufficiently reliable to be considered for future longitudinal studies of functional brain network changes.

  4. The extra mile: Ungulate migration distance alters the use of seasonal range and exposure to anthropogenic risk

    Science.gov (United States)

    Sawyer, Hall; Middleton, Arthur D.; Hayes, Matthew M.; Kauffman, Matthew J.; Monteith, Kevin L.

    2016-01-01

    Partial migration occurs across a variety of taxa and has important ecological and evolutionary consequences. Among ungulates, studies of partially migratory populations have allowed researchers to compare and contrast performance metrics of migrants versus residents and examine how environmental factors influence the relative abundance of each. Such studies tend to characterize animals discretely as either migratory or resident, but we suggest that variable migration distances within migratory herds are an important and overlooked form of population structure, with potential consequences for animal fitness. We examined whether the variation in individual migration distances (20–264 km) within a single wintering population of mule deer (Odocoileus hemionus) was associated with several critical behavioral attributes of migration, including timing of migration, time allocation to seasonal ranges, and exposure to anthropogenic mortality risks. Both the timing of migration and the amount of time animals allocated to seasonal ranges varied with migration distance. Animals migrating long distances (150–250 km) initiated spring migration more than three weeks before than those migrating moderate (50–150 km) or short distances (forage and effectively increase carrying capacity. Clear differences in winter residency, migration duration, and risk of anthropogenic mortality among short-, moderate-, and long-distance migrants suggest fitness trade-offs may exist among migratory segments of the population. Future studies of partial migration may benefit from expanding comparisons of residents and migrants, to consider how variable migration distances of migrants may influence the costs and benefits of migration.

  5. Evaluation metrics for biostatistical and epidemiological collaborations.

    Science.gov (United States)

    Rubio, Doris McGartland; Del Junco, Deborah J; Bhore, Rafia; Lindsell, Christopher J; Oster, Robert A; Wittkowski, Knut M; Welty, Leah J; Li, Yi-Ju; Demets, Dave

    2011-10-15

    Increasing demands for evidence-based medicine and for the translation of biomedical research into individual and public health benefit have been accompanied by the proliferation of special units that offer expertise in biostatistics, epidemiology, and research design (BERD) within academic health centers. Objective metrics that can be used to evaluate, track, and improve the performance of these BERD units are critical to their successful establishment and sustainable future. To develop a set of reliable but versatile metrics that can be adapted easily to different environments and evolving needs, we consulted with members of BERD units from the consortium of academic health centers funded by the Clinical and Translational Science Award Program of the National Institutes of Health. Through a systematic process of consensus building and document drafting, we formulated metrics that covered the three identified domains of BERD practices: the development and maintenance of collaborations with clinical and translational science investigators, the application of BERD-related methods to clinical and translational research, and the discovery of novel BERD-related methodologies. In this article, we describe the set of metrics and advocate their use for evaluating BERD practices. The routine application, comparison of findings across diverse BERD units, and ongoing refinement of the metrics will identify trends, facilitate meaningful changes, and ultimately enhance the contribution of BERD activities to biomedical research. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Generalization of Vaidya's radiation metric

    Energy Technology Data Exchange (ETDEWEB)

    Gleiser, R J; Kozameh, C N [Universidad Nacional de Cordoba (Argentina). Instituto de Matematica, Astronomia y Fisica

    1981-11-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations.

  7. Marketing communication metrics for social media

    OpenAIRE

    Töllinen, Aarne; Karjaluoto, Heikki

    2011-01-01

    The objective of this paper is to develop a conceptual framework for measuring the effectiveness of social media marketing communications. Specifically, we study whether the existing marketing communications performance metrics are still valid in the changing digitalised communications landscape, or whether it is time to rethink them, or even to devise entirely new metrics. Recent advances in information technology and marketing bring a need to re-examine measurement models. We combine two im...

  8. Accuracy and precision in the calculation of phenology metrics

    DEFF Research Database (Denmark)

    Ferreira, Ana Sofia; Visser, Andre; MacKenzie, Brian

    2014-01-01

    a phenology metric is first determined from a noise- and gap-free time series, and again once it has been modified. We show that precision is a greater concern than accuracy for many of these metrics, an important point that has been hereto overlooked in the literature. The variability in precision between...... phenology metrics is substantial, but it can be improved by the use of preprocessing techniques (e.g., gap-filling or smoothing). Furthermore, there are important differences in the inherent variability of the metrics that may be crucial in the interpretation of studies based upon them. Of the considered......Phytoplankton phenology (the timing of seasonal events) is a commonly used indicator for evaluating responses of marine ecosystems to climate change. However, phenological metrics are vulnerable to observation-(bloom amplitude, missing data, and observational noise) and analysis-related (temporal...

  9. Quantum anomalies for generalized Euclidean Taub-NUT metrics

    International Nuclear Information System (INIS)

    Cotaescu, Ion I; Moroianu, Sergiu; Visinescu, Mihai

    2005-01-01

    The generalized Taub-NUT metrics exhibit in general gravitational anomalies. This is in contrast with the fact that the original Taub-NUT metric does not exhibit gravitational anomalies, which is a consequence of the fact that it admits Killing-Yano tensors forming Staeckel-Killing tensors as products. We have found that for axial anomalies, interpreted as the index of the Dirac operator, the presence of Killing-Yano tensors is irrelevant. In order to evaluate the axial anomalies, we compute the index of the Dirac operator with the APS boundary condition on balls and on annular domains. The result is an explicit number-theoretic quantity depending on the radii of the domain. This quantity is 0 for metrics close to the original Taub-NUT metric but it does not vanish in general

  10. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  11. Massless and massive quanta resulting from a mediumlike metric tensor

    International Nuclear Information System (INIS)

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  12. MODELLING THE POTENTIAL DISTRIBUTION OF TREE SPECIES ON A NATIONAL SCALE IN COLOMBIA: APPLICATION TO PALICOUREA ANGUSTIFOLIA KUNTH AND PALICOUREA GUIANENSIS AUBL.

    Directory of Open Access Journals (Sweden)

    Armenteras Dolors

    2010-12-01

    Full Text Available The results in this study illustrate the methods of using the existing species' presentrecords and environmental data to produce a niche-based model based on Mahalanobis distances, and also to predict the distribution of a number of tree species in order to apply it on a national scale to a tropical country such as Colombia. The technique applied is based on the Mahalanobis distance, a generalised squared distance statistic. We used environmental data integrated into a GIS, and a georeferenced collection of localities of Palicourea angustifolia and Palicourea guianensis to produce and test the predictive models. We used record data for Warszewiczia coccinea to validate the model. The two Palicourea species show largely complementary potential ranges. P. angustifolia shows a clear Andean distribution with a presence in lower and upper mountain areas but not in the highlands or in the inter-Andean valleys. P. guianensis was predicted throughout most of the lowland areas of Colombia including lowland Amazonian forests, and most of the tropical savannas of Orinoquia. The model predicted an overlapping distribution of the two species of 93.9 km2. The Mahalanobian approach contributes to the development of biogeographically oriented modelling that makes the best use of the available data in data-scarce regions (such as most of the tropics. The technique provides key information about the environmental niche of the species being modelled, and allows comparisons between the species. The prediction achieved for the two species was considered satisfactory.

  13. The relationship between anogenital distance, fatherhood, and fertility in adult men.

    Directory of Open Access Journals (Sweden)

    Michael L Eisenberg

    Full Text Available BACKGROUND: Anogenital distance (AGD, a sexually dimorphic measure of genital development, is a marker for endocrine disruption in animal studies and may be shorter in infant males with genital anomalies. Given the correlation between anogenital distance and genital development, we sought to determine if anogenital distance varied in fertile compared to infertile adult men. METHODS: A cross sectional study of consecutive men being evaluated for infertility and men with proven fertility was recruited from an andrology clinic. Anogenital distance (the distance from the posterior aspect of the scrotum to the anal verge and penile length (PL were measured using digital calipers. ANOVA and linear regression were used to determine correlations between AGD, fatherhood status, and semen analysis parameters (sperm density, motility, and total motile sperm count. FINDINGS: A total of 117 infertile men (mean age: 35.3±17.4 and 56 fertile men (mean age: 44.8±9.7 were recruited. The infertile men possessed significantly shorter mean AGD and PL compared to the fertile controls (AGD: 31.8 vs 44.6 mm, PL: 107.1 vs 119.5 mm, p<0.01. The difference in AGD persisted even after accounting for ethnic and anthropomorphic differences. In addition to fatherhood, on both unadjusted and adjusted linear regression, AGD was significantly correlated with sperm density and total motile sperm count. After adjusting for demographic and reproductive variables, for each 1 cm increase in a man's AGD, the sperm density increases by 4.3 million sperm per mL (95% CI 0.53, 8.09, p = 0.03 and the total motile sperm count increases by 6.0 million sperm (95% CI 1.34, 10.58, p = 0.01. On adjusted analyses, no correlation was seen between penile length and semen parameters. CONCLUSION: A longer anogenital distance is associated with fatherhood and may predict normal male reproductive potential. Thus, AGD may provide a novel metric to assess reproductive potential in men.

  14. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  15. MESUR: USAGE-BASED METRICS OF SCHOLARLY IMPACT

    Energy Technology Data Exchange (ETDEWEB)

    BOLLEN, JOHAN [Los Alamos National Laboratory; RODRIGUEZ, MARKO A. [Los Alamos National Laboratory; VAN DE SOMPEL, HERBERT [Los Alamos National Laboratory

    2007-01-30

    The evaluation of scholarly communication items is now largely a matter of expert opinion or metrics derived from citation data. Both approaches can fail to take into account the myriad of factors that shape scholarly impact. Usage data has emerged as a promising complement to existing methods o fassessment but the formal groundwork to reliably and validly apply usage-based metrics of schlolarly impact is lacking. The Andrew W. Mellon Foundation funded MESUR project constitutes a systematic effort to define, validate and cross-validate a range of usage-based metrics of schlolarly impact by creating a semantic model of the scholarly communication process. The constructed model will serve as the basis of a creating a large-scale semantic network that seamlessly relates citation, bibliographic and usage data from a variety of sources. A subsequent program that uses the established semantic network as a reference data set will determine the characteristics and semantics of a variety of usage-based metrics of schlolarly impact. This paper outlines the architecture and methodology adopted by the MESUR project and its future direction.

  16. Hawking radiation from acoustic black holes, short distance and back reaction effects

    International Nuclear Information System (INIS)

    Balbinot, R.; Fabbri, A.; Parentani, R.

    2004-01-01

    Using the action principle we first review how linear density perturbations (sound waves) in an Eulerian fluid obey a relativistic equation: the d'Alembert equation. This analogy between propagation of sound and that of a massless scalar field in a Lorentzian metric also applies to non-homogeneous flows. In these cases, sound waves effectively propagate in a curved four-dimensional acoustic metric whose properties are determined by the flow. Using this analogy, we consider regular flows which become supersonic, and show that the acoustic metric behaves like that of a black hole. The analogy is so good that, when considering quantum mechanics, acoustic black holes should produce a thermal flux of Hawking phonons. We then focus on two interesting questions related to Hawking radiation which are not fully understood in the context of gravitational black holes due to the lack of a theory of quantum gravity. The first concerns the calculation of the modifications of Hawking radiation which are induced by dispersive effects at short distances, approaching the atomic scale when considering sound. We generalize existing treatments and calculate the modifications caused by the propagation near the black-hole horizon. The second question concerns back reaction effects. We return to the Eulerian action, compute second-order effects, and show that the back reaction of sound waves on the fluid's flow can be expressed in terms of their stress-energy tensor. Using this result in the context of Hawking radiation, we compute the secular effect on the background flow

  17. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    Science.gov (United States)

    Samia, Diogo S M; Blumstein, Daniel T

    2014-01-01

    Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR) hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD), and its flight initiation distance (the distance at which it flees the approaching predator, FID). However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship) and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ), a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis). Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship). Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  18. Phi index: a new metric to test the flush early and avoid the rush hypothesis.

    Directory of Open Access Journals (Sweden)

    Diogo S M Samia

    Full Text Available Optimal escape theory states that animals should counterbalance the costs and benefits of flight when escaping from a potential predator. However, in apparent contradiction with this well-established optimality model, birds and mammals generally initiate escape soon after beginning to monitor an approaching threat, a phenomena codified as the "Flush Early and Avoid the Rush" (FEAR hypothesis. Typically, the FEAR hypothesis is tested using correlational statistics and is supported when there is a strong relationship between the distance at which an individual first responds behaviorally to an approaching predator (alert distance, AD, and its flight initiation distance (the distance at which it flees the approaching predator, FID. However, such correlational statistics are both inadequate to analyze relationships constrained by an envelope (such as that in the AD-FID relationship and are sensitive to outliers with high leverage, which can lead one to erroneous conclusions. To overcome these statistical concerns we develop the phi index (Φ, a distribution-free metric to evaluate the goodness of fit of a 1:1 relationship in a constraint envelope (the prediction of the FEAR hypothesis. Using both simulation and empirical data, we conclude that Φ is superior to traditional correlational analyses because it explicitly tests the FEAR prediction, is robust to outliers, and it controls for the disproportionate influence of observations from large predictor values (caused by the constrained envelope in AD-FID relationship. Importantly, by analyzing the empirical data we corroborate the strong effect that alertness has on flight as stated by the FEAR hypothesis.

  19. Classroom reconstruction of the Schwarzschild metric

    OpenAIRE

    Kassner, Klaus

    2015-01-01

    A promising way to introduce general relativity in the classroom is to study the physical implications of certain given metrics, such as the Schwarzschild one. This involves lower mathematical expenditure than an approach focusing on differential geometry in its full glory and permits to emphasize physical aspects before attacking the field equations. Even so, in terms of motivation, lacking justification of the metric employed may pose an obstacle. The paper discusses how to establish the we...

  20. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  1. Datasets related to in-land water for limnology and remote sensing applications: distance-to-land, distance-to-water, water-body identifier and lake-centre co-ordinates.

    Science.gov (United States)

    Carrea, Laura; Embury, Owen; Merchant, Christopher J

    2015-11-01

    Datasets containing information to locate and identify water bodies have been generated from data locating static-water-bodies with resolution of about 300 m (1/360 ∘ ) recently released by the Land Cover Climate Change Initiative (LC CCI) of the European Space Agency. The LC CCI water-bodies dataset has been obtained from multi-temporal metrics based on time series of the backscattered intensity recorded by ASAR on Envisat between 2005 and 2010. The new derived datasets provide coherently: distance to land, distance to water, water-body identifiers and lake-centre locations. The water-body identifier dataset locates the water bodies assigning the identifiers of the Global Lakes and Wetlands Database (GLWD), and lake centres are defined for in-land waters for which GLWD IDs were determined. The new datasets therefore link recent lake/reservoir/wetlands extent to the GLWD, together with a set of coordinates which locates unambiguously the water bodies in the database. Information on distance-to-land for each water cell and the distance-to-water for each land cell has many potential applications in remote sensing, where the applicability of geophysical retrieval algorithms may be affected by the presence of water or land within a satellite field of view (image pixel). During the generation and validation of the datasets some limitations of the GLWD database and of the LC CCI water-bodies mask have been found. Some examples of the inaccuracies/limitations are presented and discussed. Temporal change in water-body extent is common. Future versions of the LC CCI dataset are planned to represent temporal variation, and this will permit these derived datasets to be updated.

  2. Mahalanobis Distance-Based Classifiers are Able to Recognize EEG Patterns by Using Few EEG Electrodes

    Science.gov (United States)

    2001-10-25

    Mouriño 3 , Angela Cattini 4 , Serenella Salinari 4 , Maria Grazia Marciani 2,5 and Febo Cincotti 5 1 Dip. Fisiologia umana e Farmacologia...Performing Organization Name(s) and Address(es) Dip. Fisiologia umana e Farmacologia, Università "La Sapienza", Rome, ITALY Performing Organization

  3. PERBANDINGAN EUCLIDEAN DISTANCE DENGAN CANBERRA DISTANCE PADA FACE RECOGNITION

    Directory of Open Access Journals (Sweden)

    Sendhy Rachmat Wurdianarto

    2014-08-01

    Full Text Available Perkembangan ilmu pada dunia komputer sangatlah pesat. Salah satu yang menandai hal ini adalah ilmu komputer telah merambah pada dunia biometrik. Arti biometrik sendiri adalah karakter-karakter manusia yang dapat digunakan untuk membedakan antara orang yang satu dengan yang lainnya. Salah satu pemanfaatan karakter / organ tubuh pada setiap manusia yang digunakan untuk identifikasi (pengenalan adalah dengan memanfaatkan wajah. Dari permasalahan diatas dalam pengenalan lebih tentang aplikasi Matlab pada Face Recognation menggunakan metode Euclidean Distance dan Canberra Distance. Model pengembangan aplikasi yang digunakan adalah model waterfall. Model waterfall beriisi rangkaian aktivitas proses yang disajikan dalam proses analisa kebutuhan, desain menggunakan UML (Unified Modeling Language, inputan objek gambar diproses menggunakan Euclidean Distance dan Canberra Distance. Kesimpulan yang dapat ditarik adalah aplikasi face Recognation menggunakan metode euclidean Distance dan Canverra Distance terdapat kelebihan dan kekurangan masing-masing. Untuk kedepannya aplikasi tersebut dapat dikembangkan dengan menggunakan objek berupa video ataupun objek lainnya.   Kata kunci : Euclidean Distance, Face Recognition, Biometrik, Canberra Distance

  4. Use of metrics in an effective ALARA program

    International Nuclear Information System (INIS)

    Bates, B.B. Jr.

    1996-01-01

    ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting an processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific open-quotes indicators.close quotes To choose the site-specific indicators that will be tracked and trended requires careful review. Justification is needed to defend the indicators selected and maybe even stronger justification is needed for those indicators that are available, but not chosen as a metric. Historically, the many different sources of information resided in a plethora of locations. Even the same type of metric had data located in different areas and could not be easily totaled for the entire Site. This required the end user to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that a customer can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. IL is now also a tool to communicate the status of the radiation protection program to facility managers. Finally, it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, open-quotes user friendly,close quotes software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics. These include quarterly performance indicator reports, monthly radiological incident reports, monthly external dose history and goals tracking reports, and the future use of performance indexing

  5. Two projects in theoretical neuroscience: A convolution-based metric for neural membrane potentials and a combinatorial connectionist semantic network method

    Science.gov (United States)

    Evans, Garrett Nolan

    In this work, I present two projects that both contribute to the aim of discovering how intelligence manifests in the brain. The first project is a method for analyzing recorded neural signals, which takes the form of a convolution-based metric on neural membrane potential recordings. Relying only on integral and algebraic operations, the metric compares the timing and number of spikes within recordings as well as the recordings' subthreshold features: summarizing differences in these with a single "distance" between the recordings. Like van Rossum's (2001) metric for spike trains, the metric is based on a convolution operation that it performs on the input data. The kernel used for the convolution is carefully chosen such that it produces a desirable frequency space response and, unlike van Rossum's kernel, causes the metric to be first order both in differences between nearby spike times and in differences between same-time membrane potential values: an important trait. The second project is a combinatorial syntax method for connectionist semantic network encoding. Combinatorial syntax has been a point on which those who support a symbol-processing view of intelligent processing and those who favor a connectionist view have had difficulty seeing eye-to-eye. Symbol-processing theorists have persuasively argued that combinatorial syntax is necessary for certain intelligent mental operations, such as reasoning by analogy. Connectionists have focused on the versatility and adaptability offered by self-organizing networks of simple processing units. With this project, I show that there is a way to reconcile the two perspectives and to ascribe a combinatorial syntax to a connectionist network. The critical principle is to interpret nodes, or units, in the connectionist network as bound integrations of the interpretations for nodes that they share links with. Nodes need not correspond exactly to neurons and may correspond instead to distributed sets, or assemblies, of

  6. Comprehensive Metric Education Project: Implementing Metrics at a District Level Administrative Guide.

    Science.gov (United States)

    Borelli, Michael L.

    This document details the administrative issues associated with guiding a school district through its metrication efforts. Issues regarding staff development, curriculum development, and the acquisition of instructional resources are considered. Alternative solutions are offered. Finally, an overall implementation strategy is discussed with…

  7. Effective use of metrics in an ALARA program

    International Nuclear Information System (INIS)

    Bates, B.B. Jr.

    1996-01-01

    ALARA radiological protection programs require metrics to meet their objectives. Sources of metrics include; external dosimetry; internal dosimetry; radiological occurrences from the occurrence reporting and processing system (ORPS); and radiological incident reports (RIR). The sources themselves contain an abundance of specific ''indicators''. To choose the site-specific indicators that will be tracked and trended requires careful review. This required the end users to expend valuable time and effort to locate the data they needed. To address this problem, a central metrics database has been developed so that customers can have all their questions addressed quickly and correctly. The database was developed in the beginning to answer some of the customer's most frequently asked questions. It is now also a tool to communicate the status of the radiation protection program to facility managers. Finally it also addresses requirements contained in the Rad Con manual and the 10CFR835 implementation guides. The database uses currently available, ''user friendly'', software and contains information from RIR's, ORPS, and external dosimetry records specific to ALARA performance indicators. The database is expandable to allow new metrics input. Specific reports have been developed to assist customers in their tracking and trending of ALARA metrics

  8. A family of metric gravities

    Science.gov (United States)

    Shuler, Robert

    2018-04-01

    The goal of this paper is to take a completely fresh approach to metric gravity, in which the metric principle is strictly adhered to but its properties in local space-time are derived from conservation principles, not inferred from a global field equation. The global field strength variation then gains some flexibility, but only in the regime of very strong fields (2nd-order terms) whose measurement is now being contemplated. So doing provides a family of similar gravities, differing only in strong fields, which could be developed into meaningful verification targets for strong fields after the manner in which far-field variations were used in the 20th century. General Relativity (GR) is shown to be a member of the family and this is demonstrated by deriving the Schwarzschild metric exactly from a suitable field strength assumption. The method of doing so is interesting in itself because it involves only one differential equation rather than the usual four. Exact static symmetric field solutions are also given for one pedagogical alternative based on potential, and one theoretical alternative based on inertia, and the prospects of experimentally differentiating these are analyzed. Whether the method overturns the conventional wisdom that GR is the only metric theory of gravity and that alternatives must introduce additional interactions and fields is somewhat semantical, depending on whether one views the field strength assumption as a field and whether the assumption that produces GR is considered unique in some way. It is of course possible to have other fields, and the local space-time principle can be applied to field gravities which usually are weak-field approximations having only time dilation, giving them the spatial factor and promoting them to full metric theories. Though usually pedagogical, some of them are interesting from a quantum gravity perspective. Cases are noted where mass measurement errors, or distributions of dark matter, can cause one

  9. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  10. An accurate metric for the spacetime around rotating neutron stars

    Science.gov (United States)

    Pappas, George

    2017-04-01

    The problem of having an accurate description of the spacetime around rotating neutron stars is of great astrophysical interest. For astrophysical applications, one needs to have a metric that captures all the properties of the spacetime around a rotating neutron star. Furthermore, an accurate appropriately parametrized metric, I.e. a metric that is given in terms of parameters that are directly related to the physical structure of the neutron star, could be used to solve the inverse problem, which is to infer the properties of the structure of a neutron star from astrophysical observations. In this work, we present such an approximate stationary and axisymmetric metric for the exterior of rotating neutron stars, which is constructed using the Ernst formalism and is parametrized by the relativistic multipole moments of the central object. This metric is given in terms of an expansion on the Weyl-Papapetrou coordinates with the multipole moments as free parameters and is shown to be extremely accurate in capturing the physical properties of a neutron star spacetime as they are calculated numerically in general relativity. Because the metric is given in terms of an expansion, the expressions are much simpler and easier to implement, in contrast to previous approaches. For the parametrization of the metric in general relativity, the recently discovered universal 3-hair relations are used to produce a three-parameter metric. Finally, a straightforward extension of this metric is given for scalar-tensor theories with a massless scalar field, which also admit a formulation in terms of an Ernst potential.

  11. Metric reconstruction from Weyl scalars

    Energy Technology Data Exchange (ETDEWEB)

    Whiting, Bernard F; Price, Larry R [Department of Physics, PO Box 118440, University of Florida, Gainesville, FL 32611 (United States)

    2005-08-07

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations.

  12. Metric reconstruction from Weyl scalars

    International Nuclear Information System (INIS)

    Whiting, Bernard F; Price, Larry R

    2005-01-01

    The Kerr geometry has remained an elusive world in which to explore physics and delve into the more esoteric implications of general relativity. Following the discovery, by Kerr in 1963, of the metric for a rotating black hole, the most major advance has been an understanding of its Weyl curvature perturbations based on Teukolsky's discovery of separable wave equations some ten years later. In the current research climate, where experiments across the globe are preparing for the first detection of gravitational waves, a more complete understanding than concerns just the Weyl curvature is now called for. To understand precisely how comparatively small masses move in response to the gravitational waves they emit, a formalism has been developed based on a description of the whole spacetime metric perturbation in the neighbourhood of the emission region. Presently, such a description is not available for the Kerr geometry. While there does exist a prescription for obtaining metric perturbations once curvature perturbations are known, it has become apparent that there are gaps in that formalism which are still waiting to be filled. The most serious gaps include gauge inflexibility, the inability to include sources-which are essential when the emitting masses are considered-and the failure to describe the l = 0 and 1 perturbation properties. Among these latter properties of the perturbed spacetime, arising from a point mass in orbit, are the perturbed mass and axial component of angular momentum, as well as the very elusive Carter constant for non-axial angular momentum. A status report is given on recent work which begins to repair these deficiencies in our current incomplete description of Kerr metric perturbations

  13. Using Publication Metrics to Highlight Academic Productivity and Research Impact

    Science.gov (United States)

    Carpenter, Christopher R.; Cone, David C.; Sarli, Cathy C.

    2016-01-01

    This article provides a broad overview of widely available measures of academic productivity and impact using publication data and highlights uses of these metrics for various purposes. Metrics based on publication data include measures such as number of publications, number of citations, the journal impact factor score, and the h-index, as well as emerging metrics based on document-level metrics. Publication metrics can be used for a variety of purposes for tenure and promotion, grant applications and renewal reports, benchmarking, recruiting efforts, and administrative purposes for departmental or university performance reports. The authors also highlight practical applications of measuring and reporting academic productivity and impact to emphasize and promote individual investigators, grant applications, or department output. PMID:25308141

  14. Comparison of routing metrics for wireless mesh networks

    CSIR Research Space (South Africa)

    Nxumalo, SL

    2011-09-01

    Full Text Available in each and every relay node so as to find the next hop for the packet. A routing metric is simply a measure used for selecting the best path, used by a routing protocol. Figure 2 shows the relationship between a routing protocol and the routing... on its QoS-awareness level. The routing metrics that considered QoS the most were selected from each group. This section discusses the four routing metrics that were compared in this paper, which are: hop count (HOP), expected transmission count (ETX...

  15. The metrics and correlates of physician migration from Africa

    Directory of Open Access Journals (Sweden)

    Arah Onyebuchi A

    2007-05-01

    Full Text Available Abstract Background Physician migration from poor to rich countries is considered an important contributor to the growing health workforce crisis in the developing world. This is particularly true for Africa. The perceived magnitude of such migration for each source country might, however, depend on the choice of metrics used in the analysis. This study examined the influence of choice of migration metrics on the rankings of African countries that suffered the most physician migration, and investigated the correlates of physician migration. Methods Ranking and correlational analyses were conducted on African physician migration data adjusted for bilateral net flows, and supplemented with developmental, economic and health system data. The setting was the 53 African birth countries of African-born physicians working in nine wealthier destination countries. Three metrics of physician migration were used: total number of physician émigrés; emigration fraction defined as the proportion of the potential physician pool working in destination countries; and physician migration density defined as the number of physician émigrés per 1000 population of the African source country. Results Rankings based on any of the migration metrics differed substantially from those based on the other two metrics. Although the emigration fraction and physician migration density metrics gave proportionality to the migration crisis, only the latter was consistently associated with source countries' workforce capacity, health, health spending, economic and development characteristics. As such, higher physician migration density was seen among African countries with relatively higher health workforce capacity (0.401 ≤ r ≤ 0.694, p ≤ 0.011, health status, health spending, and development. Conclusion The perceived magnitude of physician migration is sensitive to the choice of metrics. Complementing the emigration fraction, the physician migration density is a metric

  16. Description of the Sandia National Laboratories science, technology & engineering metrics process.

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter

    2010-04-01

    There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.

  17. Relevance of motion-related assessment metrics in laparoscopic surgery.

    Science.gov (United States)

    Oropesa, Ignacio; Chmarra, Magdalena K; Sánchez-González, Patricia; Lamata, Pablo; Rodrigues, Sharon P; Enciso, Silvia; Sánchez-Margallo, Francisco M; Jansen, Frank-Willem; Dankelman, Jenny; Gómez, Enrique J

    2013-06-01

    Motion metrics have become an important source of information when addressing the assessment of surgical expertise. However, their direct relationship with the different surgical skills has not been fully explored. The purpose of this study is to investigate the relevance of motion-related metrics in the evaluation processes of basic psychomotor laparoscopic skills and their correlation with the different abilities sought to measure. A framework for task definition and metric analysis is proposed. An explorative survey was first conducted with a board of experts to identify metrics to assess basic psychomotor skills. Based on the output of that survey, 3 novel tasks for surgical assessment were designed. Face and construct validation was performed, with focus on motion-related metrics. Tasks were performed by 42 participants (16 novices, 22 residents, and 4 experts). Movements of the laparoscopic instruments were registered with the TrEndo tracking system and analyzed. Time, path length, and depth showed construct validity for all 3 tasks. Motion smoothness and idle time also showed validity for tasks involving bimanual coordination and tasks requiring a more tactical approach, respectively. Additionally, motion smoothness and average speed showed a high internal consistency, proving them to be the most task-independent of all the metrics analyzed. Motion metrics are complementary and valid for assessing basic psychomotor skills, and their relevance depends on the skill being evaluated. A larger clinical implementation, combined with quality performance information, will give more insight on the relevance of the results shown in this study.

  18. Left-invariant Einstein metrics on S3 ×S3

    Science.gov (United States)

    Belgun, Florin; Cortés, Vicente; Haupt, Alexander S.; Lindemann, David

    2018-06-01

    The classification of homogeneous compact Einstein manifolds in dimension six is an open problem. We consider the remaining open case, namely left-invariant Einstein metrics g on G = SU(2) × SU(2) =S3 ×S3. Einstein metrics are critical points of the total scalar curvature functional for fixed volume. The scalar curvature S of a left-invariant metric g is constant and can be expressed as a rational function in the parameters determining the metric. The critical points of S, subject to the volume constraint, are given by the zero locus of a system of polynomials in the parameters. In general, however, the determination of the zero locus is apparently out of reach. Instead, we consider the case where the isotropy group K of g in the group of motions is non-trivial. When K ≇Z2 we prove that the Einstein metrics on G are given by (up to homothety) either the standard metric or the nearly Kähler metric, based on representation-theoretic arguments and computer algebra. For the remaining case K ≅Z2 we present partial results.

  19. A practical approach to determine dose metrics for nanomaterials.

    Science.gov (United States)

    Delmaar, Christiaan J E; Peijnenburg, Willie J G M; Oomen, Agnes G; Chen, Jingwen; de Jong, Wim H; Sips, Adriënne J A M; Wang, Zhuang; Park, Margriet V D Z

    2015-05-01

    Traditionally, administered mass is used to describe doses of conventional chemical substances in toxicity studies. For deriving toxic doses of nanomaterials, mass and chemical composition alone may not adequately describe the dose, because particles with the same chemical composition can have completely different toxic mass doses depending on properties such as particle size. Other dose metrics such as particle number, volume, or surface area have been suggested, but consensus is lacking. The discussion regarding the most adequate dose metric for nanomaterials clearly needs a systematic, unbiased approach to determine the most appropriate dose metric for nanomaterials. In the present study, the authors propose such an approach and apply it to results from in vitro and in vivo experiments with silver and silica nanomaterials. The proposed approach is shown to provide a convenient tool to systematically investigate and interpret dose metrics of nanomaterials. Recommendations for study designs aimed at investigating dose metrics are provided. © 2015 SETAC.

  20. Fisher information metrics for binary classifier evaluation and training

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Different evaluation metrics for binary classifiers are appropriate to different scientific domains and even to different problems within the same domain. This presentation focuses on the optimisation of event selection to minimise statistical errors in HEP parameter estimation, a problem that is best analysed in terms of the maximisation of Fisher information about the measured parameters. After describing a general formalism to derive evaluation metrics based on Fisher information, three more specific metrics are introduced for the measurements of signal cross sections in counting experiments (FIP1) or distribution fits (FIP2) and for the measurements of other parameters from distribution fits (FIP3). The FIP2 metric is particularly interesting because it can be derived from any ROC curve, provided that prevalence is also known. In addition to its relation to measurement errors when used as an evaluation criterion (which makes it more interesting that the ROC AUC), a further advantage of the FIP2 metric is ...

  1. Combining multivariate analysis and monosaccharide composition modeling to identify plant cell wall variations by Fourier Transform Near Infrared spectroscopy

    Directory of Open Access Journals (Sweden)

    Smith-Moritz Andreia M

    2011-08-01

    Full Text Available Abstract We outline a high throughput procedure that improves outlier detection in cell wall screens using FT-NIR spectroscopy of plant leaves. The improvement relies on generating a calibration set from a subset of a mutant population by taking advantage of the Mahalanobis distance outlier scheme to construct a monosaccharide range predictive model using PLS regression. This model was then used to identify specific monosaccharide outliers from the mutant population.

  2. Classification of Pulse Waveforms Using Edit Distance with Real Penalty

    Directory of Open Access Journals (Sweden)

    Zhang Dongyu

    2010-01-01

    Full Text Available Abstract Advances in sensor and signal processing techniques have provided effective tools for quantitative research in traditional Chinese pulse diagnosis (TCPD. Because of the inevitable intraclass variation of pulse patterns, the automatic classification of pulse waveforms has remained a difficult problem. In this paper, by referring to the edit distance with real penalty (ERP and the recent progress in -nearest neighbors (KNN classifiers, we propose two novel ERP-based KNN classifiers. Taking advantage of the metric property of ERP, we first develop an ERP-induced inner product and a Gaussian ERP kernel, then embed them into difference-weighted KNN classifiers, and finally develop two novel classifiers for pulse waveform classification. The experimental results show that the proposed classifiers are effective for accurate classification of pulse waveform.

  3. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  4. Distance Learning

    National Research Council Canada - National Science Library

    Braddock, Joseph

    1997-01-01

    A study reviewing the existing Army Distance Learning Plan (ADLP) and current Distance Learning practices, with a focus on the Army's training and educational challenges and the benefits of applying Distance Learning techniques...

  5. Important LiDAR metrics for discriminating forest tree species in Central Europe

    Science.gov (United States)

    Shi, Yifang; Wang, Tiejun; Skidmore, Andrew K.; Heurich, Marco

    2018-03-01

    Numerous airborne LiDAR-derived metrics have been proposed for classifying tree species. Yet an in-depth ecological and biological understanding of the significance of these metrics for tree species mapping remains largely unexplored. In this paper, we evaluated the performance of 37 frequently used LiDAR metrics derived under leaf-on and leaf-off conditions, respectively, for discriminating six different tree species in a natural forest in Germany. We firstly assessed the correlation between these metrics. Then we applied a Random Forest algorithm to classify the tree species and evaluated the importance of the LiDAR metrics. Finally, we identified the most important LiDAR metrics and tested their robustness and transferability. Our results indicated that about 60% of LiDAR metrics were highly correlated to each other (|r| > 0.7). There was no statistically significant difference in tree species mapping accuracy between the use of leaf-on and leaf-off LiDAR metrics. However, combining leaf-on and leaf-off LiDAR metrics significantly increased the overall accuracy from 58.2% (leaf-on) and 62.0% (leaf-off) to 66.5% as well as the kappa coefficient from 0.47 (leaf-on) and 0.51 (leaf-off) to 0.58. Radiometric features, especially intensity related metrics, provided more consistent and significant contributions than geometric features for tree species discrimination. Specifically, the mean intensity of first-or-single returns as well as the mean value of echo width were identified as the most robust LiDAR metrics for tree species discrimination. These results indicate that metrics derived from airborne LiDAR data, especially radiometric metrics, can aid in discriminating tree species in a mixed temperate forest, and represent candidate metrics for tree species classification and monitoring in Central Europe.

  6. Human Performance Optimization Metrics: Consensus Findings, Gaps, and Recommendations for Future Research.

    Science.gov (United States)

    Nindl, Bradley C; Jaffin, Dianna P; Dretsch, Michael N; Cheuvront, Samuel N; Wesensten, Nancy J; Kent, Michael L; Grunberg, Neil E; Pierce, Joseph R; Barry, Erin S; Scott, Jonathan M; Young, Andrew J; OʼConnor, Francis G; Deuster, Patricia A

    2015-11-01

    Human performance optimization (HPO) is defined as "the process of applying knowledge, skills and emerging technologies to improve and preserve the capabilities of military members, and organizations to execute essential tasks." The lack of consensus for operationally relevant and standardized metrics that meet joint military requirements has been identified as the single most important gap for research and application of HPO. In 2013, the Consortium for Health and Military Performance hosted a meeting to develop a toolkit of standardized HPO metrics for use in military and civilian research, and potentially for field applications by commanders, units, and organizations. Performance was considered from a holistic perspective as being influenced by various behaviors and barriers. To accomplish the goal of developing a standardized toolkit, key metrics were identified and evaluated across a spectrum of domains that contribute to HPO: physical performance, nutritional status, psychological status, cognitive performance, environmental challenges, sleep, and pain. These domains were chosen based on relevant data with regard to performance enhancers and degraders. The specific objectives at this meeting were to (a) identify and evaluate current metrics for assessing human performance within selected domains; (b) prioritize metrics within each domain to establish a human performance assessment toolkit; and (c) identify scientific gaps and the needed research to more effectively assess human performance across domains. This article provides of a summary of 150 total HPO metrics across multiple domains that can be used as a starting point-the beginning of an HPO toolkit: physical fitness (29 metrics), nutrition (24 metrics), psychological status (36 metrics), cognitive performance (35 metrics), environment (12 metrics), sleep (9 metrics), and pain (5 metrics). These metrics can be particularly valuable as the military emphasizes a renewed interest in Human Dimension efforts

  7. Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact

    Science.gov (United States)

    2011-01-01

    Background Citations in peer-reviewed articles and the impact factor are generally accepted measures of scientific impact. Web 2.0 tools such as Twitter, blogs or social bookmarking tools provide the possibility to construct innovative article-level or journal-level metrics to gauge impact and influence. However, the relationship of the these new metrics to traditional metrics such as citations is not known. Objective (1) To explore the feasibility of measuring social impact of and public attention to scholarly articles by analyzing buzz in social media, (2) to explore the dynamics, content, and timing of tweets relative to the publication of a scholarly article, and (3) to explore whether these metrics are sensitive and specific enough to predict highly cited articles. Methods Between July 2008 and November 2011, all tweets containing links to articles in the Journal of Medical Internet Research (JMIR) were mined. For a subset of 1573 tweets about 55 articles published between issues 3/2009 and 2/2010, different metrics of social media impact were calculated and compared against subsequent citation data from Scopus and Google Scholar 17 to 29 months later. A heuristic to predict the top-cited articles in each issue through tweet metrics was validated. Results A total of 4208 tweets cited 286 distinct JMIR articles. The distribution of tweets over the first 30 days after article publication followed a power law (Zipf, Bradford, or Pareto distribution), with most tweets sent on the day when an article was published (1458/3318, 43.94% of all tweets in a 60-day period) or on the following day (528/3318, 15.9%), followed by a rapid decay. The Pearson correlations between tweetations and citations were moderate and statistically significant, with correlation coefficients ranging from .42 to .72 for the log-transformed Google Scholar citations, but were less clear for Scopus citations and rank correlations. A linear multivariate model with time and tweets as significant

  8. The Integration of Internal and External Training Load Metrics in Hurling

    Directory of Open Access Journals (Sweden)

    Malone Shane

    2016-12-01

    Full Text Available The current study aimed to assess the relationship between the hurling player’s fitness profile and integrated training load (TL metrics. Twenty-five hurling players performed treadmill testing for VO2max, the speed at blood lactate concentrations of 2 mmol•L-1 (vLT and 4 mmol•L-1 (vOBLA and the heart rate-blood lactate profile for calculation of individual training impulse (iTRIMP. The total distance (TD; m, high speed distance (HSD; m and sprint distance (SD; m covered were measured using GPS technology (4-Hz, VX Sport, Lower Hutt, New Zealand which allowed for the measurement of the external TL. The external TL was divided by the internal TL to form integration ratios. Pearson correlation analyses allowed for the assessment of the relationships between fitness measures and the ratios to performance during simulated match play. External measures of the TL alone showed limited correlations with fitness measures. Integrated TL ratios showed significant relationships with fitness measures in players. TD:iTRIMP was correlated with aerobic fitness measures VO2max (r = 0.524; p = 0.006; 95% CI: 0.224 to 0.754; large and vOBLA (r = 0.559; p = 0.003; 95% CI: 0.254 to 0.854; large. HSD:iTRIMP also correlated with aerobic markers for fitness vLT (r = 0.502; p = 0.009; 95% CI: 0.204 to 0.801; large; vOBLA (r = 0.407; p = 0.039; 95% CI: 0.024 to 0.644; moderate. Interestingly SD:iTRIMP also showed significant correlations with vLT (r = 0.611; p = 0.001; 95% CI: 0.324 to 0.754; large. The current study showed that TL ratios can provide practitioners with a measure of fitness as external performance alone showed limited relationships with aerobic fitness measures.

  9. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    International Nuclear Information System (INIS)

    Lobo, Iarley P.; Loret, Niccolo; Nettel, Francisco

    2017-01-01

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)

  10. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    Science.gov (United States)

    Lobo, Iarley P.; Loret, Niccoló; Nettel, Francisco

    2017-07-01

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations.

  11. Rainbows without unicorns: metric structures in theories with modified dispersion relations

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Iarley P. [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Pescara (Italy); CAPES Foundation, Ministry of Education of Brazil, Brasilia (Brazil); Universidade Federal da Paraiba, Departamento de Fisica, Joao Pessoa, PB (Brazil); INFN Sezione Roma 1 (Italy); Loret, Niccolo [Ruder Boskovic Institute, Division of Theoretical Physics, Zagreb (Croatia); Nettel, Francisco [Universita ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares, Mexico (Mexico); INFN Sezione Roma 1 (Italy)

    2017-07-15

    Rainbow metrics are a widely used approach to the metric formalism for theories with modified dispersion relations. They have had a huge success in the quantum gravity phenomenology literature, since they allow one to introduce momentum-dependent space-time metrics into the description of systems with a modified dispersion relation. In this paper, we introduce the reader to some realizations of this general idea: the original rainbow metrics proposal, the momentum-space-inspired metric and a Finsler geometry approach. As the main result of this work we also present an alternative definition of a four-velocity dependent metric which allows one to handle the massless limit. This paper aims to highlight some of their properties and how to properly describe their relativistic realizations. (orig.)

  12. Temporal variability of daily personal magnetic field exposure metrics in pregnant women.

    Science.gov (United States)

    Lewis, Ryan C; Evenson, Kelly R; Savitz, David A; Meeker, John D

    2015-01-01

    Recent epidemiology studies of power-frequency magnetic fields and reproductive health have characterized exposures using data collected from personal exposure monitors over a single day, possibly resulting in exposure misclassification due to temporal variability in daily personal magnetic field exposure metrics, but relevant data in adults are limited. We assessed the temporal variability of daily central tendency (time-weighted average, median) and peak (upper percentiles, maximum) personal magnetic field exposure metrics over 7 consecutive days in 100 pregnant women. When exposure was modeled as a continuous variable, central tendency metrics had substantial reliability, whereas peak metrics had fair (maximum) to moderate (upper percentiles) reliability. The predictive ability of a single-day metric to accurately classify participants into exposure categories based on a weeklong metric depended on the selected exposure threshold, with sensitivity decreasing with increasing exposure threshold. Consistent with the continuous measures analysis, sensitivity was higher for central tendency metrics than for peak metrics. If there is interest in peak metrics, more than 1 day of measurement is needed over the window of disease susceptibility to minimize measurement error, but 1 day may be sufficient for central tendency metrics.

  13. Analysis of Molecular Variance Inferred from Metric Distances among DNA Haplotypes: Application to Human Mitochondrial DNA Restriction Data

    OpenAIRE

    Excoffier, L.; Smouse, P. E.; Quattro, J. M.

    1992-01-01

    We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as φ-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivisi...

  14. Development of soil quality metrics using mycorrhizal fungi

    Energy Technology Data Exchange (ETDEWEB)

    Baar, J.

    2010-07-01

    Based on the Treaty on Biological Diversity of Rio de Janeiro in 1992 for maintaining and increasing biodiversity, several countries have started programmes monitoring soil quality and the above- and below ground biodiversity. Within the European Union, policy makers are working on legislation for soil protection and management. Therefore, indicators are needed to monitor the status of the soils and these indicators reflecting the soil quality, can be integrated in working standards or soil quality metrics. Soil micro-organisms, particularly arbuscular mycorrhizal fungi (AMF), are indicative of soil changes. These soil fungi live in symbiosis with the great majority of plants and are sensitive to changes in the physico-chemical conditions of the soil. The aim of this study was to investigate whether AMF are reliable and sensitive indicators for disturbances in the soils and can be used for the development of soil quality metrics. Also, it was studied whether soil quality metrics based on AMF meet requirements to applicability by users and policy makers. Ecological criterions were set for the development of soil quality metrics for different soils. Multiple root samples containing AMF from various locations in The Netherlands were analyzed. The results of the analyses were related to the defined criterions. This resulted in two soil quality metrics, one for sandy soils and a second one for clay soils, with six different categories ranging from very bad to very good. These soil quality metrics meet the majority of requirements for applicability and are potentially useful for the development of legislations for the protection of soil quality. (Author) 23 refs.

  15. Measures of agreement between computation and experiment:validation metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew Franklin; Oberkampf, William Louis

    2005-08-01

    With the increasing role of computational modeling in engineering design, performance estimation, and safety assessment, improved methods are needed for comparing computational results and experimental measurements. Traditional methods of graphically comparing computational and experimental results, though valuable, are essentially qualitative. Computable measures are needed that can quantitatively compare computational and experimental results over a range of input, or control, variables and sharpen assessment of computational accuracy. This type of measure has been recently referred to as a validation metric. We discuss various features that we believe should be incorporated in a validation metric and also features that should be excluded. We develop a new validation metric that is based on the statistical concept of confidence intervals. Using this fundamental concept, we construct two specific metrics: one that requires interpolation of experimental data and one that requires regression (curve fitting) of experimental data. We apply the metrics to three example problems: thermal decomposition of a polyurethane foam, a turbulent buoyant plume of helium, and compressibility effects on the growth rate of a turbulent free-shear layer. We discuss how the present metrics are easily interpretable for assessing computational model accuracy, as well as the impact of experimental measurement uncertainty on the accuracy assessment.

  16. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Science.gov (United States)

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  17. Term Based Comparison Metrics for Controlled and Uncontrolled Indexing Languages

    Science.gov (United States)

    Good, B. M.; Tennis, J. T.

    2009-01-01

    Introduction: We define a collection of metrics for describing and comparing sets of terms in controlled and uncontrolled indexing languages and then show how these metrics can be used to characterize a set of languages spanning folksonomies, ontologies and thesauri. Method: Metrics for term set characterization and comparison were identified and…

  18. Software metrics: The key to quality software on the NCC project

    Science.gov (United States)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  19. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... with security, operations, economic, technical, logistical, training and safety requirements. (3) The... total cost of the retrofit, including redesign costs, exceeds $50,000; (ii) Metric is not the accepted... office with an explanation for the disapproval. (7) The in-house operating metric costs shall be...

  20. Climate and the complexity of migratory phenology: sexes, migratory distance, and arrival distributions

    Science.gov (United States)

    Macmynowski, Dena P.; Root, Terry L.

    2007-05-01

    The intra- and inter-season complexity of bird migration has received limited attention in climatic change research. Our phenological analysis of 22 species collected in Chicago, USA, (1979 2002) evaluates the relationship between multi-scalar climate variables and differences (1) in arrival timing between sexes, (2) in arrival distributions among species, and (3) between spring and fall migration. The early migratory period for earliest arriving species (i.e., short-distance migrants) and earliest arriving individuals of a species (i.e., males) most frequently correlate with climate variables. Compared to long-distance migrant species, four times as many short-distance migrants correlate with spring temperature, while 8 of 11 (73%) of long-distance migrant species’ arrival is correlated with the North Atlantic Oscillation (NAO). While migratory phenology has been correlated with NAO in Europe, we believe that this is the first documentation of a significant association in North America. Geographically proximate conditions apparently influence migratory timing for short-distance migrants while continental-scale climate (e.g., NAO) seemingly influences the phenology of Neotropical migrants. The preponderance of climate correlations is with the early migratory period, not the median of arrival, suggesting that early spring conditions constrain the onset or rate of migration for some species. The seasonal arrival distribution provides considerable information about migratory passage beyond what is apparent from statistical analyses of phenology. A relationship between climate and fall phenology is not detected at this location. Analysis of the within-season complexity of migration, including multiple metrics of arrival, is essential to detect species’ responses to changing climate as well as evaluate the underlying biological mechanisms.