WorldWideScience

Sample records for adaptive metric knn

  1. Classification in medical images using adaptive metric k-NN

    Science.gov (United States)

    Chen, C.; Chernoff, K.; Karemore, G.; Lo, P.; Nielsen, M.; Lauze, F.

    2010-03-01

    The performance of the k-nearest neighborhoods (k-NN) classifier is highly dependent on the distance metric used to identify the k nearest neighbors of the query points. The standard Euclidean distance is commonly used in practice. This paper investigates the performance of k-NN classifier with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms, where we use k-NN for breast cancer risk assessment. The results show that appropriate choice of metric can improve classification.

  2. Classification in medical image analysis using adaptive metric k-NN

    DEFF Research Database (Denmark)

    Chen, Chen; Chernoff, Konstantin; Karemore, Gopal

    2010-01-01

    with respect to different adaptive metrics in the context of medical imaging. We propose using adaptive metrics such that the structure of the data is better described, introducing some unsupervised learning knowledge in k-NN. We investigated four different metrics are estimated: a theoretical metric based...... on the assumption that images are drawn from Brownian Image Model (BIM), the normalized metric based on variance of the data, the empirical metric is based on the empirical covariance matrix of the unlabeled data, and an optimized metric obtained by minimizing the classification error. The spectral structure...... of the empirical covariance also leads to Principal Component Analysis (PCA) performed on it which results the subspace metrics. The metrics are evaluated on two data sets: lateral X-rays of the lumbar aortic/spine region, where we use k-NN for performing abdominal aorta calcification detection; and mammograms...

  3. Adaptive Metric Dimensionality Reduction

    OpenAIRE

    Gottlieb, Lee-Ad; Kontorovich, Aryeh; Krauthgamer, Robert

    2013-01-01

    We study adaptive data-dependent dimensionality reduction in the context of supervised learning in general metric spaces. Our main statistical contribution is a generalization bound for Lipschitz functions in metric spaces that are doubling, or nearly doubling. On the algorithmic front, we describe an analogue of PCA for metric spaces: namely an efficient procedure that approximates the data's intrinsic dimension, which is often much lower than the ambient dimension. Our approach thus leverag...

  4. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  5. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  6. Improving GPU-accelerated adaptive IDW interpolation algorithm using fast kNN search.

    Science.gov (United States)

    Mei, Gang; Xu, Nengxiong; Xu, Liangliang

    2016-01-01

    This paper presents an efficient parallel Adaptive Inverse Distance Weighting (AIDW) interpolation algorithm on modern Graphics Processing Unit (GPU). The presented algorithm is an improvement of our previous GPU-accelerated AIDW algorithm by adopting fast k-nearest neighbors (kNN) search. In AIDW, it needs to find several nearest neighboring data points for each interpolated point to adaptively determine the power parameter; and then the desired prediction value of the interpolated point is obtained by weighted interpolating using the power parameter. In this work, we develop a fast kNN search approach based on the space-partitioning data structure, even grid, to improve the previous GPU-accelerated AIDW algorithm. The improved algorithm is composed of the stages of kNN search and weighted interpolating. To evaluate the performance of the improved algorithm, we perform five groups of experimental tests. The experimental results indicate: (1) the improved algorithm can achieve a speedup of up to 1017 over the corresponding serial algorithm; (2) the improved algorithm is at least two times faster than our previous GPU-accelerated AIDW algorithm; and (3) the utilization of fast kNN search can significantly improve the computational efficiency of the entire GPU-accelerated AIDW algorithm.

  7. Multiclass Boosting with Adaptive Group-Based kNN and Its Application in Text Categorization

    Directory of Open Access Journals (Sweden)

    Lei La

    2012-01-01

    Full Text Available AdaBoost is an excellent committee-based tool for classification. However, its effectiveness and efficiency in multiclass categorization face the challenges from methods based on support vector machine (SVM, neural networks (NN, naïve Bayes, and k-nearest neighbor (kNN. This paper uses a novel multi-class AdaBoost algorithm to avoid reducing the multi-class classification problem to multiple two-class classification problems. This novel method is more effective. In addition, it keeps the accuracy advantage of existing AdaBoost. An adaptive group-based kNN method is proposed in this paper to build more accurate weak classifiers and in this way control the number of basis classifiers in an acceptable range. To further enhance the performance, weak classifiers are combined into a strong classifier through a double iterative weighted way and construct an adaptive group-based kNN boosting algorithm (AGkNN-AdaBoost. We implement AGkNN-AdaBoost in a Chinese text categorization system. Experimental results showed that the classification algorithm proposed in this paper has better performance both in precision and recall than many other text categorization methods including traditional AdaBoost. In addition, the processing speed is significantly enhanced than original AdaBoost and many other classic categorization algorithms.

  8. H-Metric: Characterizing Image Datasets via Homogenization Based on KNN-Queries

    Directory of Open Access Journals (Sweden)

    Welington M da Silva

    2012-01-01

    Full Text Available Precision-Recall is one of the main metrics for evaluating content-based image retrieval techniques. However, it does not provide an ample perception of the properties of an image dataset immersed in a metric space. In this work, we describe an alternative metric named H-Metric, which is determined along a sequence of controlled modifications in the image dataset. The process is named homogenization and works by altering the homogeneity characteristics of the classes of the images. The result is a process that measures how hard it is to deal with a set of images in respect to content-based retrieval, offering support in the task of analyzing configurations of distance functions and of features extractors.

  9. Adaptive Optics Metrics & QC Scheme

    Science.gov (United States)

    Girard, Julien H.

    2017-09-01

    "There are many Adaptive Optics (AO) fed instruments on Paranal and more to come. To monitor their performances and assess the quality of the scientific data, we have developed a scheme and a set of tools and metrics adapted to each flavour of AO and each data product. Our decisions to repeat observations or not depends heavily on this immediate quality control "zero" (QC0). Atmospheric parameters monitoring can also help predict performances . At the end of the chain, the user must be able to find the data that correspond to his/her needs. In Particular, we address the special case of SPHERE."

  10. Learning-Based Adaptive Imputation Methodwith kNN Algorithm for Missing Power Data

    Directory of Open Access Journals (Sweden)

    Minkyung Kim

    2017-10-01

    Full Text Available This paper proposes a learning-based adaptive imputation method (LAI for imputing missing power data in an energy system. This method estimates the missing power data by using the pattern that appears in the collected data. Here, in order to capture the patterns from past power data, we newly model a feature vector by using past data and its variations. The proposed LAI then learns the optimal length of the feature vector and the optimal historical length, which are significant hyper parameters of the proposed method, by utilizing intentional missing data. Based on a weighted distance between feature vectors representing a missing situation and past situation, missing power data are estimated by referring to the k most similar past situations in the optimal historical length. We further extend the proposed LAI to alleviate the effect of unexpected variation in power data and refer to this new approach as the extended LAI method (eLAI. The eLAI selects a method between linear interpolation (LI and the proposed LAI to improve accuracy under unexpected variations. Finally, from a simulation under various energy consumption profiles, we verify that the proposed eLAI achieves about a 74% reduction of the average imputation error in an energy system, compared to the existing imputation methods.

  11. Multimedia content classification metrics for content adaptation

    OpenAIRE

    Fernandes, Rui; Andrade, M.T.

    2015-01-01

    Multimedia content consumption is very popular nowadays. However, not every content can be consumed in its original format: the combination of content, transport and access networks, consumption device and usage environment characteristics may all pose restrictions to that purpose. One way to provide the best possible quality to the user is to adapt the content according to these restrictions as well as user preferences. This adaptation stage can be best executed if knowledge about the conten...

  12. Multimedia content classification metrics for content adaptation

    OpenAIRE

    Fernandes, Rui; Andrade, M.T.

    2016-01-01

    Multimedia content consumption is very popular nowadays. However, not every content can be consumed in its original format: the combination of content, transport and access networks, consumption device and usage environment characteristics may all pose restrictions to that purpose. One way to provide the best possible quality to the user is to adapt the content according to these restrictions as well as user preferences. This adaptation stage can be best executed if knowledge about the conten...

  13. Applicability of Existing Objective Metrics of Perceptual Quality for Adaptive Video Streaming

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Krasula, Lukás; Shahid, Muhammad

    2016-01-01

    Objective video quality metrics are designed to estimate the quality of experience of the end user. However, these objective metrics are usually validated with video streams degraded under common distortion types. In the presented work, we analyze the performance of published and known full......-reference and noreference quality metrics in estimating the perceived quality of adaptive bit-rate video streams knowingly out of scope. Experimental results indicate not surprisingly that state of the art objective quality metrics overlook the perceived degradations in the adaptive video streams and perform poorly...

  14. Flight Validation of a Metrics Driven L(sub 1) Adaptive Control

    Science.gov (United States)

    Dobrokhodov, Vladimir; Kitsios, Ioannis; Kaminer, Isaac; Jones, Kevin D.; Xargay, Enric; Hovakimyan, Naira; Cao, Chengyu; Lizarraga, Mariano I.; Gregory, Irene M.

    2008-01-01

    The paper addresses initial steps involved in the development and flight implementation of new metrics driven L1 adaptive flight control system. The work concentrates on (i) definition of appropriate control driven metrics that account for the control surface failures; (ii) tailoring recently developed L1 adaptive controller to the design of adaptive flight control systems that explicitly address these metrics in the presence of control surface failures and dynamic changes under adverse flight conditions; (iii) development of a flight control system for implementation of the resulting algorithms onboard of small UAV; and (iv) conducting a comprehensive flight test program that demonstrates performance of the developed adaptive control algorithms in the presence of failures. As the initial milestone the paper concentrates on the adaptive flight system setup and initial efforts addressing the ability of a commercial off-the-shelf AP with and without adaptive augmentation to recover from control surface failures.

  15. Quality Assessment of Adaptive Bitrate Videos using Image Metrics and Machine Learning

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Brunnström, Kjell

    2015-01-01

    Adaptive bitrate (ABR) streaming is widely used for distribution of videos over the internet. In this work, we investigate how well we can predict the quality of such videos using well-known image metrics, information about the bitrate levels, and a relatively simple machine learning method...

  16. Hyperspectral Imagery Super-Resolution by Adaptive POCS and Blur Metric

    Directory of Open Access Journals (Sweden)

    Shaoxing Hu

    2017-01-01

    Full Text Available The spatial resolution of a hyperspectral image is often coarse as the limitations on the imaging hardware. A novel super-resolution reconstruction algorithm for hyperspectral imagery (HSI via adaptive projection onto convex sets and image blur metric (APOCS-BM is proposed in this paper to solve these problems. Firstly, a no-reference image blur metric assessment method based on Gabor wavelet transform is utilized to obtain the blur metric of the low-resolution (LR image. Then, the bound used in the APOCS is automatically calculated via LR image blur metric. Finally, the high-resolution (HR image is reconstructed by the APOCS method. With the contribution of APOCS and image blur metric, the fixed bound problem in POCS is solved, and the image blur information is utilized during the reconstruction of HR image, which effectively enhances the spatial-spectral information and improves the reconstruction accuracy. The experimental results for the PaviaU, PaviaC and Jinyin Tan datasets indicate that the proposed method not only enhances the spatial resolution, but also preserves HSI spectral information well.

  17. Adaptive metric learning with deep neural networks for video-based facial expression recognition

    Science.gov (United States)

    Liu, Xiaofeng; Ge, Yubin; Yang, Chao; Jia, Ping

    2018-01-01

    Video-based facial expression recognition has become increasingly important for plenty of applications in the real world. Despite that numerous efforts have been made for the single sequence, how to balance the complex distribution of intra- and interclass variations well between sequences has remained a great difficulty in this area. We propose the adaptive (N+M)-tuplet clusters loss function and optimize it with the softmax loss simultaneously in the training phrase. The variations introduced by personal attributes are alleviated using the similarity measurements of multiple samples in the feature space with many fewer comparison times as conventional deep metric learning approaches, which enables the metric calculations for large data applications (e.g., videos). Both the spatial and temporal relations are well explored by a unified framework that consists of an Inception-ResNet network with long short term memory and the two fully connected layer branches structure. Our proposed method has been evaluated with three well-known databases, and the experimental results show that our method outperforms many state-of-the-art approaches.

  18. Probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty from maximum temperature metric selection.

    Science.gov (United States)

    DeWeber, J Tyrell; Wagner, Tyler

    2018-02-22

    Predictions of the projected changes in species distribution models and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) that is known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern U.S. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as 0.2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions

  19. Supervised Classification of Agricultural Land Cover Using a Modified k-NN Technique (MNN and Landsat Remote Sensing Imagery

    Directory of Open Access Journals (Sweden)

    Karsten Schulz

    2009-11-01

    Full Text Available Nearest neighbor techniques are commonly used in remote sensing, pattern recognition and statistics to classify objects into a predefined number of categories based on a given set of predictors. These techniques are especially useful for highly nonlinear relationship between the variables. In most studies the distance measure is adopted a priori. In contrast we propose a general procedure to find an adaptive metric that combines a local variance reducing technique and a linear embedding of the observation space into an appropriate Euclidean space. To illustrate the application of this technique, two agricultural land cover classifications using mono-temporal and multi-temporal Landsat scenes are presented. The results of the study, compared with standard approaches used in remote sensing such as maximum likelihood (ML or k-Nearest Neighbor (k-NN indicate substantial improvement with regard to the overall accuracy and the cardinality of the calibration data set. Also, using MNN in a soft/fuzzy classification framework demonstrated to be a very useful tool in order to derive critical areas that need some further attention and investment concerning additional calibration data.

  20. A kNN method that uses a non-natural evolutionary algorithm for ...

    African Journals Online (AJOL)

    We used this algorithm for component selection of a kNN (k Nearest Neighbor) method for breast cancer prognosis. Results with the UCI prognosis data set show that we can find components that help improve the accuracy of kNN by almost 3%, raising it above 79%. Keywords: kNN; classification; evolutionary algorithm; ...

  1. An adaptive weighted Lp metric with application to optical remote sensing classification problems

    Science.gov (United States)

    Pratiher, Sawon; Krishnamoorthy, Vigneshram; Bhattacharya, Paritosh

    2017-06-01

    In this contribution, a novel metric learning framework by jointly optimizing the feature space structural coherence manifested by the Cosine similarity measure and the error contribution induced by the Minkowski metric is presented with a loss function involving Mahalanobis distance measure governing the outlier robustness for maximal inter-sample and minimal intra-sample separation of the feature space vectors. The outlier's robustness and scale variation sensitivity of the proposed measure by exploiting the prior statistical entropy of the correlated feature components in weighing the different feature dimensions according to their degree of cohesion within the data clusters and the conceptual architecture for the optimality criterion in terms of the optimal Minkowski exponent, `poptimal' through semi-definite convex optimization with its lower and upper bounds of the proposed distance function have been discussed. Classification results involving special cases of the proposed distance measure on publicly available datasets validates the adequacy of the proposed methodology in remote sensing problems.

  2. Self-organised manifold learning and heuristic charting via adaptive metrics

    Science.gov (United States)

    Horvath, Denis; Ulicny, Jozef; Brutovsky, Branislav

    2016-01-01

    Classical metric and non-metric multidimensional scaling (MDS) variants represent the well-known manifold learning (ML) methods which enable construction of low-dimensional representation (projections) of high-dimensional data inputs. However, their use is limited to the cases when data are inherently reducible to low dimensionality. In general, drawbacks and limitations of these, as well as pure, MDS variants become more apparent when the exploration (learning) is exposed to the structured data of high intrinsic dimension. As we demonstrate on artificial as well as real-world datasets, the over-determination problem can be solved by means of the hybrid and multi-component discrete-continuous multi-modal optimisation heuristics. A remarkable feature of the approach is that projections onto 2D are constructed simultaneously with the data categorisation compensating in part for the loss of original input information. We observed that the optimisation module integrated with ML modelling, metric learning and categorisation leads to a nontrivial mechanism resulting in heuristic charting of data.

  3. Short-term Power Load Forecasting Based on Balanced KNN

    Science.gov (United States)

    Lv, Xianlong; Cheng, Xingong; YanShuang; Tang, Yan-mei

    2018-03-01

    To improve the accuracy of load forecasting, a short-term load forecasting model based on balanced KNN algorithm is proposed; According to the load characteristics, the historical data of massive power load are divided into scenes by the K-means algorithm; In view of unbalanced load scenes, the balanced KNN algorithm is proposed to classify the scene accurately; The local weighted linear regression algorithm is used to fitting and predict the load; Adopting the Apache Hadoop programming framework of cloud computing, the proposed algorithm model is parallelized and improved to enhance its ability of dealing with massive and high-dimension data. The analysis of the household electricity consumption data for a residential district is done by 23-nodes cloud computing cluster, and experimental results show that the load forecasting accuracy and execution time by the proposed model are the better than those of traditional forecasting algorithm.

  4. Integration of Global and Local Metrics for Domain Adaptation Learning Via Dimensionality Reduction.

    Science.gov (United States)

    Jiang, Min; Huang, Wenzhen; Huang, Zhongqiang; Yen, Gary G

    2017-01-01

    Domain adaptation learning (DAL) investigates how to perform a task across different domains. In this paper, we present a kernelized local-global approach to solve domain adaptation problems. The basic idea of the proposed method is to consider the global and local information regarding the domains (e.g., maximum mean discrepancy and intraclass distance) and to convert the domain adaptation problem into a bi-object optimization problem via the kernel method. A solution for the optimization problem will help us identify a latent space in which the distributions of the different domains will be close to each other in the global sense, and the local properties of the labeled source samples will be preserved. Therefore, classic classification algorithms can be used to recognize unlabeled target domain data, which has a significant difference on the source samples. Based on the analysis, we validate the proposed algorithm using four different sources of data: synthetic, textual, object, and facial image. The experimental results indicate that the proposed method provides a reasonable means to improve DAL algorithms.

  5. Preserved Network Metrics across Translated Texts

    Science.gov (United States)

    Cabatbat, Josephine Jill T.; Monsanto, Jica P.; Tapang, Giovanni A.

    2014-09-01

    Co-occurrence language networks based on Bible translations and the Universal Declaration of Human Rights (UDHR) translations in different languages were constructed and compared with random text networks. Among the considered network metrics, the network size, N, the normalized betweenness centrality (BC), and the average k-nearest neighbors, knn, were found to be the most preserved across translations. Moreover, similar frequency distributions of co-occurring network motifs were observed for translated texts networks.

  6. Efficient and Flexible KNN Query Processing in Real-Life Road Networks

    DEFF Research Database (Denmark)

    Lu, Yang; Bui, Bin; Zhao, Jiakui

    2008-01-01

    Along with the developments of mobile services, effectively modeling road networks and efficiently indexing and querying network constrained objects has become a challenging problem. In this paper, we first introduce a road network model which captures real-life road networks better than previous...... models. Then, based on the proposed model, we propose a novel index named the RNG (Road Network Grid) index for accelerating KNN queries and continuous KNN queries over road network constrained data points. In contrast to conventional methods, speed limitations and blocking information of roads...... are included into the RNG index, which enables the index to support both distance-based and time-based KNN queries and continuous KNN queries. Our work extends previous ones by taking into account more practical scenarios, such as complexities in real-life road networks and time-based KNN queries. Extensive...

  7. Adapting observationally based metrics of biogeophysical feedbacks from land cover/land use change to climate modeling

    International Nuclear Information System (INIS)

    Chen, Liang; Dirmeyer, Paul A

    2016-01-01

    To assess the biogeophysical impacts of land cover/land use change (LCLUC) on surface temperature, two observation-based metrics and their applicability in climate modeling were explored in this study. Both metrics were developed based on the surface energy balance, and provided insight into the contribution of different aspects of land surface change (such as albedo, surface roughness, net radiation and surface heat fluxes) to changing climate. A revision of the first metric, the intrinsic biophysical mechanism, can be used to distinguish the direct and indirect effects of LCLUC on surface temperature. The other, a decomposed temperature metric, gives a straightforward depiction of separate contributions of all components of the surface energy balance. These two metrics well capture observed and model simulated surface temperature changes in response to LCLUC. Results from paired FLUXNET sites and land surface model sensitivity experiments indicate that surface roughness effects usually dominate the direct biogeophysical feedback of LCLUC, while other effects play a secondary role. However, coupled climate model experiments show that these direct effects can be attenuated by large scale atmospheric changes (indirect feedbacks). When applied to real-time transient LCLUC experiments, the metrics also demonstrate usefulness for assessing the performance of climate models and quantifying land–atmosphere interactions in response to LCLUC. (letter)

  8. Analisis Perbandingan KNN dengan SVM untuk Klasifikasi Penyakit Diabetes Retinopati berdasarkan Citra Eksudat dan Mikroaneurisma

    Directory of Open Access Journals (Sweden)

    SUCI AULIA

    2015-01-01

    Full Text Available ABSTRAK Penelitian mengenai pengklasifikasian tingkat keparahan penyakit Diabetes Retinopati berbasis image processing masih hangat dibicarakan, citra yang biasa digunakan untuk mendeteksi jenis penyakit ini adalah citra optik disk, mikroaneurisma, eksudat, dan hemorrhages yang berasal dari citra fundus. Pada penelitian ini telah dilakukan perbandingan algoritma SVM dengan KNN untuk klasifikasi penyakit diabetes retinopati (mild, moderate, severe berdasarkan citra eksudat dan microaneurisma. Untuk proses ekstraksi ciri digunakan metode wavelet  pada masing-masing kedua metode tersebut. Pada penelitian ini digunakan 160 data uji, masing-masing 40 citra untuk kelas normal, kelas mild, kelas moderate, kelas saviere. Tingkat akurasi yang diperoleh dengan menggunakan metode KNN lebih tinggi dibandingkan SVM, yaitu 65 % dan 62%. Klasifikasi dengan algoritma KNN diperoleh hasil terbaik dengan parameter K=9 cityblock. Sedangkan klasifikasi dengan metode SVM diperoleh hasil terbaik dengan parameter One Agains All. Kata kunci: Diabetic Retinopathy, KNN , SVM, Wavelet.   ABSTRACT Research based on severity classification of the disease diabetic retinopathy by using image processing method is still hotly debated, the image is used to detect the type of this disease is an optical image of the disk, microaneurysm, exudates, and bleeding of the image of the fundus. This study was performed to compare SVM method with KNN method for classification of diabetic retinopathy disease (mild, moderate, severe based on exudate and microaneurysm image. For feature extraction uses wavelet method, and each of the two methods. This study made use of 160 test data, each of 40 images for normal class, mild class, moderate class, severe class. The accuracy obtained by KNN higher than SVM, with 65% and 62%. KNN classification method achieved the best results with the parameters K = 9, cityblock. While the classification with SVM method obtained the best results with

  9. Statistical analysis for validating ACO-KNN algorithm as feature selection in sentiment analysis

    Science.gov (United States)

    Ahmad, Siti Rohaidah; Yusop, Nurhafizah Moziyana Mohd; Bakar, Azuraliza Abu; Yaakub, Mohd Ridzwan

    2017-10-01

    This research paper aims to propose a hybrid of ant colony optimization (ACO) and k-nearest neighbor (KNN) algorithms as feature selections for selecting and choosing relevant features from customer review datasets. Information gain (IG), genetic algorithm (GA), and rough set attribute reduction (RSAR) were used as baseline algorithms in a performance comparison with the proposed algorithm. This paper will also discuss the significance test, which was used to evaluate the performance differences between the ACO-KNN, IG-GA, and IG-RSAR algorithms. This study evaluated the performance of the ACO-KNN algorithm using precision, recall, and F-score, which were validated using the parametric statistical significance tests. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. In addition, the experimental results have proven that the ACO-KNN can be used as a feature selection technique in sentiment analysis to obtain quality, optimal feature subset that can represent the actual data in customer review data.

  10. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Directory of Open Access Journals (Sweden)

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  11. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Science.gov (United States)

    Arefin, Ahmed Shamsul; Riveros, Carlos; Berretta, Regina; Moscato, Pablo

    2012-01-01

    The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers). An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU), can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN) search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour) for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN) provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL) at https://sourceforge.net/p/gpufsknn/.

  12. Secure kNN Computation and Integrity Assurance of Data Outsourcing in the Cloud

    Directory of Open Access Journals (Sweden)

    Jun Hong

    2017-01-01

    Full Text Available As cloud computing has been popularized massively and rapidly, individuals and enterprises prefer outsourcing their databases to the cloud service provider (CSP to save the expenditure for managing and maintaining the data. The outsourced databases are hosted, and query services are offered to clients by the CSP, whereas the CSP is not fully trusted. Consequently, the security shall be violated by multiple factors. Data privacy and query integrity are perceived as two major factors obstructing enterprises from outsourcing their databases. A novel scheme is proposed in this paper to effectuate k-nearest neighbors (kNN query and kNN query authentication on an encrypted outsourced spatial database. An asymmetric scalar-product-preserving encryption scheme is elucidated, in which data points and query points are encrypted with diverse encryption keys, and the CSP can determine the distance relation between encrypted data points and query points. Furthermore, the similarity search tree is extended to build a novel verifiable SS-tree that supports efficient kNN query and kNN query verification. It is indicated from the security analysis and experiment results that our scheme not only maintains the confidentiality of outsourced confidential data and query points but also has a lower kNN query processing and verification overhead than the MR-tree.

  13. Structural, dielectric and ferroelectric study of (1-ϕ)(NBT-KNN)-ϕSBexT ceramics

    Science.gov (United States)

    Swain, Sridevi; Kumar, Pawan

    2016-11-01

    (1-ϕ)(0.93 Na0.5Bi0.5TiO3-0.07K0.5Na0.5NbO3)-ϕSr0.8Bi2.15Ta2O9/(1-ϕ)(NBT-KNN)-ϕSBexT (ϕ=0, 2, 4, 8, 12, 16 wt%) ceramic samples were synthesized by conventional solid state reaction route. Secondary phases started developing for higher SBexT content in the (1-ϕ)(NBT-KNN)-ϕSBexT ceramic samples. Decrease of transition temperature (Tm) with the increase of SBexT content in (1-ϕ)(NBT-KNN)-ϕSBexT ceramics was attributed to the increase of internal stress. Remnant polarization (Pr), leakage current density and polarization degradation values reduced with the increase of SBexT content in (1-ϕ)(NBT-KNN)-ϕSBexT ceramic samples. Retention of good ferroelectric properties and enhancement of fatigue-free behavior with the incorporation of SBexT phase in (1-ϕ)(NBT-KNN)-ϕSBexT ceramic samples suggested their usefulness for ferroelectric memory applications.

  14. Metric Tensor Vs. Metric Extensor

    OpenAIRE

    Fernández, V. V.; Moya, A. M.; Rodrigues Jr, Waldyr A.

    2002-01-01

    In this paper we give a comparison between the formulation of the concept of metric for a real vector space of finite dimension in terms of \\emph{tensors} and \\emph{extensors}. A nice property of metric extensors is that they have inverses which are also themselves metric extensors. This property is not shared by metric tensors because tensors do \\emph{not} have inverses. We relate the definition of determinant of a metric extensor with the classical determinant of the corresponding matrix as...

  15. A Robust Sparse Adaptive Filtering Algorithm with a Correntropy Induced Metric Constraint for Broadband Multi-Path Channel Estimation

    Directory of Open Access Journals (Sweden)

    Yingsong Li

    2016-10-01

    Full Text Available A robust sparse least-mean mixture-norm (LMMN algorithm is proposed, and its performance is appraised in the context of estimating a broadband multi-path wireless channel. The proposed algorithm is implemented via integrating a correntropy-induced metric (CIM penalty into the conventional LMMN algorithm to modify the basic cost function, which is denoted as the CIM-based LMMN (CIM-LMMN algorithm. The proposed CIM-LMMN algorithm is derived in detail within the kernel framework. The updating equation of CIM-LMMN can provide a zero attractor to attract the non-dominant channel coefficients to zeros, and it also gives a tradeoff between the sparsity and the estimation misalignment. Moreover, the channel estimation behavior is investigated over a broadband sparse multi-path wireless channel, and the simulation results are compared with the least mean square/fourth (LMS/F, least mean square (LMS, least mean fourth (LMF and the recently-developed sparse channel estimation algorithms. The channel estimation performance obtained from the designated sparse channel estimation demonstrates that the CIM-LMMN algorithm outperforms the recently-developed sparse LMMN algorithms and the relevant sparse channel estimation algorithms. From the results, we can see that our CIM-LMMN algorithm is robust and is superior to these mentioned algorithms in terms of both the convergence speed rate and the channel estimation misalignment for estimating a sparse channel.

  16. Key landscape ecology metrics for assessing climate change adaptation options: Rate of change and patchiness of impacts

    Science.gov (United States)

    López-Hoffman, Laura; Breshears, David D.; Allen, Craig D.; Miller, Marc L.

    2013-01-01

    Under a changing climate, devising strategies to help stakeholders adapt to alterations to ecosystems and their services is of utmost importance. In western North America, diminished snowpack and river flows are causing relatively gradual, homogeneous (system-wide) changes in ecosystems and services. In addition, increased climate variability is also accelerating the incidence of abrupt and patchy disturbances such as fires, floods and droughts. This paper posits that two key variables often considered in landscape ecology—the rate of change and the degree of patchiness of change—can aid in developing climate change adaptation strategies. We use two examples from the “borderland” region of the southwestern United States and northwestern Mexico. In piñon-juniper woodland die-offs that occurred in the southwestern United States during the 2000s, ecosystem services suddenly crashed in some parts of the system while remaining unaffected in other locations. The precise timing and location of die-offs was uncertain. On the other hand, slower, homogeneous change, such as the expected declines in water supply to the Colorado River delta, will likely impact the entire ecosystem, with ecosystem services everywhere in the delta subject to alteration, and all users likely exposed. The rapidity and spatial heterogeneity of faster, patchy climate change exemplified by tree die-off suggests that decision-makers and local stakeholders would be wise to operate under a Rawlsian “veil of ignorance,” and implement adaptation strategies that allow ecosystem service users to equitably share the risk of sudden loss of ecosystem services before actual ecosystem changes occur. On the other hand, in the case of slower, homogeneous, system-wide impacts to ecosystem services as exemplified by the Colorado River delta, adaptation strategies can be implemented after the changes begin, but will require a fundamental rethinking of how ecosystems and services are used and valued. In

  17. A Quantum Hybrid PSO Combined with Fuzzy k-NN Approach to Feature Selection and Cell Classification in Cervical Cancer Detection

    Directory of Open Access Journals (Sweden)

    Abdullah M. Iliyasu

    2017-12-01

    Full Text Available A quantum hybrid (QH intelligent approach that blends the adaptive search capability of the quantum-behaved particle swarm optimisation (QPSO method with the intuitionistic rationality of traditional fuzzy k-nearest neighbours (Fuzzy k-NN algorithm (known simply as the Q-Fuzzy approach is proposed for efficient feature selection and classification of cells in cervical smeared (CS images. From an initial multitude of 17 features describing the geometry, colour, and texture of the CS images, the QPSO stage of our proposed technique is used to select the best subset features (i.e., global best particles that represent a pruned down collection of seven features. Using a dataset of almost 1000 images, performance evaluation of our proposed Q-Fuzzy approach assesses the impact of our feature selection on classification accuracy by way of three experimental scenarios that are compared alongside two other approaches: the All-features (i.e., classification without prior feature selection and another hybrid technique combining the standard PSO algorithm with the Fuzzy k-NN technique (P-Fuzzy approach. In the first and second scenarios, we further divided the assessment criteria in terms of classification accuracy based on the choice of best features and those in terms of the different categories of the cervical cells. In the third scenario, we introduced new QH hybrid techniques, i.e., QPSO combined with other supervised learning methods, and compared the classification accuracy alongside our proposed Q-Fuzzy approach. Furthermore, we employed statistical approaches to establish qualitative agreement with regards to the feature selection in the experimental scenarios 1 and 3. The synergy between the QPSO and Fuzzy k-NN in the proposed Q-Fuzzy approach improves classification accuracy as manifest in the reduction in number cell features, which is crucial for effective cervical cancer detection and diagnosis.

  18. Adaptive training using an artificial neural network and EEG metrics for within- and cross-task workload classification.

    Science.gov (United States)

    Baldwin, Carryl L; Penaranda, B N

    2012-01-02

    Adaptive training using neurophysiological measures requires efficient classification of mental workload in real time as a learner encounters new and increasingly difficult levels of tasks. Previous investigations have shown that artificial neural networks (ANNs) can accurately classify workload, but only when trained on neurophysiological exemplars from experienced operators on specific tasks. The present study examined classification accuracies for ANNs trained on electroencephalographic (EEG) activity recorded while participants performed the same (within task) and different (cross) tasks for short periods of time with little or no prior exposure to the tasks. Participants performed three working memory tasks at two difficulty levels with order of task and difficulty level counterbalanced. Within-task classification accuracies were high when ANNs were trained on exemplars from the same task or a set containing the to-be-classified task, (M=87.1% and 85.3%, respectively). Cross-task classification accuracies were significantly lower (average 44.8%) indicating consistent systematic misclassification for certain tasks in some individuals. Results are discussed in terms of their implications for developing neurophysiologically driven adaptive training platforms. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Processing and characterizations of BNT-KNN ceramics for actuator applications

    Directory of Open Access Journals (Sweden)

    Mallam Chandrasekhar

    2016-06-01

    Full Text Available BNT-KNN powder (with composition 0.93Bi0.5Na0.5TiO3–0.07K0.5Na0.5NbO3 was synthesized as a single perovskite phase by conventional solid state reaction route and dense ceramics were obtained by sintering of powder compacts at 1100 °C for 4 h. Dielectric study confirmed relaxor behaviour, whereas the microstructure study showed sharp cornered cubic like grains with an average grain size ∼1.15 µm. The saturated polarization vs. electric field (P-E hysteresis loops confirmed the ferroelectric (FE nature while the butterfly shaped strain vs. electric field (S-E loops suggested the piezoelectric nature of the BNT-KNN ceramic samples. Maximum electric field induced strain of ∼0.62% suggested the usefulness of this system for actuator applications.

  20. Multi-site Stochastic Simulation of Daily Streamflow with Markov Chain and KNN Algorithm

    Science.gov (United States)

    Mathai, J.; Mujumdar, P.

    2017-12-01

    A key focus of this study is to develop a method which is physically consistent with the hydrologic processes that can capture short-term characteristics of daily hydrograph as well as the correlation of streamflow in temporal and spatial domains. In complex water resource systems, flow fluctuations at small time intervals require that discretisation be done at small time scales such as daily scales. Also, simultaneous generation of synthetic flows at different sites in the same basin are required. We propose a method to equip water managers with a streamflow generator within a stochastic streamflow simulation framework. The motivation for the proposed method is to generate sequences that extend beyond the variability represented in the historical record of streamflow time series. The method has two steps: In step 1, daily flow is generated independently at each station by a two-state Markov chain, with rising limb increments randomly sampled from a Gamma distribution and the falling limb modelled as exponential recession and in step 2, the streamflow generated in step 1 is input to a nonparametric K-nearest neighbor (KNN) time series bootstrap resampler. The KNN model, being data driven, does not require assumptions on the dependence structure of the time series. A major limitation of KNN based streamflow generators is that they do not produce new values, but merely reshuffle the historical data to generate realistic streamflow sequences. However, daily flow generated using the Markov chain approach is capable of generating a rich variety of streamflow sequences. Furthermore, the rising and falling limbs of daily hydrograph represent different physical processes, and hence they need to be modelled individually. Thus, our method combines the strengths of the two approaches. We show the utility of the method and improvement over the traditional KNN by simulating daily streamflow sequences at 7 locations in the Godavari River basin in India.

  1. Self-organization in aggregating robot swarms: A DW-KNN topological approach.

    Science.gov (United States)

    Khaldi, Belkacem; Harrou, Fouzi; Cherif, Foudil; Sun, Ying

    2018-03-01

    In certain swarm applications, where the inter-agent distance is not the only factor in the collective behaviours of the swarm, additional properties such as density could have a crucial effect. In this paper, we propose applying a Distance-Weighted K-Nearest Neighbouring (DW-KNN) topology to the behaviour of robot swarms performing self-organized aggregation, in combination with a virtual physics approach to keep the robots together. A distance-weighted function based on a Smoothed Particle Hydrodynamic (SPH) interpolation approach, which is used to evaluate the robot density in the swarm, is applied as the key factor for identifying the K-nearest neighbours taken into account when aggregating the robots. The intra virtual physical connectivity among these neighbours is achieved using a virtual viscoelastic-based proximity model. With the ARGoS based-simulator, we model and evaluate the proposed approach, showing various self-organized aggregations performed by a swarm of N foot-bot robots. Also, we compared the aggregation quality of DW-KNN aggregation approach to that of the conventional KNN approach and found better performance. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Self-Organization in Aggregating Robot Swarms: A DW-KNN Topological Approach

    KAUST Repository

    Khaldi, Belkacem

    2018-02-02

    In certain swarm applications, where the inter-agent distance is not the only factor in the collective behaviours of the swarm, additional properties such as density could have a crucial effect. In this paper, we propose applying a Distance-Weighted K-Nearest Neighbouring (DW-KNN) topology to the behaviour of robot swarms performing self-organized aggregation, in combination with a virtual physics approach to keep the robots together. A distance-weighted function based on a Smoothed Particle Hydrodynamic (SPH) interpolation approach, which is used to evaluate the robot density in the swarm, is applied as the key factor for identifying the K-nearest neighbours taken into account when aggregating the robots. The intra virtual physical connectivity among these neighbours is achieved using a virtual viscoelastic-based proximity model. With the ARGoS based-simulator, we model and evaluate the proposed approach, showing various self-organized aggregations performed by a swarm of N foot-bot robots. Also, we compared the aggregation quality of DW-KNN aggregation approach to that of the conventional KNN approach and found better performance.

  3. Nanoscale characterization and local piezoelectric properties of lead-free KNN-LT-LS thin films

    Science.gov (United States)

    Abazari, M.; Choi, T.; Cheong, S.-W.; Safari, A.

    2010-01-01

    We report the observation of domain structure and piezoelectric properties of pure and Mn-doped (K0.44,Na0.52,Li0.04)(Nb0.84,Ta0.1,Sb0.06)O3 (KNN-LT-LS) thin films on SrTiO3 substrates. It is revealed that, using piezoresponse force microscopy, ferroelectric domain structure in such 500 nm thin films comprised of primarily 180° domains. This was in accordance with the tetragonal structure of the films, confirmed by relative permittivity measurements and x-ray diffraction patterns. Effective piezoelectric coefficient (d33) of the films were calculated using piezoelectric displacement curves and shown to be ~53 pm V-1 for pure KNN-LT-LS thin films. This value is among the highest values reported for an epitaxial lead-free thin film and shows a great potential for KNN-LT-LS to serve as an alternative to PZT thin films in future applications.

  4. PHYSICAL AND ELECTRICAL PROPERTIES ENHANCEMENT OF RARE-EARTH DOPED-POTASSIUM SODIUM NIOBATE (KNN: A REVIEW

    Directory of Open Access Journals (Sweden)

    Akmal Mat Harttat Maziati

    2015-06-01

    Full Text Available Alkaline niobate mainly potassium sodium niobate, (KxNa1-x NbO3 (abreviated as KNN has long attracted attention as piezoelectric materials as its high Curie temperature (Tc and piezoelectric properties. The volatility of alkaline element (K, Na is, however detrimental to the stoichiometry of KNN, contributing to the failure to achieve high-density structure and lead to the formation of intrinsic defects. By partially doping of several rare-earth elements, the inherent defects could be improved significantly. Therefore, considerable attempts have been made to develop doped-KNN based ceramic materials with high electrical properties. In this paper, these research activities are reviewed, including dopants type and doping role in KNN perovskite structure.

  5. Metric learning

    CERN Document Server

    Bellet, Aurelien; Sebban, Marc

    2015-01-01

    Similarity between objects plays an important role in both human cognitive processes and artificial systems for recognition and categorization. How to appropriately measure such similarities for a given task is crucial to the performance of many machine learning, pattern recognition and data mining methods. This book is devoted to metric learning, a set of techniques to automatically learn similarity and distance functions from data that has attracted a lot of interest in machine learning and related fields in the past ten years. In this book, we provide a thorough review of the metric learnin

  6. SU-E-J-124: FDG PET Metrics Analysis in the Context of An Adaptive PET Protocol for Node Positive Gynecologic Cancer Patients

    Energy Technology Data Exchange (ETDEWEB)

    Nawrocki, J; Chino, J; Light, K; Vergalasova, I; Craciunescu, O [Duke University Medical Center, Durham, NC (United States)

    2014-06-01

    Purpose: To compare PET extracted metrics and investigate the role of a gradient-based PET segmentation tool, PET Edge (MIM Software Inc., Cleveland, OH), in the context of an adaptive PET protocol for node positive gynecologic cancer patients. Methods: An IRB approved protocol enrolled women with gynecological, PET visible malignancies. A PET-CT was obtained for treatment planning prescribed to 45–50.4Gy with a 55– 70Gy boost to the PET positive nodes. An intra-treatment PET-CT was obtained between 30–36Gy, and all volumes re-contoured. Standard uptake values (SUVmax, SUVmean, SUVmedian) and GTV volumes were extracted from the clinician contoured GTVs on the pre- and intra-treament PET-CT for primaries and nodes and compared with a two tailed Wilcoxon signed-rank test. The differences between primary and node GTV volumes contoured in the treatment planning system and those volumes generated using PET Edge were also investigated. Bland-Altman plots were used to describe significant differences between the two contouring methods. Results: Thirteen women were enrolled in this study. The median baseline/intra-treatment primary (SUVmax, mean, median) were (30.5, 9.09, 7.83)/( 16.6, 4.35, 3.74), and nodes were (20.1, 4.64, 3.93)/( 6.78, 3.13, 3.26). The p values were all < 0.001. The clinical contours were all larger than the PET Edge generated ones, with mean difference of +20.6 ml for primary, and +23.5 ml for nodes. The Bland-Altman revealed changes between clinician/PET Edge contours to be mostly within the margins of the coefficient of variability. However, there was a proportional trend, i.e. the larger the GTV, the larger the clinical contours as compared to PET Edge contours. Conclusion: Primary and node SUV values taken from the intratreament PET-CT can be used to assess the disease response and to design an adaptive plan. The PET Edge tool can streamline the contouring process and lead to smaller, less user-dependent contours.

  7. Assessment of various supervised learning algorithms using different performance metrics

    Science.gov (United States)

    Susheel Kumar, S. M.; Laxkar, Deepak; Adhikari, Sourav; Vijayarajan, V.

    2017-11-01

    Our work brings out comparison based on the performance of supervised machine learning algorithms on a binary classification task. The supervised machine learning algorithms which are taken into consideration in the following work are namely Support Vector Machine(SVM), Decision Tree(DT), K Nearest Neighbour (KNN), Naïve Bayes(NB) and Random Forest(RF). This paper mostly focuses on comparing the performance of above mentioned algorithms on one binary classification task by analysing the Metrics such as Accuracy, F-Measure, G-Measure, Precision, Misclassification Rate, False Positive Rate, True Positive Rate, Specificity, Prevalence.

  8. Metrication manual

    International Nuclear Information System (INIS)

    Harper, A.F.A.; Digby, R.B.; Thong, S.P.; Lacey, F.

    1978-04-01

    In April 1978 a meeting of senior metrication officers convened by the Commonwealth Science Council of the Commonwealth Secretariat, was held in London. The participants were drawn from Australia, Bangladesh, Britain, Canada, Ghana, Guyana, India, Jamaica, Papua New Guinea, Solomon Islands and Trinidad and Tobago. Among other things, the meeting resolved to develop a set of guidelines to assist countries to change to SI and to compile such guidelines in the form of a working manual

  9. A comparison of the spatial linear model to Nearest Neighbor (k-NN) methods for forestry applications.

    Science.gov (United States)

    Ver Hoef, Jay M; Temesgen, Hailemariam

    2013-01-01

    Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically, through simulations, and as applied to real forestry data. While both methods have desirable properties, a review shows that the SLM has prediction optimality properties, and can be quite robust. Simulations of artificial populations and resamplings of real forestry data show that the SLM has smaller empirical root-mean-squared prediction errors (RMSPE) for a wide variety of data types, with generally less bias and better interval coverage than k-NN. These patterns held for both point predictions and for population totals or averages, with the SLM reducing RMSPE from 9% to 67% over some popular k-NN methods, with SLM also more robust to spatially imbalanced sampling. Estimating prediction standard errors remains a problem for k-NN predictors, despite recent attempts using model-based methods. Our conclusions are that the SLM should generally be used rather than k-NN if the goal is accurate mapping or estimation of population totals or averages.

  10. A New Method to Improve the Electrical Properties of KNN-based Ceramics: Tailoring Phase Fraction

    KAUST Repository

    Lv, Xiang

    2017-08-18

    Although both the phase type and fraction of multi-phase coexistence can affect the electrical properties of (K,Na)NbO3 (KNN)-based ceramics, effects of phase fraction on their electrical properties were few concerned. In this work, through changing the calcination temperature of CaZrO3 powders, we successfully developed the 0.96K0.5Na0.5Nb0.96Sb0.04O3-0.01CaZrO3-0.03Bi0.5Na0.5HfO3 ceramics containing a wide rhombohedral-tetragonal (R-T) phase coexistence with the variations of T (or R) phase fractions. It was found that higher T phase fraction can warrant a larger piezoelectric constant (d33) and d33 also showed a linear variation with respect to tetragonality ratio (c/a). More importantly, a number of domain patterns were observed due to high T phase fraction and large c/a ratio, greatly benefiting the piezoelectricity. In addition, the improved ferroelectric fatigue behavior and thermal stability were also shown in the ceramics containing high T phase fraction. Therefore, this work can bring a new viewpoint into the physical mechanism of KNN-based ceramics behind R-T phase coexistence.

  11. STUDY COMPARISON OF SVM-, K-NN- AND BACKPROPAGATION-BASED CLASSIFIER FOR IMAGE RETRIEVAL

    Directory of Open Access Journals (Sweden)

    Muhammad Athoillah

    2015-03-01

    Full Text Available Classification is a method for compiling data systematically according to the rules that have been set previously. In recent years classification method has been proven to help many people’s work, such as image classification, medical biology, traffic light, text classification etc. There are many methods to solve classification problem. This variation method makes the researchers find it difficult to determine which method is best for a problem. This framework is aimed to compare the ability of classification methods, such as Support Vector Machine (SVM, K-Nearest Neighbor (K-NN, and Backpropagation, especially in study cases of image retrieval with five category of image dataset. The result shows that K-NN has the best average result in accuracy with 82%. It is also the fastest in average computation time with 17,99 second during retrieve session for all categories class. The Backpropagation, however, is the slowest among three of them. In average it needed 883 second for training session and 41,7 second for retrieve session.

  12. Implementing metrics for process improvement

    OpenAIRE

    McAuley, Angela

    1993-01-01

    There is increasing interest in the use of metrics to control the software development process, to demonstrate productivity and value, and to identify areas for process improvement. Research work completed to date is based on the implementation of metrics in a 'standard' software development environment, and follows either a top-down or bottom-up approach. With the advent of further European unity, many companies are producing localised products, ie products which are translated and adapted t...

  13. Role of sintering time, crystalline phases and symmetry in the piezoelectric properties of lead-free KNN-modified ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Rubio-Marcos, F., E-mail: frmarcos@icv.csic.es [Electroceramic Department, Instituto de Ceramica y Vidrio, CSIC, Kelsen 5, 28049 Madrid (Spain); Marchet, P.; Merle-Mejean, T. [SPCTS, UMR 6638 CNRS, Universite de Limoges, 123, Av. A. Thomas, 87060 Limoges (France); Fernandez, J.F. [Electroceramic Department, Instituto de Ceramica y Vidrio, CSIC, Kelsen 5, 28049 Madrid (Spain)

    2010-09-01

    Lead-free KNN-modified piezoceramics of the system (Li,Na,K)(Nb,Ta,Sb)O{sub 3} were prepared by conventional solid-state sintering. The X-ray diffraction patterns revealed a perovskite phase, together with some minor secondary phase, which was assigned to K{sub 3}LiNb{sub 6}O{sub 17}, tetragonal tungsten-bronze (TTB). A structural evolution toward a pure tetragonal structure with the increasing sintering time was observed, associated with the decrease of TTB phase. A correlation between higher tetragonality and higher piezoelectric response was clearly evidenced. Contrary to the case of the LiTaO{sub 3} modified KNN, very large abnormal grains with TTB structure were not detected. As a consequence, the simultaneous modification by tantalum and antimony seems to induce during sintering a different behaviour from the one of LiTaO{sub 3} modified KNN.

  14. Obtaining of BNKT-KNN ceramic powders by the Pechini Method; Obtencion de polvos ceramicos de BNKT-KNN por el metodo Pechini

    Energy Technology Data Exchange (ETDEWEB)

    Yasno, J. P.; Tirado-Mejia, L.; Kiminamp, R. H. G. A.; Gaona, J. S.; Raigoza, C. E. V.

    2013-10-01

    Pechini method was used in order to obtain fine ceramic and single-phase powders for a lead-free ferroelectric system 0,97[(Bi{sub 1}/2Na{sub 1}/2){sub 1}-x(Bi{sub 1}/2K{sub 1}/2)xTiO{sub 3}]-0,03[(Na{sub 1}/2K{sub 1}/2)NbO{sub 3}] or BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27). This method allowed obtaining powders with 100 % perovskite phase, which was confirmed by X-ray diffraction, for this particular system in all the studied stoichiometries using temperature as low as 600 degree centigrade. The effects on the bonds present in the structure due to variation of the stoichiometry, Na-K, were determined using infrared spectroscopy, FT-IR. Irregular nanoparticles were observed by scanning electron microscopy. (Author)

  15. Obtaining of BNKT-KNN ceramic powders by the Pechini Method; Obtencion de polvos ceramicos de BNKT-KNN por el metodo Pechini

    Energy Technology Data Exchange (ETDEWEB)

    Yasno, J. P.; Tirado-Mejia, L.; Kiminami, R.; Gaona, J.; Raigoza, C. F. V.

    2013-09-01

    Pechini method was used in order to obtain fine ceramic and single-phase powders for a lead-free ferroelectric system 0,97[(Bi{sub 1}/2Na{sub 1}/2)1-x(Bi{sub 1}/2K{sub 1}/2)xTiO{sub 3}]-0,03[(Na{sub 1}/2K{sub 1}/2)NbO{sub 3}] or BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27). This method allowed obtaining powders with 100 % perovskite phase, which was confirmed by X-ray diffraction, for this particular system in all the studied stoichiometries using temperature as low as 600 degree centigrade. The effects on the bonds present in the structure due to variation of the stoichiometry, Na-K, were determined using infrared spectroscopy, FT-IR. Irregular nanoparticles were observed by scanning electron microscopy.

  16. Evaluation of normalization methods for cDNA microarray data by k-NN classification

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Wei; Xing, Eric P; Myers, Connie; Mian, Saira; Bissell, Mina J

    2004-12-17

    Non-biological factors give rise to unwanted variations in cDNA microarray data. There are many normalization methods designed to remove such variations. However, to date there have been few published systematic evaluations of these techniques for removing variations arising from dye biases in the context of downstream, higher-order analytical tasks such as classification. Ten location normalization methods that adjust spatial- and/or intensity-dependent dye biases, and three scale methods that adjust scale differences were applied, individually and in combination, to five distinct, published, cancer biology-related cDNA microarray data sets. Leave-one-out cross-validation (LOOCV) classification error was employed as the quantitative end-point for assessing the effectiveness of a normalization method. In particular, a known classifier, k-nearest neighbor (k-NN), was estimated from data normalized using a given technique, and the LOOCV error rate of the ensuing model was computed. We found that k-NN classifiers are sensitive to dye biases in the data. Using NONRM and GMEDIAN as baseline methods, our results show that single-bias-removal techniques which remove either spatial-dependent dye bias (referred later as spatial effect) or intensity-dependent dye bias (referred later as intensity effect) moderately reduce LOOCV classification errors; whereas double-bias-removal techniques which remove both spatial- and intensity effect reduce LOOCV classification errors even further. Of the 41 different strategies examined, three two-step processes, IGLOESS-SLFILTERW7, ISTSPLINE-SLLOESS and IGLOESS-SLLOESS, all of which removed intensity effect globally and spatial effect locally, appear to reduce LOOCV classification errors most consistently and effectively across all data sets. We also found that the investigated scale normalization methods do not reduce LOOCV classification error. Using LOOCV error of k-NNs as the evaluation criterion, three double

  17. Epileptic MEG Spike Detection Using Statistical Features and Genetic Programming with KNN

    Directory of Open Access Journals (Sweden)

    Turky N. Alotaiby

    2017-01-01

    Full Text Available Epilepsy is a neurological disorder that affects millions of people worldwide. Monitoring the brain activities and identifying the seizure source which starts with spike detection are important steps for epilepsy treatment. Magnetoencephalography (MEG is an emerging epileptic diagnostic tool with high-density sensors; this makes manual analysis a challenging task due to the vast amount of MEG data. This paper explores the use of eight statistical features and genetic programing (GP with the K-nearest neighbor (KNN for interictal spike detection. The proposed method is comprised of three stages: preprocessing, genetic programming-based feature generation, and classification. The effectiveness of the proposed approach has been evaluated using real MEG data obtained from 28 epileptic patients. It has achieved a 91.75% average sensitivity and 92.99% average specificity.

  18. Optimization of internet content filtering-Combined with KNN and OCAT algorithms

    Science.gov (United States)

    Guo, Tianze; Wu, Lingjing; Liu, Jiaming

    2018-04-01

    The face of the status quo that rampant illegal content in the Internet, the result of traditional way to filter information, keyword recognition and manual screening, is getting worse. Based on this, this paper uses OCAT algorithm nested by KNN classification algorithm to construct a corpus training library that can dynamically learn and update, which can be improved on the filter corpus for constantly updated illegal content of the network, including text and pictures, and thus can better filter and investigate illegal content and its source. After that, the research direction will focus on the simplified updating of recognition and comparison algorithms and the optimization of the corpus learning ability in order to improve the efficiency of filtering, save time and resources.

  19. Latent Dirichlet Allocation (LDA) Model and kNN Algorithm to Classify Research Project Selection

    Science.gov (United States)

    Safi’ie, M. A.; Utami, E.; Fatta, H. A.

    2018-03-01

    Universitas Sebelas Maret has a teaching staff more than 1500 people, and one of its tasks is to carry out research. In the other side, the funding support for research and service is limited, so there is need to be evaluated to determine the Research proposal submission and devotion on society (P2M). At the selection stage, research proposal documents are collected as unstructured data and the data stored is very large. To extract information contained in the documents therein required text mining technology. This technology applied to gain knowledge to the documents by automating the information extraction. In this articles we use Latent Dirichlet Allocation (LDA) to the documents as a model in feature extraction process, to get terms that represent its documents. Hereafter we use k-Nearest Neighbour (kNN) algorithm to classify the documents based on its terms.

  20. A Comparison of the Spatial Linear Model to Nearest Neighbor (k-NN) Methods for Forestry Applications

    Science.gov (United States)

    Jay M. Ver Hoef; Hailemariam Temesgen; Sergio Gómez

    2013-01-01

    Forest surveys provide critical information for many diverse interests. Data are often collected from samples, and from these samples, maps of resources and estimates of aerial totals or averages are required. In this paper, two approaches for mapping and estimating totals; the spatial linear model (SLM) and k-NN (k-Nearest Neighbor) are compared, theoretically,...

  1. Quantum Algorithm for K-Nearest Neighbors Classification Based on the Metric of Hamming Distance

    Science.gov (United States)

    Ruan, Yue; Xue, Xiling; Liu, Heng; Tan, Jianing; Li, Xi

    2017-11-01

    K-nearest neighbors (KNN) algorithm is a common algorithm used for classification, and also a sub-routine in various complicated machine learning tasks. In this paper, we presented a quantum algorithm (QKNN) for implementing this algorithm based on the metric of Hamming distance. We put forward a quantum circuit for computing Hamming distance between testing sample and each feature vector in the training set. Taking advantage of this method, we realized a good analog for classical KNN algorithm by setting a distance threshold value t to select k - n e a r e s t neighbors. As a result, QKNN achieves O( n 3) performance which is only relevant to the dimension of feature vectors and high classification accuracy, outperforms Llyod's algorithm (Lloyd et al. 2013) and Wiebe's algorithm (Wiebe et al. 2014).

  2. Metric Tannakian Duality

    OpenAIRE

    Daenzer, Calder

    2011-01-01

    We incorporate metric data into the framework of Tannaka-Krein duality. Thus, for any group with left invariant metric, we produce a dual metric on its category of unitary representations. We characterize the conditions under which a "double-dual" metric on the group may be recovered from the metric on representations, and provide conditions under which a metric agrees with its double-dual. We also consider some applications to T-duality and quantum Gromov-Hausdorff distance.

  3. ROUTING BASE CONGESTION CONTROL METRICS IN MANETS

    Directory of Open Access Journals (Sweden)

    Sandeep Dalal

    2014-09-01

    Full Text Available Mobile adhoc network is self-configurable and adaptive. Due to node mobility we cannot predict load on the network which leads to congestion, one of the widely researched area in manets. A lot of congestion control techniques and metrics have been proposed to overcome it before its occurrence or after it has occurred. In this survey we identify the currently used congestion control metrics. Through this survey we also propose a congestion control metric RFR(resource free ratio which considers three most important parameters to provide congestion free route discovery. Further we show the results of node selection based on fuzzy logic calculations using the proposed metric.

  4. Displacement Investigation of KNN-Bitumen-Based Piezoceramics in Asphalt Concrete

    Directory of Open Access Journals (Sweden)

    Ning Tang

    2018-01-01

    Full Text Available Piezoelectric material has excellent characteristics of electromechanical coupling so that it could be widely applied in structural health monitoring field. Nondestructive testing of piezoelectric technique becomes a research focus on piezoelectric field. Asphalt concrete produces cumulative damage under the multiple repeated vehicle load and natural situation, so it is suited material and structure for nondestructive application. In this study, a test system was established by driving power of piezoceramic, laser displacement sensor, computer, and piezo-embedded asphalt concrete. Displacement, hysteresis, creeps, and dynamic behavior of KNN piezoceramic element embedded in asphalt concrete were tested. The results indicate that displacement output attained 0.4 μm to 0.7 μm when the loads were from 0 N to 150 N. The hysteresis was not obvious when the load was from 0 N to 100 N, aside from higher loads. The creep phenomenon can be divided into two parts: uptrend and balance. The more serious the asphalt binder ageing is, the larger the displacement is, when piezo-asphalt concrete has already been in serious ageing.

  5. Feature Selection and Predictors of Falls with Foot Force Sensors Using KNN-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Shengyun Liang

    2015-11-01

    Full Text Available The aging process may lead to the degradation of lower extremity function in the elderly population, which can restrict their daily quality of life and gradually increase the fall risk. We aimed to determine whether objective measures of physical function could predict subsequent falls. Ground reaction force (GRF data, which was quantified by sample entropy, was collected by foot force sensors. Thirty eight subjects (23 fallers and 15 non-fallers participated in functional movement tests, including walking and sit-to-stand (STS. A feature selection algorithm was used to select relevant features to classify the elderly into two groups: at risk and not at risk of falling down, for three KNN-based classifiers: local mean-based k-nearest neighbor (LMKNN, pseudo nearest neighbor (PNN, local mean pseudo nearest neighbor (LMPNN classification. We compared classification performances, and achieved the best results with LMPNN, with sensitivity, specificity and accuracy all 100%. Moreover, a subset of GRFs was significantly different between the two groups via Wilcoxon rank sum test, which is compatible with the classification results. This method could potentially be used by non-experts to monitor balance and the risk of falling down in the elderly population.

  6. QSAR analysis of furanone derivatives as potential COX-2 inhibitors: kNN MFA approach

    Directory of Open Access Journals (Sweden)

    Ruchi Bhatiya

    2014-12-01

    Full Text Available A series of thirty-two furanone derivatives with their cyclooxygenase-2 inhibitory activity were subjected to quantitative structural–activity relationship analysis to derive a correlation between biological activity as a dependent variable and various descriptors as independent variables by using V-LIFE MDS3.5 software. The significant 2D QSAR model showed correlation coefficient (r2 = 0.840, standard error of estimation (SEE = 0.195, and a cross-validated squared correlation coefficient (q2 = 0.773. The descriptors involved in the building of 2D QSAR model are retention index for six membered rings, total number of oxygen connected with two single bonds, polar surface area excluding P and S plays a significant role in COX-2 inhibition. 3D-QSAR performed via Step Wise K Nearest Neighbor Molecular Field Analysis [(SW kNN MFA] with partial least-square (PLS technique showed high predictive ability (r2 = 0.7622, q2 = 0.7031 and standard error = 0.3660 explaining the majority of the variance in the data with two principle components. The results of the present study may be useful in the design of more potent furanone derivatives as COX-2 inhibitors.

  7. Prediction of Epileptic Seizure by Analysing Time Series EEG Signal Using k-NN Classifier

    Directory of Open Access Journals (Sweden)

    Md. Kamrul Hasan

    2017-01-01

    Full Text Available Electroencephalographic signal is a representative signal that contains information about brain activity, which is used for the detection of epilepsy since epileptic seizures are caused by a disturbance in the electrophysiological activity of the brain. The prediction of epileptic seizure usually requires a detailed and experienced analysis of EEG. In this paper, we have introduced a statistical analysis of EEG signal that is capable of recognizing epileptic seizure with a high degree of accuracy and helps to provide automatic detection of epileptic seizure for different ages of epilepsy. To accomplish the target research, we extract various epileptic features namely approximate entropy (ApEn, standard deviation (SD, standard error (SE, modified mean absolute value (MMAV, roll-off (R, and zero crossing (ZC from the epileptic signal. The k-nearest neighbours (k-NN algorithm is used for the classification of epilepsy then regression analysis is used for the prediction of the epilepsy level at different ages of the patients. Using the statistical parameters and regression analysis, a prototype mathematical model is proposed which helps to find the epileptic randomness with respect to the age of different subjects. The accuracy of this prototype equation depends on proper analysis of the dynamic information from the epileptic EEG.

  8. Automated web usage data mining and recommendation system using K-Nearest Neighbor (KNN classification method

    Directory of Open Access Journals (Sweden)

    D.A. Adeniyi

    2016-01-01

    Full Text Available The major problem of many on-line web sites is the presentation of many choices to the client at a time; this usually results to strenuous and time consuming task in finding the right product or information on the site. In this work, we present a study of automatic web usage data mining and recommendation system based on current user behavior through his/her click stream data on the newly developed Really Simple Syndication (RSS reader website, in order to provide relevant information to the individual without explicitly asking for it. The K-Nearest-Neighbor (KNN classification method has been trained to be used on-line and in Real-Time to identify clients/visitors click stream data, matching it to a particular user group and recommend a tailored browsing option that meet the need of the specific user at a particular time. To achieve this, web users RSS address file was extracted, cleansed, formatted and grouped into meaningful session and data mart was developed. Our result shows that the K-Nearest Neighbor classifier is transparent, consistent, straightforward, simple to understand, high tendency to possess desirable qualities and easy to implement than most other machine learning techniques specifically when there is little or no prior knowledge about data distribution.

  9. Optical and Piezoelectric Study of KNN Solid Solutions Co-Doped with La-Mn and Eu-Fe

    Directory of Open Access Journals (Sweden)

    Jesús-Alejandro Peña-Jiménez

    2016-09-01

    Full Text Available The solid-state method was used to synthesize single phase potassium-sodium niobate (KNN co-doped with the La3+–Mn4+ and Eu3+–Fe3+ ion pairs. Structural determination of all studied solid solutions was accomplished by XRD and Rietveld refinement method. Electron paramagnetic resonance (EPR studies were performed to determine the oxidation state of paramagnetic centers. Optical spectroscopy measurements, excitation, emission and decay lifetime were carried out for each solid solution. The present study reveals that doping KNN with La3+–Mn4+ and Eu3+–Fe3+ at concentrations of 0.5 mol % and 1 mol %, respectively, improves the ferroelectric and piezoelectric behavior and induce the generation of optical properties in the material for potential applications.

  10. Adaptation.

    Science.gov (United States)

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  11. Random Kaehler metrics

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, Frank, E-mail: frank.ferrari@ulb.ac.be [Service de Physique Theorique et Mathematique, Universite Libre de Bruxelles and International Solvay Institutes, Campus de la Plaine, CP 231, 1050 Bruxelles (Belgium); Klevtsov, Semyon, E-mail: semyon.klevtsov@ulb.ac.be [Service de Physique Theorique et Mathematique, Universite Libre de Bruxelles and International Solvay Institutes, Campus de la Plaine, CP 231, 1050 Bruxelles (Belgium); ITEP, B. Cheremushkinskaya 25, Moscow 117218 (Russian Federation); Zelditch, Steve, E-mail: zelditch@math.northwestern.edu [Department of Mathematics, Northwestern University, Evanston, IL 60208 (United States)

    2013-04-01

    The purpose of this article is to propose a new method to define and calculate path integrals over metrics on a Kaehler manifold. The main idea is to use finite dimensional spaces of Bergman metrics, as an approximation to the full space of Kaehler metrics. We use the theory of large deviations to decide when a sequence of probability measures on the spaces of Bergman metrics tends to a limit measure on the space of all Kaehler metrics. Several examples are considered.

  12. Function valued metric spaces

    Directory of Open Access Journals (Sweden)

    Madjid Mirzavaziri

    2010-11-01

    Full Text Available In this paper we introduce the notion of an ℱ-metric, as a function valued distance mapping, on a set X and we investigate the theory of ℱ-metrics paces. We show that every metric space may be viewed as an F-metric space and every ℱ-metric space (X,δ can be regarded as a topological space (X,τδ. In addition, we prove that the category of the so-called extended F-metric spaces properly contains the category of metric spaces. We also introduce the concept of an `ℱ-metric space as a completion of an ℱ-metric space and, as an application to topology, we prove that each normal topological space is `ℱ-metrizable.

  13. Structural Evolution of the R-T Phase Boundary in KNN-Based Ceramics

    KAUST Repository

    Lv, Xiang

    2017-10-04

    Although a rhombohedral-tetragonal (R-T) phase boundary is known to substantially enhance the piezoelectric properties of potassium-sodium niobate ceramics, the structural evolution of the R-T phase boundary itself is still unclear. In this work, the structural evolution of R-T phase boundary from -150 °C to 200 °C is investigated in (0.99-x)K0.5Na0.5Nb1-ySbyO3-0.01CaSnO3-xBi0.5K0.5HfO3 (where x=0~0.05 with y=0.035, and y=0~0.07 with x=0.03) ceramics. Through temperature-dependent powder X-ray diffraction (XRD) patterns and Raman spectra, the structural evolution was determined to be Rhombohedral (R, <-125 °C) → Rhombohedral+Orthorhombic (R+O, -125 °C to 0 °C) → Rhombohedral+Tetragonal (R+T, 0 °C to 150 °C) → dominating Tetragonal (T, 200 °C to Curie temperature (TC)) → Cubic (C, >TC). In addition, the enhanced electrical properties (e.g., a direct piezoelectric coefficient (d33) of ~450±5 pC/N, a conversion piezoelectric coefficient (d33*) of ~580±5 pm/V, an electromechanical coupling factor (kp) of ~0.50±0.02, and TC~250 °C), fatigue-free behavior, and good thermal stability were exhibited by the ceramics possessing the R-T phase boundary. This work improves understanding of the physical mechanism behind the R-T phase boundary in KNN-based ceramics and is an important step towards their adoption in practical applications. This article is protected by copyright. All rights reserved.

  14. Adaptation

    International Development Research Centre (IDRC) Digital Library (Canada)

    . Dar es Salaam. Durban. Bloemfontein. Antananarivo. Cape Town. Ifrane ... program strategy. A number of CCAA-supported projects have relevance to other important adaptation-related themes such as disaster preparedness and climate.

  15. Group covariance and metrical theory

    International Nuclear Information System (INIS)

    Halpern, L.

    1983-01-01

    The a priori introduction of a Lie group of transformations into a physical theory has often proved to be useful; it usually serves to describe special simplified conditions before a general theory can be worked out. Newton's assumptions of absolute space and time are examples where the Euclidian group and translation group have been introduced. These groups were extended to the Galilei group and modified in the special theory of relativity to the Poincare group to describe physics under the given conditions covariantly in the simplest way. The criticism of the a priori character leads to the formulation of the general theory of relativity. The general metric theory does not really give preference to a particular invariance group - even the principle of equivalence can be adapted to a whole family of groups. The physical laws covariantly inserted into the metric space are however adapted to the Poincare group. 8 references

  16. Metric Tannakian duality

    Science.gov (United States)

    Daenzer, Calder

    2013-08-01

    We incorporate metric data into the framework of Tannaka-Krein duality. Thus, for any group with left invariant metric, we produce a dual metric on its category of unitary representations. We characterize the conditions under which a "double-dual" metric on the group may be recovered from the metric on representations, and provide conditions under which a metric agrees with its double-dual. We also explore a diverse class of possible applications of the theory, including applications to T-duality and to quantum Gromov-Hausdorff distance.

  17. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  18. Intuitionistic fuzzy metric spaces

    International Nuclear Information System (INIS)

    Park, Jin Han

    2004-01-01

    Using the idea of intuitionistic fuzzy set due to Atanassov [Intuitionistic fuzzy sets. in: V. Sgurev (Ed.), VII ITKR's Session, Sofia June, 1983; Fuzzy Sets Syst. 20 (1986) 87], we define the notion of intuitionistic fuzzy metric spaces as a natural generalization of fuzzy metric spaces due to George and Veeramani [Fuzzy Sets Syst. 64 (1994) 395] and prove some known results of metric spaces including Baire's theorem and the Uniform limit theorem for intuitionistic fuzzy metric spaces

  19. Adaptation

    International Development Research Centre (IDRC) Digital Library (Canada)

    Nairobi, Kenya. 28 Adapting Fishing Policy to Climate Change with the Aid of Scientific and Endogenous Knowledge. Cap Verde, Gambia,. Guinea, Guinea Bissau,. Mauritania and Senegal. Environment and Development in the Third World. (ENDA-TM). Dakar, Senegal. 29 Integrating Indigenous Knowledge in Climate Risk ...

  20. Metric learning for DNA microarray data analysis

    International Nuclear Information System (INIS)

    Takeuchi, Ichiro; Nakagawa, Masao; Seto, Masao

    2009-01-01

    In many microarray studies, gene set selection is an important preliminary step for subsequent main task such as tumor classification, cancer subtype identification, etc. In this paper, we investigate the possibility of using metric learning as an alternative to gene set selection. We develop a simple metric learning algorithm aiming to use it for microarray data analysis. Exploiting a property of the algorithm, we introduce a novel approach for extending the metric learning to be adaptive. We apply the algorithm to previously studied microarray data on malignant lymphoma subtype identification.

  1. Software Metrics Capability Evaluation Guide

    National Research Council Canada - National Science Library

    Budlong, Faye

    1995-01-01

    ...: disseminating information regarding the U.S. Air Force Policy on software metrics, providing metrics information to the public through CrossTalk, conducting customer workshops in software metrics, guiding metrics technology adoption programs...

  2. Metrics That Matter.

    Science.gov (United States)

    Prentice, Julia C; Frakt, Austin B; Pizer, Steven D

    2016-04-01

    Increasingly, performance metrics are seen as key components for accurately measuring and improving health care value. Disappointment in the ability of chosen metrics to meet these goals is exemplified in a recent Institute of Medicine report that argues for a consensus-building process to determine a simplified set of reliable metrics. Overall health care goals should be defined and then metrics to measure these goals should be considered. If appropriate data for the identified goals are not available, they should be developed. We use examples from our work in the Veterans Health Administration (VHA) on validating waiting time and mental health metrics to highlight other key issues for metric selection and implementation. First, we focus on the need for specification and predictive validation of metrics. Second, we discuss strategies to maintain the fidelity of the data used in performance metrics over time. These strategies include using appropriate incentives and data sources, using composite metrics, and ongoing monitoring. Finally, we discuss the VA's leadership in developing performance metrics through a planned upgrade in its electronic medical record system to collect more comprehensive VHA and non-VHA data, increasing the ability to comprehensively measure outcomes.

  3. Fractal dimension to classify the heart sound recordings with KNN and fuzzy c-mean clustering methods

    Science.gov (United States)

    Juniati, D.; Khotimah, C.; Wardani, D. E. K.; Budayasa, K.

    2018-01-01

    The heart abnormalities can be detected from heart sound. A heart sound can be heard directly with a stethoscope or indirectly by a phonocardiograph, a machine of the heart sound recording. This paper presents the implementation of fractal dimension theory to make a classification of phonocardiograms into a normal heart sound, a murmur, or an extrasystole. The main algorithm used to calculate the fractal dimension was Higuchi’s Algorithm. There were two steps to make a classification of phonocardiograms, feature extraction, and classification. For feature extraction, we used Discrete Wavelet Transform to decompose the signal of heart sound into several sub-bands depending on the selected level. After the decomposition process, the signal was processed using Fast Fourier Transform (FFT) to determine the spectral frequency. The fractal dimension of the FFT output was calculated using Higuchi Algorithm. The classification of fractal dimension of all phonocardiograms was done with KNN and Fuzzy c-mean clustering methods. Based on the research results, the best accuracy obtained was 86.17%, the feature extraction by DWT decomposition level 3 with the value of kmax 50, using 5-fold cross validation and the number of neighbors was 5 at K-NN algorithm. Meanwhile, for fuzzy c-mean clustering, the accuracy was 78.56%.

  4. Dynamic partial reconfiguration implementation of the SVM/KNN multi-classifier on FPGA for bioinformatics application.

    Science.gov (United States)

    Hussain, Hanaa M; Benkrid, Khaled; Seker, Huseyin

    2015-01-01

    Bioinformatics data tend to be highly dimensional in nature thus impose significant computational demands. To resolve limitations of conventional computing methods, several alternative high performance computing solutions have been proposed by scientists such as Graphical Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The latter have shown to be efficient and high in performance. In recent years, FPGAs have been benefiting from dynamic partial reconfiguration (DPR) feature for adding flexibility to alter specific regions within the chip. This work proposes combing the use of FPGAs and DPR to build a dynamic multi-classifier architecture that can be used in processing bioinformatics data. In bioinformatics, applying different classification algorithms to the same dataset is desirable in order to obtain comparable, more reliable and consensus decision, but it can consume long time when performed on conventional PC. The DPR implementation of two common classifiers, namely support vector machines (SVMs) and K-nearest neighbor (KNN) are combined together to form a multi-classifier FPGA architecture which can utilize specific region of the FPGA to work as either SVM or KNN classifier. This multi-classifier DPR implementation achieved at least ~8x reduction in reconfiguration time over the single non-DPR classifier implementation, and occupied less space and hardware resources than having both classifiers. The proposed architecture can be extended to work as an ensemble classifier.

  5. Metric modular spaces

    CERN Document Server

    Chistyakov, Vyacheslav

    2015-01-01

    Aimed toward researchers and graduate students familiar with elements of functional analysis, linear algebra, and general topology; this book contains a general study of modulars, modular spaces, and metric modular spaces. Modulars may be thought of as generalized velocity fields and serve two important purposes: generate metric spaces in a unified manner and provide a weaker convergence, the modular convergence, whose topology is non-metrizable in general. Metric modular spaces are extensions of metric spaces, metric linear spaces, and classical modular linear spaces. The topics covered include the classification of modulars, metrizability of modular spaces, modular transforms and duality between modular spaces, metric  and modular topologies. Applications illustrated in this book include: the description of superposition operators acting in modular spaces, the existence of regular selections of set-valued mappings, new interpretations of spaces of Lipschitzian and absolutely continuous mappings, the existe...

  6. Metric diffusion along foliations

    CERN Document Server

    Walczak, Szymon M

    2017-01-01

    Up-to-date research in metric diffusion along compact foliations is presented in this book. Beginning with fundamentals from the optimal transportation theory and the theory of foliations; this book moves on to cover Wasserstein distance, Kantorovich Duality Theorem, and the metrization of the weak topology by the Wasserstein distance. Metric diffusion is defined, the topology of the metric space is studied and the limits of diffused metrics along compact foliations are discussed. Essentials on foliations, holonomy, heat diffusion, and compact foliations are detailed and vital technical lemmas are proved to aide understanding. Graduate students and researchers in geometry, topology and dynamics of foliations and laminations will find this supplement useful as it presents facts about the metric diffusion along non-compact foliation and provides a full description of the limit for metrics diffused along foliation with at least one compact leaf on the two dimensions.

  7. Software Metrics: Measuring Haskell

    OpenAIRE

    Ryder, Chris; Thompson, Simon

    2005-01-01

    Software metrics have been used in software engineering as a mechanism for assessing code quality and for targeting software development activities, such as testing or refactoring, at areas of a program that will most benefit from them. Haskell has many tools for software engineering, such as testing, debugging and refactoring tools, but software metrics have mostly been neglected. The work presented in this paper identifies a collection of software metrics for use with Haskell programs. Thes...

  8. -Metric Space: A Generalization

    Directory of Open Access Journals (Sweden)

    Farshid Khojasteh

    2013-01-01

    Full Text Available We introduce the notion of -metric as a generalization of a metric by replacing the triangle inequality with a more generalized inequality. We investigate the topology of the spaces induced by a -metric and present some essential properties of it. Further, we give characterization of well-known fixed point theorems, such as the Banach and Caristi types in the context of such spaces.

  9. Metric graphic sets

    Science.gov (United States)

    Garces, I. J. L.; Rosario, J. B.

    2017-10-01

    For an ordered subset W = {w 1, w 2, …, wk } of vertices in a connected graph G and a vertex v of G, the metric representation of v with respect to W is the k-vector r(v|W) = (d(v, w 1), d(v, w 2), …, d(v, wk )), where d(v, wi ) is the distance of the vertices v and wi in G. The set W is called a resolving set of G if r(u|W) = r(v|W) implies u = v. The metric dimension of G, denoted by β(G), is the minimum cardinality of a resolving set of G, and a resolving set of G with cardinality equal to its metric dimension is called a metric basis of G. A set T of vectors is called a positive lattice set if all the coordinates in each vector of T are positive integers. A positive lattice set T consisting of n k-vectors is called a metric graphic set if there exists a simple connected graph G of order n + k with β(G) = k such that T = {r(ui |S) : ui ∈ V (G)\\S, 1 ≤ i ≤ n} for some metric basis S = {s 1, s 2, …, sk } of G. If such G exists, then we say G is a metric graphic realization of T. In this paper, we introduce the concept of metric graphic sets anchored on the concept of metric dimension and provide some characterizations. We also give necessary and sufficient conditions for any positive lattice set consisting of 2 k-vectors to be a metric graphic set. We provide an upper bound for the sum of all the coordinates of any metric graphic set and enumerate some properties of positive lattice sets consisting of n 2-vectors that are not metric graphic sets.

  10. Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — This chapter presents several performance metrics for offline evaluation of prognostics algorithms. A brief overview of different methods employed for performance...

  11. Topics in Metric Approximation

    Science.gov (United States)

    Leeb, William Edward

    This thesis develops effective approximations of certain metrics that occur frequently in pure and applied mathematics. We show that distances that often arise in applications, such as the Earth Mover's Distance between two probability measures, can be approximated by easily computed formulas for a wide variety of ground distances. We develop simple and easily computed characterizations both of norms measuring a function's regularity -- such as the Lipschitz norm -- and of their duals. We are particularly concerned with the tensor product of metric spaces, where the natural notion of regularity is not the Lipschitz condition but the mixed Lipschitz condition. A theme that runs throughout this thesis is that snowflake metrics (metrics raised to a power less than 1) are often better-behaved than ordinary metrics. For example, we show that snowflake metrics on finite spaces can be approximated by the average of tree metrics with a distortion bounded by intrinsic geometric characteristics of the space and not the number of points. Many of the metrics for which we characterize the Lipschitz space and its dual are snowflake metrics. We also present applications of the characterization of certain regularity norms to the problem of recovering a matrix that has been corrupted by noise. We are able to achieve an optimal rate of recovery for certain families of matrices by exploiting the relationship between mixed-variable regularity conditions and the decay of a function's coefficients in a certain orthonormal basis.

  12. Overview of journal metrics

    Directory of Open Access Journals (Sweden)

    Kihong Kim

    2018-02-01

    Full Text Available Various kinds of metrics used for the quantitative evaluation of scholarly journals are reviewed. The impact factor and related metrics including the immediacy index and the aggregate impact factor, which are provided by the Journal Citation Reports, are explained in detail. The Eigenfactor score and the article influence score are also reviewed. In addition, journal metrics such as CiteScore, Source Normalized Impact per Paper, SCImago Journal Rank, h-index, and g-index are discussed. Limitations and problems that these metrics have are pointed out. We should be cautious to rely on those quantitative measures too much when we evaluate journals or researchers.

  13. Adaptação transcultural da versão brasileira da escala Social Rhythm Metric-17 (SRM-17 para a população angolana Cross-cultural adaptation of the Brazilian version of the Social Rhythm Metric-17 (SRM-17 for the population of Angola

    Directory of Open Access Journals (Sweden)

    Regina Lopes Schimitt

    2011-01-01

    Full Text Available INTRODUÇÃO: O ritmo social é um conceito que integra a relação entre Zeitgebers (sincronizadores sociais e os marcadores de tempo endógenos, e pode ser avaliado com a Escala de Ritmo Social (Social Rhythm Metric-17, SRM-17. O objetivo deste estudo foi realizar a adaptação da versão brasileira da SRM-17 para o português angolano, comparando as duas escalas em populações que utilizam o mesmo idioma mas apresentam diferenças culturais. MÉTODOS: A versão brasileira da SRM-17 foi submetida à avaliação de 10 estudantes universitários angolanos, que analisaram o grau de clareza de cada um dos 15 itens do instrumento usando uma escala visual analógica de 10 cm e propuseram modificações ao texto. Foi realizada revisão dos resultados para a elaboração da versão final, bem como prova de leitura e relatório final. RESULTADOS: A versão final angolana manteve uma equivalência de itens com relação à versão em português brasileiro. A versão avaliada demonstrou um grau satisfatório de clareza e equivalência semântica na maioria dos itens. Porém, alguns itens apresentaram um escore na clareza inferior à média aritmética de compreensão global do instrumento (8,38±1,0. CONCLUSÃO: Apesar de o português ser o idioma oficial nos dois países, há diferenças culturais significativas nas duas populações. Este trabalho apresenta uma versão adaptada à realidade angolana de um instrumento específico para aferir ritmo social. O processo de adaptação transcultural deve efetivar-se com estudos de validação do instrumento final em uma amostra maior da população, onde também poderão ser avaliadas as equivalências operacional, de medida e funcional.INTRODUCTION: Social rhythm is a concept that correlates social Zeitgebers (synchronizers with endogenous markers of time, and can be assessed with the Social Rhythm Metric-17 (SRM-17. The aim of this study was to adapt the Brazilian version of the SRM-17 to Angolan

  14. Robust Transfer Metric Learning for Image Classification.

    Science.gov (United States)

    Ding, Zhengming; Fu, Yun

    2017-02-01

    Metric learning has attracted increasing attention due to its critical role in image analysis and classification. Conventional metric learning always assumes that the training and test data are sampled from the same or similar distribution. However, to build an effective distance metric, we need abundant supervised knowledge (i.e., side/label information), which is generally inaccessible in practice, because of the expensive labeling cost. In this paper, we develop a robust transfer metric learning (RTML) framework to effectively assist the unlabeled target learning by transferring the knowledge from the well-labeled source domain. Specifically, RTML exploits knowledge transfer to mitigate the domain shift in two directions, i.e., sample space and feature space. In the sample space, domain-wise and class-wise adaption schemes are adopted to bridge the gap of marginal and conditional distribution disparities across two domains. In the feature space, our metric is built in a marginalized denoising fashion and low-rank constraint, which make it more robust to tackle noisy data in reality. Furthermore, we design an explicit rank constraint regularizer to replace the rank minimization NP-hard problem to guide the low-rank metric learning. Experimental results on several standard benchmarks demonstrate the effectiveness of our proposed RTML by comparing it with the state-of-the-art transfer learning and metric learning algorithms.

  15. Project Management Metrics

    Directory of Open Access Journals (Sweden)

    Radu MARSANU

    2010-04-01

    Full Text Available Metrics and indicators used for the evaluation of the IT projects management have the advantage of providing rigorous details about the required effort and the boundaries of the IT deliverables. There are some disadvantages, as well, due to the fact the input data contains errors and the value of metrics depends on the quality of data used in models.

  16. Project Management Metrics

    OpenAIRE

    Radu MARSANU

    2010-01-01

    Metrics and indicators used for the evaluation of the IT projects management have the advantage of providing rigorous details about the required effort and the boundaries of the IT deliverables. There are some disadvantages, as well, due to the fact the input data contains errors and the value of metrics depends on the quality of data used in models.

  17. Computational visual distinctness metric

    NARCIS (Netherlands)

    Martínez-Baena, J.; Toet, A.; Fdez-Vidal, X.R.; Garrido, A.; Rodríguez-Sánchez, R.

    1998-01-01

    A new computational visual distinctness metric based on principles of the early human visual system is presented. The metric is applied to quantify (1) the visual distinctness of targets in complex natural scenes and (2) the perceptual differences between compressed and uncompressed images. The new

  18. Arbitrary Metrics in Psychology

    Science.gov (United States)

    Blanton, Hart; Jaccard, James

    2006-01-01

    Many psychological tests have arbitrary metrics but are appropriate for testing psychological theories. Metric arbitrariness is a concern, however, when researchers wish to draw inferences about the true, absolute standing of a group or individual on the latent psychological dimension being measured. The authors illustrate this in the context of 2…

  19. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    We investigate the geometrical structure of probabilistic generative dimensionality reduction models using the tools of Riemannian geometry. We explicitly define a distribution over the natural metric given by the models. We provide the necessary algorithms to compute expected metric tensors where...

  20. Privacy Metrics and Boundaries

    NARCIS (Netherlands)

    L-F. Pau (Louis-François)

    2005-01-01

    textabstractThis paper aims at defining a set of privacy metrics (quantitative and qualitative) in the case of the relation between a privacy protector ,and an information gatherer .The aims with such metrics are: -to allow to assess and compare different user scenarios and their differences; for

  1. Metrics for Cosmetology.

    Science.gov (United States)

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of cosmetology students, this instructional package on cosmetology is part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology, measurement terms, and tools currently in use. Each of the…

  2. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    ... (sometimes referred to as confidence) state. The TPM can also be used as a measure of algorithm performance to compare against the Trackability Metric. The Trackability Metric was developed by AMCOM to determine how "trackable" a set of data should be. The TPM will be described and results presented.

  3. Metrics for Secretarial, Stenography.

    Science.gov (United States)

    Cooper, Gloria S., Ed.; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of secretarial, stenography students, this instructional package is one of three for the business and office occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

  4. Metrics, Media and Advertisers: Discussing Relationship

    Directory of Open Access Journals (Sweden)

    Marco Aurelio de Souza Rodrigues

    2014-11-01

    Full Text Available This study investigates how Brazilian advertisers are adapting to new media and its attention metrics. In-depth interviews were conducted with advertisers in 2009 and 2011. In 2009, new media and its metrics were celebrated as innovations that would increase advertising campaigns overall efficiency. In 2011, this perception has changed: New media’s profusion of metrics, once seen as an advantage, started to compromise its ease of use and adoption. Among its findings, this study argues that there is an opportunity for media groups willing to shift from a product-focused strategy towards a customer-centric one, through the creation of new, simple and integrative metrics

  5. The Metric Lens : Visualizing Metrics and Structure on Software Diagrams

    NARCIS (Netherlands)

    Byelas, Heorhiy; Telea, Alexandru; Hassan, AE; Zaidman, A; DiPenta, M

    2008-01-01

    We present the metric lens, a new visualization of method-level code metrics atop UML class diagrams, which allows performing metric-metric and metric-structure correlations on large diagrams. Me demonstrate air interactive visualization tool in which users can quickly specify a wide palette of

  6. Optimal Detection Range of RFID Tag for RFID-based Positioning System Using the k-NN Algorithm

    Directory of Open Access Journals (Sweden)

    Joon Heo

    2009-06-01

    Full Text Available Positioning technology to track a moving object is an important and essential component of ubiquitous computing environments and applications. An RFID-based positioning system using the k-nearest neighbor (k-NN algorithm can determine the position of a moving reader from observed reference data. In this study, the optimal detection range of an RFID-based positioning system was determined on the principle that tag spacing can be derived from the detection range. It was assumed that reference tags without signal strength information are regularly distributed in 1-, 2- and 3-dimensional spaces. The optimal detection range was determined, through analytical and numerical approaches, to be 125% of the tag-spacing distance in 1-dimensional space. Through numerical approaches, the range was 134% in 2-dimensional space, 143% in 3-dimensional space.

  7. Effects of doping on ferroelectric properties and leakage current behavior of KNN-LT-LS thin films on SrTiO3 substrate

    Science.gov (United States)

    Abazari, M.; Safari, A.

    2009-05-01

    We report the effects of Ba, Ti, and Mn dopants on ferroelectric polarization and leakage current of (K0.44Na0.52Li0.04)(Nb0.84Ta0.1Sb0.06)O3 (KNN-LT-LS) thin films deposited by pulsed laser deposition. It is shown that donor dopants such as Ba2+, which increased the resistivity in bulk KNN-LT-LS, had an opposite effect in the thin film. Ti4+ as an acceptor B-site dopant reduces the leakage current by an order of magnitude, while the polarization values showed a slight degradation. Mn4+, however, was found to effectively suppress the leakage current by over two orders of magnitude while enhancing the polarization, with 15 and 23 μC/cm2 remanent and saturated polarization, whose values are ˜70% and 82% of the reported values for bulk composition. This phenomenon has been associated with the dual effect of Mn4+ in KNN-LT-LS thin film, by substituting both A- and B-site cations. A detailed description on how each dopant affects the concentrations of vacancies in the lattice is presented. Mn-doped KNN-LT-LS thin films are shown to be a promising candidate for lead-free thin films and applications.

  8. General perceptual contrast metric

    Science.gov (United States)

    Liberg, Anna; Hasler, David

    2003-06-01

    A combined achromatic and chromatic contrast metric for digital images and video is presented in this paper. Our work is aimed at tuning any parametric rendering algorithm in an automated way by computing how much details an observer perceives in a rendered scene. The contrast metric is based on contrast analysis in spatial domain of image sub-bands constructed by pyramidal decomposition of the image. The proposed contrast metric is the sum of the perceptual contrast of every pixel in the image at different detail levels corresponding to different viewing distances. The novel metric shows high correlation with subjective experiments. Important applications involve optimal parameter set of any image rendering and contrast enhancement technique or auto exposure of an image capturing device.

  9. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  10. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  11. A metric for success

    Science.gov (United States)

    Carver, Gary P.

    1994-05-01

    The federal agencies are working with industry to ease adoption of the metric system. The goal is to help U.S. industry compete more successfully in the global marketplace, increase exports, and create new jobs. The strategy is to use federal procurement, financial assistance, and other business-related activities to encourage voluntary conversion. Based upon the positive experiences of firms and industries that have converted, federal agencies have concluded that metric use will yield long-term benefits that are beyond any one-time costs or inconveniences. It may be time for additional steps to move the Nation out of its dual-system comfort zone and continue to progress toward metrication. This report includes 'Metric Highlights in U.S. History'.

  12. IT Project Management Metrics

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  13. IT Project Management Metrics

    OpenAIRE

    Paul POCATILU

    2007-01-01

    Many software and IT projects fail in completing theirs objectives because different causes of which the management of the projects has a high weight. In order to have successfully projects, lessons learned have to be used, historical data to be collected and metrics and indicators have to be computed and used to compare them with past projects and avoid failure to happen. This paper presents some metrics that can be used for the IT project management.

  14. Mass Customization Measurements Metrics

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev; Jørgensen, Kaj Asbjørn

    2014-01-01

    A recent survey has indicated that 17 % of companies have ceased mass customizing less than 1 year after initiating the effort. This paper presents measurement for a company’s mass customization performance, utilizing metrics within the three fundamental capabilities: robust process design, choice...... navigation, and solution space development. A mass customizer when assessing performance with these metrics can identify within which areas improvement would increase competitiveness the most and enable more efficient transition to mass customization....

  15. Distance Metric Tracking

    Science.gov (United States)

    2016-03-02

    estimation [13, 2], and manifold learning [19]. Such unsupervised methods do not have the benefit of human input on the distance metric, and overly rely...to be defined that is related to the task at hand. Many supervised and semi- supervised distance metric learning approaches have been developed [17... Unsupervised PCA seeks to identify a set of axes that best explain the variance contained in the data. LDA takes a supervised approach, minimiz- ing the intra

  16. METRICS DEVELOPMENT FOR PATENTS.

    Science.gov (United States)

    Veiga, Daniela Francescato; Ferreira, Lydia Masako

    2015-01-01

    To develop a proposal for metrics for patents to be applied in assessing the postgraduate programs of Medicine III - Capes. From the reading and analysis of the 2013 area documents of all the 48 areas of Capes, a proposal for metrics for patents was developed to be applied in Medicine III programs. Except for the areas Biotechnology, Food Science, Biological Sciences III, Physical Education, Engineering I, III and IV and Interdisciplinary, most areas do not adopt a scoring system for patents. The proposal developed was based on the criteria of Biotechnology, with adaptations. In general, it will be valued, in ascending order, the deposit, the granting and licensing/production. It will also be assigned higher scores to patents registered abroad and whenever there is a participation of students. This proposal can be applied to the item Intellectual Production of the evaluation form, in subsection Technical Production/Patents. The percentage of 10% for academic programs and 40% for Masters Professionals should be maintained. The program will be scored as Very Good when it reaches 400 points or over; Good, between 200 and 399 points; Regular, between 71 and 199 points; Weak up to 70 points; Insufficient, no punctuation. Desenvolver uma proposta de métricas para patentes a serem aplicadas na avaliação dos Programas de Pós-Graduação da Área Medicina III - Capes. A partir da leitura e análise dos documentos de área de 2013 de todas as 48 Áreas da Capes, desenvolveu-se uma proposta de métricas para patentes, a ser aplicada na avaliação dos programas da área. Constatou-se que, com exceção das áreas Biotecnologia, Ciência de Alimentos, Ciências Biológicas III, Educação Física, Engenharias I, III e IV e Interdisciplinar, a maioria não adota sistema de pontuação para patentes. A proposta desenvolvida baseou-se nos critérios da Biotecnologia, com adaptações. De uma forma geral, foi valorizado, em ordem crescente, o depósito, a concessão e o

  17. Quality metrics for sensor images

    Science.gov (United States)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery

  18. Inducing Weinhold's metric from Euclidean and Riemannian metrics

    International Nuclear Information System (INIS)

    Andresen, B.; Berry, R.S.; Ihrig, E.; Salamon, P.

    1987-01-01

    We show that Weinhold's metric cannot be introduced on the equation of state surface from a Euclidean metric in the ambient space of all extensive state variables, whereas it can be induced if the ambient space is assumed only to have a Riemannian metric. This metric, however, is not unique. (orig.)

  19. Fault Management Metrics

    Science.gov (United States)

    Johnson, Stephen B.; Ghoshal, Sudipto; Haste, Deepak; Moore, Craig

    2017-01-01

    This paper describes the theory and considerations in the application of metrics to measure the effectiveness of fault management. Fault management refers here to the operational aspect of system health management, and as such is considered as a meta-control loop that operates to preserve or maximize the system's ability to achieve its goals in the face of current or prospective failure. As a suite of control loops, the metrics to estimate and measure the effectiveness of fault management are similar to those of classical control loops in being divided into two major classes: state estimation, and state control. State estimation metrics can be classified into lower-level subdivisions for detection coverage, detection effectiveness, fault isolation and fault identification (diagnostics), and failure prognosis. State control metrics can be classified into response determination effectiveness and response effectiveness. These metrics are applied to each and every fault management control loop in the system, for each failure to which they apply, and probabilistically summed to determine the effectiveness of these fault management control loops to preserve the relevant system goals that they are intended to protect.

  20. Deep Transfer Metric Learning.

    Science.gov (United States)

    Junlin Hu; Jiwen Lu; Yap-Peng Tan; Jie Zhou

    2016-12-01

    Conventional metric learning methods usually assume that the training and test samples are captured in similar scenarios so that their distributions are assumed to be the same. This assumption does not hold in many real visual recognition applications, especially when samples are captured across different data sets. In this paper, we propose a new deep transfer metric learning (DTML) method to learn a set of hierarchical nonlinear transformations for cross-domain visual recognition by transferring discriminative knowledge from the labeled source domain to the unlabeled target domain. Specifically, our DTML learns a deep metric network by maximizing the inter-class variations and minimizing the intra-class variations, and minimizing the distribution divergence between the source domain and the target domain at the top layer of the network. To better exploit the discriminative information from the source domain, we further develop a deeply supervised transfer metric learning (DSTML) method by including an additional objective on DTML, where the output of both the hidden layers and the top layer are optimized jointly. To preserve the local manifold of input data points in the metric space, we present two new methods, DTML with autoencoder regularization and DSTML with autoencoder regularization. Experimental results on face verification, person re-identification, and handwritten digit recognition validate the effectiveness of the proposed methods.

  1. Cyber threat metrics.

    Energy Technology Data Exchange (ETDEWEB)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  2. Permanence of metric fractals

    Directory of Open Access Journals (Sweden)

    Kyril Tintarev

    2007-05-01

    Full Text Available The paper studies energy functionals on quasimetric spaces, defined by quadratic measure-valued Lagrangeans. This general model of medium, known as metric fractals, includes nested fractals and sub-Riemannian manifolds. In particular, the quadratic form of the Lagrangean satisfies Sobolev inequalities with the critical exponent determined by the (quasimetric homogeneous dimension, which is also involved in the asymptotic distribution of the form's eigenvalues. This paper verifies that the axioms of the metric fractal are preserved by space products, leading thus to examples of non-differentiable media of arbitrary intrinsic dimension.

  3. The Noncommutative Ward Metric

    Directory of Open Access Journals (Sweden)

    Marco Maceda

    2010-06-01

    Full Text Available We analyze the moduli-space metric in the static nonabelian charge-two sector of the Moyal-deformed CP^1 sigma model in 1+2 dimensions. After carefully reviewing the commutative results of Ward and Ruback, the noncommutative Kähler potential is expanded in powers of dimensionless moduli. In two special cases we sum the perturbative series to analytic expressions. For any nonzero value of the noncommutativity parameter, the logarithmic singularity of the commutative metric is expelled from the origin of the moduli space and possibly altogether.

  4. A novel implementation of kNN classifier based on multi-tupled meteorological input data for wind power prediction

    International Nuclear Information System (INIS)

    Yesilbudak, Mehmet; Sagiroglu, Seref; Colak, Ilhami

    2017-01-01

    Highlights: • An accurate wind power prediction model is proposed for very short-term horizon. • The k-nearest neighbor classifier is implemented based on the multi-tupled inputs. • The variation of wind power prediction errors is evaluated in various aspects. • Our approach shows the superior prediction performance over the persistence method. - Abstract: With the growing share of wind power production in the electric power grids, many critical challenges to the grid operators have been emerged in terms of the power balance, power quality, voltage support, frequency stability, load scheduling, unit commitment and spinning reserve calculations. To overcome such problems, numerous studies have been conducted to predict the wind power production, but a small number of them have attempted to improve the prediction accuracy by employing the multidimensional meteorological input data. The novelties of this study lie in the proposal of an efficient and easy to implement very short-term wind power prediction model based on the k-nearest neighbor classifier (kNN), in the usage of wind speed, wind direction, barometric pressure and air temperature parameters as the multi-tupled meteorological inputs and in the comparison of wind power prediction results with respect to the persistence reference model. As a result of the achieved patterns, we characterize the variation of wind power prediction errors according to the input tuples, distance measures and neighbor numbers, and uncover the most influential and the most ineffective meteorological parameters on the optimization of wind power prediction results.

  5. Sentiment classification of Roman-Urdu opinions using Naïve Bayesian, Decision Tree and KNN classification techniques

    Directory of Open Access Journals (Sweden)

    Muhammad Bilal

    2016-07-01

    Full Text Available Sentiment mining is a field of text mining to determine the attitude of people about a particular product, topic, politician in newsgroup posts, review sites, comments on facebook posts twitter, etc. There are many issues involved in opinion mining. One important issue is that opinions could be in different languages (English, Urdu, Arabic, etc.. To tackle each language according to its orientation is a challenging task. Most of the research work in sentiment mining has been done in English language. Currently, limited research is being carried out on sentiment classification of other languages like Arabic, Italian, Urdu and Hindi. In this paper, three classification models are used for text classification using Waikato Environment for Knowledge Analysis (WEKA. Opinions written in Roman-Urdu and English are extracted from a blog. These extracted opinions are documented in text files to prepare a training dataset containing 150 positive and 150 negative opinions, as labeled examples. Testing data set is supplied to three different models and the results in each case are analyzed. The results show that Naïve Bayesian outperformed Decision Tree and KNN in terms of more accuracy, precision, recall and F-measure.

  6. IMAGE LABELING FOR LIDAR INTENSITY IMAGE USING K-NN OF FEATURE OBTAINED BY CONVOLUTIONAL NEURAL NETWORK

    Directory of Open Access Journals (Sweden)

    M. Umemura

    2016-06-01

    Full Text Available We propose an image labeling method for LIDAR intensity image obtained by Mobile Mapping System (MMS using K-Nearest Neighbor (KNN of feature obtained by Convolutional Neural Network (CNN. Image labeling assigns labels (e.g., road, cross-walk and road shoulder to semantic regions in an image. Since CNN is effective for various image recognition tasks, we try to use the feature of CNN (Caffenet pre-trained by ImageNet. We use 4,096-dimensional feature at fc7 layer in the Caffenet as the descriptor of a region because the feature at fc7 layer has effective information for object classification. We extract the feature by the Caffenet from regions cropped from images. Since the similarity between features reflects the similarity of contents of regions, we can select top K similar regions cropped from training samples with a test region. Since regions in training images have manually-annotated ground truth labels, we vote the labels attached to top K similar regions to the test region. The class label with the maximum vote is assigned to each pixel in the test image. In experiments, we use 36 LIDAR intensity images with ground truth labels. We divide 36 images into training (28 images and test sets (8 images. We use class average accuracy and pixel-wise accuracy as evaluation measures. Our method was able to assign the same label as human beings in 97.8% of the pixels in test LIDAR intensity images.

  7. Efficient Privacy-Preserving Protocol for k-NN Search over Encrypted Data in Location-Based Service

    Directory of Open Access Journals (Sweden)

    Huijuan Lian

    2017-01-01

    Full Text Available With the development of mobile communication technology, location-based services (LBS are booming prosperously. Meanwhile privacy protection has become the main obstacle for the further development of LBS. The k-nearest neighbor (k-NN search is one of the most common types of LBS. In this paper, we propose an efficient private circular query protocol (EPCQP with high accuracy rate and low computation and communication cost. We adopt the Moore curve to convert two-dimensional spatial data into one-dimensional sequence and encrypt the points of interest (POIs information with the Brakerski-Gentry-Vaikuntanathan homomorphic encryption scheme for privacy-preserving. The proposed scheme performs the secret circular shift of the encrypted POIs information to hide the location of the user without a trusted third party. To reduce the computation and communication cost, we dynamically divide the table of the POIs information according to the value of k. Experiments show that the proposed scheme provides high accuracy query results while maintaining low computation and communication cost.

  8. Engineering performance metrics

    Science.gov (United States)

    Delozier, R.; Snyder, N.

    1993-03-01

    Implementation of a Total Quality Management (TQM) approach to engineering work required the development of a system of metrics which would serve as a meaningful management tool for evaluating effectiveness in accomplishing project objectives and in achieving improved customer satisfaction. A team effort was chartered with the goal of developing a system of engineering performance metrics which would measure customer satisfaction, quality, cost effectiveness, and timeliness. The approach to developing this system involved normal systems design phases including, conceptual design, detailed design, implementation, and integration. The lessons teamed from this effort will be explored in this paper. These lessons learned may provide a starting point for other large engineering organizations seeking to institute a performance measurement system accomplishing project objectives and in achieving improved customer satisfaction. To facilitate this effort, a team was chartered to assist in the development of the metrics system. This team, consisting of customers and Engineering staff members, was utilized to ensure that the needs and views of the customers were considered in the development of performance measurements. The development of a system of metrics is no different than the development of any type of system. It includes the steps of defining performance measurement requirements, measurement process conceptual design, performance measurement and reporting system detailed design, and system implementation and integration.

  9. Arbitrary Metrics Redux

    Science.gov (United States)

    Blanton, Hart; Jaccard, James

    2006-01-01

    Reducing the arbitrariness of a metric is distinct from the pursuit of validity, rational zero points, data transformations, standardization, and the types of statistical procedures one uses to analyze interval-level versus ordinal-level data. A variety of theoretical, methodological, and statistical tools can assist researchers who wish to make…

  10. Universal hypermultiplet metrics

    International Nuclear Information System (INIS)

    Ketov, Sergei V.

    2001-01-01

    Some instanton corrections to the universal hypermultiplet moduli space metric of the type IIA string theory compactified on a Calabi-Yau threefold arise due to multiple wrapping of BPS membranes and five-branes around certain cycles of Calabi-Yau. The classical universal hypermultipet metric is locally equivalent to the Bergmann metric of the symmetric quaternionic space SU(2,1)/U(2), whereas its generic quaternionic deformations are governed by the integrable SU(∞) Toda equation. We calculate the exact (non-perturbative) UH metrics in the special cases of (i) the D-instantons (the wrapped D2-branes) in the absence of five-branes, and (ii) the five-brane instantons with vanishing charges, in the absence of D-instantons. The solutions of the first type preserve the U(1)xU(1) classical symmetry, while they can be interpreted as the gravitational dressing of the hyper-Kaehler D-instanton solutions. The solutions of the second type preserve the non-abelian SU(2) classical symmetry, while they can be interpreted as the gradient flows in the universal hypermultiplet moduli space

  11. Metrics for energy resilience

    International Nuclear Information System (INIS)

    Roege, Paul E.; Collier, Zachary A.; Mancillas, James; McDonagh, John A.; Linkov, Igor

    2014-01-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today's energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system's energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth. - Highlights: • Resilience is the ability of a system to recover from adversity. • There is a need for methods to quantify and measure system resilience. • We developed a matrix-based approach to generate energy resilience metrics. • These metrics can be used in energy planning, system design, and operations

  12. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  13. Quality metrics in endoscopy.

    Science.gov (United States)

    Gurudu, Suryakanth R; Ramirez, Francisco C

    2013-04-01

    Endoscopy has evolved in the past 4 decades to become an important tool in the diagnosis and management of many digestive diseases. Greater focus on endoscopic quality has highlighted the need to ensure competency among endoscopists. A joint task force of the American College of Gastroenterology and the American Society for Gastrointestinal Endoscopy has proposed several quality metrics to establish competence and help define areas of continuous quality improvement. These metrics represent quality in endoscopy pertinent to pre-, intra-, and postprocedural periods. Quality in endoscopy is a dynamic and multidimensional process that requires continuous monitoring of several indicators and benchmarking with local and national standards. Institutions and practices should have a process in place for credentialing endoscopists and for the assessment of competence regarding individual endoscopic procedures.

  14. Metrics for Energy Resilience

    Energy Technology Data Exchange (ETDEWEB)

    Paul E. Roege; Zachary A. Collier; James Mancillas; John A. McDonagh; Igor Linkov

    2014-09-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today?s energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system?s energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth.

  15. Metrics and Assessment

    Directory of Open Access Journals (Sweden)

    Todd Carpenter

    2015-07-01

    Full Text Available An important and timely plenary session at the 2015 UKSG Conference and Exhibition focused on the role of metrics in research assessment. The two excellent speakers had slightly divergent views.Todd Carpenter from NISO (National Information Standards Organization argued that altmetrics aren’t alt anymore and that downloads and other forms of digital interaction, including social media reference, reference tracking, personal library saving, and secondary linking activity now provide mainstream approaches to the assessment of scholarly impact. James Wilsdon is professor of science and democracy in the Science Policy Research Unit at the University of Sussex and is chair of the Independent Review of the Role of Metrics in Research Assessment commissioned by the Higher Education Funding Council in England (HEFCE. The outcome of this review will inform the work of HEFCE and the other UK higher education funding bodies as they prepare for the future of the Research Excellence Framework. He is more circumspect arguing that metrics cannot and should not be used as a substitute for informed judgement. This article provides a summary of both presentations.

  16. Symmetries of the dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.

    1998-01-01

    The geometric duality between the metric g μν and a Killing tensor K μν is studied. The conditions were found when the symmetries of the metric g μν and the dual metric K μν are the same. Dual spinning space was constructed without introduction of torsion. The general results are applied to the case of Kerr-Newmann metric

  17. Monochromatic metrics are generalized Berwald

    OpenAIRE

    Bartelmeß, Nina; Matveev, Vladimir S.

    2017-01-01

    We show that monochromatic Finsler metrics, i.e., Finsler metrics such that each two tangent spaces are isomorphic as normed spaces, are generalized Berwald metrics, i.e., there exists an affine connection, possibly with torsion, that preserves the Finsler function

  18. Spacetime Metrics from Gauge Potentials

    Directory of Open Access Journals (Sweden)

    Ettore Minguzzi

    2014-03-01

    Full Text Available I present an approach to gravity in which the spacetime metric is constructed from a non-Abelian gauge potential with values in the Lie algebra of the group U(2 (or the Lie algebra of quaternions. If the curvature of this potential vanishes, the metric reduces to a canonical curved background form reminiscent of the Friedmann S3 cosmological metric.

  19. A Unification of G-Metric, Partial Metric, and b-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Nawab Hussain

    2014-01-01

    Full Text Available Using the concepts of G-metric, partial metric, and b-metric spaces, we define a new concept of generalized partial b-metric space. Topological and structural properties of the new space are investigated and certain fixed point theorems for contractive mappings in such spaces are obtained. Some examples are provided here to illustrate the usability of the obtained results.

  20. Random Kähler metrics

    International Nuclear Information System (INIS)

    Ferrari, Frank; Klevtsov, Semyon; Zelditch, Steve

    2013-01-01

    The purpose of this article is to propose a new method to define and calculate path integrals over metrics on a Kähler manifold. The main idea is to use finite dimensional spaces of Bergman metrics, as an approximation to the full space of Kähler metrics. We use the theory of large deviations to decide when a sequence of probability measures on the spaces of Bergman metrics tends to a limit measure on the space of all Kähler metrics. Several examples are considered.

  1. Standard for metric practice

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This standard gives guidance for application of the modernized metric system in the United States. The International System of Units, developed and maintained by the General Conference on Weights and Measures (abbreviated CGPM from the official French name Conference Generale des Poids et Measures) is intended as a basis for worldwide standardization of measurement units. The name International System of Units and the international abbreviation SI 2 were adopted by the 11th CGPM in 1960. SI is a complete, coherent system that is being universally adopted

  2. Enhanced Data Representation by Kernel Metric Learning for Dementia Diagnosis

    Directory of Open Access Journals (Sweden)

    David Cárdenas-Peña

    2017-07-01

    Full Text Available Alzheimer's disease (AD is the kind of dementia that affects the most people around the world. Therefore, an early identification supporting effective treatments is required to increase the life quality of a wide number of patients. Recently, computer-aided diagnosis tools for dementia using Magnetic Resonance Imaging scans have been successfully proposed to discriminate between patients with AD, mild cognitive impairment, and healthy controls. Most of the attention has been given to the clinical data, provided by initiatives as the ADNI, supporting reliable researches on intervention, prevention, and treatments of AD. Therefore, there is a need for improving the performance of classification machines. In this paper, we propose a kernel framework for learning metrics that enhances conventional machines and supports the diagnosis of dementia. Our framework aims at building discriminative spaces through the maximization of center kernel alignment function, aiming at improving the discrimination of the three considered neurological classes. The proposed metric learning performance is evaluated on the widely-known ADNI database using three supervised classification machines (k-nn, SVM and NNs for multi-class and bi-class scenarios from structural MRIs. Specifically, from ADNI collection 286 AD patients, 379 MCI patients and 231 healthy controls are used for development and validation of our proposed metric learning framework. For the experimental validation, we split the data into two subsets: 30% of subjects used like a blindfolded assessment and 70% employed for parameter tuning. Then, in the preprocessing stage, each structural MRI scan a total of 310 morphological measurements are automatically extracted from by FreeSurfer software package and concatenated to build an input feature matrix. Obtained test performance results, show that including a supervised metric learning improves the compared baseline classifiers in both scenarios. In the multi

  3. Completion of a Dislocated Metric Space

    Directory of Open Access Journals (Sweden)

    P. Sumati Kumari

    2015-01-01

    Full Text Available We provide a construction for the completion of a dislocated metric space (abbreviated d-metric space; we also prove that the completion of the metric associated with a d-metric coincides with the metric associated with the completion of the d-metric.

  4. Understanding Traditional Research Impact Metrics.

    Science.gov (United States)

    Butler, Joseph S; Sebastian, Arjun S; Kaye, I David; Wagner, Scott C; Morrissey, Patrick B; Schroeder, Gregory D; Kepler, Christopher K; Vaccaro, Alexander R

    2017-05-01

    Traditionally, the success of a researcher has been judged by the number of publications he or she has published in peer-review, indexed, high impact journals. However, to quantify the impact of research in the wider scientific community, a number of traditional metrics have been used, including Impact Factor, SCImago Journal Rank, Eigenfactor Score, and Article Influence Score. This article attempts to provide a broad overview of the main traditional impact metrics that have been used to assess scholarly output and research impact. We determine that there is no perfect all-encompassing metric to measure research impact, and, in the modern era, no single traditional metric is capable of accommodating all facets of research impact. Academics and researchers should be aware of the advantages and limitations of traditional metrics and should be judicious when selecting any metrics for an objective assessment of scholarly output and research impact.

  5. Tomographic reconstruction of quantum metrics

    Science.gov (United States)

    Laudato, Marco; Marmo, Giuseppe; Mele, Fabio M.; Ventriglia, Franco; Vitale, Patrizia

    2018-02-01

    In the framework of quantum information geometry we investigate the relationship between monotone metric tensors uniquely defined on the space of quantum tomograms, once the tomographic scheme is chosen, and monotone quantum metrics on the space of quantum states, classified by operator monotone functions, according to the Petz classification theorem. We show that different metrics can be related through a change in the tomographic map and prove that there exists a bijective relation between monotone quantum metrics associated with different operator monotone functions. Such a bijective relation is uniquely defined in terms of solutions of a first order second degree differential equation for the parameters of the involved tomographic maps. We first exhibit an example of a non-linear tomographic map that connects a monotone metric with a new one, which is not monotone. Then we provide a second example where two monotone metrics are uniquely related through their tomographic parameters.

  6. Metric adjusted skew information

    DEFF Research Database (Denmark)

    Hansen, Frank

    2008-01-01

    establish a connection between the geometrical formulation of quantum statistics as proposed by Chentsov and Morozova and measures of quantum information as introduced by Wigner and Yanase and extended in this article. We show that the set of normalized Morozova-Chentsov functions describing the possible......We extend the concept of Wigner-Yanase-Dyson skew information to something we call "metric adjusted skew information" (of a state with respect to a conserved observable). This "skew information" is intended to be a non-negative quantity bounded by the variance (of an observable in a state......) that vanishes for observables commuting with the state. We show that the skew information is a convex function on the manifold of states. It also satisfies other requirements, proposed by Wigner and Yanase, for an effective measure-of-information content of a state relative to a conserved observable. We...

  7. Random metric spaces and universality

    International Nuclear Information System (INIS)

    Vershik, A M

    2004-01-01

    The notion of random metric space is defined, and it is proved that such a space is isometric to the Urysohn universal metric space with probability one. The main technique is the study of universal and random distance matrices; properties of metric (in particular, universal) spaces are related to properties of distance matrices. Examples of other categories in which randomness and universality coincide (graphs, and so on) are given

  8. The metric system: An introduction

    Science.gov (United States)

    Lumley, Susan M.

    On 13 Jul. 1992, Deputy Director Duane Sewell restated the Laboratory's policy on conversion to the metric system which was established in 1974. Sewell's memo announced the Laboratory's intention to continue metric conversion on a reasonable and cost effective basis. Copies of the 1974 and 1992 Administrative Memos are contained in the Appendix. There are three primary reasons behind the Laboratory's conversion to the metric system. First, Public Law 100-418, passed in 1988, states that by the end of fiscal year 1992 the Federal Government must begin using metric units in grants, procurements, and other business transactions. Second, on 25 Jul. 1991, President George Bush signed Executive Order 12770 which urged Federal agencies to expedite conversion to metric units. Third, the contract between the University of California and the Department of Energy calls for the Laboratory to convert to the metric system. Thus, conversion to the metric system is a legal requirement and a contractual mandate with the University of California. Public Law 100-418 and Executive Order 12770 are discussed in more detail later in this section, but first they examine the reasons behind the nation's conversion to the metric system. The second part of this report is on applying the metric system.

  9. Two classes of metric spaces

    Directory of Open Access Journals (Sweden)

    Isabel Garrido

    2016-04-01

    Full Text Available The class of metric spaces (X,d known as small-determined spaces, introduced by Garrido and Jaramillo, are properly defined by means of some type of real-valued Lipschitz functions on X. On the other hand, B-simple metric spaces introduced by Hejcman are defined in terms of some kind of bornologies of bounded subsets of X. In this note we present a common framework where both classes of metric spaces can be studied which allows us to see not only the relationships between them but also to obtain new internal characterizations of these metric properties.

  10. Metric-adjusted skew information

    DEFF Research Database (Denmark)

    Liang, Cai; Hansen, Frank

    2010-01-01

    We give a truly elementary proof of the convexity of metric-adjusted skew information following an idea of Effros. We extend earlier results of weak forms of superadditivity to general metric-adjusted skew information. Recently, Luo and Zhang introduced the notion of semi-quantum states...... on a bipartite system and proved superadditivity of the Wigner-Yanase-Dyson skew informations for such states. We extend this result to the general metric-adjusted skew information. We finally show that a recently introduced extension to parameter values 1 information is a special case...... of (unbounded) metric-adjusted skew information....

  11. Predicting persistence in the sediment compartment with a new automatic software based on the k-Nearest Neighbor (k-NN) algorithm.

    Science.gov (United States)

    Manganaro, Alberto; Pizzo, Fabiola; Lombardo, Anna; Pogliaghi, Alberto; Benfenati, Emilio

    2016-02-01

    The ability of a substance to resist degradation and persist in the environment needs to be readily identified in order to protect the environment and human health. Many regulations require the assessment of persistence for substances commonly manufactured and marketed. Besides laboratory-based testing methods, in silico tools may be used to obtain a computational prediction of persistence. We present a new program to develop k-Nearest Neighbor (k-NN) models. The k-NN algorithm is a similarity-based approach that predicts the property of a substance in relation to the experimental data for its most similar compounds. We employed this software to identify persistence in the sediment compartment. Data on half-life (HL) in sediment were obtained from different sources and, after careful data pruning the final dataset, containing 297 organic compounds, was divided into four experimental classes. We developed several models giving satisfactory performances, considering that both the training and test set accuracy ranged between 0.90 and 0.96. We finally selected one model which will be made available in the near future in the freely available software platform VEGA. This model offers a valuable in silico tool that may be really useful for fast and inexpensive screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    Science.gov (United States)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  13. On Indistinguishability Operators, Fuzzy Metrics and Modular Metrics

    Directory of Open Access Journals (Sweden)

    Juan-José Miñana

    2017-12-01

    Full Text Available The notion of indistinguishability operator was introduced by Trillas, E. in 1982, with the aim of fuzzifying the crisp notion of equivalence relation. Such operators allow for measuring the similarity between objects when there is a limitation on the accuracy of the performed measurement or a certain degree of similarity can be only determined between the objects being compared. Since Trillas introduced such kind of operators, many authors have studied their properties and applications. In particular, an intensive research line is focused on the metric behavior of indistinguishability operators. Specifically, the existence of a duality between metrics and indistinguishability operators has been explored. In this direction, a technique to generate metrics from indistinguishability operators, and vice versa, has been developed by several authors in the literature. Nowadays, such a measurement of similarity is provided by the so-called fuzzy metrics when the degree of similarity between objects is measured relative to a parameter. The main purpose of this paper is to extend the notion of indistinguishability operator in such a way that the measurements of similarity are relative to a parameter and, thus, classical indistinguishability operators and fuzzy metrics can be retrieved as a particular case. Moreover, we discuss the relationship between the new operators and metrics. Concretely, we prove the existence of a duality between them and the so-called modular metrics, which provide a dissimilarity measurement between objects relative to a parameter. The new duality relationship allows us, on the one hand, to introduce a technique for generating the new indistinguishability operators from modular metrics and vice versa and, on the other hand, to derive, as a consequence, a technique for generating fuzzy metrics from modular metrics and vice versa. Furthermore, we yield examples that illustrate the new results.

  14. Metric representation of DNA sequences.

    Science.gov (United States)

    Wu, Z B

    2000-07-01

    A metric representation of DNA sequences is borrowed from symbolic dynamics. In view of this method, the pattern seen in the chaos game representation of DNA sequences is explained as the suppression of certain nucleotide strings in the DNA sequences. Frequencies of short nucleotide strings and suppression of the shortest ones in the DNA sequences can be determined by using the metric representation.

  15. Extending cosmology: the metric approach

    OpenAIRE

    Mendoza, S.

    2012-01-01

    Comment: 2012, Extending Cosmology: The Metric Approach, Open Questions in Cosmology; Review article for an Intech "Open questions in cosmology" book chapter (19 pages, 3 figures). Available from: http://www.intechopen.com/books/open-questions-in-cosmology/extending-cosmology-the-metric-approach

  16. Metrics for Stage Lighting Technology.

    Science.gov (United States)

    Cooper, Gloria S., Ed; Magisos, Joel H., Ed.

    Designed to meet the job-related metric measurement needs of stage lighting technology students, this instructional package is one of five for the arts and humanities occupations cluster, part of a set of 55 packages for metric instruction in different occupations. The package is intended for students who already know the occupational terminology,…

  17. Based on Similarity Metric Learning for Semi-Supervised Clustering

    Directory of Open Access Journals (Sweden)

    Wei QIU

    2014-08-01

    Full Text Available Semi-supervised clustering employs a small amount of labeled data to aid unsupervised learning. The focus of this paper is on Metric Learning, with particular interest in incorporating side information to make it semi-supervised. This study is primarily motivated by an application: face-image clustering. In the paper introduces metric learning and semi-supervised clustering, Similarity metric learning method that adapt the underlying similarity metric used by the clustering algorithm. This paper provides new methods for the two approaches as well as presents a new semi-supervised clustering algorithm that integrates both of these techniques in a uniform, principled framework. Experimental results demonstrate that the unified approach produces better clusters than both individual approaches as well as previously proposed semi-supervised clustering algorithms. This paper followed by the discussion of experiments on face-image clustering, as well as future work.

  18. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  19. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  20. Weyl metrics and wormholes

    Science.gov (United States)

    Gibbons, Gary W.; Volkov, Mikhail S.

    2017-05-01

    We study solutions obtained via applying dualities and complexifications to the vacuum Weyl metrics generated by massive rods and by point masses. Rescaling them and extending to complex parameter values yields axially symmetric vacuum solutions containing singularities along circles that can be viewed as singular matter sources. These solutions have wormhole topology with several asymptotic regions interconnected by throats and their sources can be viewed as thin rings of negative tension encircling the throats. For a particular value of the ring tension the geometry becomes exactly flat although the topology remains non-trivial, so that the rings literally produce holes in flat space. To create a single ring wormhole of one metre radius one needs a negative energy equivalent to the mass of Jupiter. Further duality transformations dress the rings with the scalar field, either conventional or phantom. This gives rise to large classes of static, axially symmetric solutions, presumably including all previously known solutions for a gravity-coupled massless scalar field, as for example the spherically symmetric Bronnikov-Ellis wormholes with phantom scalar. The multi-wormholes contain infinite struts everywhere at the symmetry axes, apart from solutions with locally flat geometry.

  1. Development of Quality Metrics to Evaluate Pediatric Hematologic Oncology Care in the Outpatient Setting.

    Science.gov (United States)

    Teichman, Jennifer; Punnett, Angela; Gupta, Sumit

    2017-03-01

    There are currently no clinic-level quality of care metrics for outpatient pediatric oncology. We sought to develop a list of quality of care metrics for a leukemia-lymphoma (LL) clinic using a consensus process that can be adapted to other clinic settings. Medline-Ovid was searched for quality indicators relevant to pediatric oncology. A provisional list of 27 metrics spanning 7 categories was generated and circulated to a Consensus Group (CG) of LL clinic medical and nursing staff. A Delphi process comprising 2 rounds of ranking generated consensus on a final list of metrics. Consensus was defined as ≥70% of CG members ranking a metric within 2 consecutive scores. In round 1, 19 of 27 (70%) metrics reached consensus. CG members' comments resulted in 4 new metrics and revision of 8 original metrics. All 31 metrics were included in round 2. Twenty-four of 31 (77%) metrics reached consensus after round 2. Thirteen were chosen for the final list based on highest scores and eliminating redundancy. These included: patient communication/education; pain management; delay in access to clinical psychology, documentation of chemotherapy, of diagnosis/extent of disease, of treatment plan and of follow-up scheme; referral to transplant; radiation exposure during follow-up; delay until chemotherapy; clinic cancellations; and school attendance. This study provides a model of quality metric development that other clinics may use for local use. The final metrics will be used for ongoing quality improvement in the LL clinic.

  2. Metrics for Measuring Data Quality - Foundations for an Economic Oriented Management of Data Quality

    OpenAIRE

    Heinrich, Bernd; Kaiser, Marcus; Klier, Mathias

    2007-01-01

    The article develops metrics for an economic oriented management of data quality. Two data quality dimensions are focussed: consistency and timeliness. For deriving adequate metrics several requirements are stated (e. g. normalisation, cardinality, adaptivity, interpretability). Then the authors discuss existing approaches for measuring data quality and illustrate their weaknesses. Based upon these considerations, new metrics are developed for the data quality dimensions consistency and timel...

  3. A Metric for Heterotic Moduli

    Science.gov (United States)

    Candelas, Philip; de la Ossa, Xenia; McOrist, Jock

    2017-12-01

    Heterotic vacua of string theory are realised, at large radius, by a compact threefold with vanishing first Chern class together with a choice of stable holomorphic vector bundle. These form a wide class of potentially realistic four-dimensional vacua of string theory. Despite all their phenomenological promise, there is little understanding of the metric on the moduli space of these. What is sought is the analogue of special geometry for these vacua. The metric on the moduli space is important in phenomenology as it normalises D-terms and Yukawa couplings. It is also of interest in mathematics, since it generalises the metric, first found by Kobayashi, on the space of gauge field connections, to a more general context. Here we construct this metric, correct to first order in {α^{\\backprime}}, in two ways: first by postulating a metric that is invariant under background gauge transformations of the gauge field, and also by dimensionally reducing heterotic supergravity. These methods agree and the resulting metric is Kähler, as is required by supersymmetry. Checking the metric is Kähler is intricate and the anomaly cancellation equation for the H field plays an essential role. The Kähler potential nevertheless takes a remarkably simple form: it is the Kähler potential of special geometry with the Kähler form replaced by the {α^{\\backprime}}-corrected hermitian form.

  4. Colonoscopy quality: metrics and implementation.

    Science.gov (United States)

    Calderwood, Audrey H; Jacobson, Brian C

    2013-09-01

    Colonoscopy is an excellent area for quality improvement because it is high volume, has significant associated risk and expense, and there is evidence that variability in its performance affects outcomes. The best end point for validation of quality metrics in colonoscopy is colorectal cancer incidence and mortality, but a more readily accessible metric is the adenoma detection rate. Fourteen quality metrics were proposed in 2006, and these are described in this article. Implementation of quality improvement initiatives involves rapid assessments and changes on an iterative basis, and can be done at the individual, group, or facility level. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Daylight metrics and energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  6. Next-Generation Metrics: Responsible Metrics & Evaluation for Open Science

    Energy Technology Data Exchange (ETDEWEB)

    Wilsdon, J.; Bar-Ilan, J.; Peters, I.; Wouters, P.

    2016-07-01

    Metrics evoke a mixed reaction from the research community. A commitment to using data to inform decisions makes some enthusiastic about the prospect of granular, real-time analysis o of research and its wider impacts. Yet we only have to look at the blunt use of metrics such as journal impact factors, h-indices and grant income targets, to be reminded of the pitfalls. Some of the most precious qualities of academic culture resist simple quantification, and individual indicators often struggle to do justice to the richness and plurality of research. Too often, poorly designed evaluation criteria are “dominating minds, distorting behaviour and determining careers (Lawrence, 2007).” Metrics hold real power: they are constitutive of values, identities and livelihoods. How to exercise that power to more positive ends has been the focus of several recent and complementary initiatives, including the San Francisco Declaration on Research Assessment (DORA1), the Leiden Manifesto2 and The Metric Tide3 (a UK government review of the role of metrics in research management and assessment). Building on these initiatives, the European Commission, under its new Open Science Policy Platform4, is now looking to develop a framework for responsible metrics for research management and evaluation, which can be incorporated into the successor framework to Horizon 2020. (Author)

  7. Management Infrastructure and Metrics Definition

    OpenAIRE

    Loomis , Charles

    2010-01-01

    This document describes the project management bodies, the software develop- ment process, and the tools to support them. It also contains a description of the metrics that will be collected over the lifetime of the project to gauge progress.

  8. Hyperbolic geometry for colour metrics.

    Science.gov (United States)

    Farup, Ivar

    2014-05-19

    It is well established from both colour difference and colour order perpectives that the colour space cannot be Euclidean. In spite of this, most colour spaces still in use today are Euclidean, and the best Euclidean colour metrics are performing comparably to state-of-the-art non-Euclidean metrics. In this paper, it is shown that a transformation from Euclidean to hyperbolic geometry (i.e., constant negative curvature) for the chromatic plane can significantly improve the performance of Euclidean colour metrics to the point where they are statistically significantly better than state-of-the-art non-Euclidean metrics on standard data sets. The resulting hyperbolic geometry nicely models both qualitatively and quantitatively the hue super-importance phenomenon observed in colour order systems.

  9. Using TRACI for Sustainability Metrics

    Science.gov (United States)

    TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed for sustainability metrics, life cycle impact assessment, and product and process design impact assessment for developing increasingly sustainable products, processes,...

  10. Let's Make Metric Ice Cream

    Science.gov (United States)

    Zimmerman, Marianna

    1975-01-01

    Describes a classroom activity which involved sixth grade students in a learning situation including making ice cream, safety procedures in a science laboratory, calibrating a thermometer, using metric units of volume and mass. (EB)

  11. Experiential space is hardly metric

    Czech Academy of Sciences Publication Activity Database

    Šikl, Radovan; Šimeček, Michal; Lukavský, Jiří

    2008-01-01

    Roč. 2008, č. 37 (2008), s. 58-58 ISSN 0301-0066. [European Conference on Visual Perception. 24.08-28.08.2008, Utrecht] R&D Projects: GA ČR GA406/07/1676 Institutional research plan: CEZ:AV0Z70250504 Keywords : visual space perception * metric and non-metric perceptual judgments * ecological validity Subject RIV: AN - Psychology

  12. Phantom metrics with Killing spinors

    Directory of Open Access Journals (Sweden)

    W.A. Sabra

    2015-11-01

    Full Text Available We study metric solutions of Einstein–anti-Maxwell theory admitting Killing spinors. The analogue of the IWP metric which admits a space-like Killing vector is found and is expressed in terms of a complex function satisfying the wave equation in flat (2+1-dimensional space–time. As examples, electric and magnetic Kasner spaces are constructed by allowing the solution to depend only on the time coordinate. Euclidean solutions are also presented.

  13. Towards a Visual Quality Metric for Digital Video

    Science.gov (United States)

    Watson, Andrew B.

    1998-01-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics. In previous work, we have developed visual quality metrics for evaluating, controlling, and optimizing the quality of compressed still images. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. The challenge of video quality metrics is to extend these simplified models to temporal signals as well. In this presentation I will discuss a number of the issues that must be resolved in the design of effective video quality metrics. Among these are spatial, temporal, and chromatic sensitivity and their interactions, visual masking, and implementation complexity. I will also touch on the question of how to evaluate the performance of these metrics.

  14. Method Points: towards a metric for method complexity

    Directory of Open Access Journals (Sweden)

    Graham McLeod

    1998-11-01

    Full Text Available A metric for method complexity is proposed as an aid to choosing between competing methods, as well as in validating the effects of method integration or the products of method engineering work. It is based upon a generic method representation model previously developed by the author and adaptation of concepts used in the popular Function Point metric for system size. The proposed technique is illustrated by comparing two popular I.E. deliverables with counterparts in the object oriented Unified Modeling Language (UML. The paper recommends ways to improve the practical adoption of new methods.

  15. Non-metric chaotic inflation

    Energy Technology Data Exchange (ETDEWEB)

    Enqvist, Kari [Physics Department, University of Helsinki, and Helsinki Institute of Physics, FIN-00014 Helsinki (Finland); Koivisto, Tomi [Institute for Theoretical Physics and Spinoza Institute, Leuvenlaan 4, 3584 CE Utrecht (Netherlands); Rigopoulos, Gerasimos, E-mail: kari.enqvist@helsinki.fi, E-mail: T.S.Koivisto@astro.uio.no, E-mail: rigopoulos@physik.rwth-aachen.de [Institut für Theoretische Teilchenphysik und Kosmologie, RWTH Aachen University, D-52056 Aachen (Germany)

    2012-05-01

    We consider inflation within the context of what is arguably the simplest non-metric extension of Einstein gravity. There non-metricity is described by a single graviscalar field with a non-minimal kinetic coupling to the inflaton field Ψ, parameterized by a single parameter γ. There is a simple equivalent description in terms of a massless field and an inflaton with a modified potential. We discuss the implications of non-metricity for chaotic inflation and find that it significantly alters the inflaton dynamics for field values Ψ∼>M{sub P}/γ, dramatically changing the qualitative behaviour in this regime. In the equivalent single-field description this is described as a cuspy potential that forms of barrier beyond which the inflation becomes a ghost field. This imposes an upper bound on the possible number of e-folds. For the simplest chaotic inflation models, the spectral index and the tensor-to-scalar ratio receive small corrections dependent on the non-metricity parameter. We also argue that significant post-inflationary non-metricity may be generated.

  16. Symmetries of Taub-NUT dual metrics

    International Nuclear Information System (INIS)

    Baleanu, D.; Codoban, S.

    1998-01-01

    Recently geometric duality was analyzed for a metric which admits Killing tensors. An interesting example arises when the manifold has Killing-Yano tensors. The symmetries of the dual metrics in the case of Taub-NUT metric are investigated. Generic and non-generic symmetries of dual Taub-NUT metric are analyzed

  17. Requirement Metrics for Risk Identification

    Science.gov (United States)

    Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence

    1996-01-01

    The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.

  18. Moduli spaces of riemannian metrics

    CERN Document Server

    Tuschmann, Wilderich

    2015-01-01

    This book studies certain spaces of Riemannian metrics on both compact and non-compact manifolds. These spaces are defined by various sign-based curvature conditions, with special attention paid to positive scalar curvature and non-negative sectional curvature, though we also consider positive Ricci and non-positive sectional curvature. If we form the quotient of such a space of metrics under the action of the diffeomorphism group (or possibly a subgroup) we obtain a moduli space. Understanding the topology of both the original space of metrics and the corresponding moduli space form the central theme of this book. For example, what can be said about the connectedness or the various homotopy groups of such spaces? We explore the major results in the area, but provide sufficient background so that a non-expert with a grounding in Riemannian geometry can access this growing area of research.

  19. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  20. Metrics for measuring net-centric data strategy implementation

    Science.gov (United States)

    Kroculick, Joseph B.

    2010-04-01

    An enterprise data strategy outlines an organization's vision and objectives for improved collection and use of data. We propose generic metrics and quantifiable measures for each of the DoD Net-Centric Data Strategy (NCDS) data goals. Data strategy metrics can be adapted to the business processes of an enterprise and the needs of stakeholders in leveraging the organization's data assets to provide for more effective decision making. Generic metrics are applied to a specific application where logistics supply and transportation data is integrated across multiple functional groups. A dashboard presents a multidimensional view of the current progress to a state where logistics data shared in a timely and seamless manner among users, applications, and systems.

  1. Eye Tracking Metrics for Workload Estimation in Flight Deck Operation

    Science.gov (United States)

    Ellis, Kyle; Schnell, Thomas

    2010-01-01

    Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.

  2. METRICS FOR DYNAMIC SCALING OF DATABASE IN CLOUDS

    Directory of Open Access Journals (Sweden)

    Alexander V. Boichenko

    2013-01-01

    Full Text Available This article analyzes the main methods of scaling databases (replication, sharding and their support at the popular relational databases and NoSQL solutions with different data models: a document-oriented, key-value, column-oriented, graph. The article provides an assessment of the capabilities of modern cloud-based solution and gives a model for the organization of dynamic scaling in the cloud infrastructure. In the article are analyzed different types of metrics and are included the basic metrics that characterize the functioning parameters and database technology, as well as sets the goals of the integral metrics, necessary for the implementation of adaptive algorithms for dynamic scaling databases in the cloud infrastructure. This article was prepared with the support of RFBR grant № 13-07-00749.

  3. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fish er metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  4. The uniqueness of the Fisher metric as information metric

    Czech Academy of Sciences Publication Activity Database

    Le, Hong-Van

    2017-01-01

    Roč. 69, č. 4 (2017), s. 879-896 ISSN 0020-3157 Institutional support: RVO:67985840 Keywords : Chentsov’s theorem * mixed topology * monotonicity of the Fisher metric Subject RIV: BA - General Mathematics OBOR OECD: Pure mathematics Impact factor: 1.049, year: 2016 https://link.springer.com/article/10.1007%2Fs10463-016-0562-0

  5. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  6. Invariant metrics for Hamiltonian systems

    International Nuclear Information System (INIS)

    Rangarajan, G.; Dragt, A.J.; Neri, F.

    1991-05-01

    In this paper, invariant metrics are constructed for Hamiltonian systems. These metrics give rise to norms on the space of homeogeneous polynomials of phase-space variables. For an accelerator lattice described by a Hamiltonian, these norms characterize the nonlinear content of the lattice. Therefore, the performance of the lattice can be improved by minimizing the norm as a function of parameters describing the beam-line elements in the lattice. A four-fold increase in the dynamic aperture of a model FODO cell is obtained using this procedure. 7 refs

  7. Thermodynamic Metrics and Optimal Paths

    Energy Technology Data Exchange (ETDEWEB)

    Sivak, David; Crooks, Gavin

    2012-05-08

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  8. Technical Privacy Metrics: a Systematic Survey

    OpenAIRE

    Wagner, Isabel; Eckhoff, David

    2015-01-01

    The goal of privacy metrics is to measure the degree of privacy enjoyed by users in a system and the amount of protection offered by privacy-enhancing technologies. In this way, privacy metrics contribute to improving user privacy in the digital world. The diversity and complexity of privacy metrics in the literature makes an informed choice of metrics challenging. As a result, redundant new metrics are proposed frequently, and privacy studies are often incomparable. In this survey we allevia...

  9. Remarks on G-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Bessem Samet

    2013-01-01

    Full Text Available In 2005, Mustafa and Sims (2006 introduced and studied a new class of generalized metric spaces, which are called G-metric spaces, as a generalization of metric spaces. We establish some useful propositions to show that many fixed point theorems on (nonsymmetric G-metric spaces given recently by many authors follow directly from well-known theorems on metric spaces. Our technique can be easily extended to other results as shown in application.

  10. Metrics correlation and analysis service (MCAS)

    Energy Technology Data Exchange (ETDEWEB)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya; /Fermilab

    2009-05-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  11. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2010-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information pool is disorganized, it is a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation, and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by loosely coupled or fully decoupled middleware.

  12. Metrics correlation and analysis service (MCAS)

    International Nuclear Information System (INIS)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya

    2009-01-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  13. Obtención de polvos cerámicos de BNKT-KNN por el método Pechini

    Directory of Open Access Journals (Sweden)

    Yasnó, J. P.

    2013-10-01

    Full Text Available Pechini method was used in order to obtain fine ceramic and single-phase powders for a lead-free ferroelectric system 0,97[(Bi1/2Na1/21-x(Bi1/2K1/2xTiO3]-0,03[(Na1/2K1/2NbO3]or BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27. This method allowed obtaining powders with 100 % perovskite phase, which was confirmed by X-ray diffraction, for this particular system in all the studied stoichiometries using temperature as low as 600 ºC. The effects on the bonds present in the structure due to variation of the stoichiometry, Na-K, were determined using infrared spectroscopy, FT-IR. Irregular nanoparticles were observed by scanning electron microscopy.El método Pechini fue utilizado para obtener polvos cerámicos finos y monofásicos del sistema ferroeléctrico libre de plomo 0,97[(Bi1/2Na1/21-x(Bi1/2K1/2xTiO3]-0,03[(Na1/2K1/2NbO3] ó BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27. Este método permitió la obtención de polvos con 100 % de fase perovskita, para el sistema de interés en todas las estequiometrias estudiadas, a una temperatura tan baja como 600 ºC, lo que fue confirmado por difracción de rayos X. Por medio de espectroscopia infrarroja, FT-IR, se pudo determinar cómo afecta la variación de la estequiometria, Na-K, los enlaces presentes en la estructura. Mediante microscopia electrónica de barrido se observaron partículas nanométricas irregulares.

  14. Obtención de polvos cerámicos de BNKT-KNN por el método Pechini

    Directory of Open Access Journals (Sweden)

    Yasnó, J. P.

    2013-08-01

    Full Text Available Pechini method was used in order to obtain fine ceramic and single-phase powders for a lead-free ferroelectric system 0,97[(Bi1/2Na1/21-x(Bi1/2K1/2xTiO3]-0,03[(Na1/2K1/2NbO3] or BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27. This method allowed obtaining powders with 100 % perovskite phase, which was confirmed by X-ray diffraction, for this particular system in all the studied stoichiometries using temperature as low as 600 ºC. The effects on the bonds present in the structure due to variation of the stoichiometry, Na-K, were determined using infrared spectroscopy, FT-IR. Irregular nanoparticles were observed by scanning electron microscopy.El método Pechini fue utilizado para obtener polvos cerámicos finos y monofásicos del sistema ferroeléctrico libre de plomo 0,97[(Bi1/2Na1/21-x(Bi1/2K1/2xTiO3]-0,03[(Na1/2K1/2NbO3] ó BNKT-KNN (x = 0.00, 0.18, 0.21, 0.24, 0.27. Este método permitió la obtención de polvos con 100 % de fase perovskita, para el sistema de interés en todas las estequiometrias estudiadas, a una temperatura tan baja como 600 ºC, lo que fue confirmado por difracción de rayos X. Por medio de espectroscopia infrarroja, FT-IR, se pudo determinar cómo afecta la variación de la estequiometria, Na-K, los enlaces presentes en la estructura. Mediante microscopia electrónica de barrido se observaron partículas nanométricas irregulares.

  15. Differentiation of AmpC beta-lactamase binders vs. decoys using classification kNN QSAR modeling and application of the QSAR classifier to virtual screening

    Science.gov (United States)

    Hsieh, Jui-Hua; Wang, Xiang S.; Teotico, Denise; Golbraikh, Alexander; Tropsha, Alexander

    2008-09-01

    The use of inaccurate scoring functions in docking algorithms may result in the selection of compounds with high predicted binding affinity that nevertheless are known experimentally not to bind to the target receptor. Such falsely predicted binders have been termed `binding decoys'. We posed a question as to whether true binders and decoys could be distinguished based only on their structural chemical descriptors using approaches commonly used in ligand based drug design. We have applied the k-Nearest Neighbor ( kNN) classification QSAR approach to a dataset of compounds characterized as binders or binding decoys of AmpC beta-lactamase. Models were subjected to rigorous internal and external validation as part of our standard workflow and a special QSAR modeling scheme was employed that took into account the imbalanced ratio of inhibitors to non-binders (1:4) in this dataset. 342 predictive models were obtained with correct classification rate (CCR) for both training and test sets as high as 0.90 or higher. The prediction accuracy was as high as 100% (CCR = 1.00) for the external validation set composed of 10 compounds (5 true binders and 5 decoys) selected randomly from the original dataset. For an additional external set of 50 known non-binders, we have achieved the CCR of 0.87 using very conservative model applicability domain threshold. The validated binary kNN QSAR models were further employed for mining the NCGC AmpC screening dataset (69653 compounds). The consensus prediction of 64 compounds identified as screening hits in the AmpC PubChem assay disagreed with their annotation in PubChem but was in agreement with the results of secondary assays. At the same time, 15 compounds were identified as potential binders contrary to their annotation in PubChem. Five of them were tested experimentally and showed inhibitory activities in millimolar range with the highest binding constant Ki of 135 μM. Our studies suggest that validated QSAR models could complement

  16. Quality Assessment of Sharpened Images: Challenges, Methodology, and Objective Metrics.

    Science.gov (United States)

    Krasula, Lukas; Le Callet, Patrick; Fliegel, Karel; Klima, Milos

    2017-01-10

    Most of the effort in image quality assessment (QA) has been so far dedicated to the degradation of the image. However, there are also many algorithms in the image processing chain that can enhance the quality of an input image. These include procedures for contrast enhancement, deblurring, sharpening, up-sampling, denoising, transfer function compensation, etc. In this work, possible strategies for the quality assessment of sharpened images are investigated. This task is not trivial because the sharpening techniques can increase the perceived quality, as well as introduce artifacts leading to the quality drop (over-sharpening). Here, the framework specifically adapted for the quality assessment of sharpened images and objective metrics comparison in this context is introduced. However, the framework can be adopted in other quality assessment areas as well. The problem of selecting the correct procedure for subjective evaluation was addressed and a subjective test on blurred, sharpened, and over-sharpened images was performed in order to demonstrate the use of the framework. The obtained ground-truth data were used for testing the suitability of state-ofthe- art objective quality metrics for the assessment of sharpened images. The comparison was performed by novel procedure using ROC analyses which is found more appropriate for the task than standard methods. Furthermore, seven possible augmentations of the no-reference S3 metric adapted for sharpened images are proposed. The performance of the metric is significantly improved and also superior over the rest of the tested quality criteria with respect to the subjective data.

  17. Socio-technical security metrics

    NARCIS (Netherlands)

    Gollmann, D.; Herley, C.; Koenig, V.; Pieters, W.; Sasse, M.A.

    2015-01-01

    Report from Dagstuhl seminar 14491. This report documents the program and the outcomes of Dagstuhl Seminar 14491 “Socio-Technical Security Metrics”. In the domain of safety, metrics inform many decisions, from the height of new dikes to the design of nuclear plants. We can state, for example, that

  18. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differential equation by making a separability assumption on the metric functions in the space–time variables.

  19. Warped products and Einstein metrics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seongtag [Department of Mathematics Education, Inha University, Incheon 402-751 (Korea, Republic of)

    2006-05-19

    Warped product construction is an important method to produce a new metric with a base manifold and a fibre. We construct compact base manifolds with a positive scalar curvature which do not admit any non-trivial Einstein warped product, and noncompact complete base manifolds which do not admit any non-trivial Ricci-flat Einstein warped product. (letter to the editor)

  20. Geometry of Cuts and Metrics

    NARCIS (Netherlands)

    M. Deza; M. Laurent (Monique)

    1997-01-01

    htmlabstractCuts and metrics are well-known objects that arise - independently, but with many deep and fascinating connections - in diverse fields: in graph theory, combinatorial optimization, geometry of numbers, combinatorial matrix theory, statistical physics, VLSI design etc. This book offers a

  1. Linear and Branching System Metrics

    NARCIS (Netherlands)

    J., Hilston; de Alfaro, Luca; Faella, Marco; M.Z., Kwiatkowska; Telek, M.; Stoelinga, Mariëlle Ida Antoinette

    We extend the classical system relations of trace inclusion, trace equivalence, simulation, and bisimulation to a quantitative setting in which propositions are interpreted not as boolean values, but as elements of arbitrary metric spaces. Trace inclusion and equivalence give rise to asymmetrical

  2. Axiomatic Testing of Structure Metrics

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    1994-01-01

    In this paper, axiomatic testing of software metrics is described. The testing is based on representation axioms from the measurement theory. In a case study, the axioms are given for the formal relational structure and the empirical relational structure. Two approaches of axiomatic testing are

  3. Axiomatic Testing of Structure Metrics

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    In this paper, axiomatic testing of software metrics will be described. The testing is based on representation axioms from the measurement theory. In a case study, the axioms are given for the formal relational structure and the empirical relational structure. Two approaches of axiomatic testing are

  4. Separable metrics and radiating stars

    Indian Academy of Sciences (India)

    2016-12-14

    Dec 14, 2016 ... Abstract. We study the junction condition relating the pressure to heat flux at the boundary of an accelerating and expanding spherically symmetric radiating star. We transform the junction condition to an ordinary differ- ential equation by making a separability assumption on the metric functions in the ...

  5. A comparison of k-NN, backpropagation, and self-organising map classification methods using an optical fibre based sensor system utilised in an industrial large scale oven

    Science.gov (United States)

    Sheridan, Cormac; O'Farrell, Marion; Lewis, Elfed; Lyons, William B.; Flanagan, Colin; Jackman, Nick

    2005-06-01

    This paper reports on three methods of classifying the spectral data from an optical fibre based sensor system as used in the food industry. The first method uses a feed-forward back-propagation Artificial Neural Network; the second method involves using Kohonen Self-Organising Maps while the third method is k-Nearest Neighbour analysis. The sensor monitors the food colour online as the food cooks by examining the reflected light from both the surface and the core of the product. The combination of using Principal Component Analysis and Backpropagation Neural Networks has been successfully investigated previously. In this paper, results obtained using all three classifiers are presented and compared. The Principal Components used to train each classifier are evaluated from data that generate a "colourscale" comprising six colour classifications. This scale has been developed to allow several products of similar colour to be tested using a single network that had been trained using the colourscale. The results presented show that both the neural network and the Self-Organising Map approach perform comparably, while the k-NN method tested under-performs the other two.

  6. Dielectric and ferroelectric properties of strain-relieved epitaxial lead-free KNN-LT-LS ferroelectric thin films on SrTiO3 substrates

    Science.gov (United States)

    Abazari, M.; Akdoǧan, E. K.; Safari, A.

    2008-05-01

    We report the growth of single-phase (K0.44,Na0.52,Li0.04)(Nb0.84,Ta0.10,Sb0.06)O3 thin films on SrRuO3 coated ⟨001⟩ oriented SrTiO3 substrates by using pulsed laser deposition. Films grown at 600°C under low laser fluence exhibit a ⟨001⟩ textured columnar grained nanostructure, which coalesce with increasing deposition temperature, leading to a uniform fully epitaxial highly stoichiometric film at 750°C. However, films deposited at lower temperatures exhibit compositional fluctuations as verified by Rutherford backscattering spectroscopy. The epitaxial films of 400-600nm thickness have a room temperature relative permittivity of ˜750 and a loss tangent of ˜6% at 1kHz. The room temperature remnant polarization of the films is 4μC /cm2, while the saturation polarization is 7.1μC/cm2 at 24kV/cm and the coercive field is ˜7.3kV/cm. The results indicate that approximately 50% of the bulk permittivity and 20% of bulk spontaneous polarization can be retained in submicron epitaxial KNN-LT-LS thin film, respectively. The conductivity of the films remains to be a challenge as evidenced by the high loss tangent, leakage currents, and broad hysteresis loops.

  7. Ontology-based metrics computation for business process analysis

    OpenAIRE

    Pedrinaci C.; Domingue J.

    2009-01-01

    Business Process Management (BPM) aims to support the whole life-cycle necessary to deploy and maintain business processes in organisations. Crucial within the BPM lifecycle is the analysis of deployed processes. Analysing business processes requires computing metrics that can help determining the health of business activities and thus the whole enterprise. However, the degree of automation currently achieved cannot support the level of reactivity and adaptation demanded by businesses. In thi...

  8. Fuzzy polynucleotide spaces and metrics.

    Science.gov (United States)

    Nieto, Juan J; Torres, A; Georgiou, D N; Karakasidis, T E

    2006-04-01

    The study of genetic sequences is of great importance in biology and medicine. Mathematics is playing an important role in the study of genetic sequences and, generally, in bioinformatics. In this paper, we extend the work concerning the Fuzzy Polynucleotide Space (FPS) introduced in Torres, A., Nieto, J.J., 2003. The fuzzy polynucleotide Space: Basic properties. Bioinformatics 19(5); 587-592 and Nieto, J.J., Torres, A., Vazquez-Trasande, M.M. 2003. A metric space to study differences between polynucleotides. Appl. Math. Lett. 27:1289-1294: by studying distances between nucleotides and some complete genomes using several metrics. We also present new results concerning the notions of similarity, difference and equality between polynucleotides. The results are encouraging since they demonstrate how the notions of distance and similarity between polynucleotides in the FPS can be employed in the analysis of genetic material.

  9. Quality Metrics in Inpatient Neurology.

    Science.gov (United States)

    Dhand, Amar

    2015-12-01

    Quality of care in the context of inpatient neurology is the standard of performance by neurologists and the hospital system as measured against ideal models of care. There are growing regulatory pressures to define health care value through concrete quantifiable metrics linked to reimbursement. Theoretical models of quality acknowledge its multimodal character with quantitative and qualitative dimensions. For example, the Donabedian model distils quality as a phenomenon of three interconnected domains, structure-process-outcome, with each domain mutually influential. The actual measurement of quality may be implicit, as in peer review in morbidity and mortality rounds, or explicit, in which criteria are prespecified and systemized before assessment. As a practical contribution, in this article a set of candidate quality indicators for inpatient neurology based on an updated review of treatment guidelines is proposed. These quality indicators may serve as an initial blueprint for explicit quality metrics long overdue for inpatient neurology. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  10. A stationary q-metric

    Science.gov (United States)

    Toktarbay, S.; Quevedo, H.

    2014-10-01

    We present a stationary generalization of the static $q-$metric, the simplest generalization of the Schwarzschild solution that contains a quadrupole parameter. It possesses three independent parameters that are related to the mass, quadrupole moment and angular momentum. We investigate the geometric and physical properties of this exact solution of Einstein's vacuum equations, and show that it can be used to describe the exterior gravitational field of rotating, axially symmetric, compact objects.

  11. Sensory Metrics of Neuromechanical Trust.

    Science.gov (United States)

    Softky, William; Benford, Criscillia

    2017-09-01

    Today digital sources supply a historically unprecedented component of human sensorimotor data, the consumption of which is correlated with poorly understood maladies such as Internet addiction disorder and Internet gaming disorder. Because both natural and digital sensorimotor data share common mathematical descriptions, one can quantify our informational sensorimotor needs using the signal processing metrics of entropy, noise, dimensionality, continuity, latency, and bandwidth. Such metrics describe in neutral terms the informational diet human brains require to self-calibrate, allowing individuals to maintain trusting relationships. With these metrics, we define the trust humans experience using the mathematical language of computational models, that is, as a primitive statistical algorithm processing finely grained sensorimotor data from neuromechanical interaction. This definition of neuromechanical trust implies that artificial sensorimotor inputs and interactions that attract low-level attention through frequent discontinuities and enhanced coherence will decalibrate a brain's representation of its world over the long term by violating the implicit statistical contract for which self-calibration evolved. Our hypersimplified mathematical understanding of human sensorimotor processing as multiscale, continuous-time vibratory interaction allows equally broad-brush descriptions of failure modes and solutions. For example, we model addiction in general as the result of homeostatic regulation gone awry in novel environments (sign reversal) and digital dependency as a sub-case in which the decalibration caused by digital sensorimotor data spurs yet more consumption of them. We predict that institutions can use these sensorimotor metrics to quantify media richness to improve employee well-being; that dyads and family-size groups will bond and heal best through low-latency, high-resolution multisensory interaction such as shared meals and reciprocated touch; and

  12. Multi-Metric Sustainability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowlin, Shannon [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heimiller, Donna [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Pless, Jacquelyn [National Renewable Energy Lab. (NREL), Golden, CO (United States); Munoz, David [Colorado School of Mines, Golden, CO (United States)

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  13. Evaluating human detection performance of targets and false alarms, using a statistical texture image metric

    Science.gov (United States)

    Aviram, Guy; Rotman, Stanley R.

    2000-08-01

    A statistical texture image metric, which is based on the Markov cooccurrence matrix and named ICOM, is introduced and evaluated in the context of the correlation between (1) human quantitative detection performance of targets and false alarms and (2) qualitative image judgments of both natural and enhanced infrared images. Correlations between the ICOM metric and experimental detection results are compared with those obtained with the probability of edge global clutter metric, the local contrast metric (DOYLE), and the combined signal-to-clutter metric. In contrast with those metrics, which fit well either the qualitative or the quantitative results, the ICOM textural metric was found in good agreement with both the qualitative and quantitative experiments. Moreover, the ICOM metric was found appropriate for automatic extraction of potential false targets in both natural and the enhanced images. This property is used to analyze human false-detection-decision behavior, and to suggest a modification to the known constant false-alarm rate model. The modified model considers the total number of detection decisions (true and false) made by the human observer as the adaptive parameter. The model was tested and confirmed both with natural and enhanced images. The ICOM properties and the results obtained using them emphasize the robustness and the adequacy of this metric for infrared imagery evaluation.

  14. Measuring Sustainability: Deriving Metrics From Objectives (Presentation)

    Science.gov (United States)

    The definition of 'sustain', to keep in existence, provides some insight into the metrics that are required to measure sustainability and adequately respond to assure sustainability. Keeping something in existence implies temporal and spatial contexts and requires metrics that g...

  15. Framework for Information Age Assessment Metrics

    National Research Council Canada - National Science Library

    Augustine, Thomas H; Broyles, James W

    2004-01-01

    ... all of these metrics. Specifically this paper discusses an Information Age Framework for Assessment Metrics and relates its elements to the fundamental facets of a C4ISR enterprise architecture...

  16. Almost contact metric 3-submersions

    Directory of Open Access Journals (Sweden)

    Bill Watson

    1984-01-01

    Full Text Available An almost contact metric 3-submersion is a Riemannian submersion, π from an almost contact metric manifold (M4m+3,(φi,ξi,ηii=13,g onto an almost quaternionic manifold (N4n,(Jii=13,h which commutes with the structure tensors of type (1,1;i.e., π*φi=Jiπ*, for i=1,2,3. For various restrictions on ∇φi, (e.g., M is 3-Sasakian, we show corresponding limitations on the second fundamental form of the fibres and on the complete integrability of the horizontal distribution. Concommitantly, relations are derived between the Betti numbers of a compact total space and the base space. For instance, if M is 3-quasi-Saskian (dΦ=0, then b1(N≤b1(M. The respective φi-holomorphic sectional and bisectional curvature tensors are studied and several unexpected results are obtained. As an example, if X and Y are orthogonal horizontal vector fields on the 3-contact (a relatively weak structure total space of such a submersion, then the respective holomorphic bisectional curvatures satisfy: Bφi(X,Y=B′J′i(X*,Y*−2. Applications to the real differential geometry of Yarg-Milis field equations are indicated based on the fact that a principal SU(2-bundle over a compactified realized space-time can be given the structure of an almost contact metric 3-submersion.

  17. Object-Oriented Metrics Which Predict Maintainability

    OpenAIRE

    Li, Wei; Henry, Sallie M.

    1993-01-01

    Software metrics have been studied in the procedural paradigm as a quantitative means of assessing the software development process as well as the quality of software products. Several studies have validated that various metrics are useful indicators of maintenance effort in the procedural paradigm. However, software metrics have rarely been studied in the object oriented paradigm. Very few metrics have been proposed to measure object oriented systems, and the proposed ones have not been v...

  18. A proposal of knowledge engineering metrics

    OpenAIRE

    Britos, Paola Verónica; García Martínez, Ramón; Hauge, Ødwin

    2005-01-01

    Metrics used on development of expert systems is not a well investigated problem area. This article suggests some metrics to be used to measure the maturity of the conceptualization process and the complexity of the decision process in the problem domain. We propose some further work to be done with these metrics. Applying those metrics makes new and interesting problems, concerning the structure of knowledge to surface.

  19. Crowdsourcing metrics of digital collections

    Directory of Open Access Journals (Sweden)

    Tuula Pääkkönen

    2015-12-01

    Full Text Available In the National Library of Finland (NLF there are millions of digitized newspaper and journal pages, which are openly available via the public website  http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera. The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted. These metrics give us indications on how the end-users, based on their own interests, are investigating and using the digital collections. Therefore, the suggested metrics illustrate the versatility of the information needs of the users, varying from citizen science to research purposes. By analysing the user patterns, we can respond to the new needs of the users by making minor changes to accommodate the most active participants, while still making the service more approachable for those who are trying out the functionalities for the first time. Participation in the clippings and annotations can enrich the materials in unexpected ways and can possibly pave the way for opportunities of using crowdsourcing more also in research contexts. This creates more opportunities for the goals of open science since source data becomes ­available, making it possible for researchers to reach out to the general public for help. In the long term, utilizing, for example, text mining methods can allow these different end-user segments to

  20. Validation in the Software Metric Development Process

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    In this paper the validation of software metrics will be examined. Two approaches will be combined: representational measurement theory and a validation network scheme. The development process of a software metric will be described, together with validities for the three phases of the metric

  1. Modeling and analysis of metrics databases

    OpenAIRE

    Paul, Raymond A.

    1999-01-01

    The main objective of this research is to propose a comprehensive framework for quality and risk management in software development process based on analysis and modeling of software metrics data. Existing software metrics work has focused mainly on the type of metrics tobe collected ...

  2. Invariant Matsumoto metrics on homogeneous spaces

    OpenAIRE

    Salimi Moghaddam, H.R.

    2014-01-01

    In this paper we consider invariant Matsumoto metrics which are induced by invariant Riemannian metrics and invariant vector fields on homogeneous spaces, and then we give the flag curvature formula of them. Also we study the special cases of naturally reductive spaces and bi-invariant metrics. We end the article by giving some examples of geodesically complete Matsumoto spaces.

  3. Context-dependent ATC complexity metric

    NARCIS (Netherlands)

    Mercado Velasco, G.A.; Borst, C.

    2015-01-01

    Several studies have investigated Air Traffic Control (ATC) complexity metrics in a search for a metric that could best capture workload. These studies have shown how daunting the search for a universal workload metric (one that could be applied in different contexts: sectors, traffic patterns,

  4. Towards a catalog format for software metrics

    NARCIS (Netherlands)

    Bouwers, E.; Visser, J.; Van Deursen, A.

    2014-01-01

    In the past two decades both the industry and the research community have proposed hundreds of metrics to track software projects, evaluate quality or estimate effort. Unfortunately, it is not always clear which metric works best in a particular context. Even worse, for some metrics there is little

  5. A Common Metric for Integrating Research Findings.

    Science.gov (United States)

    Haladyna, Tom

    The choice of a common metric for the meta-analysis (quantitative synthesis) of correlational and experimental research studies is presented and justified. First, a background for the problem of identifying a common metric is presented. Second, the percentage of accounted variance (PAV) is described as the metric of choice, and reasons are given…

  6. A guide to calculating habitat-quality metrics to inform conservation of highly mobile species

    Science.gov (United States)

    Bieri, Joanna A.; Sample, Christine; Thogmartin, Wayne E.; Diffendorfer, James E.; Earl, Julia E.; Erickson, Richard A.; Federico, Paula; Flockhart, D. T. Tyler; Nicol, Sam; Semmens, Darius J.; Skraber, T.; Wiederholt, Ruscena; Mattsson, Brady J.

    2018-01-01

    Many metrics exist for quantifying the relative value of habitats and pathways used by highly mobile species. Properly selecting and applying such metrics requires substantial background in mathematics and understanding the relevant management arena. To address this multidimensional challenge, we demonstrate and compare three measurements of habitat quality: graph-, occupancy-, and demographic-based metrics. Each metric provides insights into system dynamics, at the expense of increasing amounts and complexity of data and models. Our descriptions and comparisons of diverse habitat-quality metrics provide means for practitioners to overcome the modeling challenges associated with management or conservation of such highly mobile species. Whereas previous guidance for applying habitat-quality metrics has been scattered in diversified tracks of literature, we have brought this information together into an approachable format including accessible descriptions and a modeling case study for a typical example that conservation professionals can adapt for their own decision contexts and focal populations.Considerations for Resource ManagersManagement objectives, proposed actions, data availability and quality, and model assumptions are all relevant considerations when applying and interpreting habitat-quality metrics.Graph-based metrics answer questions related to habitat centrality and connectivity, are suitable for populations with any movement pattern, quantify basic spatial and temporal patterns of occupancy and movement, and require the least data.Occupancy-based metrics answer questions about likelihood of persistence or colonization, are suitable for populations that undergo localized extinctions, quantify spatial and temporal patterns of occupancy and movement, and require a moderate amount of data.Demographic-based metrics answer questions about relative or absolute population size, are suitable for populations with any movement pattern, quantify demographic

  7. Load Balancing Metric with Diversity for Energy Efficient Routing in Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Moad, Sofiane; Hansen, Morten Tranberg; Jurdak, Raja

    2011-01-01

    The expected number of transmission (ETX) represents a routing metric that considers the highly variable link qualities for a specific radio in Wireless Sensor Networks (WSNs). To adapt to these differences, radio diversity is a recently explored solution for WSNs. In this paper, we propose...... an energy balancing metric which explores the diversity in link qualities present at different radios. The goal is to effectively use the energy of the network and therefore extend the network lifetime. The proposed metric takes into account the transmission and reception costs for a specific radio in order...... to choose an energy efficient radio. In addition, the metric uses the remaining energy of nodes in order to regulate the traffic so that critical nodes are avoided. We show by simulations that our metric can improve the network lifetime up to 20%....

  8. Some Equivalences between Cone b-Metric Spaces and b-Metric Spaces

    Directory of Open Access Journals (Sweden)

    Poom Kumam

    2013-01-01

    Full Text Available We introduce a b-metric on the cone b-metric space and then prove some equivalences between them. As applications, we show that fixed point theorems on cone b-metric spaces can be obtained from fixed point theorems on b-metric spaces.

  9. Optimized Seizure Detection Algorithm: A Fast Approach for Onset of Epileptic in EEG Signals Using GT Discriminant Analysis and K-NN Classifier

    Directory of Open Access Journals (Sweden)

    Azizi E.

    2016-06-01

    Full Text Available Background: Epilepsy is a severe disorder of the central nervous system that predisposes the person to recurrent seizures. Fifty million people worldwide suffer from epilepsy; after Alzheimer’s and stroke, it is the third widespread nervous disorder. Objective: In this paper, an algorithm to detect the onset of epileptic seizures based on the analysis of brain electrical signals (EEG has been proposed. 844 hours of EEG were recorded form 23 pediatric patients consecutively with 163 occurrences of seizures. Signals had been collected from Children’s Hospital Boston with a sampling frequency of 256 Hz through 18 channels in order to assess epilepsy surgery. By selecting effective features from seizure and non-seizure signals of each individual and putting them into two categories, the proposed algorithm detects the onset of seizures quickly and with high sensitivity. Method: In this algorithm, L-sec epochs of signals are displayed in form of a thirdorder tensor in spatial, spectral and temporal spaces by applying wavelet transform. Then, after applying general tensor discriminant analysis (GTDA on tensors and calculating mapping matrix, feature vectors are extracted. GTDA increases the sensitivity of the algorithm by storing data without deleting them. Finally, K-Nearest neighbors (KNN is used to classify the selected features. Results: The results of simulating algorithm on algorithm standard dataset shows that the algorithm is capable of detecting 98 percent of seizures with an average delay of 4.7 seconds and the average error rate detection of three errors in 24 hours. Conclusion: Today, the lack of an automated system to detect or predict the seizure onset is strongly felt.

  10. Is it possible to predict long-term success with k-NN? Case study of four market indices (FTSE100, DAX, HANGSENG, NASDAQ)

    International Nuclear Information System (INIS)

    Shi, Y; Gorban, A N; Yang, T Y

    2014-01-01

    This case study tests the possibility of prediction for 'success' (or 'winner') components of four stock and shares market indices in a time period of three years from 02-Jul-2009 to 29-Jun-2012.We compare their performance ain two time frames: initial frame three months at the beginning (02/06/2009-30/09/2009) and the final three month frame (02/04/2012-29/06/2012).To label the components, average price ratio between two time frames in descending order is computed. The average price ratio is defined as the ratio between the mean prices of the beginning and final time period. The 'winner' components are referred to the top one third of total components in the same order as average price ratio it means the mean price of final time period is relatively higher than the beginning time period. The 'loser' components are referred to the last one third of total components in the same order as they have higher mean prices of beginning time period. We analyse, is there any information about the winner-looser separation in the initial fragments of the daily closing prices log-returns time series.The Leave-One-Out Cross-Validation with k-NN algorithm is applied on the daily log-return of components using a distance and proximity in the experiment. By looking at the error analysis, it shows that for HANGSENG and DAX index, there are clear signs of possibility to evaluate the probability of long-term success. The correlation distance matrix histograms and 2-D/3-D elastic maps generated from ViDaExpert show that the 'winner' components are closer to each other and 'winner'/'loser' components are separable on elastic maps for HANGSENG and DAX index while for the negative possibility indices, there is no sign of separation

  11. A two-dimensional matrix image based feature extraction method for classification of sEMG: A comparative analysis based on SVM, KNN and RBF-NN.

    Science.gov (United States)

    Wen, Tingxi; Zhang, Zhongnan; Qiu, Ming; Zeng, Ming; Luo, Weizhen

    2017-01-01

    The computer mouse is an important human-computer interaction device. But patients with physical finger disability are unable to operate this device. Surface EMG (sEMG) can be monitored by electrodes on the skin surface and is a reflection of the neuromuscular activities. Therefore, we can control limbs auxiliary equipment by utilizing sEMG classification in order to help the physically disabled patients to operate the mouse. To develop a new a method to extract sEMG generated by finger motion and apply novel features to classify sEMG. A window-based data acquisition method was presented to extract signal samples from sEMG electordes. Afterwards, a two-dimensional matrix image based feature extraction method, which differs from the classical methods based on time domain or frequency domain, was employed to transform signal samples to feature maps used for classification. In the experiments, sEMG data samples produced by the index and middle fingers at the click of a mouse button were separately acquired. Then, characteristics of the samples were analyzed to generate a feature map for each sample. Finally, the machine learning classification algorithms (SVM, KNN, RBF-NN) were employed to classify these feature maps on a GPU. The study demonstrated that all classifiers can identify and classify sEMG samples effectively. In particular, the accuracy of the SVM classifier reached up to 100%. The signal separation method is a convenient, efficient and quick method, which can effectively extract the sEMG samples produced by fingers. In addition, unlike the classical methods, the new method enables to extract features by enlarging sample signals' energy appropriately. The classical machine learning classifiers all performed well by using these features.

  12. Angles between Curves in Metric Measure Spaces

    Directory of Open Access Journals (Sweden)

    Han Bang-Xian

    2017-09-01

    Full Text Available The goal of the paper is to study the angle between two curves in the framework of metric (and metric measure spaces. More precisely, we give a new notion of angle between two curves in a metric space. Such a notion has a natural interplay with optimal transportation and is particularly well suited for metric measure spaces satisfying the curvature-dimension condition. Indeed one of the main results is the validity of the cosine formula on RCD*(K, N metric measure spaces. As a consequence, the new introduced notions are compatible with the corresponding classical ones for Riemannian manifolds, Ricci limit spaces and Alexandrov spaces.

  13. On characterizations of quasi-metric completeness

    Energy Technology Data Exchange (ETDEWEB)

    Dag, H.; Romaguera, S.; Tirado, P.

    2017-07-01

    Hu proved in [4] that a metric space (X, d) is complete if and only if for any closed subspace C of (X, d), every Banach contraction on C has fixed point. Since then several authors have investigated the problem of characterizing the metric completeness by means of fixed point theorems. Recently this problem has been studied in the more general context of quasi-metric spaces for different notions of completeness. Here we present a characterization of a kind of completeness for quasi-metric spaces by means of a quasi-metric versions of Hu’s theorem. (Author)

  14. Forged seal detection based on the seal overlay metric.

    Science.gov (United States)

    Lee, Joong; Kong, Seong G; Lee, Young-Soo; Moon, Ki-Woong; Jeon, Oc-Yeub; Han, Jong Hyun; Lee, Bong-Woo; Seo, Joong-Suk

    2012-01-10

    This paper describes a method for verifying the authenticity of a seal impression imprinted on a document based on the seal overlay metric, which refers to the ratio of an effective seal impression pattern and the noise in the neighborhood of the reference impression region. A reference seal pattern is obtained by taking the average of a number of high-quality impressions of a genuine seal. A target seal impression to be examined, often on paper with some background texts and lines, is segmented out from the background by an adaptive threshold applied to the histogram of color components. The segmented target seal impression is then spatially aligned with the reference by maximizing the count of matching pixels. Then the seal overlay metric is computed for the reference and the target. If the overlay metric of a target seal is below a predetermined limit for the similarity to the genuine, then the target is classified as a forged seal. To further reduce the misclassification rate, the seal overlay metric is adjusted by the filling rate, which reflects the quality of inked pattern of the target seal. Experiment results demonstrate that the proposed method can detect elaborate seal impressions created by advanced forgery techniques such as lithography and computer-aided manufacturing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. Web metrics for library and information professionals

    CERN Document Server

    Stuart, David

    2014-01-01

    This is a practical guide to using web metrics to measure impact and demonstrate value. The web provides an opportunity to collect a host of different metrics, from those associated with social media accounts and websites to more traditional research outputs. This book is a clear guide for library and information professionals as to what web metrics are available and how to assess and use them to make informed decisions and demonstrate value. As individuals and organizations increasingly use the web in addition to traditional publishing avenues and formats, this book provides the tools to unlock web metrics and evaluate the impact of this content. The key topics covered include: bibliometrics, webometrics and web metrics; data collection tools; evaluating impact on the web; evaluating social media impact; investigating relationships between actors; exploring traditional publications in a new environment; web metrics and the web of data; the future of web metrics and the library and information professional.Th...

  16. Nonlinear Compensation with Modified Adaptive Digital Backpropagation in Flexigrid Networks

    DEFF Research Database (Denmark)

    Porto da Silva, Edson; Asif, Rameez; Larsen, Knud J.

    2015-01-01

    We present a modified version of adaptive digital backpropagation based on EVM metric, and numerically access its performance in a flexigrid WDM scenario.......We present a modified version of adaptive digital backpropagation based on EVM metric, and numerically access its performance in a flexigrid WDM scenario....

  17. Information Distances versus Entropy Metric

    Directory of Open Access Journals (Sweden)

    Bo Hu

    2017-06-01

    Full Text Available Information distance has become an important tool in a wide variety of applications. Various types of information distance have been made over the years. These information distance measures are different from entropy metric, as the former is based on Kolmogorov complexity and the latter on Shannon entropy. However, for any computable probability distributions, up to a constant, the expected value of Kolmogorov complexity equals the Shannon entropy. We study the similar relationship between entropy and information distance. We also study the relationship between entropy and the normalized versions of information distances.

  18. The Metric of Colour Space

    DEFF Research Database (Denmark)

    Gravesen, Jens

    2015-01-01

    The space of colours is a fascinating space. It is a real vector space, but no matter what inner product you put on the space the resulting Euclidean distance does not correspond to human perception of difference between colours. In 1942 MacAdam performed the first experiments on colour matching...... and found the MacAdam ellipses which are often interpreted as defining the metric tensor at their centres. An important question is whether it is possible to define colour coordinates such that the Euclidean distance in these coordinates correspond to human perception. Using cubic splines to represent...

  19. NeatSort - A practical adaptive algorithm

    OpenAIRE

    La Rocca, Marcello; Cantone, Domenico

    2014-01-01

    We present a new adaptive sorting algorithm which is optimal for most disorder metrics and, more important, has a simple and quick implementation. On input $X$, our algorithm has a theoretical $\\Omega (|X|)$ lower bound and a $\\mathcal{O}(|X|\\log|X|)$ upper bound, exhibiting amazing adaptive properties which makes it run closer to its lower bound as disorder (computed on different metrics) diminishes. From a practical point of view, \\textit{NeatSort} has proven itself competitive with (and of...

  20. Metrics for building performance assurance

    Energy Technology Data Exchange (ETDEWEB)

    Koles, G.; Hitchcock, R.; Sherman, M.

    1996-07-01

    This report documents part of the work performed in phase I of a Laboratory Directors Research and Development (LDRD) funded project entitled Building Performance Assurances (BPA). The focus of the BPA effort is to transform the way buildings are built and operated in order to improve building performance by facilitating or providing tools, infrastructure, and information. The efforts described herein focus on the development of metrics with which to evaluate building performance and for which information and optimization tools need to be developed. The classes of building performance metrics reviewed are (1) Building Services (2) First Costs, (3) Operating Costs, (4) Maintenance Costs, and (5) Energy and Environmental Factors. The first category defines the direct benefits associated with buildings; the next three are different kinds of costs associated with providing those benefits; the last category includes concerns that are broader than direct costs and benefits to the building owner and building occupants. The level of detail of the various issues reflect the current state of knowledge in those scientific areas and the ability of the to determine that state of knowledge, rather than directly reflecting the importance of these issues; it intentionally does not specifically focus on energy issues. The report describes work in progress and is intended as a resource and can be used to indicate the areas needing more investigation. Other reports on BPA activities are also available.

  1. Quality metrics for detailed clinical models.

    Science.gov (United States)

    Ahn, SunJu; Huff, Stanley M; Kim, Yoon; Kalra, Dipak

    2013-05-01

    To develop quality metrics for detailed clinical models (DCMs) and test their validity. Based on existing quality criteria which did not include formal metrics, we developed quality metrics by applying the ISO/IEC 9126 software quality evaluation model. The face and content validity of the initial quality metrics were assessed by 9 international experts. Content validity was defined as agreement by over 70% of the panelists. For eliciting opinions and achieving consensus of the panelists, a two round Delphi survey was conducted. Valid quality metrics were considered reliable if agreement between two evaluators' assessments of two example DCMs was over 0.60 in terms of the kappa coefficient. After reliability and validity were tested, the final DCM quality metrics were selected. According to the results of the reliability test, the degree of agreement was high (a kappa coefficient of 0.73). Based on the results of the reliability test, 8 quality evaluation domains and 29 quality metrics were finalized as DCM quality metrics. Quality metrics were validated by a panel of international DCM experts. Therefore, we expect that the metrics, which constitute essential qualitative and quantitative quality requirements for DCMs, can be used to support rational decision-making by DCM developers and clinical users. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. Fast similarity search for learned metrics.

    Science.gov (United States)

    Kulis, Brian; Jain, Prateek; Grauman, Kristen

    2009-12-01

    We introduce a method that enables scalable similarity search for learned metrics. Given pairwise similarity and dissimilarity constraints between some examples, we learn a Mahalanobis distance function that captures the examples' underlying relationships well. To allow sublinear time similarity search under the learned metric, we show how to encode the learned metric parameterization into randomized locality-sensitive hash functions. We further formulate an indirect solution that enables metric learning and hashing for vector spaces whose high dimensionality makes it infeasible to learn an explicit transformation over the feature dimensions. We demonstrate the approach applied to a variety of image data sets, as well as a systems data set. The learned metrics improve accuracy relative to commonly used metric baselines, while our hashing construction enables efficient indexing with learned distances and very large databases.

  3. Metric approach to quantum constraints

    International Nuclear Information System (INIS)

    Brody, Dorje C; Hughston, Lane P; Gustavsson, Anna C T

    2009-01-01

    A framework for deriving equations of motion for constrained quantum systems is introduced and a procedure for its implementation is outlined. In special cases, the proposed new method, which takes advantage of the fact that the space of pure states in quantum mechanics has both a symplectic structure and a metric structure, reduces to a quantum analogue of the Dirac theory of constraints in classical mechanics. Explicit examples involving spin-1/2 particles are worked out in detail: in the first example, our approach coincides with a quantum version of the Dirac formalism, while the second example illustrates how a situation that cannot be treated by Dirac's approach can nevertheless be dealt with in the present scheme.

  4. Metrics Are Needed for Collaborative Software Development

    OpenAIRE

    Mojgan Mohtashami; Cyril S. Ku; Thomas J. Marlowe

    2011-01-01

    There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitabili...

  5. A Metric Observer for Induction Motors Control

    Directory of Open Access Journals (Sweden)

    Mohamed Benbouzid

    2016-01-01

    Full Text Available This paper deals with metric observer application for induction motors. Firstly, assuming that stator currents and speed are measured, a metric observer is designed to estimate the rotor fluxes. Secondly, assuming that only stator currents are measured, another metric observer is derived to estimate rotor fluxes and speed. The proposed observer validity is checked throughout simulations on a 4 kW induction motor drive.

  6. The definitive guide to IT service metrics

    CERN Document Server

    McWhirter, Kurt

    2012-01-01

    Used just as they are, the metrics in this book will bring many benefits to both the IT department and the business as a whole. Details of the attributes of each metric are given, enabling you to make the right choices for your business. You may prefer and are encouraged to design and create your own metrics to bring even more value to your business - this book will show you how to do this, too.

  7. Chaotic inflation with metric and matter perturbations

    International Nuclear Information System (INIS)

    Feldman, H.A.; Brandenberger, R.H.

    1989-01-01

    A perturbative scheme to analyze the evolution of both metric and scalar field perturbations in an expanding universe is developed. The scheme is applied to study chaotic inflation with initial metric and scalar field perturbations present. It is shown that initial gravitational perturbations with wavelength smaller than the Hubble radius rapidly decay. The metric simultaneously picks up small perturbations determined by the matter inhomogeneities. Both are frozen in once the wavelength exceeds the Hubble radius. (orig.)

  8. Semantic Metrics for Object Oriented Design

    Science.gov (United States)

    Etzkorn, Lethe

    2003-01-01

    The purpose of this proposal is to research a new suite of object-oriented (OO) software metrics, called semantic metrics, that have the potential to help software engineers identify fragile, low quality code sections much earlier in the development cycle than is possible with traditional OO metrics. With earlier and better Fault detection, software maintenance will be less time consuming and expensive, and software reusability will be improved. Because it is less costly to correct faults found earlier than to correct faults found later in the software lifecycle, the overall cost of software development will be reduced. Semantic metrics can be derived from the knowledge base of a program understanding system. A program understanding system is designed to understand a software module. Once understanding is complete, the knowledge-base contains digested information about the software module. Various semantic metrics can be collected on the knowledge base. This new kind of metric measures domain complexity, or the relationship of the software to its application domain, rather than implementation complexity, which is what traditional software metrics measure. A semantic metric will thus map much more closely to qualities humans are interested in, such as cohesion and maintainability, than is possible using traditional metrics, that are calculated using only syntactic aspects of software.

  9. About the possibility of a generalized metric

    International Nuclear Information System (INIS)

    Lukacs, B.; Ladik, J.

    1991-10-01

    The metric (the structure of the space-time) may be dependent on the properties of the object measuring it. The case of size dependence of the metric was examined. For this dependence the simplest possible form of the metric tensor has been constructed which fulfils the following requirements: there be two extremal characteristic scales; the metric be unique and the usual between them; the change be sudden in the neighbourhood of these scales; the size of the human body appear as a parameter (postulated on the basis of some philosophical arguments). Estimates have been made for the two extremal length scales according to existing observations. (author) 19 refs

  10. Bounds for phylogenetic network space metrics.

    Science.gov (United States)

    Francis, Andrew; Huber, Katharina T; Moulton, Vincent; Wu, Taoyang

    2018-04-01

    Phylogenetic networks are a generalization of phylogenetic trees that allow for representation of reticulate evolution. Recently, a space of unrooted phylogenetic networks was introduced, where such a network is a connected graph in which every vertex has degree 1 or 3 and whose leaf-set is a fixed set X of taxa. This space, denoted [Formula: see text], is defined in terms of two operations on networks-the nearest neighbor interchange and triangle operations-which can be used to transform any network with leaf set X into any other network with that leaf set. In particular, it gives rise to a metric d on [Formula: see text] which is given by the smallest number of operations required to transform one network in [Formula: see text] into another in [Formula: see text]. The metric generalizes the well-known NNI-metric on phylogenetic trees which has been intensively studied in the literature. In this paper, we derive a bound for the metric d as well as a related metric [Formula: see text] which arises when restricting d to the subset of [Formula: see text] consisting of all networks with [Formula: see text] vertices, [Formula: see text]. We also introduce two new metrics on networks-the SPR and TBR metrics-which generalize the metrics on phylogenetic trees with the same name and give bounds for these new metrics. We expect our results to eventually have applications to the development and understanding of network search algorithms.

  11. Kerr-Schild metrics revisited. Pt. 1

    International Nuclear Information System (INIS)

    Gergely, L.A.; Perjes, Z.

    1993-04-01

    The particular way Kerr-Schild metrics incorporate a congruence of null curves in space-time is a sure source of fascination. The Kerr-Schild pencil of metrics g ab +Δl a l b is investigated in the generic case when it maps an arbitrary vacuum space-time with metric g ab to a vacuum space-time. The theorem is proved that this generic case does not contain the shear-free subclass as a smooth limit. It is shown that one of the Kota-Perjes metrics is a solution in the shearing class. (R.P.) 15 refs

  12. Comparing Evaluation Metrics for Sentence Boundary Detection

    National Research Council Canada - National Science Library

    Liu, Yang; Shriberg, Elizabeth

    2007-01-01

    .... This paper compares alternative evaluation metrics including the NIST error rate, classification error rate per word boundary, precision and recall, ROC curves, DET curves, precision-recall curves...

  13. Distance in Metric Trees and Banach Spaces

    Science.gov (United States)

    Alansari, Monairah

    This thesis contains results on metric trees and Banach spaces. There is a common thread which is about distance function. In case of metric trees, special metrics such as radial and river metrics will yield characterization theorems. In the case of Banach spaces we consider the distance from a point in the Banach space to its subspace and by putting conditions on subspaces we obtain results for the speed of convergence of the error of best approximation. We first introduce the concept of metric trees and study some of its properties and provide a new representation of metric trees by using a special set of metric rays, which we called it "crossing point sets". We have captured the four-point condition from these set and shown an equivalence between the metric trees with radial and river metrics, and the crossing point set. As an application of our characterization of metric trees via crossing point sets, we were able to index Brownian motions by a metric tree. Second part of this thesis contains results on the error of best approximation in the context of Banach spaces. The error of the best approximation to x via S is denoted by rho(x,S) defined as follows: rho(x, S) = inf d(x, y) for all y∈S. Note that the well known Weierstrass approximation theorem states that every continuous function defined on a closed interval [a,b] can be uniformly approximated by a polynomial function. Note that the Weierstrass approximation theorem gives no information about the speed of convergence for rho(f, Yn). However, Bernstein Lethargy Theorem (BLT) is about the speed of convergence for rho(f, Y n). We consider a condition on subspaces in order to improve bounds given in the Bernstein's Lethargy Theorem (BLT) for Banach spaces.

  14. Simple emission metrics for climate impacts

    Directory of Open Access Journals (Sweden)

    B. Aamaas

    2013-06-01

    Full Text Available In the context of climate change, emissions of different species (e.g., carbon dioxide and methane are not directly comparable since they have different radiative efficiencies and lifetimes. Since comparisons via detailed climate models are computationally expensive and complex, emission metrics were developed to allow a simple and straightforward comparison of the estimated climate impacts of emissions of different species. Emission metrics are not unique and variety of different emission metrics has been proposed, with key choices being the climate impacts and time horizon to use for comparisons. In this paper, we present analytical expressions and describe how to calculate common emission metrics for different species. We include the climate metrics radiative forcing, integrated radiative forcing, temperature change and integrated temperature change in both absolute form and normalised to a reference gas. We consider pulse emissions, sustained emissions and emission scenarios. The species are separated into three types: CO2 which has a complex decay over time, species with a simple exponential decay, and ozone precursors (NOx, CO, VOC which indirectly effect climate via various chemical interactions. We also discuss deriving Impulse Response Functions, radiative efficiency, regional dependencies, consistency within and between metrics and uncertainties. We perform various applications to highlight key applications of emission metrics, which show that emissions of CO2 are important regardless of what metric and time horizon is used, but that the importance of short lived climate forcers varies greatly depending on the metric choices made. Further, the ranking of countries by emissions changes very little with different metrics despite large differences in metric values, except for the shortest time horizons (GWP20.

  15. Using Genetic Algorithms for Building Metrics of Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Cristian CIUREA

    2011-01-01

    Full Text Available he paper objective is to reveal the importance of genetic algorithms in building robust metrics of collaborative systems. The main types of collaborative systems in economy are presented and some characteristics of genetic algorithms are described. A genetic algorithm was implemented in order to determine the local maximum and minimum points of the relative complexity function associated to a collaborative banking system. The intelligent collaborative systems based on genetic algorithms, representing the new generation of collaborative systems, are analyzed and the implementation of auto-adaptive interfaces in a banking application is described.

  16. On the relation of the generalized Schwarzschild metric and Tallman metric

    International Nuclear Information System (INIS)

    Sharshekeev, O.Sh.

    1977-01-01

    Relation of the Schwarzschild generalized metric (the Schwarzschild metric with regard for the four-dimension tensor of curvation) with the Tollman metric is considered. It is shown, that the Schwarzschild problem solution in the Tollman metric is quite correct as well. The obtained solutions meet the following requirements: conformity principle is carried out, transformation functional determinant is final everywhere, excluding the centre, where a singular point is to be

  17. Optimal Transportation and Curvature of Metric Spaces

    OpenAIRE

    Eskin, Thomas

    2013-01-01

    In this thesis we study the notion of non-negative Ricci curvature for compact metric measure spaces introduced by Lott and Villani in their article (2009): Ricci curvature for metric measure spaces via optimal transport. We also define and prove the required prerequisites concerning length spaces, convex analysis, measure theory, and optimal transportation.

  18. Enhancing Authentication Models Characteristic Metrics via ...

    African Journals Online (AJOL)

    In this work, we derive the universal characteristic metrics set for authentication models based on security, usability and design issues. We then compute the probability of the occurrence of each characteristic metrics in some single factor and multifactor authentication models in order to determine the effectiveness of these ...

  19. Quantitative metric theory of continued fractions

    Indian Academy of Sciences (India)

    2 (log log n). 1. 2 +ǫ) almost everywhere with respect to the Lebesgue measure. Keywords. Continued fractions; ergodic averages; metric theory of numbers. Mathematics Subject Classification. Primary: 11K50; Secondary: 28D99. 1. Introduction. In this paper, we use a quantitative L2-ergodic theorem to study the metrical ...

  20. Metric solution of a spinning mass

    International Nuclear Information System (INIS)

    Sato, H.

    1982-01-01

    Studies on a particular class of asymptotically flat and stationary metric solutions called the Kerr-Tomimatsu-Sato class are reviewed about its derivation and properties. For a further study, an almost complete list of the papers worked on the Tomimatsu-Sato metrics is given. (Auth.)

  1. Finite Metric Spaces of Strictly Negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul; Lisonek, P.; Markvorsen, Steen

    1998-01-01

    We prove that, if a finite metric space is of strictly negative type, then its transfinite diameter is uniquely realized by the infinite extender (load vector). Finite metric spaces that have this property include all spaces on two, three, or four points, all trees, and all finite subspaces of Eu...

  2. Gravitational Metric Tensor Exterior to Rotating Homogeneous ...

    African Journals Online (AJOL)

    The covariant and contravariant metric tensors exterior to a homogeneous spherical body rotating uniformly about a common φ axis with constant angular velocity ω is constructed. The constructed metric tensors in this gravitational field have seven non-zero distinct components.The Lagrangian for this gravitational field is ...

  3. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    In this paper, we construct an invariant metric in the space of homogeneous polynomials of a given degree (≥ 3). The homogeneous polynomials specify a nonlinear symplectic map which in turn represents a Hamiltonian system. By minimizing the norm constructed out of this metric as a function of system parameters, we ...

  4. On Information Metrics for Spatial Coding.

    Science.gov (United States)

    Souza, Bryan C; Pavão, Rodrigo; Belchior, Hindiael; Tort, Adriano B L

    2018-04-01

    The hippocampal formation is involved in navigation, and its neuronal activity exhibits a variety of spatial correlates (e.g., place cells, grid cells). The quantification of the information encoded by spikes has been standard procedure to identify which cells have spatial correlates. For place cells, most of the established metrics derive from Shannon's mutual information (Shannon, 1948), and convey information rate in bits/s or bits/spike (Skaggs et al., 1993, 1996). Despite their widespread use, the performance of these metrics in relation to the original mutual information metric has never been investigated. In this work, using simulated and real data, we find that the current information metrics correlate less with the accuracy of spatial decoding than the original mutual information metric. We also find that the top informative cells may differ among metrics, and show a surrogate-based normalization that yields comparable spatial information estimates. Since different information metrics may identify different neuronal populations, we discuss current and alternative definitions of spatially informative cells, which affect the metric choice. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  5. Fixed point theory in metric type spaces

    CERN Document Server

    Agarwal, Ravi P; O’Regan, Donal; Roldán-López-de-Hierro, Antonio Francisco

    2015-01-01

    Written by a team of leading experts in the field, this volume presents a self-contained account of the theory, techniques and results in metric type spaces (in particular in G-metric spaces); that is, the text approaches this important area of fixed point analysis beginning from the basic ideas of metric space topology. The text is structured so that it leads the reader from preliminaries and historical notes on metric spaces (in particular G-metric spaces) and on mappings, to Banach type contraction theorems in metric type spaces, fixed point theory in partially ordered G-metric spaces, fixed point theory for expansive mappings in metric type spaces, generalizations, present results and techniques in a very general abstract setting and framework. Fixed point theory is one of the major research areas in nonlinear analysis. This is partly due to the fact that in many real world problems fixed point theory is the basic mathematical tool used to establish the existence of solutions to problems which arise natur...

  6. Software Power Metric Model: An Implementation | Akwukwuma ...

    African Journals Online (AJOL)

    ... and the execution time (TIME) in each case was recorded. We then obtain the application functions point count. Our result shows that the proposed metric is computable, consistent in its use of unit, and is programming language independent. Keywords: Software attributes, Software power, measurement, Software metric, ...

  7. Validation of Metrics for Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2008-01-01

    Full Text Available This paper describe the new concepts of collaborative systems metrics validation. The paper define the quality characteristics of collaborative systems. There are proposed a metric to estimate the quality level of collaborative systems. There are performed measurements of collaborative systems quality using a specially designed software.

  8. Fuzzy Set Field and Fuzzy Metric

    OpenAIRE

    Gebray, Gebru; Reddy, B. Krishna

    2014-01-01

    The notation of fuzzy set field is introduced. A fuzzy metric is redefined on fuzzy set field and on arbitrary fuzzy set in a field. The metric redefined is between fuzzy points and constitutes both fuzziness and crisp property of vector. In addition, a fuzzy magnitude of a fuzzy point in a field is defined.

  9. The metrics of science and technology

    CERN Document Server

    Geisler, Eliezer

    2000-01-01

    Dr. Geisler's far-reaching, unique book provides an encyclopedic compilation of the key metrics to measure and evaluate the impact of science and technology on academia, industry, and government. Focusing on such items as economic measures, patents, peer review, and other criteria, and supported by an extensive review of the literature, Dr. Geisler gives a thorough analysis of the strengths and weaknesses inherent in metric design, and in the use of the specific metrics he cites. His book has already received prepublication attention, and will prove especially valuable for academics in technology management, engineering, and science policy; industrial R&D executives and policymakers; government science and technology policymakers; and scientists and managers in government research and technology institutions. Geisler maintains that the application of metrics to evaluate science and technology at all levels illustrates the variety of tools we currently possess. Each metric has its own unique strengths and...

  10. Some Properties of Metric Polytope Constraints

    Directory of Open Access Journals (Sweden)

    V. A. Bondarenko

    2014-01-01

    Full Text Available The integrality recognition problem is considered on the sequence Mn,k of the nested Boolean quadric polytope relaxations, including the rooted semimetric Mn and the metric Mn,3 polytopes. Constraints of the metric polytope cut off all faces of the rooted semimetric polytope, containing only fractional vertices, that allows to solve the problem of integrality recognition on Mn in polynomial time. To solve the problem of integrality recognition on the metric polytope, we consider the possibility of cutting off all fractional faces of Mn,3 by some relaxation Mn,k. We represent the coordinates of the metric polytope in a homogeneous form by a three-dimensional block matrix. We show that to answer the question of the metric polytope fractional faces cutting off, it is sufficient to consider only constraints of the triangle inequalities form.

  11. Smart Grid Status and Metrics Report Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Antonopoulos, Chrissi A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clements, Samuel L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gorrissen, Willy J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ruiz, Kathleen A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Smith, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gardner, Chris [APQC, Houston, TX (United States); Varney, Jeff [APQC, Houston, TX (United States)

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  12. Metrics for border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  13. Bounded Linear Stability Margin Analysis of Nonlinear Hybrid Adaptive Control

    Science.gov (United States)

    Nguyen, Nhan T.; Boskovic, Jovan D.

    2008-01-01

    This paper presents a bounded linear stability analysis for a hybrid adaptive control that blends both direct and indirect adaptive control. Stability and convergence of nonlinear adaptive control are analyzed using an approximate linear equivalent system. A stability margin analysis shows that a large adaptive gain can lead to a reduced phase margin. This method can enable metrics-driven adaptive control whereby the adaptive gain is adjusted to meet stability margin requirements.

  14. Similarity metrics for surgical process models.

    Science.gov (United States)

    Neumuth, Thomas; Loebe, Frank; Jannin, Pierre

    2012-01-01

    The objective of this work is to introduce a set of similarity metrics for comparing surgical process models (SPMs). SPMs are progression models of surgical interventions that support quantitative analyses of surgical activities, supporting systems engineering or process optimization. Five different similarity metrics are presented and proven. These metrics deal with several dimensions of process compliance in surgery, including granularity, content, time, order, and frequency of surgical activities. The metrics were experimentally validated using 20 clinical data sets each for cataract interventions, craniotomy interventions, and supratentorial tumor resections. The clinical data sets were controllably modified in simulations, which were iterated ten times, resulting in a total of 600 simulated data sets. The simulated data sets were subsequently compared to the original data sets to empirically assess the predictive validity of the metrics. We show that the results of the metrics for the surgical process models correlate significantly (pmetrics meet predictive validity. The clinical use of the metrics was exemplarily, as demonstrated by assessment of the learning curves of observers during surgical process model acquisition. Measuring similarity between surgical processes is a complex task. However, metrics for computing the similarity between surgical process models are needed in many uses in the field of medical engineering. These metrics are essential whenever two SPMs need to be compared, such as during the evaluation of technical systems, the education of observers, or the determination of surgical strategies. These metrics are key figures that provide a solid base for medical decisions, such as during validation of sensor systems for use in operating rooms in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Families of quasi-pseudo-metrics generated by probabilistic quasi-pseudo-metric spaces

    Directory of Open Access Journals (Sweden)

    Mariusz T. Grabiec

    2008-03-01

    Full Text Available This paper contains a study of families of quasi-pseudo-metrics (the concept of a quasi-pseudo-metric was introduced by Wilson (1931 , Albert (1941 and Kelly (1963 generated by probabilistic quasi-pseudo-metric-spaces which are generalization of probabilistic metric space (PM-space shortly [2, 3, 4, 6]. The idea of PM-spaces was introduced by Menger (1942, 1951, Schweizer and Sklar (1983 and Serstnev (1965. Families of pseudo-metrics generated by PM-spaces and those generalizing PM-spaces have been described by Stevens (1968 and Nishiure (1970.

  16. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  17. Fighter agility metrics. M.S. Thesis

    Science.gov (United States)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  18. SAPHIRE 8 Quality Assurance Software Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros

    2011-08-01

    The purpose of this review of software metrics is to examine the quality of the metrics gathered in the 2010 IV&V and to set an outline for results of updated metrics runs to be performed. We find from the review that the maintenance of accepted quality standards presented in the SAPHIRE 8 initial Independent Verification and Validation (IV&V) of April, 2010 is most easily achieved by continuing to utilize the tools used in that effort while adding a metric of bug tracking and resolution. Recommendations from the final IV&V were to continue periodic measurable metrics such as McCabe's complexity measure to ensure quality is maintained. The four software tools used to measure quality in the IV&V were CodeHealer, Coverage Validator, Memory Validator, Performance Validator, and Thread Validator. These are evaluated based on their capabilities. We attempted to run their latest revisions with the newer Delphi 2010 based SAPHIRE 8 code that has been developed and was successful with all of the Validator series of tools on small tests. Another recommendation from the IV&V was to incorporate a bug tracking and resolution metric. To improve our capability of producing this metric, we integrated our current web reporting system with the SpiraTest test management software purchased earlier this year to track requirements traceability.

  19. An Underwater Color Image Quality Evaluation Metric.

    Science.gov (United States)

    Yang, Miao; Sowmya, Arcot

    2015-12-01

    Quality evaluation of underwater images is a key goal of underwater video image retrieval and intelligent processing. To date, no metric has been proposed for underwater color image quality evaluation (UCIQE). The special absorption and scattering characteristics of the water medium do not allow direct application of natural color image quality metrics especially to different underwater environments. In this paper, subjective testing for underwater image quality has been organized. The statistical distribution of the underwater image pixels in the CIELab color space related to subjective evaluation indicates the sharpness and colorful factors correlate well with subjective image quality perception. Based on these, a new UCIQE metric, which is a linear combination of chroma, saturation, and contrast, is proposed to quantify the non-uniform color cast, blurring, and low-contrast that characterize underwater engineering and monitoring images. Experiments are conducted to illustrate the performance of the proposed UCIQE metric and its capability to measure the underwater image enhancement results. They show that the proposed metric has comparable performance to the leading natural color image quality metrics and the underwater grayscale image quality metrics available in the literature, and can predict with higher accuracy the relative amount of degradation with similar image content in underwater environments. Importantly, UCIQE is a simple and fast solution for real-time underwater video processing. The effectiveness of the presented measure is also demonstrated by subjective evaluation. The results show better correlation between the UCIQE and the subjective mean opinion score.

  20. Optimal Lyapunov metrics of expansive homeomorphisms

    International Nuclear Information System (INIS)

    Dovbysh, S A

    2006-01-01

    We sharpen the following results of Reddy, Sakai and Fried: any expansive homeomorphism of a metrizable compactum admits a Lyapunov metric compatible with the topology, and if we also assume the existence of a local product structure (that is, if the homeomorphism is an A*-homeomorphism in the terminology of Alekseev and Yakobson, or possesses hyperbolic canonical coordinates in the terminology of Bowen, or together with the metric compactum constitutes a Smale space in the terminology by Ruelle), then we also obtain the validity of Ruelle's technical axiom on the Lipschitz property of the homeomorphism, its inverse, and the local product structure. It is shown that any expansive homeomorphism admits a Lyapunov metric such that the homeomorphism on local stable (resp. unstable) 'manifolds' is approximately representable on a small scale as a contraction (resp. expansion) with constant coefficient λ s (resp. λ u -1 ) in this metric. For A*-homeomorphisms, we prove that the desired metric can be approximately represented on a small scale as the direct sum of metrics corresponding to the canonical coordinates determined by the local product structure and that local 'manifolds' are 'flat' in some sense. It is also proved that the lower bounds for the contraction constants λ s and expansion constants λ u of A*-homeomorphisms are attained simultaneously for some metric that satisfies all the conditions described

  1. Characterising risk - aggregated metrics: radiation and noise

    International Nuclear Information System (INIS)

    Passchier, W.

    1998-01-01

    The characterisation of risk is an important phase in the risk assessment - risk management process. From the multitude of risk attributes a few have to be selected to obtain a risk characteristic or profile that is useful for risk management decisions and implementation of protective measures. One way to reduce the number of attributes is aggregation. In the field of radiation protection such an aggregated metric is firmly established: effective dose. For protection against environmental noise the Health Council of the Netherlands recently proposed a set of aggregated metrics for noise annoyance and sleep disturbance. The presentation will discuss similarities and differences between these two metrics and practical limitations. The effective dose has proven its usefulness in designing radiation protection measures, which are related to the level of risk associated with the radiation practice in question, given that implicit judgements on radiation induced health effects are accepted. However, as the metric does not take into account the nature of radiation practice, it is less useful in policy discussions on the benefits and harm of radiation practices. With respect to the noise exposure metric, only one effect is targeted (annoyance), and the differences between sources are explicitly taken into account. This should make the metric useful in policy discussions with respect to physical planning and siting problems. The metric proposed has only significance on a population level, and can not be used as a predictor for individual risk. (author)

  2. Metrics required for Power System Resilient Operations and Protection

    Energy Technology Data Exchange (ETDEWEB)

    Eshghi, K.; Johnson, B. K.; Rieger, C. G.

    2016-08-01

    Today’s complex grid involves many interdependent systems. Various layers of hierarchical control and communication systems are coordinated, both spatially and temporally to achieve gird reliability. As new communication network based control system technologies are being deployed, the interconnected nature of these systems is becoming more complex. Deployment of smart grid concepts promises effective integration of renewable resources, especially if combined with energy storage. However, without a philosophical focus on resilience, a smart grid will potentially lead to higher magnitude and/or duration of disruptive events. The effectiveness of a resilient infrastructure depends upon its ability to anticipate, absorb, adapt to, and/or rapidly recover from a potentially catastrophic event. Future system operations can be enhanced with a resilient philosophy through architecting the complexity with state awareness metrics that recognize changing system conditions and provide for an agile and adaptive response. The starting point for metrics lies in first understanding the attributes of performance that will be qualified. In this paper, we will overview those attributes and describe how they will be characterized by designing a distributed agent that can be applied to the power grid.

  3. Metrics Are Needed for Collaborative Software Development

    Directory of Open Access Journals (Sweden)

    Mojgan Mohtashami

    2011-10-01

    Full Text Available There is a need for metrics for inter-organizational collaborative software development projects, encompassing management and technical concerns. In particular, metrics are needed that are aimed at the collaborative aspect itself, such as readiness for collaboration, the quality and/or the costs and benefits of collaboration in a specific ongoing project. We suggest questions and directions for such metrics, spanning the full lifespan of a collaborative project, from considering the suitability of collaboration through evaluating ongoing projects to final evaluation of the collaboration.

  4. On metrics and super-Riemann surfaces

    International Nuclear Information System (INIS)

    Hodgkin, L.

    1987-01-01

    It is shown that any super-Riemann surface M admits a large space of metrics (in a rather basic sense); while if M is of compact genus g type, g>1, M admits a unique metric whose lift to the universal cover is superconformally equivalent to the standard (Baranov-Shvarts) metric on the super-half plane. This explains the relation between the different methods of calculation of the upper Teichmueller space by the author (using arbitrary superconformal transformations) and Crane and Rabin (using only isometries). (orig.)

  5. Finite Metric Spaces of Strictly negative Type

    DEFF Research Database (Denmark)

    Hjorth, Poul G.

    If a finite metric space is of strictly negative type then its transfinite diameter is uniquely realized by an infinite extent (“load vector''). Finite metric spaces that have this property include all trees, and all finite subspaces of Euclidean and Hyperbolic spaces. We prove that if the distance...... matrix of a finite metric space is both hypermetric and regular, then it is of strictly negative type. We show that the strictly negative type finite subspaces of spheres are precisely those which do not contain two pairs of antipodal points....

  6. Applying Sigma Metrics to Reduce Outliers.

    Science.gov (United States)

    Litten, Joseph

    2017-03-01

    Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  8. Learnometrics: Metrics for Learning Objects (Learnometrics: metrieken voor leerobjecten)

    OpenAIRE

    Ochoa, Xavier

    2008-01-01

    - Introduction - Quantitative Analysis of the Publication of Learning Objects - Quantiative Analysis of the Reuse of Learning Objects - Metadata Quality Metrics for Learning Objects - Relevance Ranking Metrics for Learning Objects - Metrics Service Architecture and Use Cases - Conclusions

  9. MPLS/VPN traffic engineering: SLA metrics

    Science.gov (United States)

    Cherkaoui, Omar; MacGibbon, Brenda; Blais, Michel; Serhrouchni, Ahmed

    2001-07-01

    Traffic engineering must be concerned with a broad definition of service that includes network availability, reliability and stability, as well as traditional traffic data on loss, throughput, delay and jitter. MPLS and Virtual Private Networks (VPNs) significantly contribute to security and Quality of Service (QoS) within communication networks, but there remains a need for metric measurement and evaluation. The purpose of this paper is to propose a methodology which gives a measure for LSP ( Lfew abel Switching Paths) metrics in VPN MPLS networks. We propose here a statistical method for the evaluation of those metrics. Statistical methodology is very important in this type of study since there is a large amount of data to consider. We use the notions of sample surveys, self-similar processes, linear regression, additive models and bootstrapping. The results obtained allows us to estimate the different metrics for such SLAs.

  10. Einstein metrics on tangent bundles of spheres

    Energy Technology Data Exchange (ETDEWEB)

    Dancer, Andrew S [Jesus College, Oxford University, Oxford OX1 3DW (United Kingdom); Strachan, Ian A B [Department of Mathematics, University of Hull, Hull HU6 7RX (United Kingdom)

    2002-09-21

    We give an elementary treatment of the existence of complete Kaehler-Einstein metrics with nonpositive Einstein constant and underlying manifold diffeomorphic to the tangent bundle of the (n+1)-sphere.

  11. Medicare Contracting - Redacted Benchmark Metric Reports

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...

  12. Metric Guidelines Inservice and/or Preservice

    Science.gov (United States)

    Granito, Dolores

    1978-01-01

    Guidelines are given for designing teacher training for going metric. The guidelines were developed from existing guidelines, journal articles, a survey of colleges, and the detailed reactions of a panel. (MN)

  13. Variational principles for amenable metric mean dimensions

    OpenAIRE

    Chen, Ercai; Dou, Dou; Zheng, Dongmei

    2017-01-01

    In this paper, we prove variational principles between metric mean dimension and rate distortion function for countable discrete amenable group actions which extend recently results by Lindenstrauss and Tsukamoto.

  14. Science and Technology Metrics and Other Thoughts

    National Research Council Canada - National Science Library

    Harman, Wayne; Staton, Robin

    2006-01-01

    This report explores the subject of science and technology metrics and other topics to begin to provide Navy managers, as well as scientists and engineers, additional tools and concepts with which to...

  15. Clean Cities Annual Metrics Report 2009 (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  16. Flight Crew State Monitoring Metrics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — eSky will develop specific crew state metrics based on the timeliness, tempo and accuracy of pilot inputs required by the H-mode Flight Control System (HFCS)....

  17. Performance metrics used by freight transport providers.

    Science.gov (United States)

    2008-09-30

    The newly-established National Cooperative Freight Research Program (NCFRP) has allocated $300,000 in funding to a project entitled Performance Metrics for Freight Transportation (NCFRP 03). The project is scheduled for completion in September ...

  18. New quality metrics for digital image resizing

    Science.gov (United States)

    Kim, Hongseok; Kumara, Soundar

    2007-09-01

    Digital image rescaling by interpolation has been intensively researched over past decades, and still getting constant attention from many applications such as medical diagnosis, super-resolution, image blow-up, nano-manufacturing, etc. However, there are no consented metrics to objectively assess and compare the quality of resized images. Some existing measures such as peak-signal-to-noise ratio (PSNR) or mean-squared error (MSE), widely used in image restoration area, do not always coincide with the opinions from viewers. Enlarged digital images generally suffer from two major artifacts: blurring, zigzagging, and those undesirable effects especially around edges significantly degrade the overall perceptual image quality. We propose two new image quality metrics to measure the degree of the two major defects, and compare several existing interpolation methods using the proposed metrics. We also evaluate the validity of image quality metrics by comparing rank correlations.

  19. Supplier selection using different metric functions

    Directory of Open Access Journals (Sweden)

    Omosigho S.E.

    2015-01-01

    Full Text Available Supplier selection is an important component of supply chain management in today’s global competitive environment. Hence, the evaluation and selection of suppliers have received considerable attention in the literature. Many attributes of suppliers, other than cost, are considered in the evaluation and selection process. Therefore, the process of evaluation and selection of suppliers is a multi-criteria decision making process. The methodology adopted to solve the supplier selection problem is intuitionistic fuzzy TOPSIS (Technique for Order Preference by Similarity to the Ideal Solution. Generally, TOPSIS is based on the concept of minimum distance from the positive ideal solution and maximum distance from the negative ideal solution. We examine the deficiencies of using only one metric function in TOPSIS and propose the use of spherical metric function in addition to the commonly used metric functions. For empirical supplier selection problems, more than one metric function should be used.

  20. Environmental metrics for community health improvement.

    Science.gov (United States)

    Jakubowski, Benjamin; Frumkin, Howard

    2010-07-01

    Environmental factors greatly affect human health. Accordingly, environmental metrics are a key part of the community health information base. We review environmental metrics relevant to community health, including measurements of contaminants in environmental media, such as air, water, and food; measurements of contaminants in people (biomonitoring); measurements of features of the built environment that affect health; and measurements of "upstream" environmental conditions relevant to health. We offer a set of metrics (including unhealthy exposures, such as pollutants, and health-promoting assets, such as parks and green space) selected on the basis of relevance to health outcomes, magnitude of associated health outcomes, corroboration in the peer-reviewed literature, and data availability, especially at the community level, and we recommend ways to use these metrics most effectively.

  1. New Gromov-Inspired Metrics on Phylogenetic Tree Space.

    Science.gov (United States)

    Liebscher, Volkmar

    2018-03-01

    We present a new class of metrics for unrooted phylogenetic X-trees inspired by the Gromov-Hausdorff distance for (compact) metric spaces. These metrics can be efficiently computed by linear or quadratic programming. They are robust under NNI operations, too. The local behaviour of the metrics shows that they are different from any previously introduced metrics. The performance of the metrics is briefly analysed on random weighted and unweighted trees as well as random caterpillars.

  2. Target Scattering Metrics: Model-Model and Model Data comparisons

    Science.gov (United States)

    2017-12-13

    be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for...stainless steel replica of artillery shell Table 7. Targets used in the TIER simulations for the metrics study. C. Four Potential Metrics: Four...Four metrics were investigated. The metric, based on 2D cross-correlation, is typically used in classification algorithms. Model-model comparisons

  3. Transport impacts on atmosphere and climate: Metrics

    Science.gov (United States)

    Fuglestvedt, J. S.; Shine, K. P.; Berntsen, T.; Cook, J.; Lee, D. S.; Stenke, A.; Skeie, R. B.; Velders, G. J. M.; Waitz, I. A.

    2010-12-01

    The transport sector emits a wide variety of gases and aerosols, with distinctly different characteristics which influence climate directly and indirectly via chemical and physical processes. Tools that allow these emissions to be placed on some kind of common scale in terms of their impact on climate have a number of possible uses such as: in agreements and emission trading schemes; when considering potential trade-offs between changes in emissions resulting from technological or operational developments; and/or for comparing the impact of different environmental impacts of transport activities. Many of the non-CO 2 emissions from the transport sector are short-lived substances, not currently covered by the Kyoto Protocol. There are formidable difficulties in developing metrics and these are particularly acute for such short-lived species. One difficulty concerns the choice of an appropriate structure for the metric (which may depend on, for example, the design of any climate policy it is intended to serve) and the associated value judgements on the appropriate time periods to consider; these choices affect the perception of the relative importance of short- and long-lived species. A second difficulty is the quantification of input parameters (due to underlying uncertainty in atmospheric processes). In addition, for some transport-related emissions, the values of metrics (unlike the gases included in the Kyoto Protocol) depend on where and when the emissions are introduced into the atmosphere - both the regional distribution and, for aircraft, the distribution as a function of altitude, are important. In this assessment of such metrics, we present Global Warming Potentials (GWPs) as these have traditionally been used in the implementation of climate policy. We also present Global Temperature Change Potentials (GTPs) as an alternative metric, as this, or a similar metric may be more appropriate for use in some circumstances. We use radiative forcings and lifetimes

  4. Static and Dynamic Software Quality Metric Tools

    OpenAIRE

    Mayo, Kevin A.; Wake, Steven A.; Henry, Sallie M.

    1990-01-01

    The ability to detect and predict poor software quality is of major importance to software engineers, managers, and quality assurance organizations. Poor software quality leads to increased development costs and expensive maintenance. With so much attention on exacerbated budgetary constraints, a viable alternative is necessary. Software quality metrics are designed for this purpose. Metrics measure aspects of code or PDL representations, and can be collected and used throughout the life ...

  5. Effective dimension in some general metric spaces

    Directory of Open Access Journals (Sweden)

    Elvira Mayordomo

    2014-03-01

    Full Text Available We introduce the concept of effective dimension for a general metric space. Effective dimension was defined by Lutz in (Lutz 2003 for Cantor space and has also been extended to Euclidean space. Our extension to other metric spaces is based on a supergale characterization of Hausdorff dimension. We present here the concept of constructive dimension and its characterization in terms of Kolmogorov complexity. Further research directions are indicated.

  6. GRC GSFC TDRSS Waveform Metrics Report

    Science.gov (United States)

    Mortensen, Dale J.

    2013-01-01

    The report presents software metrics and porting metrics for the GGT Waveform. The porting was from a ground-based COTS SDR, the SDR-3000, to the CoNNeCT JPL SDR. The report does not address any of the Operating Environment (OE) software development, nor the original TDRSS waveform development at GSFC for the COTS SDR. With regard to STRS, the report presents compliance data and lessons learned.

  7. Autonomous Exploration Using an Information Gain Metric

    Science.gov (United States)

    2016-03-01

    navigation goals, serving to drive an autonomous system. By continually moving to these navigation goals and taking measurements, the system works to...ARL-TR-7638 ● MAR 2016 US Army Research Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung...Laboratory Autonomous Exploration Using an Information Gain Metric by Nicholas C Fung, Jason M Gregory, and John G Rogers Computational and

  8. A Laplacian on Metric Measure Spaces

    DEFF Research Database (Denmark)

    Kokkendorff, Simon Lyngby

    2006-01-01

    We introduce a Laplacian on a class of metric measure spaces via a direct pointwise mean value definition. Fundamental properties of this Laplacian, such as its symmetry as an operator on functions satisfying a Neumann or Dirichlet condition, are established.......We introduce a Laplacian on a class of metric measure spaces via a direct pointwise mean value definition. Fundamental properties of this Laplacian, such as its symmetry as an operator on functions satisfying a Neumann or Dirichlet condition, are established....

  9. Engineering Design Handbook. Metric Conversion Guide

    Science.gov (United States)

    1976-07-01

    metre /second (m3/ s ) WORK (SEE ENERGY) 5-40 OARCOM-P 706-470 TABLE 5-3 EXPERIMENTALLY DETERMINED CONSTANTS Avogadro constant, N. Bohr...result of international economic and political situations, the metric question was not seriously considered until the 1950’ s . Then, the opening of... Law 90-472 authorizing the Department of Commerce to conduct the United States Metric Study was passed by Congress. 1975: The Deputy Secretary of

  10. Area Regge calculus and discontinuous metrics

    International Nuclear Information System (INIS)

    Wainwright, Chris; Williams, Ruth M

    2004-01-01

    Taking the triangle areas as independent variables in the theory of Regge calculus can lead to ambiguities in the edge lengths, which can be interpreted as discontinuities in the metric. We construct solutions to area Regge calculus using a triangulated lattice and find that on a spacelike or timelike hypersurface no such discontinuity can arise. On a null hypersurface however, we can have such a situation and the resulting metric can be interpreted as a so-called refractive wave

  11. Some observations on a fuzzy metric space

    Energy Technology Data Exchange (ETDEWEB)

    Gregori, V.

    2017-07-01

    Let $(X,d)$ be a metric space. In this paper we provide some observations about the fuzzy metric space in the sense of Kramosil and Michalek $(Y,N,/wedge)$, where $Y$ is the set of non-negative real numbers $[0,/infty[$ and $N(x,y,t)=1$ if $d(x,y)/leq t$ and $N(x,y,t)=0$ if $d(x,y)/geq t$. (Author)

  12. Node self-connections in network metrics.

    Science.gov (United States)

    Saura, Santiago

    2018-02-01

    Zamborain-Mason et al. (Ecol. Lett., 20, 2017, 815-831) state that they have newly proposed network metrics that account for node self-connections. Network metrics incorporating node self-connections, also referred to as intranode (intrapatch) connectivity, were however already proposed before and have been widely used in a variety of conservation planning applications. © 2017 The Author. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  13. Relaxed metrics and indistinguishability operators: the relationship

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.

    2017-07-01

    In 1982, the notion of indistinguishability operator was introduced by E. Trillas in order to fuzzify the crisp notion of equivalence relation (/cite{Trillas}). In the study of such a class of operators, an outstanding property must be pointed out. Concretely, there exists a duality relationship between indistinguishability operators and metrics. The aforesaid relationship was deeply studied by several authors that introduced a few techniques to generate metrics from indistinguishability operators and vice-versa (see, for instance, /cite{BaetsMesiar,BaetsMesiar2}). In the last years a new generalization of the metric notion has been introduced in the literature with the purpose of developing mathematical tools for quantitative models in Computer Science and Artificial Intelligence (/cite{BKMatthews,Ma}). The aforementioned generalized metrics are known as relaxed metrics. The main target of this talk is to present a study of the duality relationship between indistinguishability operators and relaxed metrics in such a way that the aforementioned classical techniques to generate both concepts, one from the other, can be extended to the new framework. (Author)

  14. Peano compactifications and property S metric spaces

    Directory of Open Access Journals (Sweden)

    R. F. Dickman

    1980-01-01

    Full Text Available Let (X,d denote a locally connected, connected separable metric space. We say the X is S-metrizable provided there is a topologically equivalent metric ρ on X such that (X,ρ has Property S, i.e. for any ϵ>0, X is the union of finitely many connected sets of ρ-diameter less than ϵ. It is well-known that S-metrizable spaces are locally connected and that if ρ is a Property S metric for X, then the usual metric completion (X˜,ρ˜ of (X,ρ is a compact, locally connected, connected metric space, i.e. (X˜,ρ˜ is a Peano compactification of (X,ρ. There are easily constructed examples of locally connected connected metric spaces which fail to be S-metrizable, however the author does not know of a non-S-metrizable space (X,d which has a Peano compactification. In this paper we conjecture that: If (P,ρ a Peano compactification of (X,ρ|X, X must be S-metrizable. Several (new necessary and sufficient for a space to be S-metrizable are given, together with an example of non-S-metrizable space which fails to have a Peano compactification.

  15. Almost convex metrics and Peano compactifications

    Directory of Open Access Journals (Sweden)

    R. F. Dickman

    1982-01-01

    Full Text Available Let (X,d denote a locally connected, connected separable metric space. We say the X is S-metrizable provided there is a topologically equivalent metric ρ on X such that (X,ρ has Property S, i.e., for any ϵ>0, X is the union of finitely many connected sets of ρ-diameter less than ϵ. It is well-known that S-metrizable spaces are locally connected and that if ρ is a Property S metric for X, then the usual metric completion (X˜,ρ˜ of (X,ρ is a compact, locally connected, connected metric space; i.e., (X˜,ρ˜ is a Peano compactification of (X,ρ. In an earlier paper, the author conjectured that if a space (X,d has a Peano compactification, then it must be S-metrizable. In this paper, that conjecture is shown to be false; however, the connected spaces which have Peano compactificatons are shown to be exactly those having a totally bounded, almost convex metric. Several related results are given.

  16. Metrics for Offline Evaluation of Prognostic Performance

    Directory of Open Access Journals (Sweden)

    Sankalita Saha

    2010-01-01

    Full Text Available Prognostic performance evaluation has gained significant attention in the past few years.Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few. The research community has used a variety of metrics largely based on convenience and their respective requirements. Very little attention has been focused on establishing a standardized approach to compare different efforts. This paper presents several new evaluation metrics tailored for prognostics that were recently introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. These metrics have the capability of incorporating probabilistic uncertainty estimates from prognostic algorithms. In addition to quantitative assessment they also offer a comprehensive visual perspective that can be used in designing the prognostic system. Several methods are suggested to customize these metrics for different applications. Guidelines are provided to help choose one method over another based on distribution characteristics. Various issues faced by prognostics and its performance evaluation are discussed followed by a formal notational framework to help standardize subsequent developments.

  17. 48 CFR 611.002-70 - Metric system implementation.

    Science.gov (United States)

    2010-10-01

    ... of measurement sensitive processes and systems to the metric system. Soft metric means the result of... with security, operations, economic, technical, logistical, training and safety requirements. (3) The...

  18. Localized Multi-Model Extremes Metrics for the Fourth National Climate Assessment

    Science.gov (United States)

    Thompson, T. R.; Kunkel, K.; Stevens, L. E.; Easterling, D. R.; Biard, J.; Sun, L.

    2017-12-01

    We have performed localized analysis of scenario-based datasets for the Fourth National Climate Assessment (NCA4). These datasets include CMIP5-based Localized Constructed Analogs (LOCA) downscaled simulations at daily temporal resolution and 1/16th-degree spatial resolution. Over 45 temperature and precipitation extremes metrics have been processed using LOCA data, including threshold, percentile, and degree-days calculations. The localized analysis calculates trends in the temperature and precipitation extremes metrics for relatively small regions such as counties, metropolitan areas, climate zones, administrative areas, or economic zones. For NCA4, we are currently addressing metropolitan areas as defined by U.S. Census Bureau Metropolitan Statistical Areas. Such localized analysis provides essential information for adaptation planning at scales relevant to local planning agencies and businesses. Nearly 30 such regions have been analyzed to date. Each locale is defined by a closed polygon that is used to extract LOCA-based extremes metrics specific to the area. For each metric, single-model data at each LOCA grid location are first averaged over several 30-year historical and future periods. Then, for each metric, the spatial average across the region is calculated using model weights based on both model independence and reproducibility of current climate conditions. The range of single-model results is also captured on the same localized basis, and then combined with the weighted ensemble average for each region and each metric. For example, Boston-area cooling degree days and maximum daily temperature is shown below for RCP8.5 (red) and RCP4.5 (blue) scenarios. We also discuss inter-regional comparison of these metrics, as well as their relevance to risk analysis for adaptation planning.

  19. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  20. Baby universe metric equivalent to an interior black-hole metric

    International Nuclear Information System (INIS)

    Gonzalez-Diaz, P.F.

    1991-01-01

    It is shown that the maximally extended metric corresponding to a large wormhole is the unique possible wormhole metric whose baby universe sector is conformally equivalent ot the maximal inextendible Kruskal metric corresponding to the interior region of a Schwarzschild black hole whose gravitational radius is half the wormhole neck radius. The physical implications of this result in the black hole evaporation process are discussed. (orig.)

  1. Método basado en clasificadores k-NN parametrizados con algoritmos genéticos y la estimación de la reactancia para localización de fallas en sistemas de distribución.

    Directory of Open Access Journals (Sweden)

    Andrés Zapata-Tapasco

    2014-01-01

    Full Text Available En este artículo se presenta una estrategia de parametrización de un localizador de fallas basado en una técnica simple pero eficiente de aprendizaje, conocida como k vecinos más cercanos (k-NN. Esta técnica se complementa con un método plenamente probado de localización basado en la estimación de la impedancia de falla. La estrategia híbrida se validó en un circuito prototipo real, con resultados de error aceptables para aplicaciones en sistemas de distribución de energía eléctrica. Finalmente y como ventaja importante de la metodología propuesta se resalta la facilidad de implementación, cuando se trata de circuitos de distribución reales (más de 100 nodos.

  2. Statistical 2D and 3D shape analysis using Non-Euclidean Metrics

    DEFF Research Database (Denmark)

    Larsen, Rasmus; Hilger, Klaus Baggesen; Wrobel, Mark Christoph

    2002-01-01

    We address the problem of extracting meaningful, uncorrelated biological modes of variation from tangent space shape coordinates in 2D and 3D using non-Euclidean metrics. We adapt the maximum autocorrelation factor analysis and the minimum noise fraction transform to shape decomposition....... Furthermore, we study metrics based on repated annotations of a training set. We define a way of assessing the correlation between landmarks contrary to landmark coordinates. Finally, we apply the proposed methods to a 2D data set consisting of outlines of lungs and a 3D/(4D) data set consisting of sets...

  3. An Ionospheric Metric Study Using Operational Models

    Science.gov (United States)

    Sojka, J. J.; Schunk, R. W.; Thompson, D. C.; Scherliess, L.; Harris, T. J.

    2006-12-01

    One of the outstanding challenges in upgrading ionospheric operational models is quantifying their improvement. This challenge is not necessarily an absolute accuracy one, but rather answering the question, "Is the newest operational model an improvement over its predecessor under operational scenarios?" There are few documented cases where ionospheric models are compared either with each other or against "ground truth". For example a CEDAR workshop team, PRIMO, spent almost a decade carrying out a models comparison with ionosonde and incoherent scatter radar measurements from the Millstone Hill, Massachusetts location [Anderson et al.,1998]. The result of this study was that all models were different and specific conditions could be found when each was the "best" model. Similarly, a National Space Weather Metrics ionospheric challenge was held and results were presented at a National Space Weather meeting. The results were again found to be open to interpretation, and issues with the value of the specific metrics were raised (Fuller-Rowell, private communication, 2003). Hence, unlike the tropospheric weather community, who have established metrics and exercised them on new models over many decades to quantify improvement, the ionospheric community has not yet settled on a metric of both scientific and operational value. We report on a study in which metrics were used to compare various forms of the International Reference Ionosphere (IRI), the Ionospheric Forecast Model (IFM), and the Utah State University Global Assimilation of Ionospheric Measurements Model (USU-GAIM) models. The ground truth for this study was a group of 11 ionosonde data sets taken between 20 March and 19 April 2004. The metric parameter was the ionosphere's critical frequency. The metric was referenced to the IRI. Hence, the study addressed the specific question what improvement does IFM and USU-GAIM have over IRI. Both strengths (improvements) and weaknesses of these models are discussed

  4. The dynamics of metric-affine gravity

    International Nuclear Information System (INIS)

    Vitagliano, Vincenzo; Sotiriou, Thomas P.; Liberati, Stefano

    2011-01-01

    Highlights: → The role and the dynamics of the connection in metric-affine theories is explored. → The most general second order action does not lead to a dynamical connection. → Including higher order invariants excites new degrees of freedom in the connection. → f(R) actions are also discussed and shown to be a non- representative class. - Abstract: Metric-affine theories of gravity provide an interesting alternative to general relativity: in such an approach, the metric and the affine (not necessarily symmetric) connection are independent quantities. Furthermore, the action should include covariant derivatives of the matter fields, with the covariant derivative naturally defined using the independent connection. As a result, in metric-affine theories a direct coupling involving matter and connection is also present. The role and the dynamics of the connection in such theories is explored. We employ power counting in order to construct the action and search for the minimal requirements it should satisfy for the connection to be dynamical. We find that for the most general action containing lower order invariants of the curvature and the torsion the independent connection does not carry any dynamics. It actually reduces to the role of an auxiliary field and can be completely eliminated algebraically in favour of the metric and the matter field, introducing extra interactions with respect to general relativity. However, we also show that including higher order terms in the action radically changes this picture and excites new degrees of freedom in the connection, making it (or parts of it) dynamical. Constructing actions that constitute exceptions to this rule requires significant fine tuned and/or extra a priori constraints on the connection. We also consider f(R) actions as a particular example in order to show that they constitute a distinct class of metric-affine theories with special properties, and as such they cannot be used as representative toy

  5. Feature Selection for Natural Language Call Routing Based on Self-Adaptive Genetic Algorithm

    Science.gov (United States)

    Koromyslova, A.; Semenkina, M.; Sergienko, R.

    2017-02-01

    The text classification problem for natural language call routing was considered in the paper. Seven different term weighting methods were applied. As dimensionality reduction methods, the feature selection based on self-adaptive GA is considered. k-NN, linear SVM and ANN were used as classification algorithms. The tasks of the research are the following: perform research of text classification for natural language call routing with different term weighting methods and classification algorithms and investigate the feature selection method based on self-adaptive GA. The numerical results showed that the most effective term weighting is TRR. The most effective classification algorithm is ANN. Feature selection with self-adaptive GA provides improvement of classification effectiveness and significant dimensionality reduction with all term weighting methods and with all classification algorithms.

  6. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  7. Vehicle-to-infrastructure program cooperative adaptive cruise control.

    Science.gov (United States)

    2015-03-01

    This report documents the work completed by the Crash Avoidance Metrics Partners LLC (CAMP) Vehicle to Infrastructure (V2I) Consortium during the project titled Cooperative Adaptive Cruise Control (CACC). Participating companies in the V2I Cons...

  8. Biomechanical metrics of aesthetic perception in dance.

    Science.gov (United States)

    Bronner, Shaw; Shippen, James

    2015-12-01

    The brain may be tuned to evaluate aesthetic perception through perceptual chunking when we observe the grace of the dancer. We modelled biomechanical metrics to explain biological determinants of aesthetic perception in dance. Eighteen expert (EXP) and intermediate (INT) dancers performed développé arabesque in three conditions: (1) slow tempo, (2) slow tempo with relevé, and (3) fast tempo. To compare biomechanical metrics of kinematic data, we calculated intra-excursion variability, principal component analysis (PCA), and dimensionless jerk for the gesture limb. Observers, all trained dancers, viewed motion capture stick figures of the trials and ranked each for aesthetic (1) proficiency and (2) movement smoothness. Statistical analyses included group by condition repeated-measures ANOVA for metric data; Mann-Whitney U rank and Friedman's rank tests for nonparametric rank data; Spearman's rho correlations to compare aesthetic rankings and metrics; and linear regression to examine which metric best quantified observers' aesthetic rankings, p dance movements revealed differences between groups and condition, p brain combines sensory motor elements into integrated units of behaviour. In this representation, the chunk of information which is remembered, and to which the observer reacts, is the elemental mode shape of the motion rather than physical displacements. This suggests that reduction in redundant information to a simplistic dimensionality is related to the experienced observer's aesthetic perception.

  9. Pragmatic Metrics for Monitoring Science Data Centers

    Science.gov (United States)

    Moses, J. F.; Behnke, J.

    2003-12-01

    Science data metrics and their analysis are critical components to the end-to-end data and service flow for science data centers. The Earth Science Data and Information System Project has collected records of EOS science data archive, processing and distribution metrics from NASA's Distributed Active Archive Centers since 1996. The ESDIS Science Operations Office and the DAAC data centers have cooperated to develop a DAAC metrics reporting capability called the EOSDIS Data Gathering and Reporting Systems (EDGRS). This poster illustrates EDGRS processes and metrics data applications. EDGRS currently accesses detailed archive and distribution metrics from nine DAAC sites and transfers results to a centralized collection system on a routine basis. After automated quality checks the records are immediately made available through a web-based Graphic User Interface. Users can obtain standard graphs and prepare custom queries to generate specific reports for monitoring science data processing progress. Applications are illustrated that explore methods for performing data availability studies and performance analyses. Improvements are planned to support granule-level science data accounting and characterization of product distribution.

  10. Future of the PCI Readmission Metric.

    Science.gov (United States)

    Wasfy, Jason H; Yeh, Robert W

    2016-03-01

    Between 2013 and 2014, the Centers for Medicare and Medicaid Services and the National Cardiovascular Data Registry publically reported risk-adjusted 30-day readmission rates after percutaneous coronary intervention (PCI) as a pilot project. A key strength of this public reporting effort included risk adjustment with clinical rather than administrative data. Furthermore, because readmission after PCI is common, expensive, and preventable, this metric has substantial potential to improve quality and value in American cardiology care. Despite this, concerns about the metric exist. For example, few PCI readmissions are caused by procedural complications, limiting the extent to which improved procedural technique can reduce readmissions. Also, similar to other readmission measures, PCI readmission is associated with socioeconomic status and race. Accordingly, the metric may unfairly penalize hospitals that care for underserved patients. Perhaps in the context of these limitations, Centers for Medicare and Medicaid Services has not yet included PCI readmission among metrics that determine Medicare financial penalties. Nevertheless, provider organizations may still wish to focus on this metric to improve value for cardiology patients. PCI readmission is associated with low-risk chest discomfort and patient anxiety. Therefore, patient education, improved triage mechanisms, and improved care coordination offer opportunities to minimize PCI readmissions. Because PCI readmission is common and costly, reducing PCI readmission offers provider organizations a compelling target to improve the quality of care, and also performance in contracts involve shared financial risk. © 2016 American Heart Association, Inc.

  11. Exploring model-based target discrimination metrics

    Science.gov (United States)

    Witus, Gary; Weathersby, Marshall

    2004-08-01

    Visual target discrimination has occurred when the observer can say "I see a target THERE!" and can designate the target location. Target discrimination occurs when a perceived shape is sufficiently similar one or more of the instances the observer has been trained on. Marr defined vision as "knowing what is where by seeing." Knowing "what" requires prior knowledge. Target discrimination requires model-based visual processing. Model-based signature metrics attempt to answer the question "to what extent does the target in the image resemble a training image?" Model-based signature metrics attempt to represent the effects of high-level top-down visual cognition, in addition to low-level bottom-up effects. Recent advances in realistic 3D target rendering and computer-vision object recognition have made model-based signature metrics more practical. The human visual system almost certainly does NOT use the same processing algorithms as computer vision object recognition, but some processing elements and the overall effects are similar. It remains to be determined whether model-based metrics explain the variance in human performance. The purpose of this paper is to explain and illustrate the model-based approach to signature metrics.

  12. NASA education briefs for the classroom. Metrics in space

    Science.gov (United States)

    1982-01-01

    The use of metric measurement in space is summarized for classroom use. Advantages of the metric system over the English measurement system are described. Some common metric units are defined, as are special units for astronomical study. International system unit prefixes and a conversion table of metric/English units are presented. Questions and activities for the classroom are recommended.

  13. Codes in W*-Metric Spaces: Theory and Examples

    Science.gov (United States)

    Bumgardner, Christopher J.

    2011-01-01

    We introduce a "W*"-metric space, which is a particular approach to non-commutative metric spaces where a "quantum metric" is defined on a von Neumann algebra. We generalize the notion of a quantum code and quantum error correction to the setting of finite dimensional "W*"-metric spaces, which includes codes and error correction for classical…

  14. On Convergence of Fixed Points in Fuzzy Metric Spaces

    Directory of Open Access Journals (Sweden)

    Yonghong Shen

    2013-01-01

    Full Text Available We mainly focus on the convergence of the sequence of fixed points for some different sequences of contraction mappings or fuzzy metrics in fuzzy metric spaces. Our results provide a novel research direction for fixed point theory in fuzzy metric spaces as well as a substantial extension of several important results from classical metric spaces.

  15. Rainbow Rindler metric and Unruh effect

    Science.gov (United States)

    Yadav, Gaurav; Komal, Baby; Majhi, Bibhas Ranjan

    2017-11-01

    The energy of a particle moving on a space-time, in principle, can affect the background metric. The modifications to it depend on the ratio of energy of the particle and the Planck energy, known as rainbow gravity. Here, we find the explicit expressions for the coordinate transformations from rainbow Minkowski space-time to accelerated frame. The corresponding metric is also obtained which we call as rainbow Rindler metric. So far we are aware of that no body has done it in a concrete manner. Here, this is found from the first principle and hence all the parameters are properly identified. The advantage of this is that the calculated Unruh temperature is compatible with the Hawking temperature of the rainbow black hole horizon, obtained earlier. Since the accelerated frame has several importance in revealing various properties of gravity, we believe that the present result will not only fill that gap, but also help to explore different aspects of rainbow gravity paradigm.

  16. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    Static analysis tools that are used for worst-case execution time (WCET) analysis of real-time software just provide partial information on an analyzed program. Only the longest-executing path, which currently determines the WCET bound is indicated to the programmer. This limited view can prevent...... a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  17. SOCIAL METRICS APPLIED TO SMART TOURISM

    Directory of Open Access Journals (Sweden)

    O. Cervantes

    2016-09-01

    Full Text Available We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  18. The conformal metric structure of Geometrothermodynamics

    Science.gov (United States)

    Bravetti, Alessandro; Lopez-Monsalvo, Cesar S.; Nettel, Francisco; Quevedo, Hernando

    2013-03-01

    We present a thorough analysis on the invariance of the most widely used metrics in the Geometrothermodynamics programme. We centre our attention in the invariance of the curvature of the space of equilibrium states under a change of fundamental representation. Assuming that the systems under consideration can be described by a fundamental relation which is a homogeneous function of a definite order, we demonstrate that such invariance is only compatible with total Legendre transformations in the present form of the programme. We give the explicit form of a metric which is invariant under total Legendre transformations and whose induced metric produces a curvature which is independent of the fundamental representation. Finally, we study a generic system with two degrees of freedom and whose fundamental relation is homogeneous of order one.

  19. Steiner trees for fixed orientation metrics

    DEFF Research Database (Denmark)

    Brazil, Marcus; Zachariasen, Martin

    2009-01-01

    We consider the problem of constructing Steiner minimum trees for a metric defined by a polygonal unit circle (corresponding to s = 2 weighted legal orientations in the plane). A linear-time algorithm to enumerate all angle configurations for degree three Steiner points is given. We provide...... a simple proof that the angle configuration for a Steiner point extends to all Steiner points in a full Steiner minimum tree, such that at most six orientations suffice for edges in a full Steiner minimum tree. We show that the concept of canonical forms originally introduced for the uniform orientation...... metric generalises to the fixed orientation metric. Finally, we give an O(s n) time algorithm to compute a Steiner minimum tree for a given full Steiner topology with n terminal leaves....

  20. A bi-metric theory of gravitation

    International Nuclear Information System (INIS)

    Rosen, N.

    1975-01-01

    The bi-metric theory of gravitation proposed previously is simplified in that the auxiliary conditions are discarded, the two metric tensors being tied together only by means of the boundary conditions. Some of the properties of the field of a particle are investigated; there is no black hole, and it appears that no gravitational collapse can take place. Although the proposed theory and general relativity are at present observationally indistinguishable, some differences are pointed out which may some day be susceptible of observation. An alternative bi-metric theory is considered which gives for the precession of the perihelion 5/6 of the value given by general relativity; it seems less satisfactory than the present theory from the aesthetic point of view. (author)

  1. Metric Learning for Hyperspectral Image Segmentation

    Science.gov (United States)

    Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca

    2011-01-01

    We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.

  2. Social Metrics Applied to Smart Tourism

    Science.gov (United States)

    Cervantes, O.; Gutiérrez, E.; Gutiérrez, F.; Sánchez, J. A.

    2016-09-01

    We present a strategy to make productive use of semantically-related social data, from a user-centered semantic network, in order to help users (tourists and citizens in general) to discover cultural heritage, points of interest and available services in a smart city. This data can be used to personalize recommendations in a smart tourism application. Our approach is based on flow centrality metrics typically used in social network analysis: flow betweenness, flow closeness and eccentricity. These metrics are useful to discover relevant nodes within the network yielding nodes that can be interpreted as suggestions (venues or services) to users. We describe the semantic network built on graph model, as well as social metrics algorithms used to produce recommendations. We also present challenges and results from a prototypical implementation applied to the case study of the City of Puebla, Mexico.

  3. Metrical and dynamical aspects in complex analysis

    CERN Document Server

    2017-01-01

    The central theme of this reference book is the metric geometry of complex analysis in several variables. Bridging a gap in the current literature, the text focuses on the fine behavior of the Kobayashi metric of complex manifolds and its relationships to dynamical systems, hyperbolicity in the sense of Gromov and operator theory, all very active areas of research. The modern points of view expressed in these notes, collected here for the first time, will be of interest to academics working in the fields of several complex variables and metric geometry. The different topics are treated coherently and include expository presentations of the relevant tools, techniques and objects, which will be particularly useful for graduate and PhD students specializing in the area.

  4. Pragmatic quality metrics for evolutionary software development models

    Science.gov (United States)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  5. Robustness Metrics: Consolidating the multiple approaches to quantify Robustness

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Eifler, Tobias; Howard, Thomas J.

    2016-01-01

    determined to be conceptually different from one another. The metrics were classified by their meaning and interpretation based on the types of information necessary to calculate the metrics. Four different classes were identified: 1) Sensitivity robustness metrics; 2) Size of feasible design space......, this ambiguity can have significant influence on the strategies used to combat variability, the way it is quantified and ultimately, the quality of the final design. In this contribution the literature for robustness metrics was systematically reviewed. From the 108 relevant publications found, 38 metrics were...... robustness metrics; 3) Functional expectancy and dispersion robustness metrics; and 4) Probability of conformance robustness metrics. The goal was to give a comprehensive overview of robustness metrics and guidance to scholars and practitioners to understand the different types of robustness metrics...

  6. Critical insights for a sustainability framework to address integrated community water services: Technical metrics and approaches.

    Science.gov (United States)

    Xue, Xiaobo; Schoen, Mary E; Ma, Xin Cissy; Hawkins, Troy R; Ashbolt, Nicholas J; Cashdollar, Jennifer; Garland, Jay

    2015-06-15

    Planning for sustainable community water systems requires a comprehensive understanding and assessment of the integrated source-drinking-wastewater systems over their life-cycles. Although traditional life cycle assessment and similar tools (e.g. footprints and emergy) have been applied to elements of these water services (i.e. water resources, drinking water, stormwater or wastewater treatment alone), we argue for the importance of developing and combining the system-based tools and metrics in order to holistically evaluate the complete water service system based on the concept of integrated resource management. We analyzed the strengths and weaknesses of key system-based tools and metrics, and discuss future directions to identify more sustainable municipal water services. Such efforts may include the need for novel metrics that address system adaptability to future changes and infrastructure robustness. Caution is also necessary when coupling fundamentally different tools so to avoid misunderstanding and consequently misleading decision-making. Published by Elsevier Ltd.

  7. Metrically adjusted questionnaires can provide more information for scientists- an example from the tourism.

    Science.gov (United States)

    Sindik, Joško; Miljanović, Maja

    2017-03-01

    The article deals with the issue of research methodology, illustrating the use of known research methods for new purposes. Questionnaires that originally do not have metric characteristics can be called »handy questionnaires«. In this article, the author is trying to consider the possibilities of their improved scientific usability, which can be primarily ensured by improving their metric characteristics, consequently using multivariate instead of univariate statistical methods. In order to establish the base for the application of multivariate statistical procedures, the main idea is to develop strategies to design measurement instruments from parts of the handy questionnaires. This can be accomplished in two ways: before deciding upon the methods for data collection (redesigning the handy questionnaires) and before the collection of the data (a priori) or after the data has been collected, without modifying the questionnaire (a posteriori). The basic principles of applying these two strategies of the metrical adaptation of handy questionnaires are described.

  8. On metric divergences of probability measures

    Czech Academy of Sciences Publication Activity Database

    Vajda, Igor

    2009-01-01

    Roč. 45, č. 6 (2009), s. 885-900 ISSN 0023-5954 R&D Projects: GA MŠk(CZ) 1M0572; GA ČR GA102/07/1131 Institutional research plan: CEZ:AV0Z10750506 Keywords : Metric divergences * Hellinger divergence * Le Cam divergence * Jensen-Shannon divergence * Total variation Subject RIV: BD - Theory of Information Impact factor: 0.445, year: 2009 http://library.utia.cas.cz/separaty/2010/SI/vajda-on metric divergences of probability measures.pdf

  9. Federal Procurement Metrication Appropriateness and Methods.

    Science.gov (United States)

    1982-09-18

    8217D-R123 243 FEDERAL PROCUREMENT NETRICATION AiPPROPRIATENESS AND- i/i METHODS(U) SCIENCE MANAGEMENT CORP UASHINGTON DC M A COELLA 18 SEP 82 NRC-358i... Science Management Corporation FEDERAL PROCUREMENT METRICATION APPROPRIATENESS AND METHODS (FINAL REPORT) DTICSEL.C T ED fDISMhIUTION STATEMENT A LM...reflect the views of the U.S. Metric Board. Acce5Sion For NTTS GRA&I DTTC ’ri% Q is Diet Special Science Management CoRoration 1120 Connecticut Avenue

  10. Jacobi-Maupertuis metric and Kepler equation

    Science.gov (United States)

    Chanda, Sumanto; Gibbons, Gary William; Guha, Partha

    This paper studies the application of the Jacobi-Eisenhart lift, Jacobi metric and Maupertuis transformation to the Kepler system. We start by reviewing fundamentals and the Jacobi metric. Then we study various ways to apply the lift to Kepler-related systems: first as conformal description and Bohlin transformation of Hooke’s oscillator, second in contact geometry and third in Houri’s transformation [T. Houri, Liouville integrability of Hamiltonian systems and spacetime symmetry (2016), www.geocities.jp/football_physician/publication.html], coupled with Milnor’s construction [J. Milnor, On the geometry of the Kepler problem, Am. Math. Mon. 90 (1983) 353-365] with eccentric anomaly.

  11. What Metrics Accurately Reflect Surgical Quality?

    Science.gov (United States)

    Ibrahim, Andrew M; Dimick, Justin B

    2018-01-29

    Surgeons are increasingly under pressure to measure and improve their quality. While there is broad consensus that we ought to track surgical quality, there is far less agreement about which metrics matter most. This article reviews the important statistical concepts of case mix and chance as they apply to understanding the observed wide variation in surgical quality. We then discuss the benefits and drawbacks of current measurement strategies through the framework of structure, process, and outcomes approaches. Finally, we describe emerging new metrics, such as video evaluation and network optimization, that are likely to take on an increasingly important role in the future of measuring surgical quality.

  12. A generalization of Vaidya's radiation metric

    International Nuclear Information System (INIS)

    Gleiser, R.J.; Kozameh, C.N.

    1981-01-01

    In this paper it is shown that if Vaidya's radiation metric is considered from the point of view of kinetic theory in general relativity, the corresponding phase space distribution function can be generalized in a particular way. The new family of spherically symmetric radiation metrics obtained contains Vaidya's as a limiting situation. The Einstein field equations are solved in a ''comoving'' coordinate system. Two arbitrary functions of a single variable are introduced in the process of solving these equations. Particular examples considered are a stationary solution, a nonvacuum solution depending on a single parameter, and several limiting situations. (author)

  13. On the Plane Geometry with Generalized Absolute Value Metric

    Directory of Open Access Journals (Sweden)

    A. Bayar

    2008-01-01

    Full Text Available Metric spaces are among the most important widely studied topics in mathematics. In recent years, Mathematicians began to investigate using other metrics different from Euclidean metric. These metrics also find their place computer age in addition to their importance in geometry. In this paper, we consider the plane geometry with the generalized absolute value metric and define trigonometric functions and norm and then give a plane tiling example for engineers underlying Schwarz's inequality in this plane.

  14. Metrics and Evaluation Models for Accessible Television

    DEFF Research Database (Denmark)

    Li, Dongxiao; Looms, Peter Olaf

    2014-01-01

    to compare. Using case studies from three emerging economies (Argentina, Brazil and China) as well as industrialized nations including Canada, Denmark, the United Kingdom and the USA), this paper examines the situation facing television accessibility. Having identified and discussed existing metrics...

  15. Vehicle Integrated Prognostic Reasoner (VIPR) Metric Report

    Science.gov (United States)

    Cornhill, Dennis; Bharadwaj, Raj; Mylaraswamy, Dinkar

    2013-01-01

    This document outlines a set of metrics for evaluating the diagnostic and prognostic schemes developed for the Vehicle Integrated Prognostic Reasoner (VIPR), a system-level reasoner that encompasses the multiple levels of large, complex systems such as those for aircraft and spacecraft. VIPR health managers are organized hierarchically and operate together to derive diagnostic and prognostic inferences from symptoms and conditions reported by a set of diagnostic and prognostic monitors. For layered reasoners such as VIPR, the overall performance cannot be evaluated by metrics solely directed toward timely detection and accuracy of estimation of the faults in individual components. Among other factors, overall vehicle reasoner performance is governed by the effectiveness of the communication schemes between monitors and reasoners in the architecture, and the ability to propagate and fuse relevant information to make accurate, consistent, and timely predictions at different levels of the reasoner hierarchy. We outline an extended set of diagnostic and prognostics metrics that can be broadly categorized as evaluation measures for diagnostic coverage, prognostic coverage, accuracy of inferences, latency in making inferences, computational cost, and sensitivity to different fault and degradation conditions. We report metrics from Monte Carlo experiments using two variations of an aircraft reference model that supported both flat and hierarchical reasoning.

  16. All You Need to Know About Metric

    Science.gov (United States)

    American Metric Journal, 1974

    1974-01-01

    Information found necessary for South Africa's citizens to learn during their recent conversion to the metric system is presented. Twelve terms and prefixes are suggested that satisfy practically all ordinary needs. Tables are given for the most commonly used measures, with relationships between different units indicated. (LS)

  17. Strong ideal convergence in probabilistic metric spaces

    Indian Academy of Sciences (India)

    sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and ... also important applications in nonlinear analysis [2]. The theory was brought to ..... for each t > 0 since each set on the right-hand side of the relation (3.1) belongs to I. Thus, by Definition 2.11 and the ...

  18. Calabi–Yau metrics and string compactification

    Directory of Open Access Journals (Sweden)

    Michael R. Douglas

    2015-09-01

    Full Text Available Yau proved an existence theorem for Ricci-flat Kähler metrics in the 1970s, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.

  19. Reuse metrics and measurement: A framework

    Science.gov (United States)

    Reifer, Donald J.

    1990-01-01

    The lessons learned and experience gleaned are described by those who have started to implement the reuse metrics and measurement framework used in controlling the development of common avionics and software for its affiliated aircraft programs. The framework was developed to permit the measurement of the long term cost/benefits resulting from the creation and use of Reusable Software Objects (RSOs). The framework also monitors the efficiency and effectiveness of the Software Reuse Library (SRL). The metrics and measurement framework is defined which was established to allow some determinations and findings to be made relative to software reuse. Seven criteria are discussed which were used to guide the establishment of the proposed reuse framework. Object recapture and creation metrics are explained along with their normalized use in effort, productivity, and quality determination. A single and multiple reuse instance version of a popular cost model is presented which uses these metrics and the measurement scheme proposed to predict the software effort and duration under various reuse assumptions. Studies in using this model to predict actuals taken from the RCI data base of over 1000 completed projects is discussed.

  20. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  1. Invariant metric for nonlinear symplectic maps

    Indian Academy of Sciences (India)

    a function of system parameters, we demonstrate that the performance of a nonlinear Hamiltonian system is enhanced. Keywords. Invariant metric; symplectic maps; performance optimization. PACS Nos 05.45. ...... [7] A Nijenhuis and H S Wilf, Computational algorithms for computers and calculators (Academic. Press, New ...

  2. Random shortest path metrics with applications

    NARCIS (Netherlands)

    Engels, Christian; Manthey, Bodo; Raghavendra Rao, B.V.; Brieden, A.; Görgülü, Z.-K.; Krug, T.; Kropat, E.; Meyer-Nieberg, S.; Mihelcic, G.; Pickl, S.W.

    2012-01-01

    We consider random metric instances for optimization problems obtained as follows: Every edge of a complete graph gets a weight drawn independently at random. And then the length of an edge is the length of a shortest path with respect to these weights that connects its two endpoints. We prove that

  3. Business model metrics : An open repository

    NARCIS (Netherlands)

    Heikkila, M.; Bouwman, W.A.G.A.; Heikkila, J.; Solaimani, S.; Janssen, W.

    2015-01-01

    Development of successful business models has become a necessity in turbulent business environments, but compared to research on business modeling tools, attention to the role of metrics in designing business models in literature is limited. Building on existing approaches to business models and

  4. Metrical musings on Littlewood and friends

    DEFF Research Database (Denmark)

    Haynes, A.; Jensen, Jonas Lindstrøm; Kristensen, Simon

    We prove a metrical result on a family of conjectures related to the Littlewood conjecture, namely the original Littlewood conjecture, the mixed Littlewood conjecture of de Mathan and Teulié and a hybrid between a conjecture of Cassels and the Littlewood conjecture. It is shown that the set of nu...

  5. Click Model-Based Information Retrieval Metrics

    NARCIS (Netherlands)

    Chuklin, A.; Serdyukov, P.; de Rijke, M.

    2013-01-01

    In recent years many models have been proposed that are aimed at predicting clicks of web search users. In addition, some information retrieval evaluation metrics have been built on top of a user model. In this paper we bring these two directions together and propose a common approach to converting

  6. Strong Ideal Convergence in Probabilistic Metric Spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  7. A Paradigm for Security Metrics in Counterinsurgency

    Science.gov (United States)

    2011-06-10

    The rest go on with their old measurements and expect me to fit them. — George Bernard Shaw, Playwright The history of COIN security metrics...at an estimated 263,000 in 1962.87 The book Souvenirs de la Bataille d’Alger written by Saadi Yacef in 1962 inspired the movie Battle of Algiers

  8. Contraction theorems in fuzzy metric space

    International Nuclear Information System (INIS)

    Farnoosh, R.; Aghajani, A.; Azhdari, P.

    2009-01-01

    In this paper, the results on fuzzy contractive mapping proposed by Dorel Mihet will be proved for B-contraction and C-contraction in the case of George and Veeramani fuzzy metric space. The existence of fixed point with weaker conditions will be proved; that is, instead of the convergence of subsequence, p-convergence of subsequence is used.

  9. Clean Cities 2010 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  10. Clean Cities 2011 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  11. A new universal colour image fidelity metric

    NARCIS (Netherlands)

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated colour space. The resulting colour image fidelity metric quantifies the distortion of a processed colour image relative to its original version. We evaluated the new colour image

  12. Strong ideal convergence in probabilistic metric spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  13. Outsourced Similarity Search on Metric Data Assets

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Assent, Ira; Jensen, Christian S.

    2012-01-01

    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example...

  14. A Lagrangian-dependent metric space

    International Nuclear Information System (INIS)

    El-Tahir, A.

    1989-08-01

    A generalized Lagrangian-dependent metric of the static isotropic spacetime is derived. Its behaviour should be governed by imposing physical constraints allowing to avert the pathological features of gravity at the strong field domain. This would restrict the choice of the Lagrangian form. (author). 10 refs

  15. Product fixed points in ordered metric spaces

    OpenAIRE

    Turinici, Mihai

    2011-01-01

    All product fixed point results in ordered metric spaces based on linear contractive conditions are but a vectorial form of the fixed point statement due to Nieto and Rodriguez-Lopez [Order, 22 (2005), 223-239], under the lines in Matkowski [Bull. Acad. Pol. Sci. (Ser. Sci. Math. Astronom. Phys.), 21 (1973), 323-324].

  16. Quantitative properties of the Schwarzschild metric

    Czech Academy of Sciences Publication Activity Database

    Křížek, Michal; Křížek, Filip

    2018-01-01

    Roč. 2018, č. 1 (2018), s. 1-10 Institutional support: RVO:67985840 Keywords : exterior and interior Schwarzschild metric * proper radius * coordinate radius Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics http://astro.shu-bg.net/pasb/index_files/Papers/2018/SCHWARZ8.pdf

  17. Visualizing energy landscapes with metric disconnectivity graphs.

    Science.gov (United States)

    Smeeton, Lewis C; Oakley, Mark T; Johnston, Roy L

    2014-07-30

    The visualization of multidimensional energy landscapes is important, providing insight into the kinetics and thermodynamics of a system, as well the range of structures a system can adopt. It is, however, highly nontrivial, with the number of dimensions required for a faithful reproduction of the landscape far higher than can be represented in two or three dimensions. Metric disconnectivity graphs provide a possible solution, incorporating the landscape connectivity information present in disconnectivity graphs with structural information in the form of a metric. In this study, we present a new software package, PyConnect, which is capable of producing both disconnectivity graphs and metric disconnectivity graphs in two or three dimensions. We present as a test case the analysis of the 69-bead BLN coarse-grained model protein and show that, by choosing appropriate order parameters, metric disconnectivity graphs can resolve correlations between structural features on the energy landscape with the landscapes energetic and kinetic properties. Copyright © 2014 The Authors Journal of Computational Chemistry Published by Wiley Periodicals, Inc.

  18. A possible molecular metric for biological evolvability

    Indian Academy of Sciences (India)

    2012-06-25

    Jun 25, 2012 ... Proteins manifest themselves as phenotypic traits, retained or lost in living systems via evolutionary pressures. Simply ... the first such metric by utilizing the recently discovered stoichiometric margin of life for all known naturally occurring ... tures), at the molecular level, are still debated under the context.

  19. DIGITAL MARKETING: SUCCESS METRICS, FUTURE TRENDS

    OpenAIRE

    Preeti Kaushik

    2017-01-01

    Abstract – Business Marketing is one of the prospective which has been tremendously affected by digital world in last few years. Digital marketing refers to doing advertising through digital channels. This paper provides detailed study of metrics to measure success of digital marketing platform and glimpse of future of technologies by 2020.

  20. Description of the Sandia Validation Metrics Project

    International Nuclear Information System (INIS)

    TRUCANO, TIMOTHY G.; EASTERLING, ROBERT G.; DOWDING, KEVIN J.; PAEZ, THOMAS L.; URBINA, ANGEL; ROMERO, VICENTE J.; RUTHERFORD, BRIAN M.; HILLS, RICHARD GUY

    2001-01-01

    This report describes the underlying principles and goals of the Sandia ASCI Verification and Validation Program Validation Metrics Project. It also gives a technical description of two case studies, one in structural dynamics and the other in thermomechanics, that serve to focus the technical work of the project in Fiscal Year 2001

  1. Inferring feature relevances from metric learning

    DEFF Research Database (Denmark)

    Schulz, Alexander; Mokbel, Bassam; Biehl, Michael

    2015-01-01

    Powerful metric learning algorithms have been proposed in the last years which do not only greatly enhance the accuracy of distance-based classifiers and nearest neighbor database retrieval, but which also enable the interpretability of these operations by assigning explicit relevance weights to ...

  2. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    Science.gov (United States)

    Dias, Óscar J.; Lemos, José P.

    2003-11-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de Sitter (dS) C metric (Λ>0), and the anti de Sitter (AdS) C metricmetric in flat spacetime (Λ=0). These exact solutions describe a pair of accelerated black holes in the flat or cosmological constant background, with the acceleration A being provided by a strut in between that pushes away the two black holes or, alternatively, by strings hanging from infinity that pull them in. In this paper we analyze the extremal limits of the C metric in a background with a generic cosmological constant Λ>0, Λ=0, and ΛSchwarzschild solution by taking an appropriate limit, where the black hole event horizon approaches the cosmological horizon. Similarly, one can generate the Bertotti-Robinson metric from the Reissner-Nordström metric by taking the limit of the Cauchy horizon going into the event horizon of the black hole, as well as the anti-Nariai metric by taking an appropriate solution and limit. Using these methods we generate the C-metric counterparts of the Nariai, Bertotti-Robinson, and anti-Nariai solutions, among others. These C-metric counterparts are conformal to the product of two two-dimensional manifolds of constant curvature, the conformal factor depending on the angular coordinate. In addition, the C-metric extremal solutions have a conical singularity at least at one of the poles of their angular surfaces. We give a physical interpretation to these solutions, e.g., in the Nariai C metric (with topology dS2×S˜2) to each point in the deformed two-sphere S˜˜2 corresponds a dS2 spacetime, except for one point which corresponds to a dS2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these

  3. Selections of the metric projection operator and strict solarity of sets with continuous metric projection

    Science.gov (United States)

    Alimov, A. R.

    2017-07-01

    In a broad class of finite-dimensional Banach spaces, we show that a closed set with lower semicontinuous metric projection is a strict sun, admits a continuous selection of the metric projection operator onto it, has contractible intersections with balls, and its (nonempty) intersection with any closed ball is a retract of this ball. For sets with continuous metric projection, a number of new results relating the solarity of such sets to the stability of the operator of best approximation are obtained. Bibliography 25 titles.

  4. Computing strong metric dimension of some special classes of graphs by genetic algorithms

    Directory of Open Access Journals (Sweden)

    Kratica Jozef

    2008-01-01

    Full Text Available In this paper we consider the NP-hard problem of determining the strong metric dimension of graphs. The problem is solved by a genetic algorithm that uses binary encoding and standard genetic operators adapted to the problem. This represents the first attempt to solve this problem heuristically. We report experimental results for the two special classes of ORLIB test instances: crew scheduling and graph coloring.

  5. Information metric on instanton moduli spaces in nonlinear σ models

    International Nuclear Information System (INIS)

    Yahikozawa, Shigeaki

    2004-01-01

    We study the information metric on instanton moduli spaces in two-dimensional nonlinear σ models. In the CP 1 model, the information metric on the moduli space of one instanton with the topological charge Q=k(k≥1) is a three-dimensional hyperbolic metric, which corresponds to Euclidean anti-de Sitter space-time metric in three dimensions, and the overall scale factor of the information metric is 4k 2 /3; this means that the sectional curvature is -3/4k 2 . We also calculate the information metric in the CP 2 model

  6. Machine Learning for ATLAS DDM Network Metrics

    CERN Document Server

    Lassnig, Mario; The ATLAS collaboration; Vamosi, Ralf

    2016-01-01

    The increasing volume of physics data is posing a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from our ongoing automation efforts. First, we describe our framework for distributed data management and network metrics, automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for network-aware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.

  7. Data Complexity Metrics for XML Web Services

    Directory of Open Access Journals (Sweden)

    MISRA, S.

    2009-06-01

    Full Text Available Web services that are based on eXtensible Markup Language (XML technologies enable integration of diverse IT processes and systems and have been gaining extraordinary acceptance from the basic to the most complicated business and scientific processes. The maintainability is one of the important factors that affect the quality of the Web services that can be seen a kind of software project. The effective management of any type of software projects requires modelling, measurement, and quantification. This study presents a metric for the assessment of the quality of the Web services in terms of its maintainability. For this purpose we proposed a data complexity metric that can be evaluated by analyzing WSDL (Web Service Description Language documents used for describing Web services.

  8. A perceptual metric for photo retouching.

    Science.gov (United States)

    Kee, Eric; Farid, Hany

    2011-12-13

    In recent years, advertisers and magazine editors have been widely criticized for taking digital photo retouching to an extreme. Impossibly thin, tall, and wrinkle- and blemish-free models are routinely splashed onto billboards, advertisements, and magazine covers. The ubiquity of these unrealistic and highly idealized images has been linked to eating disorders and body image dissatisfaction in men, women, and children. In response, several countries have considered legislating the labeling of retouched photos. We describe a quantitative and perceptually meaningful metric of photo retouching. Photographs are rated on the degree to which they have been digitally altered by explicitly modeling and estimating geometric and photometric changes. This metric correlates well with perceptual judgments of photo retouching and can be used to objectively judge by how much a retouched photo has strayed from reality.

  9. Metrics for measuring distances in configuration spaces.

    Science.gov (United States)

    Sadeghi, Ali; Ghasemi, S Alireza; Schaefer, Bastian; Mohr, Stephan; Lill, Markus A; Goedecker, Stefan

    2013-11-14

    In order to characterize molecular structures we introduce configurational fingerprint vectors which are counterparts of quantities used experimentally to identify structures. The Euclidean distance between the configurational fingerprint vectors satisfies the properties of a metric and can therefore safely be used to measure dissimilarities between configurations in the high dimensional configuration space. In particular we show that these metrics are a perfect and computationally cheap replacement for the root-mean-square distance (RMSD) when one has to decide whether two noise contaminated configurations are identical or not. We introduce a Monte Carlo approach to obtain the global minimum of the RMSD between configurations, which is obtained from a global minimization over all translations, rotations, and permutations of atomic indices.

  10. Metric-Aware Secure Service Orchestration

    Directory of Open Access Journals (Sweden)

    Gabriele Costa

    2012-12-01

    Full Text Available Secure orchestration is an important concern in the internet of service. Next to providing the required functionality the composite services must also provide a reasonable level of security in order to protect sensitive data. Thus, the orchestrator has a need to check whether the complex service is able to satisfy certain properties. Some properties are expressed with metrics for precise definition of requirements. Thus, the problem is to analyse the values of metrics for a complex business process. In this paper we extend our previous work on analysis of secure orchestration with quantifiable properties. We show how to define, verify and enforce quantitative security requirements in one framework with other security properties. The proposed approach should help to select the most suitable service architecture and guarantee fulfilment of the declared security requirements.

  11. Anisotropic rectangular metric for polygonal surface remeshing

    KAUST Repository

    Pellenard, Bertrand

    2013-06-18

    We propose a new method for anisotropic polygonal surface remeshing. Our algorithm takes as input a surface triangle mesh. An anisotropic rectangular metric, defined at each triangle facet of the input mesh, is derived from both a user-specified normal-based tolerance error and the requirement to favor rectangle-shaped polygons. Our algorithm uses a greedy optimization procedure that adds, deletes and relocates generators so as to match two criteria related to partitioning and conformity.

  12. Quantitative metric theory of continued fractions

    Indian Academy of Sciences (India)

    Quantitative versions of the central results of the metric theory of continued fractions were given primarily by C. De Vroedt. In this paper we give improvements of the bounds involved . For a real number , let. x = c 0 + 1 c 1 + 1 c 2 + 1 c 3 + 1 c 4 + ⋱ . A sample result we prove is that given ϵ > 0 ,. ( c 1 ( x ) ⋯ c n ( x ) ) 1 n ...

  13. Agile Metrics: Progress Monitoring of Agile Contractors

    Science.gov (United States)

    2014-01-01

    epic. The short timeframe is usually called an itera- tion or, in Scrum -based teams, a sprint; multiple iterations make up a release [Lapham 2011...9769 [Rawsthorne 2012] Rawsthorne, Dan. Monitoring Scrum Projects with AgileEVM and Earned Business Value Metrics (EBV). 2012. http...AgileEVM – Earned Value Manage- ment in Scrum Projects.” Presented at Agile2006, 23-28 July 2006. [USAF 2008] United States Air Force. United

  14. Primordial magnetic fields from metric perturbations

    CERN Document Server

    Maroto, A L

    2001-01-01

    We study the amplification of electromagnetic vacuum fluctuations induced by the evolution of scalar metric perturbations at the end of inflation. Such perturbations break the conformal invariance of Maxwell equations in Friedmann-Robertson-Walker backgrounds and allow the growth of magnetic fields on super-Hubble scales. We estimate the strength of the fields generated by this mechanism on galactic scales and compare the results with the present bounds on the galactic dynamo seed fields.

  15. Design Management: Metrics and Visual Tools

    OpenAIRE

    Abou Ibrahim, Hisham; Hamzeh, Farook

    2017-01-01

    The iterative and multidisciplinary nature of design complicates its management. Nonetheless, the lack of adequate tools that can be used to manage design dynamics negatively affects the design process as well as the quality of the final design deliverables. In this regard, this paper introduces new metrics to measure information flow in BIM projects, and elaborates on the Level of Development (LOD) concept to reflect the design maturity of model elements in the corresponding design context. ...

  16. Metrics in Keplerian orbits quotient spaces

    Science.gov (United States)

    Milanov, Danila V.

    2018-03-01

    Quotient spaces of Keplerian orbits are important instruments for the modelling of orbit samples of celestial bodies on a large time span. We suppose that variations of the orbital eccentricities, inclinations and semi-major axes remain sufficiently small, while arbitrary perturbations are allowed for the arguments of pericentres or longitudes of the nodes, or both. The distance between orbits or their images in quotient spaces serves as a numerical criterion for such problems of Celestial Mechanics as search for common origin of meteoroid streams, comets, and asteroids, asteroid families identification, and others. In this paper, we consider quotient sets of the non-rectilinear Keplerian orbits space H. Their elements are identified irrespective of the values of pericentre arguments or node longitudes. We prove that distance functions on the quotient sets, introduced in Kholshevnikov et al. (Mon Not R Astron Soc 462:2275-2283, 2016), satisfy metric space axioms and discuss theoretical and practical importance of this result. Isometric embeddings of the quotient spaces into R^n, and a space of compact subsets of H with Hausdorff metric are constructed. The Euclidean representations of the orbits spaces find its applications in a problem of orbit averaging and computational algorithms specific to Euclidean space. We also explore completions of H and its quotient spaces with respect to corresponding metrics and establish a relation between elements of the extended spaces and rectilinear trajectories. Distance between an orbit and subsets of elliptic and hyperbolic orbits is calculated. This quantity provides an upper bound for the metric value in a problem of close orbits identification. Finally the invariance of the equivalence relations in H under coordinates change is discussed.

  17. Smart Grid Status and Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Balducci, Patrick J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Weimar, Mark R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kirkham, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  18. Quantitative Embeddability and Connectivity in Metric Spaces

    Science.gov (United States)

    Eriksson-Bique, Sylvester David

    This thesis studies three analytic and quantitative questions on doubling metric (measure) spaces. These results are largely independent and will be presented in separate chapters. The first question concerns representing metric spaces arising from complete Riemannian manifolds in Euclidean space. More precisely, we find bi-Lipschitz embeddings ƒ for subsets A of complete Riemannian manifolds M of dimension n, where N could depend on a bound on the curvature and diameter of A. The main difficulty here is to control the distortion of such embeddings in terms of the curvature of the manifold. In constructing the embeddings, we will study the collapsing theory of manifolds in detail and at multiple scales. Similar techniques give embeddings for subsets of complete Riemannian orbifolds and quotient metric spaces. The second part of the thesis answers a question about finding quantitative and weak conditions that ensure large families of rectifiable curves connecting pairs of points. These families of rectifiable curves are quantified in terms of Poincare inequalities. We identify a new quantitative connectivity condition in terms of curve fragments, which is equivalent to possessing a Poincare inequality with some exponent. The connectivity condition arises naturally in three different contexts, and we present methods to find Poincare inequalities for the spaces involved. In particular, we prove such inequalities for spaces with weak curvature bounds and thus resolve a question of Tapio Rajala. In the final part of the thesis we study the local geometry of spaces admitting differentiation of Lipschitz functions with certain Banach space targets. The main result shows that such spaces can be characterized in terms of Poincare inequalities and doubling conditions. In fact, such spaces can be covered by countably many pieces, each of which is an isometric subset of a doubling metric measure space admitting a Poincare inequality. In proving this, we will find a new way to

  19. Metric entropy in linear inverse scattering

    Directory of Open Access Journals (Sweden)

    M. A. Maisto

    2016-09-01

    Full Text Available The role of multiple views and/or multiple frequencies on the achievable performance in linear inverse scattering problems is addressed. To this end, the impact of views and frequencies on the Kolmogorov entropy measure is studied. This way the metric information that can be conveyed back from data to the unknown can be estimated. For the sake of simplicity, the study deals with strip scatterers and the cases of discrete angles of incidence and/or frequencies.

  20. The Planck Vacuum and the Schwarzschild Metrics

    Directory of Open Access Journals (Sweden)

    Daywitt W. C.

    2009-07-01

    Full Text Available The Planck vacuum (PV is assumed to be the source of the visible universe. So under conditions of sufficient stress, there must exist a pathway through which energy from the PV can travel into this universe. Conversely, the passage of energy from the visible universe to the PV must also exist under the same stressful conditions. The following examines two versions of the Schwarzschild metric equation for compatability with this open-pathway idea.

  1. Fast Link Adaptation for MIMO-OFDM

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm; Kant, Shashi; Wehinger, Joachim

    2010-01-01

    We investigate link-quality metrics (LQMs) based on raw bit-error-rate, effective signal-to-interference-plus-noise ratio, and mutual information (MI) for the purpose of fast link adaptation (LA) in communication systems employing orthogonal frequency-division multiplexing and multiple-input–mult...

  2. Adaptive dissimilarity measures, dimension reduction and visualization

    NARCIS (Netherlands)

    Bunte, Kerstin

    2011-01-01

    Mijn thesis presenteert een aantal extensies van het Learning Vector Quantization algoritme gebaseerd op het concept van adaptive similarity measures. Deze manier van metric learning kan gebruikt worden in een grote verscheidenheid aan applicaties. In het eerste deel van deze thesis worden

  3. Ideal Based Cyber Security Technical Metrics for Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    W. F. Boyer; M. A. McQueen

    2007-10-01

    Much of the world's critical infrastructure is at risk from attack through electronic networks connected to control systems. Security metrics are important because they provide the basis for management decisions that affect the protection of the infrastructure. A cyber security technical metric is the security relevant output from an explicit mathematical model that makes use of objective measurements of a technical object. A specific set of technical security metrics are proposed for use by the operators of control systems. Our proposed metrics are based on seven security ideals associated with seven corresponding abstract dimensions of security. We have defined at least one metric for each of the seven ideals. Each metric is a measure of how nearly the associated ideal has been achieved. These seven ideals provide a useful structure for further metrics development. A case study shows how the proposed metrics can be applied to an operational control system.

  4. On induced hermitian metrics for holomorphic vector bundles

    International Nuclear Information System (INIS)

    Hoang Le Minh.

    1989-09-01

    An explicit computation of induced hermitian metrics on holomorphic vector bundles is given. As an example the Fubini-Study metrics for complex projective spaces and Grassmannians are considered. (author). 5 refs

  5. A Fundamental Metric for Metal Recycling Applied to Coated Magnesium

    NARCIS (Netherlands)

    Meskers, C.E.M.; Reuter, M.A.; Boin, U.; Kvithyld, A.

    2008-01-01

    A fundamental metric for the assessment of the recyclability and, hence, the sustainability of coated magnesium scrap is presented; this metric combines kinetics and thermodynamics. The recycling process, consisting of thermal decoating and remelting, was studied by thermogravimetry and differential

  6. Using a safety forecast model to calculate future safety metrics.

    Science.gov (United States)

    2017-05-01

    This research sought to identify a process to improve long-range planning prioritization by using forecasted : safety metrics in place of the existing Utah Department of Transportation Safety Indexa metric based on historical : crash data. The res...

  7. The Entropy-Based Quantum Metric

    Directory of Open Access Journals (Sweden)

    Roger Balian

    2014-07-01

    Full Text Available The von Neumann entropy S( D ^ generates in the space of quantum density matrices  D ^ the Riemannian metric ds2 = −d2S( D ^ , which is physically founded and which characterises the amount of quantum information lost by mixing  D ^ and  D ^ + d D ^ . A rich geometric structure is thereby implemented in quantum mechanics. It includes a canonical mapping between the spaces of states and of observables, which involves the Legendre transform of S( D ^ . The Kubo scalar product is recovered within the space of observables. Applications are given to equilibrium and non equilibrium quantum statistical mechanics. There the formalism is specialised to the relevant space of observables and to the associated reduced states issued from the maximum entropy criterion, which result from the exact states through an orthogonal projection. Von Neumann’s entropy specialises into a relevant entropy. Comparison is made with other metrics. The Riemannian properties of the metric ds2 = −d2S( D ^ are derived. The curvature arises from the non-Abelian nature of quantum mechanics; its general expression and its explicit form for q-bits are given, as well as geodesics.

  8. Metrics of the perception of body movement.

    Science.gov (United States)

    Giese, Martin A; Thornton, Ian; Edelman, Shimon

    2008-07-28

    Body movements are recognized with speed and precision, even from strongly impoverished stimuli. While cortical structures involved in biological motion recognition have been identified, the nature of the underlying perceptual representation remains largely unknown. We show that visual representations of complex body movements are characterized by perceptual spaces with well-defined metric properties. By multidimensional scaling, we reconstructed from similarity judgments the perceptual space configurations of stimulus sets generated by motion morphing. These configurations resemble the true stimulus configurations in the space of morphing weights. In addition, we found an even higher similarity between the perceptual metrics and the metrics of a physical space that was defined by distance measures between joint trajectories, which compute spatial trajectory differences after time alignment using a robust error norm. These outcomes were independent of the experimental paradigm for the assessment of perceived similarity (pairs-comparison vs. delayed match-to-sample) and of the method of stimulus presentation (point-light stimuli vs. stick figures). Our findings suggest that the visual perception of body motion is veridical and closely reflects physical similarities between joint trajectories. This implies that representations of form and motion share fundamental properties and places constraints on the computational mechanisms that support the recognition of biological motion patterns.

  9. A computational imaging target specific detectivity metric

    Science.gov (United States)

    Preece, Bradley L.; Nehmetallah, George

    2017-05-01

    Due to the large quantity of low-cost, high-speed computational processing available today, computational imaging (CI) systems are expected to have a major role for next generation multifunctional cameras. The purpose of this work is to quantify the performance of theses CI systems in a standardized manner. Due to the diversity of CI system designs that are available today or proposed in the near future, significant challenges in modeling and calculating a standardized detection signal-to-noise ratio (SNR) to measure the performance of these systems. In this paper, we developed a path forward for a standardized detectivity metric for CI systems. The detectivity metric is designed to evaluate the performance of a CI system searching for a specific known target or signal of interest, and is defined as the optimal linear matched filter SNR, similar to the Hotelling SNR, calculated in computational space with special considerations for standardization. Therefore, the detectivity metric is designed to be flexible, in order to handle various types of CI systems and specific targets, while keeping the complexity and assumptions of the systems to a minimum.

  10. Toward a meaningful metric of implicit prejudice.

    Science.gov (United States)

    Blanton, Hart; Jaccard, James; Strauts, Erin; Mitchell, Gregory; Tetlock, Philip E

    2015-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 100(5) of Journal of Applied Psychology (see record 2015-40760-001). there are errors in some of the values listed in Table 6 that do not alter any of the conclusions or substantive statements in the original article. The corrected portion of Table 6 is in the correction. The positive intercepts in this table represent the estimated IAT score when the criterion has a value of zero (suggesting attitudinal neutrality), except in the equation examining voter preference in Greenwald et al. (2009), where the intercept estimated the IAT score of Obama voters.] The modal distribution of the Implicit Association Test (IAT) is commonly interpreted as showing high levels of implicit prejudice among Americans. These interpretations have fueled calls for changes in organizational and legal practices, but such applications are problematic because the IAT is scored on an arbitrary psychological metric. The present research was designed to make the IAT metric less arbitrary by determining the scores on IAT measures that are associated with observable racial or ethnic bias. By reexamining data from published studies, we found evidence that the IAT metric is "right biased," such that individuals who are behaviorally neutral tend to have positive IAT scores. Current scoring conventions fail to take into account these dynamics and can lead to faulty inferences about the prevalence of implicit prejudice. (c) 2015 APA, all rights reserved).

  11. A remodelling metric for angular fibre distributions and its application to diseased carotid bifurcations.

    LENUS (Irish Health Repository)

    Creane, Arthur

    2012-07-01

    Many soft biological tissues contain collagen fibres, which act as major load bearing constituents. The orientation and the dispersion of these fibres influence the macroscopic mechanical properties of the tissue and are therefore of importance in several areas of research including constitutive model development, tissue engineering and mechanobiology. Qualitative comparisons between these fibre architectures can be made using vector plots of mean orientations and contour plots of fibre dispersion but quantitative comparison cannot be achieved using these methods. We propose a \\'remodelling metric\\' between two angular fibre distributions, which represents the mean rotational effort required to transform one into the other. It is an adaptation of the earth mover\\'s distance, a similarity measure between two histograms\\/signatures used in image analysis, which represents the minimal cost of transforming one distribution into the other by moving distribution mass around. In this paper, its utility is demonstrated by considering the change in fibre architecture during a period of plaque growth in finite element models of the carotid bifurcation. The fibre architecture is predicted using a strain-based remodelling algorithm. We investigate the remodelling metric\\'s potential as a clinical indicator of plaque vulnerability by comparing results between symptomatic and asymptomatic carotid bifurcations. Fibre remodelling was found to occur at regions of plaque burden. As plaque thickness increased, so did the remodelling metric. A measure of the total predicted fibre remodelling during plaque growth, TRM, was found to be higher in the symptomatic group than in the asymptomatic group. Furthermore, a measure of the total fibre remodelling per plaque size, TRM\\/TPB, was found to be significantly higher in the symptomatic vessels. The remodelling metric may prove to be a useful tool in other soft tissues and engineered scaffolds where fibre adaptation is also present.

  12. Metrics. [measurement for effective software development and management

    Science.gov (United States)

    Mcgarry, Frank

    1991-01-01

    A development status evaluation is presented for practical software performance measurement, or 'metrics', in which major innovations have recently occurred. Metrics address such aspects of software performance as whether a software project is on schedule, how many errors can be expected from it, whether the methodology being used is effective and the relative quality of the software employed. Metrics may be characterized as explicit, analytical, and subjective. Attention is given to the bases for standards and the conduct of metrics research.

  13. What can article-level metrics do for you?

    Science.gov (United States)

    Fenner, Martin

    2013-10-01

    Article-level metrics (ALMs) provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  14. A note on generalized metrics on complex manifolds

    International Nuclear Information System (INIS)

    Rastogi, S.C.

    1986-08-01

    In 1981, Hojo introduced a generalized metric function Φ (P) , p(≠1) is a real number in a Finsler space and studied some beautiful consequences of such a metric function. The aim of this paper is to investigate the possibility of introducing a similar metric function on a complex manifold studied by Rund. It is interesting to note that such an introduction is unnatural for values of p other than 2, which corresponds to the metric function introduced by Rund. (author)

  15. Existing Model Metrics and Relations to Model Quality

    OpenAIRE

    Mohagheghi, Parastoo; Dehlen, Vegard

    2009-01-01

    This paper presents quality goals for models and provides a state-of-the-art analysis regarding model metrics. While model-based software development often requires assessing the quality of models at different abstraction and precision levels and developed for multiple purposes, existing work on model metrics do not reflect this need. Model size metrics are descriptive and may be used for comparing models but their relation to model quality is not welldefined. Code metrics are proposed to be ...

  16. What can article-level metrics do for you?

    Directory of Open Access Journals (Sweden)

    Martin Fenner

    2013-10-01

    Full Text Available Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, usage statistics, discussions in online comments and social media, social bookmarking, and recommendations. In this essay, we describe why article-level metrics are an important extension of traditional citation-based journal metrics and provide a number of example from ALM data collected for PLOS Biology.

  17. Not in one metric: Neuroticism modulates different resting state metrics within distinctive brain regions.

    Science.gov (United States)

    Gentili, Claudio; Cristea, Ioana Alina; Ricciardi, Emiliano; Vanello, Nicola; Popita, Cristian; David, Daniel; Pietrini, Pietro

    2017-06-01

    Neuroticism is a complex personality trait encompassing diverse aspects. Notably, high levels of neuroticism are related to the onset of psychiatric conditions, including anxiety and mood disorders. Personality traits are stable individual features; therefore, they can be expected to be associated with stable neurobiological features, including the Brain Resting State (RS) activity as measured by fMRI. Several metrics have been used to describe RS properties, yielding rather inconsistent results. This inconsistency could be due to the fact that different metrics portray different RS signal properties and that these properties may be differently affected by neuroticism. To explore the distinct effects of neuroticism, we assessed several distinct metrics portraying different RS properties within the same population. Neuroticism was measured in 31 healthy subjects using the Zuckerman-Kuhlman Personality Questionnaire; RS was acquired by high-resolution fMRI. Using linear regression, we examined the modulatory effects of neuroticism on RS activity, as quantified by the Amplitude of low frequency fluctuations (ALFF, fALFF), regional homogeneity (REHO), Hurst Exponent (H), global connectivity (GC) and amygdalae functional connectivity. Neuroticism modulated the different metrics across a wide network of brain regions, including emotional regulatory, default mode and visual networks. Except for some similarities in key brain regions for emotional expression and regulation, neuroticism affected different metrics in different ways. Metrics more related to the measurement of regional intrinsic brain activity (fALFF, ALFF and REHO), or that provide a parsimonious index of integrated and segregated brain activity (HE), were more broadly modulated in regions related to emotions and their regulation. Metrics related to connectivity were modulated across a wider network of areas. Overall, these results show that neuroticism affects distinct aspects of brain resting state activity

  18. [Measuring in Metric:] A Teacher's Workshop Manual for Individualized Instruction.

    Science.gov (United States)

    Sorenson, Juanita S.; And Others

    This manual contains nine modules on the metric system: background and overview, length and basic prefixes, volume, mass, temperature, relationships within metric system, everyday applications, relationships between metric and English, and developing a teaching plan. Each module states objectives, suggested activities, and illustration of mastery.…

  19. An inheritance complexity metric for object-oriented code: A ...

    Indian Academy of Sciences (India)

    An inheritance complexity metric for object-oriented code: A cognitive approach ... Software metrics; object-oriented programming; software complexity; cognitive weights; measurement theory; empirical validation. ... In this paper, we propose a cognitive complexity metric for evaluating design of object-oriented (OO) code.

  20. OPTIMAL EMBEDDINGS OF FINITE METRIC SPACES INTO GRAPHS

    Directory of Open Access Journals (Sweden)

    Derya Çelik

    2015-12-01

    Full Text Available We consider the embedding of a finite metric space into a weighted graph in such a way that the total weight of the edges is minimal. We discuss metric spaces with  points in detail and show that the already known classification for these cases can be obtained by simple operations on the associated graph of the given metric space.

  1. A note on a paraholomorphic Cheeger–Gromoll metric

    Indian Academy of Sciences (India)

    on the tangent bundle of Riemannian manifolds. Keywords. Cheeger–Gromoll metric; pure metric; paracomplex structure; paraholo- morphic tensor field. 1. Introduction. In [1], Cheeger and Gromoll study complete manifolds of nonnegative curvature and sug- gest a construction of Riemannian metrics useful in that context.

  2. An inheritance complexity metric for object-oriented code: A ...

    Indian Academy of Sciences (India)

    Abstract. Software metrics should be used in order to improve the productivity and quality of software, because they provide critical information about reliability and maintainability of the system. In this paper, we propose a cognitive complexity metric for evaluating design of object-oriented (OO) code. The proposed metric is ...

  3. On the completion of partial metric spaces | Van Dung | Quaestiones ...

    African Journals Online (AJOL)

    In this note, we give an example to answer affirmatively Ge-Lin's question on the completion of partial metric spaces [3, Question 1]. We also study some related results on the completion of a partial metric space. Keywords: Partial metric space, completion ...

  4. Invariant Einstein metrics on Ledger-Obata spaces

    OpenAIRE

    Chen, Zhiqi; Nikonorov, Yuriĭ; Nikonorova, Yulia

    2016-01-01

    In this paper, we study invariant Einstein metrics on Ledger-Obata spaces $F^m/\\operatorname{diag}(F)$. In particular, we classify invariant Einstein metrics on $F^4/\\operatorname{diag}(F)$ and estimate the number of invariant Einstein metrics on general Ledger-Obata spaces $F^{m}/\\operatorname{diag}(F)$.

  5. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  6. The universal connection and metrics on moduli spaces

    International Nuclear Information System (INIS)

    Massamba, Fortune; Thompson, George

    2003-11-01

    We introduce a class of metrics on gauge theoretic moduli spaces. These metrics are made out of the universal matrix that appears in the universal connection construction of M. S. Narasimhan and S. Ramanan. As an example we construct metrics on the c 2 = 1 SU(2) moduli space of instantons on R 4 for various universal matrices. (author)

  7. Modified intuitionistic fuzzy metric spaces and some fixed point theorems

    International Nuclear Information System (INIS)

    Saadati, R.; Sedghi, S.; Shobe, N.

    2008-01-01

    Since the intuitionistic fuzzy metric space has extra conditions (see [Gregori V, Romaguera S, Veereamani P. A note on intuitionistic fuzzy metric spaces. Chaos, Solitons and Fractals 2006;28:902-5]). In this paper, we consider modified intuitionistic fuzzy metric spaces and prove some fixed point theorems in these spaces. All the results presented in this paper are new

  8. Tide or Tsunami? The Impact of Metrics on Scholarly Research

    Science.gov (United States)

    Bonnell, Andrew G.

    2016-01-01

    Australian universities are increasingly resorting to the use of journal metrics such as impact factors and ranking lists in appraisal and promotion processes, and are starting to set quantitative "performance expectations" which make use of such journal-based metrics. The widespread use and misuse of research metrics is leading to…

  9. Understanding Acceptance of Software Metrics--A Developer Perspective

    Science.gov (United States)

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  10. Robustness of climate metrics under climate policy ambiguity

    International Nuclear Information System (INIS)

    Ekholm, Tommi; Lindroos, Tomi J.; Savolainen, Ilkka

    2013-01-01

    Highlights: • We assess the economic impacts of using different climate metrics. • The setting is cost-efficient scenarios for three interpretations of the 2C target. • With each target setting, the optimal metric is different. • Therefore policy ambiguity prevents the selection of an optimal metric. • Robust metric values that perform well with multiple policy targets however exist. -- Abstract: A wide array of alternatives has been proposed as the common metrics with which to compare the climate impacts of different emission types. Different physical and economic metrics and their parameterizations give diverse weights between e.g. CH 4 and CO 2 , and fixing the metric from one perspective makes it sub-optimal from another. As the aims of global climate policy involve some degree of ambiguity, it is not possible to determine a metric that would be optimal and consistent with all policy aims. This paper evaluates the cost implications of using predetermined metrics in cost-efficient mitigation scenarios. Three formulations of the 2 °C target, including both deterministic and stochastic approaches, shared a wide range of metric values for CH 4 with which the mitigation costs are only slightly above the cost-optimal levels. Therefore, although ambiguity in current policy might prevent us from selecting an optimal metric, it can be possible to select robust metric values that perform well with multiple policy targets

  11. A Quantum Hybrid PSO Combined with Fuzzy k-NN Approach to Feature Selection and Cell Classification in Cervical Cancer Detection.

    Science.gov (United States)

    Iliyasu, Abdullah M; Fatichah, Chastine

    2017-12-19

    A quantum hybrid (QH) intelligent approach that blends the adaptive search capability of the quantum-behaved particle swarm optimisation (QPSO) method with the intuitionistic rationality of traditional fuzzy k -nearest neighbours (Fuzzy k -NN) algorithm (known simply as the Q-Fuzzy approach) is proposed for efficient feature selection and classification of cells in cervical smeared (CS) images. From an initial multitude of 17 features describing the geometry, colour, and texture of the CS images, the QPSO stage of our proposed technique is used to select the best subset features (i.e., global best particles) that represent a pruned down collection of seven features. Using a dataset of almost 1000 images, performance evaluation of our proposed Q-Fuzzy approach assesses the impact of our feature selection on classification accuracy by way of three experimental scenarios that are compared alongside two other approaches: the All-features (i.e., classification without prior feature selection) and another hybrid technique combining the standard PSO algorithm with the Fuzzy k -NN technique (P-Fuzzy approach). In the first and second scenarios, we further divided the assessment criteria in terms of classification accuracy based on the choice of best features and those in terms of the different categories of the cervical cells. In the third scenario, we introduced new QH hybrid techniques, i.e., QPSO combined with other supervised learning methods, and compared the classification accuracy alongside our proposed Q-Fuzzy approach. Furthermore, we employed statistical approaches to establish qualitative agreement with regards to the feature selection in the experimental scenarios 1 and 3. The synergy between the QPSO and Fuzzy k -NN in the proposed Q-Fuzzy approach improves classification accuracy as manifest in the reduction in number cell features, which is crucial for effective cervical cancer detection and diagnosis.

  12. Massless and massive quanta resulting from a mediumlike metric tensor

    International Nuclear Information System (INIS)

    Soln, J.

    1985-01-01

    A simple model of the ''primordial'' scalar field theory is presented in which the metric tensor is a generalization of the metric tensor from electrodynamics in a medium. The radiation signal corresponding to the scalar field propagates with a velocity that is generally less than c. This signal can be associated simultaneously with imaginary and real effective (momentum-dependent) masses. The requirement that the imaginary effective mass vanishes, which we take to be the prerequisite for the vacuumlike signal propagation, leads to the ''spontaneous'' splitting of the metric tensor into two distinct metric tensors: one metric tensor gives rise to masslesslike radiation and the other to a massive particle. (author)

  13. Pragmatic security metrics applying metametrics to information security

    CERN Document Server

    Brotby, W Krag

    2013-01-01

    Other books on information security metrics discuss number theory and statistics in academic terms. Light on mathematics and heavy on utility, PRAGMATIC Security Metrics: Applying Metametrics to Information Security breaks the mold. This is the ultimate how-to-do-it guide for security metrics.Packed with time-saving tips, the book offers easy-to-follow guidance for those struggling with security metrics. Step by step, it clearly explains how to specify, develop, use, and maintain an information security measurement system (a comprehensive suite of metrics) to

  14. On Fixed Point Theorems in Probabilistic Metric Spaces and Applications

    Science.gov (United States)

    Goleţ, Ioan; Goleţ, Ionuţ

    2008-09-01

    In [4] S. Gähler formulated an appropriate system of axioms for a distance between three points and developed a theory of 2-metric spaces. A slight enlargement of the concept of 2-metric space was given in [3], where B. C. Dhage studied so called generalized metric spaces. In the present paper we have studied contraction conditions for mappings defined on a class of probabilistic metric space and fixed point theorems for such mappings. As a particular cases we have obtain fixed point theorems for random operator and for mappings defined on deterministic metric spaces.

  15. Lee-Wick indefinite metric quantization: A functional integral approach

    International Nuclear Information System (INIS)

    Boulware, D.G.; Gross, D.J.

    1984-01-01

    In an attempt to study the stability of the Lee-Wick indefinite metric theory, the functional integral for indefinite metric quantum field theories is derived. Theories with an indefinite classical energy may be quantized with either a normal metric and an indefinite energy in Minkowski space or an indefinite metric and a positive energy in euclidean space. However, the functional integral in the latter formulation does not incorporate the Lee-Wick prescription for assuring the unitarity of the positive energy positive metric sector of the theory, hence the stability of the theory cannot be studied non-perturbatively. (orig.)

  16. Principle of space existence and De Sitter metric

    International Nuclear Information System (INIS)

    Mal'tsev, V.K.

    1990-01-01

    The selection principle for the solutions of the Einstein equations suggested in a series of papers implies the existence of space (g ik ≠ 0) only in the presence of matter (T ik ≠0). This selection principle (principle of space existence, in the Markov terminology) implies, in the general case, the absence of the cosmological solution with the De Sitter metric. On the other hand, the De Sitter metric is necessary for describing both inflation and deflation periods of the Universe. It is shown that the De Sitter metric is also allowed by the selection principle under discussion if the metric experiences the evolution into the Friedmann metric

  17. How to evaluate objective video quality metrics reliably

    DEFF Research Database (Denmark)

    Korhonen, Jari; Burini, Nino; You, Junyong

    2012-01-01

    The typical procedure for evaluating the performance of different objective quality metrics and indices involves comparisons between subjective quality ratings and the quality indices obtained using the objective metrics in question on the known video sequences. Several correlation indicators can...... as processing of subjective data. We also suggest some general guidelines for researchers to make comparison studies of objective video quality metrics more reliable and useful for the practitioners in the field.......The typical procedure for evaluating the performance of different objective quality metrics and indices involves comparisons between subjective quality ratings and the quality indices obtained using the objective metrics in question on the known video sequences. Several correlation indicators can...

  18. Extremal limits of the C metric: Nariai, Bertotti-Robinson, and anti-Nariai C metrics

    International Nuclear Information System (INIS)

    Dias, Oscar J.C.; Lemos, Jose P.S.

    2003-01-01

    In two previous papers we have analyzed the C metric in a background with a cosmological constant Λ, namely, the de-Sitter (dS) C metric (Λ>0), and the anti-de Sitter (AdS) C metric (Λ 0, Λ=0, and Λ 2 xS-tilde 2 ) to each point in the deformed two-sphere S-tilde 2 corresponds a dS 2 spacetime, except for one point which corresponds to a dS 2 spacetime with an infinite straight strut or string. There are other important new features that appear. One expects that the solutions found in this paper are unstable and decay into a slightly nonextreme black hole pair accelerated by a strut or by strings. Moreover, the Euclidean version of these solutions mediate the quantum process of black hole pair creation that accompanies the decay of the dS and AdS spaces

  19. A Hidden-Exposed Terminal Interference Aware Routing Metric for Multi-Radio and Multi-Rate Wireless Mesh Networks

    Science.gov (United States)

    Jin, Shouguang; Mase, Kenichi

    In this paper, we propose a novel Hidden-terminal and Exposed-terminal Interference aware routing metric (HEI-ETT) for Multi-Radio and Multi-Rate wireless mesh networks in which each stationary mesh node is equipped with multi-radio interfaces that relays traffic to the extend networks by using multi-hop transmissions. We have two main design goals for HEI-ETT. First, we will characterize interferences as Hidden-terminal Interference and Exposed-terminal Interference regardless of inter- or intra-flow interference and should take into account both interference effects while computing the path metric. Second, an efficient transmission rate adaptation should be employed in HEI-ETT to enhance the network throughput. We incorporated our metric in well known Optimized Link State Routing protocol version 2 (OLSRv2) which is one of the two standard routing protocols for MANETs and evaluated the performance of our metric by simulation. The results show that our metric outperforms existing metrics such as ETX, ETT and WCETT.

  20. On the Efficiency of Image Metrics for Evaluating the Visual Quality of 3D Models.

    Science.gov (United States)

    Lavoue, Guillaume; Larabi, Mohamed Chaker; Vasa, Libor

    2016-08-01

    3D meshes are deployed in a wide range of application processes (e.g., transmission, compression, simplification, watermarking and so on) which inevitably introduce geometric distortions that may alter the visual quality of the rendered data. Hence, efficient model-based perceptual metrics, operating on the geometry of the meshes being compared, have been recently introduced to control and predict these visual artifacts. However, since the 3D models are ultimately visualized on 2D screens, it seems legitimate to use images of the models (i.e., snapshots from different viewpoints) to evaluate their visual fidelity. In this work we investigate the use of image metrics to assess the visual quality of 3D models. For this goal, we conduct a wide-ranging study involving several 2D metrics, rendering algorithms, lighting conditions and pooling algorithms, as well as several mean opinion score databases. The collected data allow (1) to determine the best set of parameters to use for this image-based quality assessment approach and (2) to compare this approach to the best performing model-based metrics and determine for which use-case they are respectively adapted. We conclude by exploring several applications that illustrate the benefits of image-based quality assessment.

  1. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  2. THE ROLE OF ARTICLE LEVEL METRICS IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Vladimir TRAJKOVSKI

    2016-04-01

    Full Text Available Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Article-level metrics (ALMs provide a wide range of metrics about the uptake of an individual journal article by the scientific community after publication. They include citations, statistics of usage, discussions in online comments and social media, social bookmarking, and recommendations. In this editorial, the role of article level metrics in publishing scientific papers has been described. Article-Level Metrics (ALMs are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. Data sources depend on the tool, but they include classic metrics indicators depending on citations, academic social networks (Mendeley, CiteULike, Delicious and social media (Facebook, Twitter, blogs, and Youtube. The most popular tools used to apply this new metrics are: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum Analytics. Journal Impact Factor (JIF does not consider impact or influence beyond citations count as this count reflected only through Thomson Reuters’ Web of Science® database. JIF provides indicator related to the journal, but not related to a published paper. Thus, altmetrics now becomes an alternative metrics for performance assessment of individual scientists and their contributed scholarly publications. Macedonian scholarly publishers have to work on implementing of article level metrics in their e-journals. It is the way to increase their visibility and impact in the world of science.

  3. Novel metrics for growth model selection.

    Science.gov (United States)

    Grigsby, Matthew R; Di, Junrui; Leroux, Andrew; Zipunnikov, Vadim; Xiao, Luo; Crainiceanu, Ciprian; Checkley, William

    2018-01-01

    Literature surrounding the statistical modeling of childhood growth data involves a diverse set of potential models from which investigators can choose. However, the lack of a comprehensive framework for comparing non-nested models leads to difficulty in assessing model performance. This paper proposes a framework for comparing non-nested growth models using novel metrics of predictive accuracy based on modifications of the mean squared error criteria. Three metrics were created: normalized, age-adjusted, and weighted mean squared error (MSE). Predictive performance metrics were used to compare linear mixed effects models and functional regression models. Prediction accuracy was assessed by partitioning the observed data into training and test datasets. This partitioning was constructed to assess prediction accuracy for backward (i.e., early growth), forward (i.e., late growth), in-range, and on new-individuals. Analyses were done with height measurements from 215 Peruvian children with data spanning from near birth to 2 years of age. Functional models outperformed linear mixed effects models in all scenarios tested. In particular, prediction errors for functional concurrent regression (FCR) and functional principal component analysis models were approximately 6% lower when compared to linear mixed effects models. When we weighted subject-specific MSEs according to subject-specific growth rates during infancy, we found that FCR was the best performer in all scenarios. With this novel approach, we can quantitatively compare non-nested models and weight subgroups of interest to select the best performing growth model for a particular application or problem at hand.

  4. The mathematics of non-linear metrics for nested networks

    Science.gov (United States)

    Wu, Rui-Jie; Shi, Gui-Yuan; Zhang, Yi-Cheng; Mariani, Manuel Sebastian

    2016-10-01

    Numerical analysis of data from international trade and ecological networks has shown that the non-linear fitness-complexity metric is the best candidate to rank nodes by importance in bipartite networks that exhibit a nested structure. Despite its relevance for real networks, the mathematical properties of the metric and its variants remain largely unexplored. Here, we perform an analytic and numeric study of the fitness-complexity metric and a new variant, called minimal extremal metric. We rigorously derive exact expressions for node scores for perfectly nested networks and show that these expressions explain the non-trivial convergence properties of the metrics. A comparison between the fitness-complexity metric and the minimal extremal metric on real data reveals that the latter can produce improved rankings if the input data are reliable.

  5. Decision Analysis for Metric Selection on a Clinical Quality Scorecard.

    Science.gov (United States)

    Guth, Rebecca M; Storey, Patricia E; Vitale, Michael; Markan-Aurora, Sumita; Gordon, Randolph; Prevost, Traci Q; Dunagan, Wm Claiborne; Woeltje, Keith F

    2016-09-01

    Clinical quality scorecards are used by health care institutions to monitor clinical performance and drive quality improvement. Because of the rapid proliferation of quality metrics in health care, BJC HealthCare found it increasingly difficult to select the most impactful scorecard metrics while still monitoring metrics for regulatory purposes. A 7-step measure selection process was implemented incorporating Kepner-Tregoe Decision Analysis, which is a systematic process that considers key criteria that must be satisfied in order to make the best decision. The decision analysis process evaluates what metrics will most appropriately fulfill these criteria, as well as identifies potential risks associated with a particular metric in order to identify threats to its implementation. Using this process, a list of 750 potential metrics was narrowed to 25 that were selected for scorecard inclusion. This decision analysis process created a more transparent, reproducible approach for selecting quality metrics for clinical quality scorecards. © The Author(s) 2015.

  6. An Improved Metric Learning Approach for Degraded Face Recognition

    Directory of Open Access Journals (Sweden)

    Guofeng Zou

    2014-01-01

    Full Text Available To solve the matching problem of the elements in different data collections, an improved coupled metric learning approach is proposed. First, we improved the supervised locality preserving projection algorithm and added the within-class and between-class information of the improved algorithm to coupled metric learning, so a novel coupled metric learning method is proposed. Furthermore, we extended this algorithm to nonlinear space, and the kernel coupled metric learning method based on supervised locality preserving projection is proposed. In kernel coupled metric learning approach, two elements of different collections are mapped to the unified high dimensional feature space by kernel function, and then generalized metric learning is performed in this space. Experiments based on Yale and CAS-PEAL-R1 face databases demonstrate that the proposed kernel coupled approach performs better in low-resolution and fuzzy face recognition and can reduce the computing time; it is an effective metric method.

  7. National Metrical Types in Nineteenth Century Art Song

    Directory of Open Access Journals (Sweden)

    Leigh VanHandel

    2010-01-01

    Full Text Available William Rothstein’s article “National metrical types in music of the eighteenth and early nineteenth centuries” (2008 proposes a distinction between the metrical habits of 18th and early 19th century German music and those of Italian and French music of that period. Based on theoretical treatises and compositional practice, he outlines these national metrical types and discusses the characteristics of each type. This paper presents the results of a study designed to determine whether, and to what degree, Rothstein’s characterizations of national metrical types are present in 19th century French and German art song. Studying metrical habits in this genre may provide a lens into changing metrical conceptions of 19th century theorists and composers, as well as to the metrical habits and compositional style of individual 19th century French and German art song composers.

  8. Graev metrics on free products and HNN extensions

    DEFF Research Database (Denmark)

    Slutsky, Konstantin

    2014-01-01

    We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointe...... metric spaces. - See more at: http://www.ams.org/journals/tran/2014-366-12/S0002-9947-2014-06010-8/#sthash.ullL9RMw.dpuf......We give a construction of two-sided invariant metrics on free products (possibly with amalgamation) of groups with two-sided invariant metrics and, under certain conditions, on HNN extensions of such groups. Our approach is similar to the Graev's construction of metrics on free groups over pointed...

  9. Class Cohesion Metrics for Software Engineering: A Critical Review

    Directory of Open Access Journals (Sweden)

    Habib Izadkhah

    2017-02-01

    Full Text Available Class cohesion or degree of the relations of class members is considered as one of the crucial quality criteria. A class with a high cohesion improves understandability, maintainability and reusability. The class cohesion metrics can be measured quantitatively and therefore can be used as a base for assessing the quality of design. The main objective of this paper is to identify important research directions in the area of class cohesion metrics that require further attention in order to develop more effective and efficient class cohesion metrics for software engineering. In this paper, we discuss the class cohesion assessing metrics (thirty-two metrics that have received the most attention in the research community and compare them from different aspects. We also present desirable properties of cohesion metrics to validate class cohesion metrics.

  10. Improving medication safety through the use of metrics.

    Science.gov (United States)

    Beckett, Robert D; Yazdi, Marina; Hanson, Laura J; Thompson, Ross W

    2014-02-01

    Describe medication safety metrics used at University HealthSystem Consortium (UHC) institutions and recommend a meaningful way to report and communicate medication safety information across an organization. A cross-sectional study was conducted using an electronically distributed, open-ended survey instrument. Twenty percent of the UHC institutions responded to our survey. Seventy-seven percent of those institutions responding to our survey reported their organization has defined metrics to measure medication safety; an additional 21% of the institutions were still in the process of defining metrics. Of metrics that were reported, 33% were true medication safety metrics. Results are distributed to a wide variety of institutional venues. Institutions should take several actions related to medication safety including defining local metrics; building metrics addressing preventable adverse drug events, medication errors, and technology; and reporting results to a variety of venues in order to design specific interventions to improve local medication use.

  11. Invariant distances and metrics in complex analysis

    CERN Document Server

    Jarnicki, Marek

    2013-01-01

    As in the field of ""Invariant Distances and Metrics in Complex Analysis"" there was and is a continuous progress this is the second extended edition of the corresponding monograph. This comprehensive book is about the study of invariant pseudodistances (non-negative functions on pairs of points) and pseudometrics (non-negative functions on the tangent bundle) in several complex variables. It is an overview over a highly active research area at the borderline between complex analysis, functional analysis and differential geometry. New chapters are covering the Wu, Bergman and several other met

  12. Multi-Robot Assembly Strategies and Metrics

    Science.gov (United States)

    MARVEL, JEREMY A.; BOSTELMAN, ROGER; FALCO, JOE

    2018-01-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies. PMID:29497234

  13. Metrics for agile requirements definition and management

    OpenAIRE

    Jussilainen, Nina

    2016-01-01

    “You Can't Manage What You Don't Measure” (Origin unknown) was the starting point for this research. The goal of this research was to define metrics to support and monitor the requirements defi-nition and management in the Alusta P2P Invoice automation and Procurement product de-velopment in the target organization. The research was conducted as a constructive research including document analysis, inter-views and facilitated workshop and it was done during June 2016-December 2016. Th...

  14. Multi-Robot Assembly Strategies and Metrics.

    Science.gov (United States)

    Marvel, Jeremy A; Bostelman, Roger; Falco, Joe

    2018-02-01

    We present a survey of multi-robot assembly applications and methods and describe trends and general insights into the multi-robot assembly problem for industrial applications. We focus on fixtureless assembly strategies featuring two or more robotic systems. Such robotic systems include industrial robot arms, dexterous robotic hands, and autonomous mobile platforms, such as automated guided vehicles. In this survey, we identify the types of assemblies that are enabled by utilizing multiple robots, the algorithms that synchronize the motions of the robots to complete the assembly operations, and the metrics used to assess the quality and performance of the assemblies.

  15. Metric propositional neighborhood logics on natural numbers

    DEFF Research Database (Denmark)

    Bresolin, Davide; Della Monica, Dario; Goranko, Valentin

    2013-01-01

    Metric Propositional Neighborhood Logic (MPNL) over natural numbers. MPNL features two modalities referring, respectively, to an interval that is “met by” the current one and to an interval that “meets” the current one, plus an infinite set of length constraints, regarded as atomic propositions...... is decidable in double exponential time and expressively complete with respect to a well-defined sub-fragment of the two-variable fragment FO2[N,=,numbers. Moreover, we show that MPNL can be extended in a natural way...

  16. Metric preheating and limitations of linearized gravity

    International Nuclear Information System (INIS)

    Bassett, Bruce A.; Tamburini, Fabrizio; Kaiser, David I.; Maartens, Roy

    1999-01-01

    During the preheating era after inflation, resonant amplification of quantum field fluctuations takes place. Recently it has become clear that this must be accompanied by resonant amplification of scalar metric fluctuations, since the two are united by Einstein's equations. Furthermore, this 'metric preheating' enhances particle production, and leads to gravitational rescattering effects even at linear order. In multi-field models with strong preheating (q>>1), metric perturbations are driven non-linear, with the strongest amplification typically on super-Hubble scales (k→0). This amplification is causal, being due to the super-Hubble coherence of the inflaton condensate, and is accompanied by resonant growth of entropy perturbations. The amplification invalidates the use of the linearized Einstein field equations, irrespective of the amount of fine-tuning of the initial conditions. This has serious implications on all scales - from large-angle cosmic microwave background (CMB) anisotropies to primordial black holes. We investigate the (q,k) parameter space in a two-field model, and introduce the time to non-linearity, t nl , as the timescale for the breakdown of the linearized Einstein equations. t nl is a robust indicator of resonance behavior, showing the fine structure in q and k that one expects from a quasi-Floquet system, and we argue that t nl is a suitable generalization of the static Floquet index in an expanding universe. Backreaction effects are expected to shut down the linear resonances, but cannot remove the existing amplification, which threatens the viability of strong preheating when confronted with the CMB. Mode-mode coupling and turbulence tend to re-establish scale invariance, but this process is limited by causality and for small k the primordial scale invariance of the spectrum may be destroyed. We discuss ways to escape the above conclusions, including secondary phases of inflation and preheating solely to fermions. The exclusion principle

  17. Indefinite metric and regularization of electrodynamics

    International Nuclear Information System (INIS)

    Gaudin, M.

    1984-06-01

    The invariant regularization of Pauli and Villars in quantum electrodynamics can be considered as deriving from a local and causal lagrangian theory for spin 1/2 bosons, by introducing an indefinite metric and a condition on the allowed states similar to the Lorentz condition. The consequences are the asymptotic freedom of the photon's propagator. We present a calcultion of the effective charge to the fourth order in the coupling as a function of the auxiliary masses, the theory avoiding all mass divergencies to this order [fr

  18. Rotating Black Holes and the Kerr Metric

    Science.gov (United States)

    Kerr, Roy Patrick

    2008-10-01

    Since it was first discovered in 1963 the Kerr metric has been used by relativists as a test-bed for conjectures on worm-holes, time travel, closed time-like loops, and the existence or otherwise of global Cauchy surfaces. More importantly, it has also used by astrophysicists to investigate the effects of collapsed objects on their local environments. These two groups of applications should not be confused. Astrophysical Black Holes are not the same as the Kruskal solution and its generalisations.

  19. Sigma Metrics Across the Total Testing Process.

    Science.gov (United States)

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Adaptive Sampling in Hierarchical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  1. Self-supervised online metric learning with low rank constraint for scene categorization.

    Science.gov (United States)

    Cong, Yang; Liu, Ji; Yuan, Junsong; Luo, Jiebo

    2013-08-01

    Conventional visual recognition systems usually train an image classifier in a bath mode with all training data provided in advance. However, in many practical applications, only a small amount of training samples are available in the beginning and many more would come sequentially during online recognition. Because the image data characteristics could change over time, it is important for the classifier to adapt to the new data incrementally. In this paper, we present an online metric learning method to address the online scene recognition problem via adaptive similarity measurement. Given a number of labeled data followed by a sequential input of unseen testing samples, the similarity metric is learned to maximize the margin of the distance among different classes of samples. By considering the low rank constraint, our online metric learning model not only can provide competitive performance compared with the state-of-the-art methods, but also guarantees convergence. A bi-linear graph is also defined to model the pair-wise similarity, and an unseen sample is labeled depending on the graph-based label propagation, while the model can also self-update using the more confident new samples. With the ability of online learning, our methodology can well handle the large-scale streaming video data with the ability of incremental self-updating. We evaluate our model to online scene categorization and experiments on various benchmark datasets and comparisons with state-of-the-art methods demonstrate the effectiveness and efficiency of our algorithm.

  2. Covariant electrodynamics in linear media: Optical metric

    Science.gov (United States)

    Thompson, Robert T.

    2018-03-01

    While the postulate of covariance of Maxwell's equations for all inertial observers led Einstein to special relativity, it was the further demand of general covariance—form invariance under general coordinate transformations, including between accelerating frames—that led to general relativity. Several lines of inquiry over the past two decades, notably the development of metamaterial-based transformation optics, has spurred a greater interest in the role of geometry and space-time covariance for electrodynamics in ponderable media. I develop a generally covariant, coordinate-free framework for electrodynamics in general dielectric media residing in curved background space-times. In particular, I derive a relation for the spatial medium parameters measured by an arbitrary timelike observer. In terms of those medium parameters I derive an explicit expression for the pseudo-Finslerian optical metric of birefringent media and show how it reduces to a pseudo-Riemannian optical metric for nonbirefringent media. This formulation provides a basis for a unified approach to ray and congruence tracing through media in curved space-times that may smoothly vary among positively refracting, negatively refracting, and vacuum.

  3. Defining a standard metric for electricity savings

    International Nuclear Information System (INIS)

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  4. Defining a standard metric for electricity savings

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  5. Network Community Detection on Metric Space

    Directory of Open Access Journals (Sweden)

    Suman Saha

    2015-08-01

    Full Text Available Community detection in a complex network is an important problem of much interest in recent years. In general, a community detection algorithm chooses an objective function and captures the communities of the network by optimizing the objective function, and then, one uses various heuristics to solve the optimization problem to extract the interesting communities for the user. In this article, we demonstrate the procedure to transform a graph into points of a metric space and develop the methods of community detection with the help of a metric defined for a pair of points. We have also studied and analyzed the community structure of the network therein. The results obtained with our approach are very competitive with most of the well-known algorithms in the literature, and this is justified over the large collection of datasets. On the other hand, it can be observed that time taken by our algorithm is quite less compared to other methods and justifies the theoretical findings.

  6. Hausdorff metric in the fuzzy environment

    Directory of Open Access Journals (Sweden)

    Laura Victoria Forero Vega

    2016-10-01

    Full Text Available Context: intuitively, the concept the set has been established as a collection of different elements, that is, a set is determined via the relationship of membership of an element of a universe as a whole. The situation, of course, is whether or does not belong; in a diffuse to each element subset of the universe it is associated with a degree of membership, which is a number between 0 and 1. The fuzzy subsets are established as a correspondence between each element of the universe and a degree of membership. Method: the study was based on previous work as articles or books, where authors present ideas about the importance of fuzzy subsets and the need to create with them new theories and spaces. Results: by combining two theories, a new study environment that allows state that corresponds Hausdorff distance, extends and adjusts the notion of distance between nonempty compact subsets in the environment of metrics spaces, more accurately generated in (Rn; du. Conclusions: the construction carried out allows a metric space with several qualities, where we can say that are the object consequence initial study.

  7. Fisher metric, geometric entanglement, and spin networks

    Science.gov (United States)

    Chirco, Goffredo; Mele, Fabio M.; Oriti, Daniele; Vitale, Patrizia

    2018-02-01

    Starting from recent results on the geometric formulation of quantum mechanics, we propose a new information geometric characterization of entanglement for spin network states in the context of quantum gravity. For the simple case of a single-link fixed graph (Wilson line), we detail the construction of a Riemannian Fisher metric tensor and a symplectic structure on the graph Hilbert space, showing how these encode the whole information about separability and entanglement. In particular, the Fisher metric defines an entanglement monotone which provides a notion of distance among states in the Hilbert space. In the maximally entangled gauge-invariant case, the entanglement monotone is proportional to a power of the area of the surface dual to the link thus supporting a connection between entanglement and the (simplicial) geometric properties of spin network states. We further extend such analysis to the study of nonlocal correlations between two nonadjacent regions of a generic spin network graph characterized by the bipartite unfolding of an intertwiner state. Our analysis confirms the interpretation of spin network bonds as a result of entanglement and to regard the same spin network graph as an information graph, whose connectivity encodes, both at the local and nonlocal level, the quantum correlations among its parts. This gives a further connection between entanglement and geometry.

  8. Value of the Company and Marketing Metrics

    Directory of Open Access Journals (Sweden)

    André Luiz Ramos

    2013-12-01

    Full Text Available Thinking marketing strategies from a resource-based perspective (Barney, 1991, proposing assets as either tangible, organizational and human, and from Constantin and Luch’s vision (1994, where strategic resources can be tanbigle or intangible, internal or external to the firm, raises a research approach on Marketing and Finance. According to Srivastava, Shervani and Fahey (1998 there are 3 market assets types, which generate firm value. Firm value can be measured by discounted cashflow, compromising marketing activities with value generation forcasts (Anderson, 1982; Day, Fahey, 1988; Doyle, 2000; Rust et al., 2004a. The economic value of marketing strategies and marketing metrics are calling strategy researchers’ and marketing managers’ attention, making clear the need for building a bridge able to articulate marketing and finance form a strategic perspective. This article proposes an analytical framework based on different scientific approaches envolving risk and return promoted by marketing strategies and points out advances concerning both methodological approaches and marketing strategies and its impact on firm metrics and value, usgin Srinivasan and Hanssens (2009 as a start point.

  9. Fanpage metrics analysis. "Study on content engagement"

    Science.gov (United States)

    Rahman, Zoha; Suberamanian, Kumaran; Zanuddin, Hasmah Binti; Moghavvemi, Sedigheh; Nasir, Mohd Hairul Nizam Bin Md

    2016-08-01

    Social Media is now determined as an excellent communicative tool to connect directly with consumers. One of the most significant ways to connect with the consumers through these Social Networking Sites (SNS) is to create a facebook fanpage with brand contents and to place different posts periodically on these fanpages. In measuring social networking sites' effectiveness, corporate houses are now analyzing metrics in terms of calculating engagement rate, number of comments/share and likings in fanpages. So now, it is very important for the marketers to know the effectiveness of different contents or posts of fanpages in order to increase the fan responsiveness and engagement rate in the fan pages. In the study the authors have analyzed total 1834 brand posts from 17 international brands of Electronics companies. Data of 9 months (From December 2014 to August 2015) have been collected for analyses, which were available online in the Brand' fan pages. An econometrics analysis is conducted using Eviews 9, to determine the impact of different contents on fanpage engagement. The study picked the four most frequently posted content to determine their impact on PTA (people Talking About) metrics and Fanpage engagement activities.

  10. Electromagnetic properties of metric based media

    International Nuclear Information System (INIS)

    Schultz, A.K.

    1979-01-01

    A metric based constitutive relation is used to calculate the optical properties of the matter free space surrounding the sources for the Schwarzschild, Reissner-Nordstrom, Kerr-Neumann, and Kerr-Taub-NUT singularity solutions to the Einstein-Maxwell field equations. Information is obtained about effective indices of refraction, the behavior of polarization states, and properties of photon trajectories in the medium. The methodology is demonstrated by presenting some new results in regard to propagation of light through crystals. The new information describes the combinations of optical phenomena which break the two-fold degeneracy of light speeds. In the metric based constitutive relation approach the homogeneous wave equation treatment neglects derivatives in the constitutive quantities and corresponds to the geometrical optics limit. The singularity solutions are extensively analyzed in that approximation. The effective indices of refraction with respect to a flat space-time background are functions of the coordinates and are anisotropic with respect to the coordinate directions in a spherical-like frame field. However, in a particular coordinate direction all polarization states propagate with the same speed so that birefringent or optical activity properties are not predicted for the homogeneous limit

  11. Defining a Standard Metric for Electricity Savings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  12. Two-Basket Approach and Emission Metrics

    Science.gov (United States)

    Tanaka, K.; Schmale, J.; von Schneidemesser, E.

    2013-12-01

    Cutting the emissions of Short-Lived Climate-Forcing Air Pollutants (SLCPs) gains increasing global attention as a mitigation policy option because of direct benefits for climate and co-benefits such as improvements in air quality. Including SLCPs as target components to abate within a single basket (e.g. the Kyoto Protocol) would, however, face issues with regard to: i) additional assumptions that are required to compare SLCP emissions and CO2 emissions within a basket in terms of climatic effects, especially because of the difference in lifetimes, ii) the accountability of non-climatic effects in the emission trading between SLCPs and CO2. The idea of a two-basket approach was originally proposed as a climatic analogue to the Montreal Protocol dealing with ozone depleting substances (Jackson 2009; Daniel et al. 2012; Smith et al. 2013). In a two-basket approach, emissions are allowed to be traded within a basket but not across the baskets. While this approach potentially ensures scientifically supported emission trading (e.g. (Smith et al. 2013)), this approach leaves open the important issue of how to determine the relative weight between two baskets. Determining the weight cannot be answered by science alone, as the question involves a value judgment as stressed in metric studies (e.g. (Tanaka et al. 2010; Tanaka et al. 2013)). We discuss emission metrics in the context of a two-basket approach and present policy implications of such an approach. In a two-basket approach, the weight between two baskets needs to be determined a priori or exogenously. Here, an opportunity arises to present synergetic policy options targeted at mitigating climate change and air pollution simultaneously. In other words, this could be a strategy to encourage policymakers to consider cross-cutting issues. Under a two-basket climate policy, policymakers would be exposed to questions such as: - What type of damages caused by climate change does one choose to avoid? - To what extent

  13. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and distributed...

  14. Social Media Metrics Importance and Usage Frequency in Latvia

    Directory of Open Access Journals (Sweden)

    Ronalds Skulme

    2017-12-01

    Full Text Available Purpose of the article: The purpose of this paper was to explore which social media marketing metrics are most often used and are most important for marketing experts in Latvia and can be used to evaluate marketing campaign effectiveness. Methodology/methods: In order to achieve the aim of this paper several theoretical and practical research methods were used, such as theoretical literature analysis, surveying and grouping. First of all, theoretical research about social media metrics was conducted. Authors collected information about social media metric grouping methods and the most frequently mentioned social media metrics in the literature. The collected information was used as the foundation for the expert surveys. The expert surveys were used to collect information from Latvian marketing professionals to determine which social media metrics are used most often and which social media metrics are most important in Latvia. Scientific aim: The scientific aim of this paper was to identify if social media metrics importance varies depending on the consumer purchase decision stage. Findings: Information about the most important and most often used social media marketing metrics in Latvia was collected. A new social media grouping framework is proposed. Conclusions: The main conclusion is that the importance and the usage frequency of the social media metrics is changing depending of consumer purchase decisions stage the metric is used to evaluate.

  15. ADAPT Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Diagnostics and Prognostics Testbed (ADAPT) Project Lead: Scott Poll Subject Fault diagnosis in electrical power systems Description The Advanced...

  16. Retrieval of noisy fingerprint patterns using metric attractor networks.

    Science.gov (United States)

    González, Mario; Dominguez, David; Rodríguez, Francisco B; Sánchez, Ángel

    2014-11-01

    This work experimentally analyzes the learning and retrieval capabilities of the diluted metric attractor neural network when applied to collections of fingerprint images. The computational cost of the network decreases with the dilution, so we can increase the region of interest to cover almost the complete fingerprint. The network retrieval was successfully tested for different noisy configurations of the fingerprints, and proved to be robust with a large basin of attraction. We showed that network topologies with a 2D-Grid arrangement adapt better to the fingerprints spatial structure, outperforming the typical 1D-Ring configuration. An optimal ratio of local connections to random shortcuts that better represent the intrinsic spatial structure of the fingerprints was found, and its influence on the retrieval quality was characterized in a phase diagram. Since the present model is a set of nonlinear equations, it is possible to go beyond the naïve static solution (consisting in matching two fingerprints using a fixed distance threshold value), and a crossing evolution of similarities was shown, leading to the retrieval of the right fingerprint from an apparently more distant candidate. This feature could be very useful for fingerprint verification to discriminate between fingerprints pairs.

  17. Measurable Control System Security through Ideal Driven Technical Metrics

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Sean McBride; Marie Farrar; Zachary Tudor

    2008-01-01

    The Department of Homeland Security National Cyber Security Division supported development of a small set of security ideals as a framework to establish measurable control systems security. Based on these ideals, a draft set of proposed technical metrics was developed to allow control systems owner-operators to track improvements or degradations in their individual control systems security posture. The technical metrics development effort included review and evaluation of over thirty metrics-related documents. On the bases of complexity, ambiguity, or misleading and distorting effects the metrics identified during the reviews were determined to be weaker than necessary to aid defense against the myriad threats posed by cyber-terrorism to human safety, as well as to economic prosperity. Using the results of our metrics review and the set of security ideals as a starting point for metrics development, we identified thirteen potential technical metrics - with at least one metric supporting each ideal. Two case study applications of the ideals and thirteen metrics to control systems were then performed to establish potential difficulties in applying both the ideals and the metrics. The case studies resulted in no changes to the ideals, and only a few deletions and refinements to the thirteen potential metrics. This led to a final proposed set of ten core technical metrics. To further validate the security ideals, the modifications made to the original thirteen potential metrics, and the final proposed set of ten core metrics, seven separate control systems security assessments performed over the past three years were reviewed for findings and recommended mitigations. These findings and mitigations were then mapped to the security ideals and metrics to assess gaps in their coverage. The mappings indicated that there are no gaps in the security ideals and that the ten core technical metrics provide significant coverage of standard security issues with 87% coverage. Based

  18. Quantitative adaptation analytics for assessing dynamic systems of systems: LDRD Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Gauthier, John H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Readiness & Sustainment Technologies (6133, M/S 1188); Miner, Nadine E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Military & Energy Systems Analysis (6114, M/S 1188); Wilson, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Resilience and Regulatory Effects (6921, M/S 1138); Le, Hai D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Readiness & Sustainment Technologies (6133, M/S 1188); Kao, Gio K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Networked System Survivability & Assurance (5629, M/S 0671); Melander, Darryl J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Software Systems R& D (9525, M/S 1188); Longsine, Dennis Earl [Sandia National Laboratories, Unknown, Unknown; Vander Meer, Jr., Robert C. [SAIC, Inc., Albuquerque, NM (United States)

    2015-01-01

    Our society is increasingly reliant on systems and interoperating collections of systems, known as systems of systems (SoS). These SoS are often subject to changing missions (e.g., nation- building, arms-control treaties), threats (e.g., asymmetric warfare, terrorism), natural environments (e.g., climate, weather, natural disasters) and budgets. How well can SoS adapt to these types of dynamic conditions? This report details the results of a three year Laboratory Directed Research and Development (LDRD) project aimed at developing metrics and methodologies for quantifying the adaptability of systems and SoS. Work products include: derivation of a set of adaptability metrics, a method for combining the metrics into a system of systems adaptability index (SoSAI) used to compare adaptability of SoS designs, development of a prototype dynamic SoS (proto-dSoS) simulation environment which provides the ability to investigate the validity of the adaptability metric set, and two test cases that evaluate the usefulness of a subset of the adaptability metrics and SoSAI for distinguishing good from poor adaptability in a SoS. Intellectual property results include three patents pending: A Method For Quantifying Relative System Adaptability, Method for Evaluating System Performance, and A Method for Determining Systems Re-Tasking.

  19. Adaptation Stories

    International Development Research Centre (IDRC) Digital Library (Canada)

    By Reg'

    formed a real foundation for endogenous, and, therefore, sustainable, strategies for adaptation to climate change. The stories reinforce what we already knew: that successful adaptation must come from the people who are living on the front lines, facing the many problems caused by climate change and climate variation.

  20. MESUR metrics from scholarly usage of resources

    CERN Document Server

    CERN. Geneva; Van de Sompel, Herbert

    2007-01-01

    Usage data is increasingly regarded as a valuable resource in the assessment of scholarly communication items. However, the development of quantitative, usage-based indicators of scholarly impact is still in its infancy. The Digital Library Research & Prototyping Team at the Los Alamos National Laboratory's Research library has therefore started a program to expand the set of usage-based tools for the assessment of scholarly communication items. The two-year MESUR project, funded by the Andrew W. Mellon Foundation, aims to define and validate a range of usage-based impact metrics, and issue guidelines with regards to their characteristics and proper application. The MESUR project is constructing a large-scale semantic model of the scholarly community that seamlessly integrates a wide range of bibliographic, citation and usage data. Functioning as a reference data set, this model is analyzed to characterize the intricate networks of typed relationships that exist in the scholarly community. The resulting c...

  1. Reuse Metrics for Object Oriented Software

    Science.gov (United States)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  2. Special metrics and group actions in geometry

    CERN Document Server

    Fino, Anna; Musso, Emilio; Podestà, Fabio; Vezzoni, Luigi

    2017-01-01

    The volume is a follow-up to the INdAM meeting “Special metrics and quaternionic geometry” held in Rome in November 2015. It offers a panoramic view of a selection of cutting-edge topics in differential geometry, including 4-manifolds, quaternionic and octonionic geometry, twistor spaces, harmonic maps, spinors, complex and conformal geometry, homogeneous spaces and nilmanifolds, special geometries in dimensions 5–8, gauge theory, symplectic and toric manifolds, exceptional holonomy and integrable systems. The workshop was held in honor of Simon Salamon, a leading international scholar at the forefront of academic research who has made significant contributions to all these subjects. The articles published here represent a compelling testimony to Salamon’s profound and longstanding impact on the mathematical community. Target readership includes graduate students and researchers working in Riemannian and complex geometry, Lie theory and mathematical physics.

  3. Development of Technology Transfer Economic Growth Metrics

    Science.gov (United States)

    Mastrangelo, Christina M.

    1998-01-01

    The primary objective of this project is to determine the feasibility of producing technology transfer metrics that answer the question: Do NASA/MSFC technical assistance activities impact economic growth? The data for this project resides in a 7800-record database maintained by Tec-Masters, Incorporated. The technology assistance data results from survey responses from companies and individuals who have interacted with NASA via a Technology Transfer Agreement, or TTA. The goal of this project was to determine if the existing data could provide indications of increased wealth. This work demonstrates that there is evidence that companies that used NASA technology transfer have a higher job growth rate than the rest of the economy. It also shows that the jobs being supported are jobs in higher wage SIC codes, and this indicates improvements in personal wealth. Finally, this work suggests that with correct data, the wealth issue may be addressed.

  4. Advanced reactors: the case for metric design

    International Nuclear Information System (INIS)

    Ruby, L.

    1986-01-01

    The author argues that DOE should insist that all design specifications for advanced reactors be in the International System of Units (SI) in accordance with the Metric Conversion Act of 1975. Despite a lack of leadership from the federal government, industry has had to move toward conversion in order to compete on world markets. The US is the only major country without a scheduled conversion program. SI avoids the disadvantages of ambiguous names, non-coherent units, multiple units for the same quantity, multiple definitions, as well as barriers to international exchange and marketing and problems in comparing safety and code parameters. With a first step by DOE, the Nuclear Regulatory Commission should add the same requirements to reactor licensing guidelines. 4 references

  5. Metric Learning to Enhance Hyperspectral Image Segmentation

    Science.gov (United States)

    Thompson, David R.; Castano, Rebecca; Bue, Brian; Gilmore, Martha S.

    2013-01-01

    Unsupervised hyperspectral image segmentation can reveal spatial trends that show the physical structure of the scene to an analyst. They highlight borders and reveal areas of homogeneity and change. Segmentations are independently helpful for object recognition, and assist with automated production of symbolic maps. Additionally, a good segmentation can dramatically reduce the number of effective spectra in an image, enabling analyses that would otherwise be computationally prohibitive. Specifically, using an over-segmentation of the image instead of individual pixels can reduce noise and potentially improve the results of statistical post-analysis. In this innovation, a metric learning approach is presented to improve the performance of unsupervised hyperspectral image segmentation. The prototype demonstrations attempt a superpixel segmentation in which the image is conservatively over-segmented; that is, the single surface features may be split into multiple segments, but each individual segment, or superpixel, is ensured to have homogenous mineralogy.

  6. Einstein metrics and Brans-Dicke superfields

    International Nuclear Information System (INIS)

    Marques, S.

    1988-01-01

    It is obtained here a space conformal to the Einstein space-time, making the transition from an internal bosonic space, constructed with the Majorana constant spinors in the Majorana representation, to a bosonic ''superspace,'' through the use of Einstein vierbeins. These spaces are related to a Grassmann space constructed with the Majorana spinors referred to above, where the ''metric'' is a function of internal bosonic coordinates. The conformal function is a scale factor in the zone of gravitational radiation. A conformal function dependent on space-time coordinates can be constructed in that region when we introduce Majorana spinors which are functions of those coordinates. With this we obtain a scalar field of Brans-Dicke type. 11 refs

  7. Clean Cities 2014 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Caley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Singer, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  8. Clean Cities 2013 Annual Metrics Report

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  9. Outsourced similarity search on metric data assets

    KAUST Repository

    Yiu, Man Lung

    2012-02-01

    This paper considers a cloud computing setting in which similarity querying of metric data is outsourced to a service provider. The data is to be revealed only to trusted users, not to the service provider or anyone else. Users query the server for the most similar data objects to a query example. Outsourcing offers the data owner scalability and a low-initial investment. The need for privacy may be due to the data being sensitive (e.g., in medicine), valuable (e.g., in astronomy), or otherwise confidential. Given this setting, the paper presents techniques that transform the data prior to supplying it to the service provider for similarity queries on the transformed data. Our techniques provide interesting trade-offs between query cost and accuracy. They are then further extended to offer an intuitive privacy guarantee. Empirical studies with real data demonstrate that the techniques are capable of offering privacy while enabling efficient and accurate processing of similarity queries.

  10. A Framework for Instituting Software Metrics in Small Software Organizations

    OpenAIRE

    Hisham M. Haddad; Nancy C. Ross; Donald E. Meredith

    2012-01-01

    The role of metrics in software quality is well-recognized; however, software metrics are yet to be standardized and integrated into development practices across the software industry. Literature reports indicate that software companies with less than 50 employees may represent up to 85% of the software organizations in several countries, including the United States. While process, project, and product metrics share a common goal of contributing to software quality and reliability, utilizatio...

  11. Report on metric study tour to Republic of South Africa

    Energy Technology Data Exchange (ETDEWEB)

    Laner, F. J.

    1978-01-01

    The modernized metric system, known universally as the International System of Units (abbreviated SI under the French name) was renamed in 1960 by the world body on standards. A map shows 98 percent of the world using or moving toward adoption of SI units. Only the countries of Burma, Liberia, Brunei, and Southern Yemen are nonmetric. The author describes a two-week session in Pretoria and Johannesburg on metrication, followed by additional meetings on metrication in Rhodesia. (MCW)

  12. Chaos of discrete dynamical systems in complete metric spaces

    International Nuclear Information System (INIS)

    Shi Yuming; Chen Guanrong

    2004-01-01

    This paper is concerned with chaos of discrete dynamical systems in complete metric spaces. Discrete dynamical systems governed by continuous maps in general complete metric spaces are first discussed, and two criteria of chaos are then established. As a special case, two corresponding criteria of chaos for discrete dynamical systems in compact subsets of metric spaces are obtained. These results have extended and improved the existing relevant results of chaos in finite-dimensional Euclidean spaces

  13. Relevance As a Metric for Evaluating Machine Learning Algorithms

    OpenAIRE

    Gopalakrishna, Aravind Kota; Ozcelebi, Tanir; Liotta, Antonio; Lukkien, Johan J.

    2013-01-01

    In machine learning, the choice of a learning algorithm that is suitable for the application domain is critical. The performance metric used to compare different algorithms must also reflect the concerns of users in the application domain under consideration. In this work, we propose a novel probability-based performance metric called Relevance Score for evaluating supervised learning algorithms. We evaluate the proposed metric through empirical analysis on a dataset gathered from an intellig...

  14. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  15. Analytic convergence of harmonic metrics for parabolic Higgs bundles

    Science.gov (United States)

    Kim, Semin; Wilkin, Graeme

    2018-04-01

    In this paper we investigate the moduli space of parabolic Higgs bundles over a punctured Riemann surface with varying weights at the punctures. We show that the harmonic metric depends analytically on the weights and the stable Higgs bundle. This gives a Higgs bundle generalisation of a theorem of McOwen on the existence of hyperbolic cone metrics on a punctured surface within a given conformal class, and a generalisation of a theorem of Judge on the analytic parametrisation of these metrics.

  16. Ricci flow of warped product metrics with positive isotropic curvature ...

    Indian Academy of Sciences (India)

    In this paper we study an ODE system associated to the Ricci flow of a special class of metrics on Sp+1 × S1, namely the doubly-warped product metrics, with PIC. While the ... In our context, blow-ups are defined as follows: Let g(t) be the Ricci flow on a compact n-manifold M starting at a metric g0. If g0 has positive scalar ...

  17. Survey on Impact of Software Metrics on Software Quality

    OpenAIRE

    Mrinal Singh Rawat; Arpita Mittal; Sanjay Kumar Dubey

    2012-01-01

    Software metrics provide a quantitative basis for planning and predicting software development processes. Therefore the quality of software can be controlled and improved easily. Quality in fact aids higher productivity, which has brought software metrics to the forefront. This research paper focuses on different views on software quality. Moreover, many metrics and models have been developed; promoted and utilized resulting in remarkable successes. This paper examines the realm of software e...

  18. Landscape Metrics to Predict Soil Spatial Patterns

    Science.gov (United States)

    Gillin, C. P.; McGuire, K. J.; Bailey, S.; Prisley, S.

    2012-12-01

    Recent literature has advocated the application of hydropedology, or the integration of hydrology and pedology, to better understand hydrologic flowpaths and soil spatial heterogeneity in a landscape. Hydropedology can be used to describe soil units affected by distinct topography, geology, and hydrology. Such a method has not been applied to digital soil mapping in the context of spatial variations in hydrological and biogeochemical processes. The purpose of this study is to use field observations of soil morphology, geospatial information technology, and a multinomial logistic regression model to predict the distribution of five hydropedological units (HPUs) across a 41-hectare forested headwater catchment in New England. Each HPU reflects varying degrees of lateral flow influence on soil development. Ninety-six soil characterization pits were located throughout the watershed, and HPU type was identified at each pit based on the presence and thickness of genetic soil horizons. Digital terrain analysis was conducted using ArcGIS and SAGA software to compute topographic and landscape metrics. Results indicate that each HPU occurs under specific topographic settings that influence subsurface hydrologic conditions. Among the most important landscape metrics are distance from stream, distance from bedrock outcrop, upslope accumulated area, the topographic wetness index, the downslope index, and curvature. Our project is unique in that it delineates high resolution soil units using a process-based morphological approach rather than a traditional taxonomical method taken by conventional soil surveys. Hydropedological predictor models can be a valuable tool for informing forest and land management decisions, water quality planning, soil carbon accounting, and understanding subsurface hydrologic dynamics. They can also be readily calibrated for regions of differing geology, topography, and climate regimes.

  19. Modeling color preference using color space metrics.

    Science.gov (United States)

    Schloss, Karen B; Lessard, Laurent; Racey, Chris; Hurlbert, Anya C

    2017-07-27

    Studying color preferences provides a means to discover how perceptual experiences map onto cognitive and affective judgments. A challenge is finding a parsimonious way to describe and predict patterns of color preferences, which are complex with rich individual differences. One approach has been to model color preferences using factors from metric color spaces to establish direct correspondences between dimensions of color and preference. Prior work established that substantial, but not all, variance in color preferences could be captured by weights on color space dimensions using multiple linear regression. The question we address here is whether model fits may be improved by using different color metric specifications. We therefore conducted a large-scale analysis of color space models, and focused in-depth analysis on models that differed in color space (cone-contrast vs. CIELAB), coordinate system within the color space (Cartesian vs. cylindrical), and factor degrees (1st degree only, or 1st and 2nd degree). We used k-fold cross validation to avoid over-fitting the data and to ensure fair comparisons across models. The best model was the 2nd-harmonic Lch model ("LabC Cyl2"). Specified in CIELAB space, it included 1st and 2nd harmonics of hue (capturing opponency in hue preferences and simultaneous liking/disliking of both hues on an opponent axis, respectively), lightness, and chroma. These modeling approaches can be used to characterize and compare patterns for group averages and individuals in future datasets on color preference, or other measures in which correspondences between color appearance and cognitive or affective judgments may exist. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Diagnosis code assignment: models and evaluation metrics.

    Science.gov (United States)

    Perotte, Adler; Pivovarov, Rimma; Natarajan, Karthik; Weiskopf, Nicole; Wood, Frank; Elhadad, Noémie

    2014-01-01

    The volume of healthcare data is growing rapidly with the adoption of health information technology. We focus on automated ICD9 code assignment from discharge summary content and methods for evaluating such assignments. We study ICD9 diagnosis codes and discharge summaries from the publicly available Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC II) repository. We experiment with two coding approaches: one that treats each ICD9 code independently of each other (flat classifier), and one that leverages the hierarchical nature of ICD9 codes into its modeling (hierarchy-based classifier). We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Experimental setup, code for modeling, and evaluation scripts are made available to the research community. The hierarchy-based classifier outperforms the flat classifier with F-measures of 39.5% and 27.6%, respectively, when trained on 20,533 documents and tested on 2282 documents. While recall is improved at the expense of precision, our novel evaluation metrics show a more refined assessment: for instance, the hierarchy-based classifier identifies the correct sub-tree of gold-standard codes more often than the flat classifier. Error analysis reveals that gold-standard codes are not perfect, and as such the recall and precision are likely underestimated. Hierarchy-based classification yields better ICD9 coding than flat classification for MIMIC patients. Automated ICD9 coding is an example of a task for which data and tools can be shared and for which the research community can work together to build on shared models and advance the state of the art.